US20040063501A1 - Game device, image processing device and image processing method - Google Patents

Game device, image processing device and image processing method Download PDF

Info

Publication number
US20040063501A1
US20040063501A1 US10/441,031 US44103103A US2004063501A1 US 20040063501 A1 US20040063501 A1 US 20040063501A1 US 44103103 A US44103103 A US 44103103A US 2004063501 A1 US2004063501 A1 US 2004063501A1
Authority
US
United States
Prior art keywords
character
virtual camera
distance
damage
shooting
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/441,031
Inventor
Hitoshi Shimokawa
Yukio Tsuji
Kazutomo Sanbongi
Junji Shibasaki
Takenao Sata
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sega Corp
Original Assignee
Sega Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sega Corp filed Critical Sega Corp
Assigned to KABUSHIKI KAISHA SEGA reassignment KABUSHIKI KAISHA SEGA ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SANBONGI, KAZUTOMO, SATA, TAKENAO, SHIBASAKI, JUNJI, SHIMOKAWA, HITOSHI, TSUJI, YUKIO
Publication of US20040063501A1 publication Critical patent/US20040063501A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • A63F13/5258Changing parameters of virtual cameras by dynamically adapting the position of the virtual camera to keep a game object or game character in its viewing frustum, e.g. for tracking a character or a ball
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • A63F13/577Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game using determination of contact between game characters or objects, e.g. to avoid collision between virtual racing cars
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/64Methods for processing data by generating or executing the game program for computing dynamical parameters of game objects, e.g. motion determination or computation of frictional forces for a virtual car
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/64Methods for processing data by generating or executing the game program for computing dynamical parameters of game objects, e.g. motion determination or computation of frictional forces for a virtual car
    • A63F2300/643Methods for processing data by generating or executing the game program for computing dynamical parameters of game objects, e.g. motion determination or computation of frictional forces for a virtual car by determining the impact between objects, e.g. collision detection
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images
    • A63F2300/6661Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images
    • A63F2300/6661Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera
    • A63F2300/6684Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera by dynamically adapting its position to keep a game object in its viewing frustrum, e.g. for tracking a character or a ball
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8076Shooting

Definitions

  • the present invention relates to an image processing device, particularly to a game device.
  • An image processing device defines various kinds of characters in a virtual space formed by the computer and loads the manipulation information of players into the computer through peripheral equipment such as joysticks, thereby realizing image processing for moving characters, etc.
  • images which are viewed from a three-dimensional virtual space viewpoint, called a virtual camera are displayed on a TV monitor for the players to see.
  • One example of the image processing device is a game device in which players compete against each other over shooting characters displayed on the screen (for example, “House of the Dead” available from Sega Enterprises, Ltd.).
  • This game device is composed such that the virtual camera moves along a predetermined course in the three-dimensional space, and a player moves while shooting enemies (zombies).
  • Each enemy has some weak points and when the player accurately hits a weak point, the enemy incurs a damage point, and when the total damage point exceed a predetermined value, the enemy is defeated.
  • a time limit is predetermined for the player to defeat the enemies. Therefore, if the player fails to defeat the enemies within the predetermined tim limit, the player is attack d by the en mies and the player incurs damage points.
  • This game device is also composed such that, when the player switches on a trigger of a gun pointing at the screen, the time elapsed for the gun to detect a scanning line on the screen is computed, and the coordinates of the location pointed at by the gun is further computed, and thereby a decision is made as to whether or not a bullet will hit the enemy character.
  • the conventional game device has configurations unfavorable to the player, for example, a time to fight with the enemies is predetermined and if the player can not defeat the enemies within the predetermined time limit, the player is attacked by the enemies and his/her damage points increase.
  • a time to fight with the enemies is predetermined and if the player can not defeat the enemies within the predetermined time limit, the player is attacked by the enemies and his/her damage points increase.
  • the player does defeat the enemies within the time limit
  • there is no arrangement favorable to the player For instance, when the player defeats the enemies just in time, or even when the player defeats them with time to spare, there are no positive effects on the future development of the game or the player's game score. Accordingly, there is a problem that the players, who are skilled in the shooting technique and can defeat the enemies within the time limit, lose their fighting spirits.
  • the conventional game device is hardly intended to be realistic based upon deciding whether a bullet hits an enemy. Therefore, the player cannot develop a strategy by familiarizing himself/herself with the types and characteristics of weapons. Accordingly, the conventional gam d vice does not pres nt nough entertaining characteristics for a gun shooting game.
  • the present invention provides an image processing method for moving a virtual camera located in a three-dimensional virtual space at a predetermined speed and changing the distance between the virtual camera and a character defined in the three-dimensional virtual space, wherein the moving speed of the virtual camera changes based on the distance between the virtual camera and the character.
  • a first character defined in the three-dimensional virtual space and a second character manipulated by the player are displayed, and the moving speed of the virtual camera changes based on the distance between the first and second characters.
  • the present invention also provides an image processing method for directing a virtual camera to a character located in a three-dimensional virtual space, wherein a fixation point of the virtual camera is set on the character in a manner so that the speed of directing the virtual camera to the character changes based on the distance between the virtual camera and the character.
  • the present invention further provides a game d vice that is composed such that a virtual camera located in a three-dimensional virtual space moves at a predetermined speed and the distance between the virtual camera and a character defined in the virtual space changes, comprising: virtual camera controlling means for changing a moving speed of the virtual camera based on the distance b twe n the virtual camera and the character.
  • the game device display a first character defined in the three-dimensional virtual space and a second character manipulated by the player, and the virtual camera controlling means change the moving speed of the virtual camera based on the distance between the first and second characters.
  • the virtual camera controlling means control the virtual camera moving speed in a manner that the shorter the distance between the virtual camera and the character becomes, the more the virtual camera moving speed decreases.
  • the game device it is desirable that a plurality of areas be provided in the three-dimensional virtual space with the virtual camera at the center, and the virtual camera controlling means determine in which area a character closest to the virtual camera exists, and control the virtual camera moving speed in accordance with the determined area.
  • the present invention provides a game device for directing a virtual camera to a character located in a three-dimensional virtual space comprising: fixation point setting means for setting a fixation point of the virtual camera on the character such that the speed of directing the virtual camera to the character chang s in accordance with the distance between the virtual camera and the character.
  • the present invention also provides a game device for simulating a player's shooting of a character defined in a three-dimensional virtual space while moving a virtual camera located in the three-dimensional virtual space, comprising: computing means for computing a damage point of the character caused by the player's shooting of it, based on a distance between the virtual camera and the character and the distance between the character and the center of an effective shooting radius that changes in accordance with the distance between the virtual camera and the character.
  • the damage point computing means compute a damage point of the character caused by the player's shooting of it, by multiplying a damage value, which is determined based on the distance between the virtual camera and the character, by a damage rate that is determined based on the distance between the character and the center of the effective shooting radius that changes in accordance with the distance between the virtual camera and the character.
  • the damage value be determined such that the further the distance between the virtual camera and the character is, the smaller the damage value is
  • the effective shooting radius be determined such that the further the distance between the virtual camera and the character is, the larger the effective shooting radius is
  • the damage rate be determined such that the further the distance between the character and the center of the effective shooting radius is, the smaller the proportion is.
  • the present invention provides a game device for simulating a player's shooting of a charact r defined in a three-dimensional virtual space while moving a virtual camera located in the three-dimensional virtual space, comprising: computing means for computing a damage point of the character caused by the shooting in accordance with a proportion of an overlapping area to the shooting radius, and in the overlapping area an effective shooting radius and a collision sphere of the enemy overlap.
  • the present invention provides a game device for simulating a player's shooting of a character defined in a three-dimensional virtual space while moving a virtual camera located in the three-dimensional virtual space, comprising: computing means for computing a damage point of the character caused by the shooting in accordance with a proportion of an overlapping area to the collision sphere of the enemy, and in the overlapping area an effective shooting radius and a collision sphere of the enemy overlap.
  • the present invention provides an image processing device that is composed to change a distance between a character and a virtual camera both located in a three-dimensional virtual space, comprising: means for computing the distance between the virtual camera and the character; and virtual camera controlling means for changing a moving speed of the virtual camera in accordance with the computed distance.
  • the present invention provides a controlling method of a game device for simulating a player's shooting of a character defined in a three-dimensional virtual space while moving a virtual camera located in the three-dimensional virtual space, wherein a damage point of the character caused by the player's shooting of it is computed on the basis of a distance between the virtual camera and the character and the distance between the character and the center of an effective shooting radius that changes in accordance with the distance between the virtual camera and the character.
  • the present invention provides a controlling method of a game device for simulating a player's shooting of a character defined in a three-dimensional virtual space while moving a virtual camera located in the three-dimensional virtual space, wherein a damage point of the character caused by the shooting is computed in accordance with a proportion of an overlapping area to the shooting radius, and in the overlapping area an effective shooting radius and a collision sphere of the enemy overlap.
  • the present invention provides a controlling method of a game device for simulating a player's shooting of a character defined in a three-dimensional virtual space while moving a virtual camera located in the three-dimensional virtual space, wherein a damage point of the character caused by the shooting is computed in accordance with a proportion of an overlapping area to the collision sphere, and in the overlapping area an effective shooting radius and a collision sphere of the enemy overlap.
  • the present invention provides a game controlling method whereby a game device is controlled such that it determines whether a hit decision area generated in a virtual space in accordance with a player's manipulation overlaps with an object located in the virtual space, and the character is damaged when it is determined that the hit decision area overlaps with the object, the method comprising: a step of obtaining first positional information indicating a position of a character being manipulated by the player in the virtual space; a step of obtaining second positional information indicating a position of the object in the virtual space; a step of computing a distance between the character and the object based on the obtained first and second positional information; a step of changing the size of the hit decision area based on the obtained distance; and a step of computing, when it is determined that the hit decision area and the object overlap with each other, a damage amount for the object based on the obtained distance, and realizing damage to the object based on the obtained damage amount.
  • the hit decision area be set to small and the damage amount be set to large when the obtained distance is shorter than a predetermined distance.
  • the hit decision area be set to large and the damage amount be set to small when the computed distance is further than a predetermined distance.
  • the present invention provides a game controlling method whereby a game device is controlled such that it determines whether an effective a area, generated in a virtual space with a predetermined target point at the center in accordance with a player's manipulation, overlaps with an object located in the virtual space and the object is damaged when it is determined that the hit decision area overlaps with the object, the method comprising: a step of obtaining positional information indicating a position of the object in the virtual space; a step of computing an area wher in the hit decision area and the object overlap with each other, based on the hit decision area and the obtained positional information; and a step of generating data of a damage amount to be attributed to the object, based on the obtained area, and providing damage to the object based on the generated damage amount data.
  • the present invention provides a shooting game controlling method, wherein the shooting game is controlled such that it simulates a player's shooting of a character defined in a three-dimensional virtual space, while a virtual camera located in the three-dimensional virtual space is moving at a predetermined speed, comprising: a step of changing the distance between the character and the virtual camera; a step of changing the moving speed of the virtual camera or the speed of directing the virtual camera to the character, based on the distance between the character and the virtual camera; a step of changing the player's effective shooting radius based on the distance between the virtual camera and the character; a step of determining whether a bullet has hit the character, based on the character's position and the location of the effective shooting radius; and a step of computing, when it is determined that the bullet has hit the character in the determination step, a damage amount caused to the character by the shooting, based on both the distance between the virtual camera and the character as well as the distance between the character and the center of the effective shooting radius.
  • FIG. 1 is a block diagram indicating the general structure of a game device according to one embodiment of the present invention.
  • FIG. 2 is a flow chart of th ntire general process performed by the CPU according to th embodiment.
  • FIG. 3 is a flow chart of the process executed in the game mode.
  • FIG. 4 illustrates the relationship between virtual camera movements and the enemies.
  • FIG. 5 is a flow chart of one example for the controlling process of the virtual camera.
  • FIG. 6(A) illustrates the relationship between the enemy sensing distance and the position d of the enemy closest to the virtual camera.
  • FIG. 6(B) shows examples of formulas for computing acceleration.
  • FIG. 7 is a diagram explaining a fixation point for the virtual camera.
  • FIG. 8 is a diagram showing one example of the relationship between the distance to the enemy and the bullet strength.
  • FIG. 9 is a diagram showing one example of the relationship between an effective shooting scope and the bullet strength.
  • FIG. 10 is a diagram showing one example of an effective shotgun radius.
  • FIG. 11 is a flow chart explaining the entire hit decision process.
  • FIG. 12(A) is a flow chart of the hit decision process.
  • FIG. 12(B) is an example in which a collision cone of the shotgun's bullet is divided into 16 sections.
  • FIG. 13(A) is a flow chart explaining the damage process.
  • FIG. 13(B) is an example of a computation for damage rate.
  • FIG. 14 shows an example of the configuration of the damage chart.
  • FIG. 15 shows image examples of objects (enemies) being shot.
  • FIG. 16 is a flow chart of the injury process.
  • FIG. 17 shows an example of the configuration for an injury progression value (damage progression value) chart.
  • FIG. 18 is a diagram explaining a second damage process.
  • FIG. 19 is a diagram explaining a second damage process.
  • FIG. 1 is a block diagram indicating one example of the game device of an arcade type game for playing a gun shooting game, according to the present invention.
  • the basic components of this game device include a game device main body 10 , an input device 11 , a TV monitor 13 , and a speaker 14 .
  • the input device 11 is a weapon such as a gun, shotgun, or a machine gun, for shooting enemies in the game.
  • the weapon is a shotgun used by the game player.
  • a shotgun includes a photoreceptor for reading a scanning spot (a light spot of an electron beam) for an impact point on the TV monitor, and a trigger switch that is equivalent to the trigger of a regular shotgun which is pulled. Scanning spot detection signals and trigger signals are transmitted to the interface 106 , which will be described hereinafter, via a connecting cord.
  • the TV monitor 13 displays images showing the status of the game development.
  • the TV monitor can be replaced by a projector.
  • the game device main body 10 comprises a central processing unit (CPU) 101 , a ROM 102 , a RAM 103 , a sound device 104 , an input/output interface 106 , a scroll data processor 107 , a coprocessor (auxiliary processor) 108 , a landform contour data ROM 109 , a geometrizer 110 , a form data ROM 111 , a drawing device 112 , a texture data ROM 113 , a texture map RAM 114 , a frame buffer 115 , an image composition device 116 , and a D/A converter 117 .
  • Examples of a storage medium used in this invention as the ROM 102 may include a hard disc, a cartridge-type ROM, a CD-ROM, other well-known media, and communication media (the intern t and other personal computer communication networks).
  • the CPU 101 is connected through a bus-line to: the ROM 102 having a predetermined program stored therein; RAM 103 for storing data; sound device 104 ; input/out interface 106 ; scroll data processor 107 ; coprocessor 108 ; and geometrizer 110 .
  • the RAM 103 is operated as a RAM buffer.
  • Various commands (to display objects, etc) to the geometrizer 110 and matrices obtained by computing the transformation matrix are written on the RAM 103 .
  • the input device 11 (shotgun) is connected to the input/output interface 106 .
  • CPU 101 checks whether the shotgun was fired based on a scanning spot detection signal sent from the shotgun 11 and a trigger signal indicating that the shotgun switch was pulled, and identifies an impact point and the number of shots fired in accordance with the current coordinates (X, Y) of the location of the scanning electron beam on the TV monitor and a location of a target. Then, CPU 101 sets various corresponding flags at predetermined positions in the RAM 103 .
  • the sound device 104 is connected to the speaker 14 through a power amplifier 105 , and audio signals generated by the sound device 104 are amplified in electric power, and then transmitted to the speaker 14 .
  • the CPU 101 reads, on the basis of a program stored in the ROM 102 , the game story development, the landform data in ROM 109 or the form data (three-dimensional data of “objects such as enemy characters” and “the game scenery including landscape, buildings, interiors, and underground passages) in the ROM 111 , then the CPU 101 determines the situation in the three-dimensional virtual space and executes the shooting process in correspondence with the trigger signals s nt from the input device 11 .
  • the various objects in the virtual game space their coordinate values in the three-dimensional space are determined, then the transformation matrix for transforming the coordinate values to a viewpoint coordinate system, and the form data (buildings, landforms, interiors, laboratories, furniture, etc.) are specified to the geometrizer 110 .
  • the coprocessor 108 is connected to the landform data ROM 109 , and accordingly, the landform data for the predetermined movement course of the camera is delivered to the coprocessor 108 (and the CPU 101 ).
  • the coprocessor 108 decides whether a bullet hits a target, computes deviations of objects from the camera's line of sight, executes the process for the movement of the line of sight, and takes on computing floating-points upon such decision and computation. Consequently, the results of the coprocessor's decision as to whether a bullet hit an object and the process for the movement of the line of sight which is movement relative to the position of the objects, are transmitted to the CPU 101 .
  • the geometrizer 110 is connected to the form data ROM 111 and the drawing device 112 .
  • Prestored on the form data ROM 111 is the polygon form data (three-dimensional data of buildings, walls, corridors, interiors, landforms, scenery, the main character, characters on the main character's side, various kinds of enemies (zombies), etc., all being composed of respective vertices).
  • This form data is delivered to the geometrizer 110 .
  • the geometrizer 110 executes the perspective transformation of the form data specified by the transformation matrix sent from the CPU 101 , and obtains the form data in which the coordinate system of the three-dimensional virtual space has been transformed to the coordinate system of a visual field.
  • the drawing device 112 pastes together the transformed form data of the visual field coordinate system and the textures and then outputs it to the frame buffer 115 .
  • the drawing device 112 is connected to the texture data ROM 113 , the texture map RAM 114 , and the frame buffer 115 .
  • Polygon data refers to a data group of relative or absolute coordinates of respective vertices each composing a polygon (mainly a trigon or tetragon), which consists of a set of plural vertices.
  • the polygon data stored in the landform data ROM 109 is set relatively roughly but enough to move the camera in the virtual space along the game storyline.
  • the polygon data stored in the form data ROM 111 is set in more detail concerning the forms, such as enemies and backgrounds, that compose the screen.
  • Scroll data processor 107 processes data such as letters on a scroll screen.
  • the scroll data processor 107 and the frame buffer 115 are connected to the TV monitor 13 via the image composition device 116 and the D/A converter 117 .
  • the polygon screen (a simulation result) for objects (enemies) and landforms (backgrounds) stored temporarily in the frame buffer 115 and the scroll screen for text information (for example, the player's LifeCount value, damage points, etc.) are composed to generate final frame-image data.
  • This frame-image data is converted into analog signals by the D/A converter 117 and transmitted to the TV monitor 13 , thereby, real-time images of the shooting game are displayed.
  • FIG. 2 is a flow chart explaining the outline of the game, and the process flow is broadly classified into a movement mode and a game mode.
  • the movement mode S 10
  • the virtual camera moves in the virtual game space created in the computer system in accordance with the pre-programmed game story, and also projects various game status updates onto the screen.
  • the game is determined to be over.
  • the player can go back to the game mode of a different status or the exact game mode in which the player lost, if, for example, the main character's damage is not heavy (S 50 ; NO).
  • the game is over (S 50 ; YES) when the set time of each section of the game runs out or when game parameters, such as damage point, satisfy the game termination requirements.
  • FIG. 3 is a flow chart explaining the process in the game mode (S 30 ).
  • the virtual camera moves in the virtual space and when an enemy appears, an enemy appearance means execut s the process to make enemies appear of the type and number preprogrammed for the scene (S 302 ).
  • an enemy appearance means execut s the process to make enemies appear of the type and number preprogrammed for the scene (S 302 ).
  • well-known techniques such as the Japanese Patent Laid-Open Publication No. Hei 10-185547 may be used.
  • a virtual camera controlling means changes the moving speed of the virtual camera in accordance with the distance between the virtual camera (viewpoint) and the enemy (S 304 ).
  • the process to control the moving speed of the virtual camera will be explained hereinafter with reference to FIGS. 4 to 6 .
  • the virtual camera controlling means also changes a fixation point of the virtual camera in accordance with the distance between the virtual camera (viewpoint) and the enemy (S 304 ). This process will be explained hereinafter with reference to FIG. 4.
  • a shooting result determination means determines a shooting result (S 306 ). At first, the shooting result determination means determines whether the bullet has hit the enemy (hit decision), and if the bullet has hit the enemy, a hit flag is set for the shot enemy, and a damage point and an injury progression value incurred by the shot are computed. The hit decision and the computation of a damage point are executed in accordance with the shotgun properties, and the details of this process will be hereinafter explained with reference to FIGS. 11 to 13 .
  • an enemy moving means executes the enemy moving process (S 308 ) for moving another enemy towards a clear space or the place where the previous enemy was shot.
  • the moving process of the enemies well-known techniques such as the above-cited Japanese Patent Publication Hei 10-165547 may be used.
  • Step 310 a determination is made as to whether the fighting will continue. If the fighting is not yet finished (S 310 ; YES), such as in the case that some enemies still remain on the screen, it is determined whether to make further enemies appear based on the program of the game story (S 312 ). If another enemy should appear (S 312 ; YES), the enemy is made to appear (S 302 ). If enemies should no longer appear (S 312 ; NO), the determination of the shooting results for the remaining enemies (S 306 ) is moved on to, and Steps 308 to 310 are repeated. When the fighting is determined to be over (S 310 ; NO), the process returns to the above-mentioned Step 40 . Then, it is determined whether to return to the movement mode that leads to another fight scene (S 40 ), or to finish the game (S 50 ).
  • the virtual camera is a viewpoint located in the three-dimensional virtual space, and images seen from this viewpoint are presented to the player through the monitor.
  • the virtual camera stops its movement. Accordingly, when an enemy appears (i.e. when the virtual camera arrives at a predetermined position), the player stops to shoot the enemy. Further, the moving speed for the virtual camera of a conventional game is constant.
  • the player's speed of defeating the enemies affects the progression of th game.
  • the moving speed of the camera is changed such that the shorter the distance becomes, the slower the moving speed of the camera becomes. Accordingly, the faster the player defeats the approaching enemies (the more the player defeats the enemies far off in the distance), the faster the game progresses, and consequently, the player can obtain a higher score.
  • the more slowly the player defeats the enemies the more slowly the game progresses, therefore, the player cannot obtain a high score.
  • the speed of defeating the enemies influences the game development and the game results. Further, since the player finds and shoots the enemies while moving toward a destination, the player feels a sense of urgency and thus, an increased enjoyment of the game.
  • the virtual camera moves along a predetermined track with a predetermined speed and angle.
  • There is a certain distance (enemy sensing distance) from the virtual camera wherein the player can sense an enemy and the virtual camera moving speed changes depending on the distance between the virtual camera and the enemy.
  • the virtual camera moving speed decelerates. The closer the enemy approaches the virtual camera, the more the virtual camera moving speed decelerates, and when the enemy reaches a certain proximity, the virtual camera stops its movement.
  • the enemy sensing distance is divided into three areas having th virtual camera at the center: a normal moving speed area in which the virtual camera mov s at a normal speed; a low moving speed area in which the virtual camera moves at a low speed; and a non-moving area in which the virtual camera stops its movement. These areas are defined by the enemy's distance from the virtual camera.
  • FIG. 5 is a flow chart explaining the process for controlling the virtual camera (S 304 in FIG. 3).
  • FIG. 6(A) shows the areas of the different moving speeds of camera.
  • FIG. 6(B) explains formulas for computing an acceleration for the camera.
  • a position d of an enemy that is closest to the virtual camera is determined (S 304 a ).
  • the non-moving area of the camera is within the 2.5-meter distance from the virtual camera; the low moving speed area of th camera is within the 10-meter distance from the virtual camera (excluding the non-moving area of the camera); and the normal moving speed area of the camera is an area further than the 10-meter distance from the virtual camera, respectively.
  • the enemy's position d is determined whether it is within the non-moving area of the camera (S 304 b ). If it is determined that it is within the non-moving area (S 304 b ; YES), an acceleration in the non-moving area is computed (S 304 c ). The acceleration is obtained by the formula shown in FIG. 6(B).
  • the virtual camera moving speed s is computed based on the above-obtained acceleration (S 304 g ), and it is further determined whether the obtained moving speed s is less than zero (S 304 h ).) If the obtained moving speed s is less than zero (S 304 h ; YES), the moving speed s is set to zero (S 304 i ). Conversely, when the obtained moving speed s is not less than zero (S 304 k ; NO), it is determined whether it is more than one, and if it is more than one (S 304 k ; YES), the moving speed s is set as one.
  • the position d of the enemy (the distance between the virtual camera and the enemy character) is computed. Then, an area that includes the enemy's position d is determined, the acceleration of the virtual camera is computed based on the specified area, and the virtual camera moving speed is computed on the basis of the obtained acceleration.
  • the virtual camera moving speed may be set back to the normal speed. Specifically, it is determined whether the enemy characters in the virtual space have all been defeated. If so determined, the virtual camera moving speed is set back to the normal speed.
  • the virtual camera moving direction and the game story development may be changed (or may follow other preprogrammed branches) in accordance with the time elapsed for enemy characters to appear in the virtual space and to be annihilated. Specifically, the time elapsed is measured, then a virtual camera moving direction is selected and the game story is specified for how it will develop in accordance with the time.
  • the virtual camera moving speed changes in accordance with the distance between the virtual camera and the enemies, and the shorter the distance becomes, the slower the virtual camera moving speed becomes. Accordingly, if the player shoots the enemies from far away, the virtual camera moving speed does not decrease. In other words, the faster the play r defeats the enemies, the fast r the game advances, thereby the player obtains a higher score.
  • the game device can provide a tense mood. For example, while a target enemy approaches from the back of the screen, the player (the virtual camera) moves toward a certain destination. The moving speed of the player does not change as long as the enemy is far away, therefore, the player feels as if he/she is advancing towards the destination on his/her own. However, as time passes and the enemy approaches the player, the player's moving speed decelerates and the player recognizes that he/she will have a battle with the enemies and so he/she becomes nervous.
  • the player's character stops its movement and remains in that position until the battle with the enemy is over.
  • the player fights with the enemies with a sense of urgency, fearing that he/she might be defeated. Accordingly, the combination of the player's movements and the enemies'movements can provide a real-life tense atmosphere.
  • the virtual camera moves in the three-dimensional virtual space according to the program.
  • the line of sight of the camera is directed to a certain point (a fixation point) in the virtual space and images are generated with the fixation point at the center of the display screen.
  • the fixation point is controlled according to the enemy's situation which is located in the direction of the virtual camera's line of sight. Control of the fixation point is executed such that the speed with which the fixation point follows the enemy changes based on the distance between the virtual camera and the enemy. More specifically, as the enemy reaches within the enemy sensing distance, the fixation point starts to follow the enemy, and the shorter the distance becomes, the speed of the fixation point for following th enemy increases.
  • the fixation point of the virtual camera is predetermined according to the program.
  • an enemy reaches within the enemy sensing distance (enemy 1 in FIG. 4)
  • the fixation point of the virtual camera starts to follow the enemy, but since the enemy is still far away from the virtual camera, the speed of the fixation point for following the enemy is set to slow.
  • the enemy approaches the virtual camera (enemy 2 in FIG. 4)
  • the enemy following speed of the fixation point is set to fast.
  • the speed of the fixation point for following the enemy will be at the maximum setting.
  • a fixation point setting means sets a fixation point of the virtual camera. Next, it selects an enemy on which the fixation point should be fixed. This enemy is the one within the enemy sensing distance and closest to the virtual camera. Then, the fixation point setting means determines the speed to move the fixation point in accordance with the enemy's position and moves the fixation point at the determined speed.
  • the player's weapon is a shotgun. Accordingly, it is desirable that the shooting results be determined in a manner that effectively demonstrates the shotgun property in which “bullets scatter in a wide radius”. It is understood that this characteristic of the shotgun means that, if fired at an object closeby, bullets impact a small area with high density, thereby demonstrating their greatest strength. On the other hand, if fired at a distant object, bullets scatter and impact a wide area with low density, thereby a deadly force cannot be fully realized.
  • damage to be suffered by an enemy is determined based on the following points: the amount of damage changes according to the distance to the enemy; an effective shooting scope (bullet strength) will change in accordance with the above distance: and the damage also changes in accordance with the bullet's impact point within the effective radius.
  • FIG. 8 is a diagram showing one example of the relationship between the distance to the enemy and the bullet strength.
  • the bullet strength and the effective shooting scope are determined to change according to the distance to the enemy. For example, if the distance is 3 meters, the bullet strength is 100 points, but the bullet strength decreases to 60 points when the distance is 5 meters, and to 30 points when the distance is 7 meters. Whereas, if the distance to the enemy is 3 meters, the effective shooting scope is 20 centimeters, and it expands to 60 centimeters when the distance is 5 meters, and to 70 centimeters when the distance is 7 meters.
  • FIG. 9 shows an example of the relationship between the effective shooting scope and the bullet strength.
  • the bullet strength is set in a manner so that the further the impact point is located from the target point of the player (the center of the concentric circle), the more the bullet strength and the damage suffered by the enemy decrease.
  • FIG. 10 shows on example of the r lationship between the distance to the enemy and the effective shooting scope.
  • FIG. 11 is a flow chart explaining the process of the shooting results determination process (S 306 in FIG. 3).
  • the hit decision means executes the hit decision process for determining whether the bullet hits the enemy (S 306 a ). This hit decision process will be described hereinafter with reference to FIGS. 12 (A) and 12 (B).
  • a hit flag is set.
  • a damage computing means executes the damage process for computing a damage point caused by the shooting (S 306 b ).
  • the damage process will be described hereinafter with reference to FIGS. 13 (A) and 13 (B).
  • injury severity computing means executes the injury process for determining the injury severity of the enemies in accordance with the shooting results and expressing the injury visually (S 306 c ). The injury process will be hereinafter described with reference to FIG. 16.
  • FIG. 12(A) is a flow chart explaining the hit determination process flow.
  • the player shoots an enemy S 306 a 1 ; YES
  • the enemy's coordinates are converted on the coordinate system in which the players position is an original point and the vector of the shooting direction is a Z-axis (S 306 a 2 ).
  • a radius DR i.e., an effective shooting scope (extent of the scatter shot) at the Z position of the enemy is computed (S 306 a 3 ) and a distance L between the enemy and the Z-axis is computed (S 306 a 4 ).
  • a radius R of the enemy's collision sphere is computed (S 306 a 6 ) and it is determined whether the bullet has hit the enemy base d on the radius DR, the distance L, and the radius R (S 306 a 6 ).
  • the sum of the radius R and the radius DR is greater than or equal to the distance L, it is considered that the bullet has hit the enemy (S 306 a 6 ; YES).
  • the sum is less than the distance L (S 306 a 6 ; NO)
  • the cross section of the collision cone of the shotgun pellets at the enemy's Z position is divided into sections of a predetermined number (for example, 16 sections), and it is determined which sections cover the enemy (S 306 a 8 ).
  • FIG. 12(B) shows the conical cone which is divided into sections 1 to 16 .
  • FIG. 13 is a flow chart explaining the damage process.
  • the body of an enemy is composed of predetermined body sections (for example, head, arms, legs, chest, etc.), and each body section is composed of predetermined body parts (for example, an “arm” has a “shoulder,” “upper arm,” “lower arm,” and “hand”).
  • the presence or absence of a hit flag which is set in the hit decision process explained by FIG. 12(A), tells whether the bullet has hit any body section or body sections of the enemy's body as well as which body section or body sections were hit.
  • the damage rate can be obtained by the formula shown in FIG. 13(B).
  • the minimum damage rate (MIN_DAMAGE_RATE) is a bullet strength percentage at the furthest position from the impact point within the shotgun radius, for example the minimum damage rate is set as 0.1, for example.
  • the maximum damage rate (MAX_DAMAGE_RADIUS_RATE) in the maximum impact radius is a bullet strength percentage, which determines a radius around the impact point for which the same strength should be applied. In short, the bullet strength at the impact point is maintained within a certain radius from the impact point.
  • the damage radius is a radius wherein the bullet strength at the impact point is effective, and it also represents a range in which bullets scatter (i.e., hit decision area).
  • a distance from the center of the trajectory is a distance between the center of the trajectory and the enemy, and is obtained by subtracting the radius of the enemy's collision sphere from the distance between the center of the trajectory and the enemy.
  • FIG. 14 is one example of the configuration of the damage chart.
  • the damage chart stores the damage values that determine a damage point of the enemies that have been shot.
  • the average physical power value of the enemies is set at 200 points.
  • the damage values are set according to the distance between the player and an enemy and a body section that has been shot.
  • the physical power value of an enemy which has been hit is calculated by: at first, obtaining the damage point of the enemy by multiplying the damage value by the damage rate based on the distance to the impact point (center of the trajectory); and then subtracting the computed damage point from the physical power value which the nemy owned before it was shot.
  • the damage value suffered by the enemy is “ 30 ” points.
  • a damage point of the body part is computed by multiplying the damage rate by the damage value (S 306 b 7 ).
  • damage points of all the body sections are not computed, damage points of the rest of the body sections are computed (S 306 b 8 ; NO).
  • the damage points are summed up for the respective body sections and the total damage point to the enemy is obtained (S 306 b 9 ).
  • the sum of the damage points of the respective body sections is the total damage point suffered by the enemy, and this total damage point is subtracted from the physical power value of the enemy. If after the subtraction, the physical power value is less than a predetermined value, the enemy vanishes from the screen.
  • FIG. 15 shows image examples of objects (enemies) being shot.
  • FIG. 15 shows two examples wherein the objects were shot at the same impact point but from different distances, the effective shooting scopes being shown with circles of dashed lines, and damage being shown with ⁇ figures. If the enemy is shot at short range as shown in FIG. 15(A), the bullets scatter around the abdomen, and each body part will be heavily damaged even though there are only a few points of damage. Whereas, if the enemy is shot at long range as shown in FIG. 15(B), the bullets scatter in a wide range throughout the whole body, but the damage to each body part is small.
  • the distance betwe n the virtual camera and an enemy character affects not only the virtual camera's moving speed, but the amount of damag suffered by the enemy character that was shot. Due to this fact, the player will be conflicted since on the one hand, shooting at short range demonstrates great bullet strength and enables the player to defeat ‘one enemy’ in a short time, but the player's moving speed will become slow. On the other hand, if there are many enemies, it may be better to shoot them at long range even with small bullet strengths because the player can damage an enemy in a wide range, thereby defeating the enemies more quickly and the player can move forward in the game. Thus, the entertaining characteristics of the game are enhanced.
  • Each body part on the enemy is provided with damaged body parts that express damage (injury status) corresponding to the predetermined levels.
  • the body part, the “chest,” of an enemy A is provided with damaged body parts that correspond to five levels ( 0 ⁇ 1 ⁇ 2 ⁇ 3 ⁇ 4 ) of damage.
  • the damaged body parts are composed such that the severity increases at each level.
  • level 0 shows an image of the chest with no damage; at level 1 a part of the chest is bleeding; at level 2 , a part of the chest is damaged; at level 3 , the entire ch st is damaged; and at level 4 , the chest is shattered. Damage levels and their modes of expression may be set differently depending on the enemy types.
  • FIG. 16 is a flow chart explaining the injury process.
  • the presence or absence of a hit flag which is set in the hit decision process explained with reference to FIG. 12, indicates whether a bullet hit a body section of the enemy's body.
  • a hit flag is set for a predetermined body section (S 306 c 1 ), and if the hit flag is set for the body section (S 306 c 1 ; YES), a body part closest to the impact point within the present body section is selected (S 306 c 2 ). Then, the injury progression value chart (damage progression value chart) in FIG. 17 is referred to, and an injury progression value is specified based on the player's distance to the enemy (S 306 c 3 ).
  • FIG. 17 shows one example of the configuration of the injury progression value chart, wherein the injury progression values are set in accordance with the distance to the enemy.
  • the injury progression values of a body part A of a certain body section are set such that the shorter the distance to the enemy the player is, the more the value increases; and the further the distance is, the more the value decreases.
  • the injury progression values only of the body part A are indicated.
  • Other body parts (for example, lower arms) and other body sections (for example, the head) are omitted in FIG. 17, but the injury progression values of those parts are similarly set.
  • the injury progression value that is already stored in a predetermined storage area is added (S 306 c 4 ).
  • the injury progression value of the present impact is added to the injury progression value of the previous impact, thereby increasing the total injury progression value.
  • Parameters of the damaged body parts are referred to based on the accumulated injury progression value, and a damaged body part which will be displayed is specified as it shows the shooting result (S 306 c 5 ).
  • the injury progression value of the body part is “7”
  • the damaged body part of level “0” is displayed.
  • the severity of the enemy's injury increases even if impacted only a few times.
  • a proportion (“first overlapping proportion”) of the overlapping area (hit determined portion) to the entire effective shooting radius (damage radius) is computed, and in the overlapping area, the damage radius and the enemy's collision sphere overlap.
  • a damage point is then computed based on the obtained proportion. Specifically, the damage point is computed by multiplying a damage value of the damage radius by the first overlapping proportion. Details will be explained with reference to FIG. 18.
  • FIG. 18 explains the second damage process.
  • the proportion of the hit determined portion to the damage radius is 100%.
  • second overlapping proportion of the overlapping area (hit determined portion) to the entire collision sphere of the enemy is computed, and in the overlapping area, the damage radius and the enemy's collision sphere overlap.
  • a damage point is then computed based on the obtain d second overlapping proportion. Specifically, the damage point is obtained by multiplying the damage value of the damage radius by the second overlapping proportion. Details will be explained with reference to FIGS. 18 (C) and 18 (D).
  • the proportion of the hit determined portion to the enemy's collision sphere is 100%.
  • the proportion of the hit determined portion to the enemy's collision sphere is 50%. Accordingly, when the damage value provided by the entire damage radius is set to 100, the enemy's damage point is 50 by calculating “damage value (100) ⁇ second overlapping proportion (50%)”. The fact that the proportion of the hit determined portion to the enemy's collision sphere is 50%, means that 50% of the enemy's collision sphere overlaps with the damage radius.
  • the damage radius is divided into lattices of a predetermined size. Then, it counts the number of lattices overlapping the collision sphere. Finally, in the case of the first overlapping proportion, the proportion of the overlapped lattices to all of the damage radius lattices is computed. Whereas, in the cas of the second overlapping proportion, the proportion of the overlapped lattices to all of the collision sphere lattices is computed.
  • the damage radius and the collision sphere are projected onto a virtual image (not displayed) for the hit decision, and the number of pixels in the overlapped portion of the virtual image is counted.
  • the overlapped area and the value for multiplying the damage do not necessarily correspond exactly.
  • the value may be separated into levels, for example, when “the overlapped portion is 1% or more, but less than 10%”, “the value is 10%”, and when “the overlapped portion is 10% or more, but less than 30%”, “the value is 30%”.
  • the damage radius is not limited to being circular, but may be oval or polygonal as appropriate. Further, the combination of the damage process of the present invention with other damage processes makes it possible to execute a more fractionalized damage process.
  • FIG. 19 explains the second damage process when an enemy is human-shaped.
  • a proportion of an area for each body part to the damage radius is computed and a damage point is also computed based on the total obtained proportion. For example, when the damage radius is taken up by 10% for each of the arms, 10% by the head, 20% by the waist, and 30% by the chest, the summation of 20 for the arms, 10 for the head, 20 for the waist, and 30 for the chest, i.e. “20+10+20+30”, will equal the total damage points of the enemy, i.e., 80.
  • damage can be determined more realistically and fairly. Further, the player can develop his/her skills for the shooting game, for example, “aiming at a target in a manner so that more enemies are included in the effective shooting scope” and “shooting stronger enemies in a manner so that bullets scatter in a wide range”.
  • th present invention is applied to a gun shooting game, however, the present invention is not limited to this application, but can be applied to other types of games.
  • explanations will be given to a game wherein multiple characters are defined in a three-dimensional virtual space, a first character (for example, an enemy character) being manipulated under a predetermined program while a second character (for example, a player's character) being manipulated in accordance with the manipulation information from the player.
  • a position of the player's character is employed instead of the position of the virtual camera and the moving speed of the virtual camera is controlled by the distance between the enemy character and the player's character.
  • the speed of the fixation point to follow enemy characters may be controlled on the basis of the distance between the player's character and the enemy character.
  • an effective attack range of the player is employed instead of the effective shooting radius.
  • damage points of the enemy characters may be computed on the basis of: the distance between the player's character and the enemy character; the effective attack range that changes in accordance with the above distance; and the distance between the enemy character and the center of the effective attack range. The same computing manner is used when the player's character (a character on the player's side) is being attacked.
  • a damage point may be computed based on the proportion of the overlapping area to the entire effective attack range, and in the overlapping area, the effective attack range and the enemy's collision spheres overlap.
  • This damage point computing process may be applied to games in which a player damages objects located in a virtual space. Specifically, the positional information of a character manipulated by the player and the positional information of objects are obtained. Then the distance between the player's character and an object is computed, and the size of a hit decision is determined based on the distance. When it is determined that the hit decision area overlaps with an object, the damage amount of the object is computed, and then the damage to the object based on the damage amount is caused.
  • the shooting results are determined in correspondence with the characteristics of the gun, the player can enjoy the game by developing fight strategies using his/her knowledge of the gun properties.
  • the shooting results and the enemies' damage point correspond to each other precisely, and damage can be determined more realistically and fairly.
  • a product invention can be interpreted as a method invention and vice versa.
  • This invention can also be implemented as a program or a recording medium that has a program stored therein for making a computer implement predetermined functions.
  • the recording medium include, for example, a hard disk (HD), a DVD-RAM, a floppy disk (FD), a CD-ROM, and types of memory such as a RAM and a ROM.
  • Examples of the computer included a so-called microcomputer wherein a central processing unit such as a CPU or an MPU interprets programs to execute predetermined processes.
  • a means does not simply imply a physical means, but it can also imply a function of the means implemented by a software or hardware circuit.
  • a function of one means may be realized by two or more physical means and functions of two or more means may be realized by one physical means.
  • means in this specification can be implemented by hardware or software, or the combination of both.
  • Implementation by the combination of the hardware and the software is, for example, the implementation by a computer system having a predetermined program therein.
  • a function of one means may be realized by two or more types of hardware or software, or by the combination of both, while two or more functions of one means may also be realized by one type of hardware or software, or by the combination of both.

Abstract

The present invention is an image processing method for moving a virtual camera located in a three-dimensional virtual space at a predetermined speed, and changing a distance between the virtual camera and a character defined in the virtual space, wherein a moving speed of the virtual camera changes based on the distance between the virtual camera and the character.

Description

    BACKGROUND
  • The present invention relates to an image processing device, particularly to a game device. [0001]
  • In recent years, numerous image processing devices of the type called three-dimensional game devices have been proposed. An image processing device defines various kinds of characters in a virtual space formed by the computer and loads the manipulation information of players into the computer through peripheral equipment such as joysticks, thereby realizing image processing for moving characters, etc. As a result of the image processing, images which are viewed from a three-dimensional virtual space viewpoint, called a virtual camera, are displayed on a TV monitor for the players to see. [0002]
  • One example of the image processing device is a game device in which players compete against each other over shooting characters displayed on the screen (for example, “House of the Dead” available from Sega Enterprises, Ltd.). This game device is composed such that the virtual camera moves along a predetermined course in the three-dimensional space, and a player moves while shooting enemies (zombies). Each enemy has some weak points and when the player accurately hits a weak point, the enemy incurs a damage point, and when the total damage point exceed a predetermined value, the enemy is defeated. Further, a time limit is predetermined for the player to defeat the enemies. Therefore, if the player fails to defeat the enemies within the predetermined tim limit, the player is attack d by the en mies and the player incurs damage points. [0003]
  • This game device is also composed such that, when the player switches on a trigger of a gun pointing at the screen, the time elapsed for the gun to detect a scanning line on the screen is computed, and the coordinates of the location pointed at by the gun is further computed, and thereby a decision is made as to whether or not a bullet will hit the enemy character. [0004]
  • This type of image processing device, however, has the following problems. [0005]
  • First of all, the conventional game device has configurations unfavorable to the player, for example, a time to fight with the enemies is predetermined and if the player can not defeat the enemies within the predetermined time limit, the player is attacked by the enemies and his/her damage points increase. However, in the case where the player does defeat the enemies within the time limit, there is no arrangement favorable to the player. For instance, when the player defeats the enemies just in time, or even when the player defeats them with time to spare, there are no positive effects on the future development of the game or the player's game score. Accordingly, there is a problem that the players, who are skilled in the shooting technique and can defeat the enemies within the time limit, lose their fighting spirits. [0006]
  • Secondly, the conventional game device is hardly intended to be realistic based upon deciding whether a bullet hits an enemy. Therefore, the player cannot develop a strategy by familiarizing himself/herself with the types and characteristics of weapons. Accordingly, the conventional gam d vice does not pres nt nough entertaining characteristics for a gun shooting game. [0007]
  • SUMMARY
  • In order to achieve the above objects, the present invention provides an image processing method for moving a virtual camera located in a three-dimensional virtual space at a predetermined speed and changing the distance between the virtual camera and a character defined in the three-dimensional virtual space, wherein the moving speed of the virtual camera changes based on the distance between the virtual camera and the character. [0008]
  • According to the image processing method, a first character defined in the three-dimensional virtual space and a second character manipulated by the player, are displayed, and the moving speed of the virtual camera changes based on the distance between the first and second characters. [0009]
  • The present invention also provides an image processing method for directing a virtual camera to a character located in a three-dimensional virtual space, wherein a fixation point of the virtual camera is set on the character in a manner so that the speed of directing the virtual camera to the character changes based on the distance between the virtual camera and the character. [0010]
  • The present invention further provides a game d vice that is composed such that a virtual camera located in a three-dimensional virtual space moves at a predetermined speed and the distance between the virtual camera and a character defined in the virtual space changes, comprising: virtual camera controlling means for changing a moving speed of the virtual camera based on the distance b twe n the virtual camera and the character. [0011]
  • It is desirable that the game device display a first character defined in the three-dimensional virtual space and a second character manipulated by the player, and the virtual camera controlling means change the moving speed of the virtual camera based on the distance between the first and second characters. [0012]
  • It is also desirable that the virtual camera controlling means control the virtual camera moving speed in a manner that the shorter the distance between the virtual camera and the character becomes, the more the virtual camera moving speed decreases. [0013]
  • Regarding the game device, it is desirable that a plurality of areas be provided in the three-dimensional virtual space with the virtual camera at the center, and the virtual camera controlling means determine in which area a character closest to the virtual camera exists, and control the virtual camera moving speed in accordance with the determined area. [0014]
  • The present invention provides a game device for directing a virtual camera to a character located in a three-dimensional virtual space comprising: fixation point setting means for setting a fixation point of the virtual camera on the character such that the speed of directing the virtual camera to the character chang s in accordance with the distance between the virtual camera and the character. [0015]
  • The present invention also provides a game device for simulating a player's shooting of a character defined in a three-dimensional virtual space while moving a virtual camera located in the three-dimensional virtual space, comprising: computing means for computing a damage point of the character caused by the player's shooting of it, based on a distance between the virtual camera and the character and the distance between the character and the center of an effective shooting radius that changes in accordance with the distance between the virtual camera and the character. [0016]
  • It is desirable that the damage point computing means compute a damage point of the character caused by the player's shooting of it, by multiplying a damage value, which is determined based on the distance between the virtual camera and the character, by a damage rate that is determined based on the distance between the character and the center of the effective shooting radius that changes in accordance with the distance between the virtual camera and the character. [0017]
  • It is desirable that: the damage value be determined such that the further the distance between the virtual camera and the character is, the smaller the damage value is; the effective shooting radius be determined such that the further the distance between the virtual camera and the character is, the larger the effective shooting radius is; and the damage rate be determined such that the further the distance between the character and the center of the effective shooting radius is, the smaller the proportion is. [0018]
  • The present invention provides a game device for simulating a player's shooting of a charact r defined in a three-dimensional virtual space while moving a virtual camera located in the three-dimensional virtual space, comprising: computing means for computing a damage point of the character caused by the shooting in accordance with a proportion of an overlapping area to the shooting radius, and in the overlapping area an effective shooting radius and a collision sphere of the enemy overlap. [0019]
  • The present invention provides a game device for simulating a player's shooting of a character defined in a three-dimensional virtual space while moving a virtual camera located in the three-dimensional virtual space, comprising: computing means for computing a damage point of the character caused by the shooting in accordance with a proportion of an overlapping area to the collision sphere of the enemy, and in the overlapping area an effective shooting radius and a collision sphere of the enemy overlap. [0020]
  • The present invention provides an image processing device that is composed to change a distance between a character and a virtual camera both located in a three-dimensional virtual space, comprising: means for computing the distance between the virtual camera and the character; and virtual camera controlling means for changing a moving speed of the virtual camera in accordance with the computed distance. [0021]
  • The present invention provides a controlling method of a game device for simulating a player's shooting of a character defined in a three-dimensional virtual space while moving a virtual camera located in the three-dimensional virtual space, wherein a damage point of the character caused by the player's shooting of it is computed on the basis of a distance between the virtual camera and the character and the distance between the character and the center of an effective shooting radius that changes in accordance with the distance between the virtual camera and the character. [0022]
  • The present invention provides a controlling method of a game device for simulating a player's shooting of a character defined in a three-dimensional virtual space while moving a virtual camera located in the three-dimensional virtual space, wherein a damage point of the character caused by the shooting is computed in accordance with a proportion of an overlapping area to the shooting radius, and in the overlapping area an effective shooting radius and a collision sphere of the enemy overlap. [0023]
  • The present invention provides a controlling method of a game device for simulating a player's shooting of a character defined in a three-dimensional virtual space while moving a virtual camera located in the three-dimensional virtual space, wherein a damage point of the character caused by the shooting is computed in accordance with a proportion of an overlapping area to the collision sphere, and in the overlapping area an effective shooting radius and a collision sphere of the enemy overlap. [0024]
  • The present invention provides a game controlling method whereby a game device is controlled such that it determines whether a hit decision area generated in a virtual space in accordance with a player's manipulation overlaps with an object located in the virtual space, and the character is damaged when it is determined that the hit decision area overlaps with the object, the method comprising: a step of obtaining first positional information indicating a position of a character being manipulated by the player in the virtual space; a step of obtaining second positional information indicating a position of the object in the virtual space; a step of computing a distance between the character and the object based on the obtained first and second positional information; a step of changing the size of the hit decision area based on the obtained distance; and a step of computing, when it is determined that the hit decision area and the object overlap with each other, a damage amount for the object based on the obtained distance, and realizing damage to the object based on the obtained damage amount. [0025]
  • It is desirable that the hit decision area be set to small and the damage amount be set to large when the obtained distance is shorter than a predetermined distance. [0026]
  • It is desirable that the hit decision area be set to large and the damage amount be set to small when the computed distance is further than a predetermined distance. [0027]
  • The present invention provides a game controlling method whereby a game device is controlled such that it determines whether an effective a area, generated in a virtual space with a predetermined target point at the center in accordance with a player's manipulation, overlaps with an object located in the virtual space and the object is damaged when it is determined that the hit decision area overlaps with the object, the method comprising: a step of obtaining positional information indicating a position of the object in the virtual space; a step of computing an area wher in the hit decision area and the object overlap with each other, based on the hit decision area and the obtained positional information; and a step of generating data of a damage amount to be attributed to the object, based on the obtained area, and providing damage to the object based on the generated damage amount data. [0028]
  • The present invention provides a shooting game controlling method, wherein the shooting game is controlled such that it simulates a player's shooting of a character defined in a three-dimensional virtual space, while a virtual camera located in the three-dimensional virtual space is moving at a predetermined speed, comprising: a step of changing the distance between the character and the virtual camera; a step of changing the moving speed of the virtual camera or the speed of directing the virtual camera to the character, based on the distance between the character and the virtual camera; a step of changing the player's effective shooting radius based on the distance between the virtual camera and the character; a step of determining whether a bullet has hit the character, based on the character's position and the location of the effective shooting radius; and a step of computing, when it is determined that the bullet has hit the character in the determination step, a damage amount caused to the character by the shooting, based on both the distance between the virtual camera and the character as well as the distance between the character and the center of the effective shooting radius.[0029]
  • DESCRIPTION OF DRAWINGS
  • FIG. 1 is a block diagram indicating the general structure of a game device according to one embodiment of the present invention. [0030]
  • FIG. 2 is a flow chart of th ntire general process performed by the CPU according to th embodiment. [0031]
  • FIG. 3 is a flow chart of the process executed in the game mode. [0032]
  • FIG. 4 illustrates the relationship between virtual camera movements and the enemies. [0033]
  • FIG. 5 is a flow chart of one example for the controlling process of the virtual camera. [0034]
  • FIG. 6(A) illustrates the relationship between the enemy sensing distance and the position d of the enemy closest to the virtual camera. [0035]
  • FIG. 6(B) shows examples of formulas for computing acceleration. [0036]
  • FIG. 7 is a diagram explaining a fixation point for the virtual camera. [0037]
  • FIG. 8 is a diagram showing one example of the relationship between the distance to the enemy and the bullet strength. [0038]
  • FIG. 9 is a diagram showing one example of the relationship between an effective shooting scope and the bullet strength. [0039]
  • FIG. 10 is a diagram showing one example of an effective shotgun radius. [0040]
  • FIG. 11 is a flow chart explaining the entire hit decision process. [0041]
  • FIG. 12(A) is a flow chart of the hit decision process. [0042]
  • FIG. 12(B) is an example in which a collision cone of the shotgun's bullet is divided into 16 sections. [0043]
  • FIG. 13(A) is a flow chart explaining the damage process. [0044]
  • FIG. 13(B) is an example of a computation for damage rate. [0045]
  • FIG. 14 shows an example of the configuration of the damage chart. [0046]
  • FIG. 15 shows image examples of objects (enemies) being shot. [0047]
  • FIG. 16 is a flow chart of the injury process. [0048]
  • FIG. 17 shows an example of the configuration for an injury progression value (damage progression value) chart. [0049]
  • FIG. 18 is a diagram explaining a second damage process. [0050]
  • FIG. 19 is a diagram explaining a second damage process.[0051]
  • DETAILED DESCRIPTION
  • An embodiment of the present invention will be explained with reference to the drawings. In this embodiment, explanations are given for the case wherein the game device of the present invention is applied to a gun shooting game of a so-called arcade type game. Nevertheless, this invention is not limited to this type and can also b applied to game software for home game devices. [0052]
  • [Block Diagram of Game Device][0053]
  • FIG. 1 is a block diagram indicating one example of the game device of an arcade type game for playing a gun shooting game, according to the present invention. The basic components of this game device include a game device [0054] main body 10, an input device 11, a TV monitor 13, and a speaker 14.
  • The [0055] input device 11 is a weapon such as a gun, shotgun, or a machine gun, for shooting enemies in the game. In this embodiment, the weapon is a shotgun used by the game player. A shotgun includes a photoreceptor for reading a scanning spot (a light spot of an electron beam) for an impact point on the TV monitor, and a trigger switch that is equivalent to the trigger of a regular shotgun which is pulled. Scanning spot detection signals and trigger signals are transmitted to the interface 106, which will be described hereinafter, via a connecting cord. The TV monitor 13 displays images showing the status of the game development. The TV monitor can be replaced by a projector.
  • The game device [0056] main body 10 comprises a central processing unit (CPU) 101, a ROM 102, a RAM 103, a sound device 104, an input/output interface 106, a scroll data processor 107, a coprocessor (auxiliary processor) 108, a landform contour data ROM 109, a geometrizer 110, a form data ROM 111, a drawing device 112, a texture data ROM 113, a texture map RAM 114, a frame buffer 115, an image composition device 116, and a D/A converter 117. Examples of a storage medium used in this invention as the ROM 102, may include a hard disc, a cartridge-type ROM, a CD-ROM, other well-known media, and communication media (the intern t and other personal computer communication networks).
  • The [0057] CPU 101 is connected through a bus-line to: the ROM 102 having a predetermined program stored therein; RAM 103 for storing data; sound device 104; input/out interface 106; scroll data processor 107; coprocessor 108; and geometrizer 110. The RAM 103 is operated as a RAM buffer. Various commands (to display objects, etc) to the geometrizer 110 and matrices obtained by computing the transformation matrix are written on the RAM 103.
  • The input device [0058] 11 (shotgun) is connected to the input/output interface 106. CPU 101 checks whether the shotgun was fired based on a scanning spot detection signal sent from the shotgun 11 and a trigger signal indicating that the shotgun switch was pulled, and identifies an impact point and the number of shots fired in accordance with the current coordinates (X, Y) of the location of the scanning electron beam on the TV monitor and a location of a target. Then, CPU 101 sets various corresponding flags at predetermined positions in the RAM 103.
  • The [0059] sound device 104 is connected to the speaker 14 through a power amplifier 105, and audio signals generated by the sound device 104 are amplified in electric power, and then transmitted to the speaker 14.
  • In this embodiment, the [0060] CPU 101 reads, on the basis of a program stored in the ROM 102, the game story development, the landform data in ROM 109 or the form data (three-dimensional data of “objects such as enemy characters” and “the game scenery including landscape, buildings, interiors, and underground passages) in the ROM 111, then the CPU 101 determines the situation in the three-dimensional virtual space and executes the shooting process in correspondence with the trigger signals s nt from the input device 11.
  • Regarding the various objects in the virtual game space, their coordinate values in the three-dimensional space are determined, then the transformation matrix for transforming the coordinate values to a viewpoint coordinate system, and the form data (buildings, landforms, interiors, laboratories, furniture, etc.) are specified to the [0061] geometrizer 110. The coprocessor 108 is connected to the landform data ROM 109, and accordingly, the landform data for the predetermined movement course of the camera is delivered to the coprocessor 108 (and the CPU 101). The coprocessor 108 decides whether a bullet hits a target, computes deviations of objects from the camera's line of sight, executes the process for the movement of the line of sight, and takes on computing floating-points upon such decision and computation. Consequently, the results of the coprocessor's decision as to whether a bullet hit an object and the process for the movement of the line of sight which is movement relative to the position of the objects, are transmitted to the CPU 101.
  • The [0062] geometrizer 110 is connected to the form data ROM 111 and the drawing device 112. Prestored on the form data ROM 111 is the polygon form data (three-dimensional data of buildings, walls, corridors, interiors, landforms, scenery, the main character, characters on the main character's side, various kinds of enemies (zombies), etc., all being composed of respective vertices). This form data is delivered to the geometrizer 110. The geometrizer 110 executes the perspective transformation of the form data specified by the transformation matrix sent from the CPU 101, and obtains the form data in which the coordinate system of the three-dimensional virtual space has been transformed to the coordinate system of a visual field.
  • The [0063] drawing device 112 pastes together the transformed form data of the visual field coordinate system and the textures and then outputs it to the frame buffer 115. For pasting together the textures, the drawing device 112 is connected to the texture data ROM 113, the texture map RAM 114, and the frame buffer 115. Polygon data refers to a data group of relative or absolute coordinates of respective vertices each composing a polygon (mainly a trigon or tetragon), which consists of a set of plural vertices. The polygon data stored in the landform data ROM 109, is set relatively roughly but enough to move the camera in the virtual space along the game storyline. On the other hand, the polygon data stored in the form data ROM 111, is set in more detail concerning the forms, such as enemies and backgrounds, that compose the screen.
  • [0064] Scroll data processor 107 processes data such as letters on a scroll screen. The scroll data processor 107 and the frame buffer 115 are connected to the TV monitor 13 via the image composition device 116 and the D/A converter 117. Accordingly, the polygon screen (a simulation result) for objects (enemies) and landforms (backgrounds) stored temporarily in the frame buffer 115, and the scroll screen for text information (for example, the player's LifeCount value, damage points, etc.) are composed to generate final frame-image data. This frame-image data is converted into analog signals by the D/A converter 117 and transmitted to the TV monitor 13, thereby, real-time images of the shooting game are displayed.
  • [Entire Game Flow][0065]
  • Now, the entire flow of the game will be explained with reference to FIG. 2. FIG. 2 is a flow chart explaining the outline of the game, and the process flow is broadly classified into a movement mode and a game mode. In the movement mode (S[0066] 10), the virtual camera moves in the virtual game space created in the computer system in accordance with the pre-programmed game story, and also projects various game status updates onto the screen.
  • When the virtual camera moves to a point where a preprogrammed enemy appears, an enemy is displayed on the screen (S[0067] 20) and the game switches to the game mode for developing the shooting game (S30). During the game mode, the player can move forward while shooting the enemies. When the player defeats the enemies, the virtual camera again moves according to the preprogrammed game story and enters another game status (S40; YES), thereby further developing the game (S10 to S30).
  • If the player can not defeat the enemies and loses or if the player clears the final game (S[0068] 40; NO), the game is determined to be over. The player can go back to the game mode of a different status or the exact game mode in which the player lost, if, for example, the main character's damage is not heavy (S50; NO). Alternatively, the game is over (S50; YES) when the set time of each section of the game runs out or when game parameters, such as damage point, satisfy the game termination requirements.
  • [Game Mode][0069]
  • Now, the process flow in the game mode will be explained with reference to FIG. 3. FIG. 3 is a flow chart explaining the process in the game mode (S[0070] 30). The virtual camera moves in the virtual space and when an enemy appears, an enemy appearance means execut s the process to make enemies appear of the type and number preprogrammed for the scene (S302). Regarding the process to mak enemies appear, well-known techniques such as the Japanese Patent Laid-Open Publication No. Hei 10-185547 may be used.
  • Along with the appearance of an enemy, a virtual camera controlling means changes the moving speed of the virtual camera in accordance with the distance between the virtual camera (viewpoint) and the enemy (S[0071] 304). The process to control the moving speed of the virtual camera will be explained hereinafter with reference to FIGS. 4 to 6. The virtual camera controlling means also changes a fixation point of the virtual camera in accordance with the distance between the virtual camera (viewpoint) and the enemy (S304). This process will be explained hereinafter with reference to FIG. 4.
  • The player can shoot the enemies that appear on the screen. When an enemy is shot, a shooting result determination means determines a shooting result (S[0072] 306). At first, the shooting result determination means determines whether the bullet has hit the enemy (hit decision), and if the bullet has hit the enemy, a hit flag is set for the shot enemy, and a damage point and an injury progression value incurred by the shot are computed. The hit decision and the computation of a damage point are executed in accordance with the shotgun properties, and the details of this process will be hereinafter explained with reference to FIGS. 11 to 13.
  • When the enemy is shot and vanishes, an enemy moving means executes the enemy moving process (S[0073] 308) for moving another enemy towards a clear space or the place where the previous enemy was shot. As for the moving process of the enemies, well-known techniques such as the above-cited Japanese Patent Publication Hei 10-165547 may be used.
  • Subsequently, a determination is made as to whether the fighting will continue. If the fighting is not yet finished (S[0074] 310; YES), such as in the case that some enemies still remain on the screen, it is determined whether to make further enemies appear based on the program of the game story (S312). If another enemy should appear (S312; YES), the enemy is made to appear (S302). If enemies should no longer appear (S312; NO), the determination of the shooting results for the remaining enemies (S306) is moved on to, and Steps 308 to 310 are repeated. When the fighting is determined to be over (S310; NO), the process returns to the above-mentioned Step 40. Then, it is determined whether to return to the movement mode that leads to another fight scene (S40), or to finish the game (S50).
  • [Virtual Camera Movement][0075]
  • Now, improvements regarding the virtual camera movements by a virtual camera controlling means will be explained. The virtual camera is a viewpoint located in the three-dimensional virtual space, and images seen from this viewpoint are presented to the player through the monitor. In the conventional game, when the virtual camera arrives at the preprogrammed enemy appearance point, the virtual camera stops its movement. Accordingly, when an enemy appears (i.e. when the virtual camera arrives at a predetermined position), the player stops to shoot the enemy. Further, the moving speed for the virtual camera of a conventional game is constant. [0076]
  • On the contrary, in this embodiment, since the moving speed of th camera changes in accordance with the distance between the camera and the enemy, the player's speed of defeating the enemies affects the progression of th game. For example, the moving speed of the camera is changed such that the shorter the distance becomes, the slower the moving speed of the camera becomes. Accordingly, the faster the player defeats the approaching enemies (the more the player defeats the enemies far off in the distance), the faster the game progresses, and consequently, the player can obtain a higher score. On the other hand, the more slowly the player defeats the enemies, the more slowly the game progresses, therefore, the player cannot obtain a high score. In short, in the event that the player defeats the enemies within the time-limit, the speed of defeating the enemies influences the game development and the game results. Further, since the player finds and shoots the enemies while moving toward a destination, the player feels a sense of urgency and thus, an increased enjoyment of the game. [0077]
  • Now, the relationship between the virtual camera movement and an enemy will be explained with reference to FIG. 4. As shown in FIG. 4, the virtual camera moves along a predetermined track with a predetermined speed and angle. There is a certain distance (enemy sensing distance) from the virtual camera, wherein the player can sense an enemy and the virtual camera moving speed changes depending on the distance between the virtual camera and the enemy. When an enemy comes within this enemy's sensing distance, the virtual camera moving speed decelerates. The closer the enemy approaches the virtual camera, the more the virtual camera moving speed decelerates, and when the enemy reaches a certain proximity, the virtual camera stops its movement. [0078]
  • As FIG. 4 shows, the enemy sensing distance is divided into three areas having th virtual camera at the center: a normal moving speed area in which the virtual camera mov s at a normal speed; a low moving speed area in which the virtual camera moves at a low speed; and a non-moving area in which the virtual camera stops its movement. These areas are defined by the enemy's distance from the virtual camera. [0079]
  • When an enemy is in the normal moving speed area of the camera, the player feels that the enemy is quite far away, and the moving speed of the virtual camera does not change and maintains its normal speed. When the enemy enters the low moving speed area of the camera, the player feels that he/she must defeat the enemy and the virtual camera moving speed becomes slower than the normal speed. When the enemy further enters the non-moving area of the camera, the player feels danger that he/she might be defeated, and the virtual camera stops its movement. Areas for determining the virtual camera moving speed are not limited to these three areas. Any area can be appropriately set according to the difficulty level of a game. [0080]
  • Next, explanations will be given for the process flow to change the virtual camera moving speed in accordance with the distance between the camera and the enemy, with reference to FIGS. [0081] 5, 6(A) and 6(B). FIG. 5 is a flow chart explaining the process for controlling the virtual camera (S304 in FIG. 3). FIG. 6(A) shows the areas of the different moving speeds of camera. FIG. 6(B) explains formulas for computing an acceleration for the camera.
  • At first, a position d of an enemy that is closest to the virtual camera is determined (S[0082] 304 a). As shown in FIG. 6(A), the non-moving area of the camera is within the 2.5-meter distance from the virtual camera; the low moving speed area of th camera is within the 10-meter distance from the virtual camera (excluding the non-moving area of the camera); and the normal moving speed area of the camera is an area further than the 10-meter distance from the virtual camera, respectively.
  • Then, the enemy's position d is determined whether it is within the non-moving area of the camera (S[0083] 304 b). If it is determined that it is within the non-moving area (S304 b; YES), an acceleration in the non-moving area is computed (S304 c). The acceleration is obtained by the formula shown in FIG. 6(B).
  • If it is determined that the enemy's position d is not within the non-moving area of the camera (S[0084] 304 b; NO), it is then determined whether the position d is within the low moving speed area of the camera (S304 d). If it is determined that the position d is within the low moving speed area of the camera (S304 d; YES), an acceleration in the low moving speed area of the camera is computed (S304 e). But, if it is determined that the enemy's position d is not within the low moving speed area of the camera (S304 d; NO), an acceleration in the non-moving area of the camera is computed (S304 f).
  • The virtual camera moving speed s is computed based on the above-obtained acceleration (S[0085] 304 g), and it is further determined whether the obtained moving speed s is less than zero (S304 h).) If the obtained moving speed s is less than zero (S304 h; YES), the moving speed s is set to zero (S304 i). Conversely, when the obtained moving speed s is not less than zero (S304 k; NO), it is determined whether it is more than one, and if it is more than one (S304 k; YES), the moving speed s is set as one.
  • To rephrase the above processing, first, the position d of the enemy (the distance between the virtual camera and the enemy character) is computed. Then, an area that includes the enemy's position d is determined, the acceleration of the virtual camera is computed based on the specified area, and the virtual camera moving speed is computed on the basis of the obtained acceleration. [0086]
  • In the case that plural enemy characters are defined (appear) in the virtual space and when all the enemies have been defeated, the virtual camera moving speed may be set back to the normal speed. Specifically, it is determined whether the enemy characters in the virtual space have all been defeated. If so determined, the virtual camera moving speed is set back to the normal speed. [0087]
  • Further, the virtual camera moving direction and the game story development may be changed (or may follow other preprogrammed branches) in accordance with the time elapsed for enemy characters to appear in the virtual space and to be annihilated. Specifically, the time elapsed is measured, then a virtual camera moving direction is selected and the game story is specified for how it will develop in accordance with the time. [0088]
  • As described above, the virtual camera moving speed changes in accordance with the distance between the virtual camera and the enemies, and the shorter the distance becomes, the slower the virtual camera moving speed becomes. Accordingly, if the player shoots the enemies from far away, the virtual camera moving speed does not decrease. In other words, the faster the play r defeats the enemies, the fast r the game advances, thereby the player obtains a higher score. [0089]
  • Furthermor, since the virtual camera moving speed changes in accordance with the distance from the enemies, the game device can provide a tense mood. For example, while a target enemy approaches from the back of the screen, the player (the virtual camera) moves toward a certain destination. The moving speed of the player does not change as long as the enemy is far away, therefore, the player feels as if he/she is advancing towards the destination on his/her own. However, as time passes and the enemy approaches the player, the player's moving speed decelerates and the player recognizes that he/she will have a battle with the enemies and so he/she becomes nervous. When the enemy finally reaches a certain range, the player's character stops its movement and remains in that position until the battle with the enemy is over. The player fights with the enemies with a sense of urgency, fearing that he/she might be defeated. Accordingly, the combination of the player's movements and the enemies'movements can provide a real-life tense atmosphere. [0090]
  • [Fixation Point of the Virtual Camera][0091]
  • The virtual camera moves in the three-dimensional virtual space according to the program. As shown in FIG. 7, the line of sight of the camera is directed to a certain point (a fixation point) in the virtual space and images are generated with the fixation point at the center of the display screen. The fixation point is controlled according to the enemy's situation which is located in the direction of the virtual camera's line of sight. Control of the fixation point is executed such that the speed with which the fixation point follows the enemy changes based on the distance between the virtual camera and the enemy. More specifically, as the enemy reaches within the enemy sensing distance, the fixation point starts to follow the enemy, and the shorter the distance becomes, the speed of the fixation point for following th enemy increases. [0092]
  • Explanations will be given for a case in which the fixation point is controlled in accordance with the distance between the virtual camera and the enemy, with reference to FIG. 4. The fixation point of the virtual camera is predetermined according to the program. When an enemy reaches within the enemy sensing distance ([0093] enemy 1 in FIG. 4), the fixation point of the virtual camera starts to follow the enemy, but since the enemy is still far away from the virtual camera, the speed of the fixation point for following the enemy is set to slow. However, when the enemy approaches the virtual camera (enemy 2 in FIG. 4), the enemy following speed of the fixation point is set to fast. Finally, when the enemy arrives at a certain distance (enemy 3 in FIG. 4), the speed of the fixation point for following the enemy will be at the maximum setting.
  • To rephrase the above process, a fixation point setting means sets a fixation point of the virtual camera. Next, it selects an enemy on which the fixation point should be fixed. This enemy is the one within the enemy sensing distance and closest to the virtual camera. Then, the fixation point setting means determines the speed to move the fixation point in accordance with the enemy's position and moves the fixation point at the determined speed. [0094]
  • [Shotgun Shooting Results Determination Process][0095]
  • Now, explanations are given for improvements in the shooting result determination process which is performed when the player shoots the enemies. In this embodiment, the player's weapon is a shotgun. Accordingly, it is desirable that the shooting results be determined in a manner that effectively demonstrates the shotgun property in which “bullets scatter in a wide radius”. It is understood that this characteristic of the shotgun means that, if fired at an object closeby, bullets impact a small area with high density, thereby demonstrating their greatest strength. On the other hand, if fired at a distant object, bullets scatter and impact a wide area with low density, thereby a deadly force cannot be fully realized. [0096]
  • Accordingly, in this embodiment, damage to be suffered by an enemy is determined based on the following points: the amount of damage changes according to the distance to the enemy; an effective shooting scope (bullet strength) will change in accordance with the above distance: and the damage also changes in accordance with the bullet's impact point within the effective radius. [0097]
  • FIG. 8 is a diagram showing one example of the relationship between the distance to the enemy and the bullet strength. As shown in FIG. 8, the bullet strength and the effective shooting scope are determined to change according to the distance to the enemy. For example, if the distance is 3 meters, the bullet strength is 100 points, but the bullet strength decreases to 60 points when the distance is 5 meters, and to 30 points when the distance is 7 meters. Whereas, if the distance to the enemy is 3 meters, the effective shooting scope is 20 centimeters, and it expands to 60 centimeters when the distance is 5 meters, and to 70 centimeters when the distance is 7 meters. [0098]
  • FIG. 9 shows an example of the relationship between the effective shooting scope and the bullet strength. As shown in FIG. 9, the bullet strength is set in a manner so that the further the impact point is located from the target point of the player (the center of the concentric circle), the more the bullet strength and the damage suffered by the enemy decrease. FIG. 10 shows on example of the r lationship between the distance to the enemy and the effective shooting scope. [0099]
  • The shooting results of the player based on the above settings, will be explained with reference to FIGS. 8 and 9. If the player shoots an enemy (for example, the head of the enemy) when it is within three meters of the player, the player can cause damage to the enemy worth 100 points. Whereas, if the player shoots the enemy when it is within five meters of the player, the player can cause damage worth only 60 points, even if aiming at the same head. However, the effective shooting scope, which is 20 centimeters when the distance is 3 meters, expands to 50 centimeters when the distance is 5 meters, and accordingly, there is a high possibility that the bullet will hit the head and other parts of the body (chest and shoulders) at the same time, consequently, this may cause more damage to the enemy. When aiming at the center of the enemy's head, a bullet that hits the chest is out of the radius of 100% bullet strength, but in the radius of 80% bullet strength. On the other hand, if aiming at the neck, the player can damage both the head and the chest with 100% bullet strength. [0100]
  • Since an effective point to aim at varies in accordance with the distance to the enemy, by learning the shotgun properties, strategies to defeat the enemies become possible. Therefore, the entertaining characteristics of the game are enhanced. [0101]
  • Now, the process flow executed when the player shoots an enemy will be explained. FIG. 11 is a flow chart explaining the process of the shooting results determination process (S[0102] 306 in FIG. 3). At first, when the player shoots the enemy, the hit decision means executes the hit decision process for determining whether the bullet hits the enemy (S306 a). This hit decision process will be described hereinafter with reference to FIGS. 12(A) and 12(B). When the hit decision means determines that the bullet has hit the target in the hit decision process, a hit flag is set.
  • Then, for the enemies having hit flags set, a damage computing means executes the damage process for computing a damage point caused by the shooting (S[0103] 306 b). The damage process will be described hereinafter with reference to FIGS. 13(A) and 13(B). Further, injury severity computing means executes the injury process for determining the injury severity of the enemies in accordance with the shooting results and expressing the injury visually (S306 c). The injury process will be hereinafter described with reference to FIG. 16.
  • [Hit Decision Process][0104]
  • Now, the process flow of the hit decision executed when the player shoots an enemy, will be explained with reference to FIGS. [0105] 12(A) and 12(B). FIG. 12(A) is a flow chart explaining the hit determination process flow. When the player shoots an enemy (S306 a 1; YES), the enemy's coordinates are converted on the coordinate system in which the players position is an original point and the vector of the shooting direction is a Z-axis (S306 a 2).
  • A radius DR, i.e., an effective shooting scope (extent of the scatter shot) at the Z position of the enemy is computed (S[0106] 306 a 3) and a distance L between the enemy and the Z-axis is computed (S306 a 4). Subsequently, a radius R of the enemy's collision sphere is computed (S306 a 6) and it is determined whether the bullet has hit the enemy base d on the radius DR, the distance L, and the radius R (S306 a 6). Specifically, when the sum of the radius R and the radius DR is greater than or equal to the distance L, it is considered that the bullet has hit the enemy (S306 a 6; YES). Whereas, when the sum is less than the distance L (S306 a 6; NO), it is considered that the bullet has missed the enemy (S306 a 7).
  • When the bullet has hit the enemy, the cross section of the collision cone of the shotgun pellets at the enemy's Z position is divided into sections of a predetermined number (for example, 16 sections), and it is determined which sections cover the enemy (S[0107] 306 a 8). FIG. 12(B) shows the conical cone which is divided into sections 1 to 16.
  • After executing the process from S[0108] 306 a 1 to S306 a 8 for all the enemies that have appeared on the screen (S306 a 9), the enemies are realigned according to their Z positions (S306 a 10), and the sections of the collision cone that were hit cover enemies in order of the shortest distance along the Z-axis from the shotgun (S306 a 11).
  • It is determined whether the enemy is covered by any of [0109] sections 1 to 16 (whether any section is filled with the enemies) (S306 a 12), and if it is determined that the enemy is not covered in any section, a hit flag is not set for the enemy (S306 a 13). On the other hand, if it is determined that the enemy is covered by a section, a hit flag for the enemy is set (S306 a 14). In short, it is determined that the bullet missed the enemy, if all the sections that have been hit already completely have covered the other enemies, and a hit flag is not set for the enemy.
  • The process from S[0110] 306 a 1 to S306 a 14 is repeated until the hit decision process is completed for all the enemies.
  • With this process, it is possible to execute th shotgun hit decision many times without executing the hit decision of the vector and the enemy's collision sphere. [0111]
  • [First Damage Process][0112]
  • Now, explanations will be given with reference to FIG. 13 for the flow of the damage process for computing a damage point incurred by the enemies based on the player's shooting. FIG. 13 is a flow chart explaining the damage process. The body of an enemy is composed of predetermined body sections (for example, head, arms, legs, chest, etc.), and each body section is composed of predetermined body parts (for example, an “arm” has a “shoulder,” “upper arm,” “lower arm,” and “hand”). The presence or absence of a hit flag, which is set in the hit decision process explained by FIG. 12(A), tells whether the bullet has hit any body section or body sections of the enemy's body as well as which body section or body sections were hit. [0113]
  • It is determined whether a hit flag is set for a predetermined body section (S[0114] 306 b 1). If the hit flag is set, the part closest to the impact point is selected (S306 b 2). Then, the effective shotgun radius at the impact point is specified (S306 b 3) and a distance from the selected body part to the impact point is computed (S306 b 4). Subsequently, a damage rate based on the distance from the trajectory is calculated (S306 b 5).
  • The damage rate can be obtained by the formula shown in FIG. 13(B). In this formula, the minimum damage rate (MIN_DAMAGE_RATE) is a bullet strength percentage at the furthest position from the impact point within the shotgun radius, for example the minimum damage rate is set as 0.1, for example. The maximum damage rate (MAX_DAMAGE_RADIUS_RATE) in the maximum impact radius is a bullet strength percentage, which determines a radius around the impact point for which the same strength should be applied. In short, the bullet strength at the impact point is maintained within a certain radius from the impact point. The damage radius (SHOT_GUN_RADIUS) is a radius wherein the bullet strength at the impact point is effective, and it also represents a range in which bullets scatter (i.e., hit decision area). A distance from the center of the trajectory (HIT_LEN) is a distance between the center of the trajectory and the enemy, and is obtained by subtracting the radius of the enemy's collision sphere from the distance between the center of the trajectory and the enemy. [0115]
  • When the damage rate is obtained at [0116] S306 b 5, a damage value is specified with reference to the damage chart shown in FIG. 14 (S306 b 6). The damage value is determined based on the distance to the enemy and the body section to which the body part belongs (S306 b 6).
  • FIG. 14 is one example of the configuration of the damage chart. The damage chart stores the damage values that determine a damage point of the enemies that have been shot. In FIG. 14, it is assumed that the average physical power value of the enemies is set at 200 points. As shown in FIG. 14, the damage values are set according to the distance between the player and an enemy and a body section that has been shot. The physical power value of an enemy which has been hit is calculated by: at first, obtaining the damage point of the enemy by multiplying the damage value by the damage rate based on the distance to the impact point (center of the trajectory); and then subtracting the computed damage point from the physical power value which the nemy owned before it was shot. [0117]
  • If the distance to th nemy is 3 m t rs or less and the body part that has b n shot is the arm, the damage value suffered by the enemy is “[0118] 30” points. A damage point of the body part is computed by multiplying the damage rate by the damage value (S306 b 7).
  • If damage points of all the body sections are not computed, damage points of the rest of the body sections are computed ([0119] S306 b 8; NO). When the damage points of all the body sections have been obtained, the damage points are summed up for the respective body sections and the total damage point to the enemy is obtained (S306 b 9). In other words, the sum of the damage points of the respective body sections is the total damage point suffered by the enemy, and this total damage point is subtracted from the physical power value of the enemy. If after the subtraction, the physical power value is less than a predetermined value, the enemy vanishes from the screen.
  • FIG. 15 shows image examples of objects (enemies) being shot. FIG. 15 shows two examples wherein the objects were shot at the same impact point but from different distances, the effective shooting scopes being shown with circles of dashed lines, and damage being shown with ⋆ figures. If the enemy is shot at short range as shown in FIG. 15(A), the bullets scatter around the abdomen, and each body part will be heavily damaged even though there are only a few points of damage. Whereas, if the enemy is shot at long range as shown in FIG. 15(B), the bullets scatter in a wide range throughout the whole body, but the damage to each body part is small. [0120]
  • To summarize the above explanations, the distance betwe n the virtual camera and an enemy character affects not only the virtual camera's moving speed, but the amount of damag suffered by the enemy character that was shot. Due to this fact, the player will be conflicted since on the one hand, shooting at short range demonstrates great bullet strength and enables the player to defeat ‘one enemy’ in a short time, but the player's moving speed will become slow. On the other hand, if there are many enemies, it may be better to shoot them at long range even with small bullet strengths because the player can damage an enemy in a wide range, thereby defeating the enemies more quickly and the player can move forward in the game. Thus, the entertaining characteristics of the game are enhanced. [0121]
  • [Injury Process][0122]
  • Now, explanations will be given for the flow of the injury process for displaying an injury status of an enemy in accordance with the shooting by the player. By this injury process, how much the enemy is damaged is visually displayed. Every time the enemy is shot, an injury progression value due to the shot is attributed to the enemy, and the damage (injury status) is displayed in correspondence with the accumulated injury progression values. The injury progression value to be attributed to the enemy, is set in accordance with the distance to the enemy. Specifically, the shorter the distance is, the larger the injury progression value is set, and the longer the distance is, the smaller the injury progression value is set [0123]
  • Each body part on the enemy is provided with damaged body parts that express damage (injury status) corresponding to the predetermined levels. For example, the body part, the “chest,” of an enemy A is provided with damaged body parts that correspond to five levels ([0124] 01234) of damage. The damaged body parts are composed such that the severity increases at each level. For xample, level 0 shows an image of the chest with no damage; at level 1 a part of the chest is bleeding; at level 2, a part of the chest is damaged; at level 3, the entire ch st is damaged; and at level 4, the chest is shattered. Damage levels and their modes of expression may be set differently depending on the enemy types.
  • FIG. 16 is a flow chart explaining the injury process. The presence or absence of a hit flag, which is set in the hit decision process explained with reference to FIG. 12, indicates whether a bullet hit a body section of the enemy's body. [0125]
  • At first, it is determined whether a hit flag is set for a predetermined body section (S[0126] 306 c 1), and if the hit flag is set for the body section (S306 c 1; YES), a body part closest to the impact point within the present body section is selected (S306 c 2). Then, the injury progression value chart (damage progression value chart) in FIG. 17 is referred to, and an injury progression value is specified based on the player's distance to the enemy (S306 c 3).
  • FIG. 17 shows one example of the configuration of the injury progression value chart, wherein the injury progression values are set in accordance with the distance to the enemy. As shown in FIG. 17, the injury progression values of a body part A of a certain body section are set such that the shorter the distance to the enemy the player is, the more the value increases; and the further the distance is, the more the value decreases. In FIG. 17, the injury progression values only of the body part A (upper part of an arm) are indicated. Other body parts (for example, lower arms) and other body sections (for example, the head) are omitted in FIG. 17, but the injury progression values of those parts are similarly set. [0127]
  • After specifying the injury progression value, the injury progression value that is already stored in a predetermined storage area is added (S[0128] 306 c 4). In short, if the present body part has previously been hit, the injury progression value of the present impact is added to the injury progression value of the previous impact, thereby increasing the total injury progression value. Parameters of the damaged body parts are referred to based on the accumulated injury progression value, and a damaged body part which will be displayed is specified as it shows the shooting result (S306 c 5).
  • Specifically, the damaged body part is specified by the formula “display damaged body part=injury progression value of the present body part÷10 (fractions omitted)”. For example, when the enemy is shot from a distance of 8 meters, the injury progression value of the body part is “7”, and the formula is “{fraction (7/10)}=0 (fractions omitted)”, therefore, the damaged body part of level “0” is displayed. When the enemy is shot again from the same distance, the accumulated injury progression value is “7+7=14”, and the formula is “{fraction (14/10)}=1 (fractions omitted)”, thereby the damaged body part of level “1” is displayed. [0129]
  • Whereas, if the enemy is shot from a distance of 3 meters, the injury progression value of the present body part is “15” and the formula is “{fraction (15/10)}=1 (fractions omitted)”, and accordingly, the damaged body part of level “1” is displayed. When the enemy is shot again from the same distance, the accumulated injury progression value is “15+15=30”, and the formula is “{fraction (30/10)}=3 (fractions omitted)”, thereby, the damaged body part of level “3” is displayed. In this way, when the distance to the enemy at the time of impact is short, the severity of the enemy's injury increases even if impacted only a few times. [0130]
  • [Second Damage Process][0131]
  • Now, explanations will be given for the second damage process of damage suffered by the enemy due to the shooting. A proportion (“first overlapping proportion”) of the overlapping area (hit determined portion) to the entire effective shooting radius (damage radius) is computed, and in the overlapping area, the damage radius and the enemy's collision sphere overlap. A damage point is then computed based on the obtained proportion. Specifically, the damage point is computed by multiplying a damage value of the damage radius by the first overlapping proportion. Details will be explained with reference to FIG. 18. [0132]
  • FIG. 18 explains the second damage process. In FIG. 18(A), the proportion of the hit determined portion to the damage radius is 100%. An enemy's damage point is obtained by the formula “enemy's damage point=damage value×first overlapping proportion (%)”. Accordingly, if the damage value is set to 100, the damage point of the enemy is 100 by calculating “damage value (100)×first overlapping proportion (100%)=100”. [0133]
  • In FIG. 18(B), the proportion of the hit determined portion to the damage radius is 50%. If the damage value is set to 100, the enemy's damage point is 50 by the formula “damage value (100)×first overlapping proportion (50%)=50”. The fact that the proportion of the hit determined portion to the damage radius is 50%, means that 50% of the damage radius overlaps with the enemy's collision sphere. [0134]
  • Now, another example of the second damage process will be explained. In this example, a proportion (“second overlapping proportion”) of the overlapping area (hit determined portion) to the entire collision sphere of the enemy is computed, and in the overlapping area, the damage radius and the enemy's collision sphere overlap. A damage point is then computed based on the obtain d second overlapping proportion. Specifically, the damage point is obtained by multiplying the damage value of the damage radius by the second overlapping proportion. Details will be explained with reference to FIGS. [0135] 18(C) and 18(D).
  • In FIG. 18(C), the proportion of the hit determined portion to the enemy's collision sphere is 100%. A damage point of the enemy is obtained by calculating “enemy's damage point=damage value×second overlapping proportion (%)”. Accordingly, when the damage value provided by the entire damage radius is set to 100, the enemy's damage point is 100 by the formula “damage value (100)×second overlapping proportion (100%)=100”. [0136]
  • On the other hand, in FIG. 18(D), the proportion of the hit determined portion to the enemy's collision sphere is 50%. Accordingly, when the damage value provided by the entire damage radius is set to 100, the enemy's damage point is 50 by calculating “damage value (100)×second overlapping proportion (50%)”. The fact that the proportion of the hit determined portion to the enemy's collision sphere is 50%, means that 50% of the enemy's collision sphere overlaps with the damage radius. [0137]
  • Now, explanations will be given, with reference to FIG. 18(E), as to how to compute an area (“hit determined portion”) wherein the damage radius and the enemy's collision sphere overlap with each other. As shown in FIG. 18(E), the damage radius is divided into lattices of a predetermined size. Then, it counts the number of lattices overlapping the collision sphere. Finally, in the case of the first overlapping proportion, the proportion of the overlapped lattices to all of the damage radius lattices is computed. Whereas, in the cas of the second overlapping proportion, the proportion of the overlapped lattices to all of the collision sphere lattices is computed. [0138]
  • In yet another example, the damage radius and the collision sphere are projected onto a virtual image (not displayed) for the hit decision, and the number of pixels in the overlapped portion of the virtual image is counted. [0139]
  • The overlapped area and the value for multiplying the damage do not necessarily correspond exactly. The value may be separated into levels, for example, when “the overlapped portion is 1% or more, but less than 10%”, “the value is 10%”, and when “the overlapped portion is 10% or more, but less than 30%”, “the value is 30%”. [0140]
  • The damage radius is not limited to being circular, but may be oval or polygonal as appropriate. Further, the combination of the damage process of the present invention with other damage processes makes it possible to execute a more fractionalized damage process. [0141]
  • Now, explanations will be given for the case in which an enemy, a shooting object, has a human shape and a damage point of the enemy is computed according to the second damage process. FIG. 19 explains the second damage process when an enemy is human-shaped. [0142]
  • As shown in FIG. 19(A), when the damage radius is above the waist, it is determined that approximately 80% of the entire damage radius overlaps with the enemy's collision sphere (the proportion of the hit determined portion to the damage radius is 80%). If th damage value of the entire damage radius is set to 100, the enemy's damage point will be 80 by calculating “damage value (100)×first overlapping proportion (80%)=80”. [0143]
  • If the damage point computation is executed for each body part, a proportion of an area for each body part to the damage radius is computed and a damage point is also computed based on the total obtained proportion. For example, when the damage radius is taken up by 10% for each of the arms, 10% by the head, 20% by the waist, and 30% by the chest, the summation of 20 for the arms, 10 for the head, 20 for the waist, and 30 for the chest, i.e. “20+10+20+30”, will equal the total damage points of the enemy, i.e., 80. [0144]
  • As shown in FIG. 19(B), if the damage radius overlaps with an arm, it is determined that approximately 10% of the entire damage radius overlaps with the enemy's collision sphere. When the damage value is set to 100, the enemy's damage point will be 10 by calculating “damage value (100)×first overlapping proportion (10%)=10”. If the damage point computation is executed for each body part, the damage point of the arm is 10. [0145]
  • As shown in FIG. 19(C), if the damage radius overlaps with both legs, it is determined that approximately 40% of the entire damage radius overlaps with the enemy's collision sphere. When the damage value is set to 100, the enemy's damage point will be 40 by calculating “damage value (100)×first overlapping proportion (40%)=40”. If the damage point computation is executed for each body part, th damage points of both legs will be 40, which is 20 for each leg. [0146]
  • As shown in FIG. 19(D), if the damage radius overlaps with two nemies, their damage points are computed separately. First, regarding the enemy A, the damage radius overlaps with its arm, therefore, it is determined that approximately 10% of the damage radius overlaps with the collision sphere of the enemy A. As for the enemy B, since the damage radius overlaps with its upper body, it is determined that approximately 50% of the damage radius overlaps with the collision sphere of the enemy B. When the damage value is set to 100, a damage point of the enemy A is 10 by calculating “damage value (100)×first overlapping proportion [0147] (10%)=10”. A damage point of the enemy B will be 50 by calculating “damage value (100)×first overlapping proportion (50%)=50”. This enables the display of an image in which “even though the enemy B is positioned slightly behind the character A, the enemy B is severely damaged since it received more scattered bullets”.
  • As described, because the overlapping areas of the damage radius and the enemies'collision spheres, correspond precisely to the damage values, damage can be determined more realistically and fairly. Further, the player can develop his/her skills for the shooting game, for example, “aiming at a target in a manner so that more enemies are included in the effective shooting scope” and “shooting stronger enemies in a manner so that bullets scatter in a wide range”. [0148]
  • It is possible to combine the second damage process with the first damage process. [0149]
  • [Other Embodiments][0150]
  • In the explanations of the above embodiment, th present invention is applied to a gun shooting game, however, the present invention is not limited to this application, but can be applied to other types of games. For example, explanations will be given to a game wherein multiple characters are defined in a three-dimensional virtual space, a first character (for example, an enemy character) being manipulated under a predetermined program while a second character (for example, a player's character) being manipulated in accordance with the manipulation information from the player. [0151]
  • In the virtual camera controlling process of this game, a position of the player's character is employed instead of the position of the virtual camera and the moving speed of the virtual camera is controlled by the distance between the enemy character and the player's character. Further, in the moving speed changing process of the fixation point of the virtual camera, the speed of the fixation point to follow enemy characters may be controlled on the basis of the distance between the player's character and the enemy character. [0152]
  • Also, in the damage point computing process, an effective attack range of the player is employed instead of the effective shooting radius. In this case, damage points of the enemy characters may be computed on the basis of: the distance between the player's character and the enemy character; the effective attack range that changes in accordance with the above distance; and the distance between the enemy character and the center of the effective attack range. The same computing manner is used when the player's character (a character on the player's side) is being attacked. [0153]
  • Furthermore, in the second damage point computing process, a damage point may be computed based on the proportion of the overlapping area to the entire effective attack range, and in the overlapping area, the effective attack range and the enemy's collision spheres overlap. [0154]
  • This damage point computing process may be applied to games in which a player damages objects located in a virtual space. Specifically, the positional information of a character manipulated by the player and the positional information of objects are obtained. Then the distance between the player's character and an object is computed, and the size of a hit decision is determined based on the distance. When it is determined that the hit decision area overlaps with an object, the damage amount of the object is computed, and then the damage to the object based on the damage amount is caused. [0155]
  • According to the present invention, the faster the player defeats the enemies, the more favorably the game develops for the player, therefore, the player will have results that correspond to his/her skills. In addition, in this invention, since the shooting results are determined in correspondence with the characteristics of the gun, the player can enjoy the game by developing fight strategies using his/her knowledge of the gun properties. Furthermore, with the game device of the present invention, the shooting results and the enemies' damage point correspond to each other precisely, and damage can be determined more realistically and fairly. [0156]
  • In this specification, a product invention can be interpreted as a method invention and vice versa. This invention can also be implemented as a program or a recording medium that has a program stored therein for making a computer implement predetermined functions. Examples of the recording medium include, for example, a hard disk (HD), a DVD-RAM, a floppy disk (FD), a CD-ROM, and types of memory such as a RAM and a ROM. Examples of the computer includ a so-called microcomputer wherein a central processing unit such as a CPU or an MPU interprets programs to execute predetermined processes. [0157]
  • In this specification, a means does not simply imply a physical means, but it can also imply a function of the means implemented by a software or hardware circuit. A function of one means may be realized by two or more physical means and functions of two or more means may be realized by one physical means. [0158]
  • Moreover, means in this specification can be implemented by hardware or software, or the combination of both. Implementation by the combination of the hardware and the software is, for example, the implementation by a computer system having a predetermined program therein. A function of one means may be realized by two or more types of hardware or software, or by the combination of both, while two or more functions of one means may also be realized by one type of hardware or software, or by the combination of both. [0159]
  • The entire disclosure of Japanese Patent Application No. 2002-146900 filed on May 21, 2002 including the specification, claims, drawings, and summary are incorporated herein by reference in its entirety. [0160]

Claims (24)

I claim:
1. A game device for simulating a player's shooting of a character defined in a three-dimensional virtual space while moving a virtual camera located in the three-dimensional virtual space, comprising:
computing means for computing a damage point of the character caused by the player's shooting of it, based on a distance between the virtual camera and the character and the distance between the character and the center of an effective shooting radius that changes in accordance with the distance between the virtual camera and the character.
2. The game device according to claim 1,
wherein the damage point computing means computes a damage point of the character caused by the player's shooting of it, by multiplying a damage value, which is determined based on the distance between the virtual camera and the character, by a damage rate that is determined based on the distance between the character and the center of the effective shooting radius that changes in accordance with the distance between the virtual camera and the character.
3. The game device according to claim 1 or 2, wherein
the damage value is determined such that the further the distance between the virtual camera and the character is, the smaller the damage value is,
the effective shooting radius is determined such that the further the distance between the virtual camera and th character is, the larger the effective shooting radius is, and
the damage rate is determined such that the further the distance between the character and the center of the effective shooting radius is, the smaller the proportion is.
4. A game device for simulating a player's shooting of a character defined in a three-dimensional virtual space while moving a virtual camera located in the three-dimensional virtual space, comprising:
computing means for computing a damage point of the character caused by the shooting in accordance with a proportion of an overlapping area to the shooting radius, and in the overlapping area an effective shooting radius and a collision sphere of the enemy overlap.
5. A game device for simulating a player's shooting of a character defined in a three-dimensional virtual space while moving a virtual camera located in the three-dimensional virtual space, comprising:
computing means for computing a damage point of the character caused by the shooting in accordance with a proportion of an overlapping area to the collision sphere of the enemy, and in the overlapping area an effective shooting radius and a collision sphere of the enemy overlap.
6. A controlling method of a game device for simulating a player's shooting of a character defined in a three-dimensional virtual space while moving a virtual camera located in the three-dimensional virtual space, wherein
a damage point of the character caused by the player's shooting of it is computed based on a distance between the virtual camera and the character and the distance between the character and the center of an ffective shooting radius that changes in accordance with the distance between the virtual camera and the character.
7. A controlling method of a game device for simulating a player's shooting of a character defined in a three-dimensional virtual space while moving a virtual camera located in the three-dimensional virtual space, wherein
a damage point of the character caused by the shooting is computed in accordance with a proportion of an overlapping area to the shooting radius, and in the overlapping area an effective shooting radius and a collision sphere of the enemy overlap.
8. A controlling method of a game device for simulating a player's shooting of a character defined in a three-dimensional virtual space while moving a virtual camera located in the three-dimensional virtual space, wherein
a damage point of the character caused by the shooting is computed in accordance with a proportion of an overlapping area to the collision sphere, and in the overlapping area an effective shooting radius and a collision sphere of the enemy overlap.
9. A game controlling method whereby a game device is controlled such that it determines whether a hit decision area generated in a virtual space in accordance with a players manipulation overlaps with an object located in the virtual space, and the character is damaged when it is determined that the hit decision area overlaps with the object, the method comprising:
a step of obtaining first positional information indicating a position of a character being manipulated by the player in the virtual space;
a step of obtaining second positional information indicating a position of the object in the virtual space;
a step of computing a distance between the character and the object based on the obtained first and second positional information;
a step of changing the size of the hit decision area based on the obtained distance; and
a step of computing, when it is determined that the hit decision area and the object overlap with each other, a damage amount for the object based on the obtained distance, and realizing damage to the object based on the obtained damage amount.
10. The game controlling method according to claim 9, wherein when the obtained distance is shorter than a predetermined distance, the hit decision area is set to small and the damage amount is set to large.
11. The game controlling method according to claim 9 or 10, wherein when the computed distance is further than a predetermined distance, the hit decision area is set to large and the damage amount is set to small.
12. A game controlling method whereby a game device is controlled such that it determines whether a hit decision area, generated in a virtual space with a predetermined target point at the center in accordance with a player's manipulation, overlaps with an object located in the virtual space and the object is damaged when it is determined that the hit decision area overlaps with the object, the method comprising:
a step of obtaining positional information indicating a position of the object in th virtual space;
a step of computing an area wherein the hit decision area and the object overlap with each other, based on the hit decision area and the obtained positional information; and
a step of generating data of a damage amount to be attributed to the object, based on the obtained area, and providing damage to the object based on the generated damage amount data.
13. A shooting game controlling method, wherein the shooting game is controlled such that it simulates a player's shooting of a character defined in a three-dimensional virtual space, while a virtual camera located in the three-dimensional virtual space is moving at a predetermined speed, comprising:
a step of changing the distance between the character and the virtual camera;
a step of changing the moving speed of the virtual camera or the speed of directing the virtual camera to the character, based on the distance between the character and the virtual camera;
a step of changing the player's effective shooting radius based on the distance between the virtual camera and the character;
a step of determining whether a bullet has hit the character, based on the character's position and the location of the effective shooting radius; and
a step of computing, when it is determined that the bullet has hit the character in the determination step, a damage amount caused to the character by the shooting, based on both the distance between the virtual camera and the character as well as the distance between the character and the center of the effective shooting radius.
14. An image processing method for moving a virtual camera located in a three-dimensional virtual space at a predetermined speed and changing the distance between the virtual camera and a charact r defined in the three-dimensional virtual space, wherein
the moving speed of the virtual camera changes based on the distance between the virtual camera and the character.
15. The image processing method according to claim 14, wherein
a first character defined in the three-dimensional virtual space and a second character manipulated by a player, are displayed, and
the moving speed of the virtual camera changes based on the distance between the first and second characters.
16. An image processing method for directing a virtual camera to a character located in a three-dimensional virtual space, wherein
a fixation point of the virtual camera is set on the character in a manner so that the speed of directing the virtual camera to the character changes based on the distance between the virtual camera and the character.
17. A game device composed such that a virtual camera located in a three-dimensional virtual space moves at a predetermined speed and the distance between the virtual camera and a character defined in the virtual space changes, comprising:
virtual camera controlling means for changing a moving speed of the virtual camera based on the distance between the virtual camera and the character.
18. The game device according to claim 17, wherein
a first character defined in the three-dim nsional virtual space and a second character manipulated by a player are displayed, and
the virtual camera controlling means changes the moving speed of the virtual camera based on the distance between the first and second characters.
19. The game device according to claim 17, wherein the virtual camera controlling means controls the virtual camera moving speed in a manner that the shorter the distance between the virtual camera and the character becomes, the more the virtual camera moving speed decreases.
20. The game device according to claim 17, wherein
a plurality of areas are provided in the three-dimensional virtual space with the virtual camera at the center, and
the virtual camera controlling means determines in which area a character closest to the virtual camera exists, and controls the virtual camera moving speed in accordance with the determined area.
21. A game device for directing a virtual camera to a character located in a three-dimensional virtual space, comprising:
fixation point setting means for setting a fixation point of the virtual camera on the character such that the speed of directing the virtual camera to the character changes in accordance with the distance between the virtual camera and the character.
22. An image processing device composed to change a distanc between a character and a virtual camera both located in a three-dimensional virtual space, comprising:
means for computing the distance between the virtual camera and th character; and
virtual camera controlling means for changing a moving speed of the virtual camera in accordance with the computed distance.
23. An information processing program for making a computer execute the game device controlling method according to any one of claims 6 to 16.
24. A computer-readable recording medium having an information processing program stored therein that makes a computer execute the game device controlling method according to any one of claims 6 to 16.
US10/441,031 2002-05-21 2003-05-20 Game device, image processing device and image processing method Abandoned US20040063501A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2002-146900 2002-05-21
JP2002146900A JP2003334382A (en) 2002-05-21 2002-05-21 Game apparatus, and apparatus and method for image processing

Publications (1)

Publication Number Publication Date
US20040063501A1 true US20040063501A1 (en) 2004-04-01

Family

ID=29705737

Family Applications (1)

Application Number Title Priority Date Filing Date
US10/441,031 Abandoned US20040063501A1 (en) 2002-05-21 2003-05-20 Game device, image processing device and image processing method

Country Status (2)

Country Link
US (1) US20040063501A1 (en)
JP (1) JP2003334382A (en)

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050164794A1 (en) * 2004-01-28 2005-07-28 Nintendo Co.,, Ltd. Game system using touch panel input
US20050187023A1 (en) * 2004-02-23 2005-08-25 Nintendo Co., Ltd. Game program and game machine
US20070171221A1 (en) * 2006-01-26 2007-07-26 Nintendo Co., Ltd. Image processing program and image processing device
US20070270215A1 (en) * 2006-05-08 2007-11-22 Shigeru Miyamoto Method and apparatus for enhanced virtual camera control within 3d video games or other computer graphics presentations providing intelligent automatic 3d-assist for third person viewpoints
US20080207324A1 (en) * 2007-02-28 2008-08-28 Kabushiki Kaisha Square Enix (Also Trading As Square Enix Co., Ltd.) Game apparatus, virtual camera control method, program and recording medium
EP1968350A1 (en) * 2005-12-28 2008-09-10 Konami Digital Entertainment Co., Ltd. Voice processor, voice processing method, program, and information recording medium
US20100151943A1 (en) * 2006-11-09 2010-06-17 Kevin Johnson Wagering game with 3d gaming environment using dynamic camera
US20100160042A1 (en) * 2007-09-27 2010-06-24 Konami Digital Entertainment Co., Ltd. Game program, game apparatus and game control method
US20100267451A1 (en) * 2009-04-20 2010-10-21 Capcom Co., Ltd. Game machine, program for realizing game machine, and method of displaying objects in game
US20100315415A1 (en) * 2007-11-01 2010-12-16 Konami Digital Entertainment Co., Ltd. Image Processing Device, Method for Processing Image, Information Recording Medium, and Program
US20110304620A1 (en) * 2010-06-09 2011-12-15 Nintendo Co., Ltd. Storage medium having stored therein image processing program, image processing apparatus, image processing system, and image processing method
US20130085410A1 (en) * 2011-09-30 2013-04-04 Motorola Mobility, Inc. Method and system for identifying location of a touched body part
US20130137066A1 (en) * 2011-11-29 2013-05-30 L-3 Communications Corporation Physics-based simulation of warhead and directed energy weapons
US20140274239A1 (en) * 2013-03-12 2014-09-18 Fourthirtythree Inc. Computer readable medium recording shooting game
EP2087928A3 (en) * 2007-12-21 2015-02-25 Nintendo Co., Ltd. Computer-readable storage medium having game program stored therein and game apparatus
US20150314194A1 (en) * 2014-05-01 2015-11-05 Activision Publishing, Inc. Reactive emitters for video games
US10406437B1 (en) * 2015-09-30 2019-09-10 Electronic Arts Inc. Route navigation system within a game application environment
US10589180B2 (en) * 2013-04-05 2020-03-17 Gree, Inc. Method and apparatus for providing online shooting game
CN113069770A (en) * 2021-03-29 2021-07-06 广州三七互娱科技有限公司 Game role display method and device and electronic equipment
CN114225419A (en) * 2020-08-27 2022-03-25 腾讯科技(深圳)有限公司 Control method, device, equipment, storage medium and program product of virtual prop
US20220254094A1 (en) * 2021-02-09 2022-08-11 Canon Medical Systems Corporation Image rendering apparatus and method
US11498004B2 (en) * 2020-06-23 2022-11-15 Nintendo Co., Ltd. Computer-readable non-transitory storage medium having instructions stored therein, game apparatus, game system, and game processing method
US11508116B2 (en) * 2017-03-17 2022-11-22 Unity IPR ApS Method and system for automated camera collision and composition preservation
CN117395510A (en) * 2023-12-12 2024-01-12 湖南快乐阳光互动娱乐传媒有限公司 Virtual machine position control method and device

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006006635A (en) * 2004-06-25 2006-01-12 Aruze Corp Game machine
JP2006006634A (en) * 2004-06-25 2006-01-12 Aruze Corp Game machine
JP5030132B2 (en) * 2006-01-17 2012-09-19 任天堂株式会社 GAME PROGRAM AND GAME DEVICE
JP4094647B2 (en) 2006-09-13 2008-06-04 株式会社コナミデジタルエンタテインメント GAME DEVICE, GAME PROCESSING METHOD, AND PROGRAM
JP2008119224A (en) * 2006-11-10 2008-05-29 Namco Bandai Games Inc Program, information storage medium, and game device
JP2009000286A (en) * 2007-06-21 2009-01-08 Taito Corp Game system and game robot operated by remote control
JP5296338B2 (en) * 2007-07-09 2013-09-25 任天堂株式会社 GAME PROGRAM AND GAME DEVICE
JP5269392B2 (en) * 2007-11-08 2013-08-21 株式会社カプコン Program and game system
JP4392446B2 (en) 2007-12-21 2010-01-06 株式会社コナミデジタルエンタテインメント GAME DEVICE, GAME PROCESSING METHOD, AND PROGRAM
JP4384697B2 (en) * 2008-03-26 2009-12-16 株式会社コナミデジタルエンタテインメント GAME DEVICE, GAME PROCESSING METHOD, AND PROGRAM
JP4218977B2 (en) * 2008-05-20 2009-02-04 任天堂株式会社 GAME PROGRAM AND GAME DEVICE
JP5498803B2 (en) * 2010-01-13 2014-05-21 株式会社コナミデジタルエンタテインメント GAME DEVICE, GAME CONTROL METHOD, AND PROGRAM
JP5989708B2 (en) * 2014-05-22 2016-09-07 株式会社コロプラ Game program
JP6608171B2 (en) * 2015-05-22 2019-11-20 株式会社コロプラ Game program
CN111265864B (en) * 2020-01-19 2022-07-01 腾讯科技(深圳)有限公司 Information display method, information display device, storage medium, and electronic device
CN112169330B (en) * 2020-09-25 2021-12-31 腾讯科技(深圳)有限公司 Method, device, equipment and medium for displaying picture of virtual environment

Citations (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3960380A (en) * 1974-09-16 1976-06-01 Nintendo Co., Ltd. Light ray gun and target changing projectors
US4317650A (en) * 1978-09-13 1982-03-02 The Solartron Electronic Group Limited Weapon training systems
US5382026A (en) * 1991-09-23 1995-01-17 Hughes Aircraft Company Multiple participant moving vehicle shooting gallery
USRE35314E (en) * 1986-05-20 1996-08-20 Atari Games Corporation Multi-player, multi-character cooperative play video game with independent player entry and departure
US5662523A (en) * 1994-07-08 1997-09-02 Sega Enterprises, Ltd. Game apparatus using a video display device
US5734807A (en) * 1994-07-21 1998-03-31 Kabushiki Kaisha Sega Enterprises Image processing devices and methods
US5800265A (en) * 1995-02-24 1998-09-01 Semiconductor Energy Laboratory Co., Ltd. Game machine
US5880709A (en) * 1994-08-30 1999-03-09 Kabushiki Kaisha Sega Enterprises Image processing devices and methods
US5988645A (en) * 1994-04-08 1999-11-23 Downing; Dennis L. Moving object monitoring system
US6146278A (en) * 1997-01-10 2000-11-14 Konami Co., Ltd. Shooting video game machine
US20010029203A1 (en) * 2000-04-10 2001-10-11 Konami Corporation Game system and computer readable storage medium
US6304267B1 (en) * 1997-06-13 2001-10-16 Namco Ltd. Image generating system and information storage medium capable of changing angle of view of virtual camera based on object positional information
US6306033B1 (en) * 1999-03-23 2001-10-23 Square Co., Ltd. Video game item's value being adjusted by using another item's value
US6323895B1 (en) * 1997-06-13 2001-11-27 Namco Ltd. Image generating system and information storage medium capable of changing viewpoint or line-of sight direction of virtual camera for enabling player to see two objects without interposition
US6458034B1 (en) * 1999-08-27 2002-10-01 Namco Ltd. Game system and computer-usable information
US20020190981A1 (en) * 1997-12-12 2002-12-19 Namco Ltd. Image generation device and information storage medium
US6504539B1 (en) * 1999-09-16 2003-01-07 Sony Computer Entertainment Inc. Method for displaying an object in three-dimensional game
US6532015B1 (en) * 1999-08-25 2003-03-11 Namco Ltd. Image generation system and program
US20030064764A1 (en) * 2001-10-02 2003-04-03 Konami Corporation Game device, game control method and program
US6582299B1 (en) * 1998-12-17 2003-06-24 Konami Corporation Target shooting video game device, and method of displaying result of target shooting video game
US6632137B1 (en) * 1999-06-11 2003-10-14 Konami Co., Ltd. Target-game execution method, game machine, and recording medium
US6763325B1 (en) * 1998-06-19 2004-07-13 Microsoft Corporation Heightened realism for computer-controlled units in real-time activity simulation
US6821206B1 (en) * 1999-11-25 2004-11-23 Namco Ltd. Game machine, game route selection method, and information storage medium
US6852032B2 (en) * 2000-12-06 2005-02-08 Nikon Corporation Game machine, method of performing game and computer-readable medium
US6972756B1 (en) * 1997-11-25 2005-12-06 Kabushiki Kaisha Sega Enterprises Image generating device
US6972734B1 (en) * 1999-06-11 2005-12-06 Canon Kabushiki Kaisha Mixed reality apparatus and mixed reality presentation method
US6992666B2 (en) * 2000-05-15 2006-01-31 Sony Corporation 3-dimensional-model-processing apparatus, 3-dimensional-model processing method and program-providing medium
US7048632B2 (en) * 1998-03-19 2006-05-23 Konami Co., Ltd. Image processing method, video game apparatus and storage medium
US20070202946A1 (en) * 2004-03-12 2007-08-30 Konami Digital Entertainment Co., Ltd. Shooting Game Device
US20080100531A1 (en) * 2005-03-31 2008-05-01 Sega Corporation Display control program executed in game machine

Patent Citations (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3960380A (en) * 1974-09-16 1976-06-01 Nintendo Co., Ltd. Light ray gun and target changing projectors
US4317650A (en) * 1978-09-13 1982-03-02 The Solartron Electronic Group Limited Weapon training systems
USRE35314E (en) * 1986-05-20 1996-08-20 Atari Games Corporation Multi-player, multi-character cooperative play video game with independent player entry and departure
US5382026A (en) * 1991-09-23 1995-01-17 Hughes Aircraft Company Multiple participant moving vehicle shooting gallery
US5988645A (en) * 1994-04-08 1999-11-23 Downing; Dennis L. Moving object monitoring system
US5662523A (en) * 1994-07-08 1997-09-02 Sega Enterprises, Ltd. Game apparatus using a video display device
US5734807A (en) * 1994-07-21 1998-03-31 Kabushiki Kaisha Sega Enterprises Image processing devices and methods
US5880709A (en) * 1994-08-30 1999-03-09 Kabushiki Kaisha Sega Enterprises Image processing devices and methods
US5800265A (en) * 1995-02-24 1998-09-01 Semiconductor Energy Laboratory Co., Ltd. Game machine
US6146278A (en) * 1997-01-10 2000-11-14 Konami Co., Ltd. Shooting video game machine
US6304267B1 (en) * 1997-06-13 2001-10-16 Namco Ltd. Image generating system and information storage medium capable of changing angle of view of virtual camera based on object positional information
US6323895B1 (en) * 1997-06-13 2001-11-27 Namco Ltd. Image generating system and information storage medium capable of changing viewpoint or line-of sight direction of virtual camera for enabling player to see two objects without interposition
US6972756B1 (en) * 1997-11-25 2005-12-06 Kabushiki Kaisha Sega Enterprises Image generating device
US20020190981A1 (en) * 1997-12-12 2002-12-19 Namco Ltd. Image generation device and information storage medium
US6614436B2 (en) * 1997-12-12 2003-09-02 Namco Ltd Image generation device and information storage medium
US7048632B2 (en) * 1998-03-19 2006-05-23 Konami Co., Ltd. Image processing method, video game apparatus and storage medium
US6763325B1 (en) * 1998-06-19 2004-07-13 Microsoft Corporation Heightened realism for computer-controlled units in real-time activity simulation
US6582299B1 (en) * 1998-12-17 2003-06-24 Konami Corporation Target shooting video game device, and method of displaying result of target shooting video game
US6306033B1 (en) * 1999-03-23 2001-10-23 Square Co., Ltd. Video game item's value being adjusted by using another item's value
US6972734B1 (en) * 1999-06-11 2005-12-06 Canon Kabushiki Kaisha Mixed reality apparatus and mixed reality presentation method
US6632137B1 (en) * 1999-06-11 2003-10-14 Konami Co., Ltd. Target-game execution method, game machine, and recording medium
US6532015B1 (en) * 1999-08-25 2003-03-11 Namco Ltd. Image generation system and program
US6458034B1 (en) * 1999-08-27 2002-10-01 Namco Ltd. Game system and computer-usable information
US6504539B1 (en) * 1999-09-16 2003-01-07 Sony Computer Entertainment Inc. Method for displaying an object in three-dimensional game
US6821206B1 (en) * 1999-11-25 2004-11-23 Namco Ltd. Game machine, game route selection method, and information storage medium
US6572476B2 (en) * 2000-04-10 2003-06-03 Konami Corporation Game system and computer readable storage medium
US20010029203A1 (en) * 2000-04-10 2001-10-11 Konami Corporation Game system and computer readable storage medium
US6992666B2 (en) * 2000-05-15 2006-01-31 Sony Corporation 3-dimensional-model-processing apparatus, 3-dimensional-model processing method and program-providing medium
US6852032B2 (en) * 2000-12-06 2005-02-08 Nikon Corporation Game machine, method of performing game and computer-readable medium
US20030064764A1 (en) * 2001-10-02 2003-04-03 Konami Corporation Game device, game control method and program
US20070202946A1 (en) * 2004-03-12 2007-08-30 Konami Digital Entertainment Co., Ltd. Shooting Game Device
US20080100531A1 (en) * 2005-03-31 2008-05-01 Sega Corporation Display control program executed in game machine
US7948449B2 (en) * 2005-03-31 2011-05-24 Sega Corporation Display control program executed in game machine

Cited By (51)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050164794A1 (en) * 2004-01-28 2005-07-28 Nintendo Co.,, Ltd. Game system using touch panel input
US20050187023A1 (en) * 2004-02-23 2005-08-25 Nintendo Co., Ltd. Game program and game machine
US7771279B2 (en) 2004-02-23 2010-08-10 Nintendo Co. Ltd. Game program and game machine for game character and target image processing
EP1968350A4 (en) * 2005-12-28 2009-11-18 Konami Digital Entertainment Voice processor, voice processing method, program, and information recording medium
EP1968350A1 (en) * 2005-12-28 2008-09-10 Konami Digital Entertainment Co., Ltd. Voice processor, voice processing method, program, and information recording medium
CN101347043A (en) * 2005-12-28 2009-01-14 科乐美数码娱乐株式会社 Voice processor, voice processing method, program, and information recording medium
US20090180624A1 (en) * 2005-12-28 2009-07-16 Konami Digital Entertainment Co., Ltd. Voice Processor, Voice Processing Method, Program, and Information Recording Medium
US8155324B2 (en) 2005-12-28 2012-04-10 Konami Digital Entertainment Co. Ltd. Voice processor, voice processing method, program, and information recording medium
US7679623B2 (en) * 2006-01-26 2010-03-16 Nintendo Co., Ltd. Image processing program and image processing device
US20070171221A1 (en) * 2006-01-26 2007-07-26 Nintendo Co., Ltd. Image processing program and image processing device
US20070270215A1 (en) * 2006-05-08 2007-11-22 Shigeru Miyamoto Method and apparatus for enhanced virtual camera control within 3d video games or other computer graphics presentations providing intelligent automatic 3d-assist for third person viewpoints
US9327191B2 (en) * 2006-05-08 2016-05-03 Nintendo Co., Ltd. Method and apparatus for enhanced virtual camera control within 3D video games or other computer graphics presentations providing intelligent automatic 3D-assist for third person viewpoints
US20100151943A1 (en) * 2006-11-09 2010-06-17 Kevin Johnson Wagering game with 3d gaming environment using dynamic camera
US8628415B2 (en) * 2006-11-09 2014-01-14 Wms Gaming Inc. Wagering game with 3D gaming environment using dynamic camera
US20080207324A1 (en) * 2007-02-28 2008-08-28 Kabushiki Kaisha Square Enix (Also Trading As Square Enix Co., Ltd.) Game apparatus, virtual camera control method, program and recording medium
US8641523B2 (en) * 2007-02-28 2014-02-04 Kabushiki Kaisha Square Enix Game apparatus, virtual camera control method, program and recording medium
US20100160042A1 (en) * 2007-09-27 2010-06-24 Konami Digital Entertainment Co., Ltd. Game program, game apparatus and game control method
US8241120B2 (en) 2007-09-27 2012-08-14 Konami Digital Entertainment Co., Ltd. Game program, game apparatus and game control method
US20100315415A1 (en) * 2007-11-01 2010-12-16 Konami Digital Entertainment Co., Ltd. Image Processing Device, Method for Processing Image, Information Recording Medium, and Program
EP2087928A3 (en) * 2007-12-21 2015-02-25 Nintendo Co., Ltd. Computer-readable storage medium having game program stored therein and game apparatus
US8740681B2 (en) * 2009-04-20 2014-06-03 Capcom Co., Ltd. Game machine, program for realizing game machine, and method of displaying objects in game
US20100267451A1 (en) * 2009-04-20 2010-10-21 Capcom Co., Ltd. Game machine, program for realizing game machine, and method of displaying objects in game
US20130116019A1 (en) * 2009-04-20 2013-05-09 Capcom Co., Ltd. Game machine, program for realizing game machine, and method of displaying objects in game
US20130122977A1 (en) * 2009-04-20 2013-05-16 Capcom Co., Ltd. Game machine, program for realizing game machine, and method of displaying objects in game
EP2243527A3 (en) * 2009-04-20 2013-12-11 Capcom Co., Ltd. Game machine, program for realizing game machine, and method of displaying objects in game
US8740682B2 (en) * 2009-04-20 2014-06-03 Capcom Co., Ltd. Game machine, program for realizing game machine, and method of displaying objects in game
US8665285B2 (en) * 2010-06-09 2014-03-04 Nintendo Co., Ltd. Storage medium having stored therein image processing program, image processing apparatus, image processing system, and image processing method
US20110304620A1 (en) * 2010-06-09 2011-12-15 Nintendo Co., Ltd. Storage medium having stored therein image processing program, image processing apparatus, image processing system, and image processing method
US10932728B2 (en) 2011-09-30 2021-03-02 Google Technology Holdings LLC Method and system for identifying location of a touched body part
US20130085410A1 (en) * 2011-09-30 2013-04-04 Motorola Mobility, Inc. Method and system for identifying location of a touched body part
US9924907B2 (en) * 2011-09-30 2018-03-27 Google Technology Holdings LLC Method and system for identifying location of a touched body part
US8834163B2 (en) * 2011-11-29 2014-09-16 L-3 Communications Corporation Physics-based simulation of warhead and directed energy weapons
US20130137066A1 (en) * 2011-11-29 2013-05-30 L-3 Communications Corporation Physics-based simulation of warhead and directed energy weapons
US20150243182A1 (en) * 2011-11-29 2015-08-27 L-3 Communications Corporation Physics-based simulation of warhead and directed energy weapons
US9333420B2 (en) * 2013-03-12 2016-05-10 Fourthirtythree Inc. Computer readable medium recording shooting game
US20140274239A1 (en) * 2013-03-12 2014-09-18 Fourthirtythree Inc. Computer readable medium recording shooting game
US10589180B2 (en) * 2013-04-05 2020-03-17 Gree, Inc. Method and apparatus for providing online shooting game
US11712634B2 (en) 2013-04-05 2023-08-01 Gree, Inc. Method and apparatus for providing online shooting game
US20230347254A1 (en) * 2013-04-05 2023-11-02 Gree, Inc. Method and apparatus for providing online shooting game
US11192035B2 (en) * 2013-04-05 2021-12-07 Gree, Inc. Method and apparatus for providing online shooting game
US10532286B2 (en) * 2014-05-01 2020-01-14 Activision Publishing, Inc. Reactive emitters of a video game effect based on intersection of coverage and detection zones
US20150314194A1 (en) * 2014-05-01 2015-11-05 Activision Publishing, Inc. Reactive emitters for video games
US10406437B1 (en) * 2015-09-30 2019-09-10 Electronic Arts Inc. Route navigation system within a game application environment
US11235241B2 (en) 2015-09-30 2022-02-01 Electronic Arts Inc. Route navigation system within a game application environment
US11508116B2 (en) * 2017-03-17 2022-11-22 Unity IPR ApS Method and system for automated camera collision and composition preservation
US11498004B2 (en) * 2020-06-23 2022-11-15 Nintendo Co., Ltd. Computer-readable non-transitory storage medium having instructions stored therein, game apparatus, game system, and game processing method
CN114225419A (en) * 2020-08-27 2022-03-25 腾讯科技(深圳)有限公司 Control method, device, equipment, storage medium and program product of virtual prop
US20220254094A1 (en) * 2021-02-09 2022-08-11 Canon Medical Systems Corporation Image rendering apparatus and method
US11688126B2 (en) * 2021-02-09 2023-06-27 Canon Medical Systems Corporation Image rendering apparatus and method
CN113069770A (en) * 2021-03-29 2021-07-06 广州三七互娱科技有限公司 Game role display method and device and electronic equipment
CN117395510A (en) * 2023-12-12 2024-01-12 湖南快乐阳光互动娱乐传媒有限公司 Virtual machine position control method and device

Also Published As

Publication number Publication date
JP2003334382A (en) 2003-11-25

Similar Documents

Publication Publication Date Title
US20040063501A1 (en) Game device, image processing device and image processing method
JP3745475B2 (en) GAME DEVICE AND IMAGE PROCESSING DEVICE
US8740681B2 (en) Game machine, program for realizing game machine, and method of displaying objects in game
US7390254B2 (en) Soccer game method for use in game apparatus, involves recognizing areas pertaining to power of character group, based on calculated arrival times of characters up to sample points
US6980207B2 (en) Image processing device and information recording medium
US8556695B2 (en) Information storage medium, image generation device, and image generation method
KR100276549B1 (en) Image generation apparatus, image generation method, game machine using the method
JP5234716B2 (en) PROGRAM, INFORMATION STORAGE MEDIUM, AND GAME DEVICE
US6972756B1 (en) Image generating device
US20100069152A1 (en) Method of generating image using virtual camera, storage medium, and computer device
JP5136742B2 (en) Electronic game device, electronic game control method, and game program
US20100009734A1 (en) Electronic play device, control method for electronic play device and game program
US20030032484A1 (en) Game apparatus for mixed reality space, image processing method thereof, and program storage medium
EP2394716A2 (en) Image generation system, program product, and image generation method for video games
JP3835005B2 (en) GAME DEVICE, GAME CONTROL METHOD, AND STORAGE MEDIUM
JP2010068872A (en) Program, information storage medium and game device
JP4363595B2 (en) Image generating apparatus and information storage medium
JP4292483B2 (en) Computer program
JP4117687B2 (en) Image processing device
JP4114825B2 (en) Image generating apparatus and information storage medium
JP3736767B2 (en) Image processing method
JP2011255114A (en) Program, information storage medium, and image generation system
JP5161384B2 (en) GAME SYSTEM, GAME CONTROL METHOD, PROGRAM, AND COMPUTER-READABLE RECORDING MEDIUM CONTAINING THE PROGRAM
JP2005334128A (en) Program, information storage medium and game device
JP2012166068A (en) Game system, game control method, program, and computer readable recording medium with program recorded therein

Legal Events

Date Code Title Description
AS Assignment

Owner name: KABUSHIKI KAISHA SEGA, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SHIMOKAWA, HITOSHI;TSUJI, YUKIO;SANBONGI, KAZUTOMO;AND OTHERS;REEL/FRAME:014508/0622

Effective date: 20030916

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION