WO2006003869A1 - プレイヤの画像を用いてオブジェクトを制御する情報処理装置、情報処理装置におけるオブジェクト制御方法 - Google Patents
プレイヤの画像を用いてオブジェクトを制御する情報処理装置、情報処理装置におけるオブジェクト制御方法 Download PDFInfo
- Publication number
- WO2006003869A1 WO2006003869A1 PCT/JP2005/011777 JP2005011777W WO2006003869A1 WO 2006003869 A1 WO2006003869 A1 WO 2006003869A1 JP 2005011777 W JP2005011777 W JP 2005011777W WO 2006003869 A1 WO2006003869 A1 WO 2006003869A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- player
- moving image
- detection target
- image
- information processing
- Prior art date
Links
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/213—Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/10—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
- A63F2300/1087—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera
- A63F2300/1093—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera using visible light
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/30—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
- A63F2300/308—Details of the user interface
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/6045—Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/66—Methods for processing data by generating or executing the game program for rendering three dimensional images
- A63F2300/6692—Methods for processing data by generating or executing the game program for rendering three dimensional images using special effects, generally involving post-processing, e.g. blooming
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/80—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
- A63F2300/8017—Driving on land or water; Flying
Definitions
- Patent Document 1 Since the input interface disclosed in Patent Document 1 directly corresponds to the operation of the player and the operation on the image processing apparatus, it has an operability that makes it easy for everyone. is doing. For this reason, application in the field of entertainment systems, in which the user's age group is wide, is expected. On the other hand, for applications in the field of entertainment systems, a device with improved game characteristics is required.
- FIG. 1 An example of the configuration of an entertainment system according to the present embodiment is shown in FIG.
- an analog or digital video camera 1 shoots a player 4 located at a position facing the display device 3.
- the entertainment device 2 continuously and continuously captures the resulting moving image, and the entertainment device 2
- a superimposed image obtained by superimposing a computer image (CG) generated by the above and a specular moving image of a moving image captured from the video camera 1 is displayed on the display device 3 in real time. Therefore, the operation of the player 4 is reflected in real time in the superimposed image, and the player 4 can enjoy playing with this superimposed image.
- the specular moving image can be generated by performing specular processing (image reversal processing of the image) with the entertainment device 2 on the moving image captured from the video camera 1, but other methods may be used.
- video camera 1 may be mirrored.
- the sub CPU 320 performs various operations according to the control program stored in the ROM 323.
- the sub DMAC 322 is a semiconductor device that performs control such as DMA transfer for each circuit connected to the sub bus B2 only when the bus interface INT is disconnected from the main bus B1 and the sub bus B2. It is.
- the input unit 331 receives a connection terminal 332 to which an input signal from the operation device 335 is input, a connection terminal 333 to which an image signal from the video camera 1 is input, and an audio signal from the video camera 1 or the like. Connection end It has child 334. In this specification, only the image is described, and the description of the sound is omitted for convenience.
- FIG. 1 A functional block diagram of the entertainment apparatus 2 is shown in FIG. That is, the entertainment apparatus 2 includes a video image input unit 101, a difference detection unit 102, a main control unit 103, an object data storage unit 104, a superimposed image generation unit 105, and a display control unit 106.
- the entertainment device 2 forms a functional block as shown in FIG. 3 by executing a program recorded on a recording medium such as a DVD-ROM or CD-ROM.
- the video image input unit 101 takes in a video image shot by the video camera 1, performs mirror surface processing, that is, left-right inversion processing, and outputs it.
- the video image is a moving image, and the video image input unit 101 continuously captures images sent from the video camera 1. Therefore, the video image input unit 101 includes a mirror processing unit 101a. Subsequent processing is performed on the mirrored video image. Note that when the mirror processing of the video image is performed by the video camera 1, the mirror processing unit 101a can be omitted.
- the difference detection unit 102 stores a certain mirror moving image in the main memory 311 as a “reference image” between the mirror moving images of other frames.
- the specular moving image to be stored may be the entire specular moving image for one frame, but it is sufficient if the difference value of the detection target region can be derived. You can do it.
- the difference detection unit 102 recognizes the player motion detection area 201, assuming that a difference value has occurred in the area corresponding to the hand movement. This recognition result is notified to the main control unit 103, and is used to determine the movement of the object appearing in the game scenario.
- the player's motion may be detected by recognizing a turn by extracting a difference between frames of the characteristic portion. In this case, the direction and speed of the player's movement can be grasped.
- the main control unit 103 controls the entire entertainment system. For example, when the entertainment apparatus 2 is executing a game program, the main control unit 103 determines the progress of the game according to the program. Further, when the main control unit 103 determines the progress of the game, in particular, the movement of an object appearing in the game, the detection result of the difference detection unit 102 may be referred to.
- the main control unit 103 can set one or a plurality of detection target areas in the video image. This detection target area can be used to determine whether or not the player has operated in that area. That is, it is sufficient for the difference detection unit 102 to obtain a difference value in the detection target region and detect the player's action.
- FIG. 5B is a diagram in which the detection target areas 202 and 203 and the player motion detection area 201 are overlapped.
- the player's action is detected in the detection target area on the right side, that is, the detection target area 203.
- the main control unit 103 recognizes that the action of the player has been detected in the detection target area 203, and detects it.
- the target area 202 it is recognized that the action of the player is not detected. It should be noted that the main control unit 103 is not affected even if the player's movement is detected outside the detection target area.
- the detection target areas may be adjacent to each other, or depending on the player's action, the player's action may be detected even in the detection target area not intended by the player.
- a threshold value is set for the player motion detection in the detection target region, and the evaluation value corresponding to the difference value of the frame exceeds the threshold value for the detection target region. Hope to certify that the motion has been detected.
- the detection target area can be moved according to the progress of the game scenario, for example.
- the position, shape, etc. can be changed according to the movement of the object to be moved in the game.
- the main control unit 103 generates a computer image to be presented to the player according to the game scenario.
- a CG generation unit 103a is provided.
- the CG generation unit 103a generates various computer images along the game story.
- the object data recorded in the object data storage unit 104 is referred to.
- the superimposed image generation unit 105 generates a superimposed image in which the video image mirror-processed by the image reversing unit 102 and the computer image generated by the CG generation unit 103a are superimposed. At this time, the video image can be superimposed by semi-transparent processing or the like so that the player can more easily check the computer image.
- the display control unit 106 causes the display device 3 to display the superimposed image generated by the superimposed image generation unit 105. That is, the player can perform various operations on the entertainment system while viewing the superimposed image of his / her specular moving image and the computer image.
- a first example of object movement processing mainly performed by the main control unit 103 will be described.
- a computer image a ball 210 is displayed as an object to be moved as shown in FIG. 6A. Then, the ball 210 is moved along the course 211 in accordance with the action of the player. It should be noted that, on the display, the course 211 and the background can be changed with respect to the ball 210, so that the ball 210 can be displayed so as to roll on the course 211.
- the computer image can include, for example, play time (time limit), score, and other information presented to the player according to the game scenario.
- FIG. 6B is a diagram showing a screen at the time of playing in which the mirror image of the player is superimposed on the computer image shown in FIG. 6A.
- the mirror moving image is made translucent and displayed over the computer image.
- Course 211 has a starting point and a goal point, and a path that turns left and right from the start toward the goal is set.
- the course 211 may be provided with an uphill and a downhill.
- the image data of the ball 210, the image data of the course 211, geographic data, and the like are stored in advance in the object data storage unit 104.
- main controller 103 sets detection target area 210a and detection target area 210b in correspondence with the area of ball 210.
- a display is made so that the ball 210 moves to the left front as shown in FIG. 7B.
- a display is made so that the ball 210 moves forward as shown in FIG. 7C.
- a display is made so that the ball 210 moves to the right as shown in FIG. 7D.
- the ball 210 is kept in a stationary state when the ball 210 is stationary, and the speed is slow when the ball 210 is moving. Display as follows.
- the force may be displayed as if the ball 210 is moving.
- the course has an uphill, downhill, etc.
- the entertainment characteristics of the game are further enhanced.
- the player can control the movement of the ball 210 by moving in the vicinity of the ball 210, for example, by waving his hand. Specifically, when moving on the right side of the ball 210, the ball 210 moves to the left, and when moving on the left side of the ball 210, the ball 210 moves to the right and operates on the entire surface of the ball 210. The ball 210 goes straight. This provides a game in which the ball 210 is moved from the start point of the course to the goal point by the action of the player. At this time, you may set a time limit and give the player the task of reaching the goal point within the time limit.
- FIG. 9 is a flowchart for explaining processing of the entertainment system in the first embodiment.
- the entertainment system starts playing based on the instructions of the player, the following processing is repeated at a predetermined interval, for example, every frame until the game ends.
- the entertainment system determines whether or not an action of the player is detected in the detection target areas 210a and 210b (S101).
- a plurality of detection target areas are provided for one object (ball 210), and the object is moved based on the detection result of each detection target area.
- a course-out process (S106) is performed and the game is over.
- a computer image in which a ball falls can be displayed, or a display indicating that the game is over can be performed.
- a goal process (S 108) is performed to clear the game.
- points can be added to the player or a display indicating that the game has been cleared can be performed.
- the force that provided two detection target regions on the ball 210 may be provided at three locations as shown in Fig. 8A.
- the detection target area 210c is provided on the left side of the ball 210
- the detection target area 210d is provided in the center portion
- the detection target area 21 Oe is provided on the right side.
- the acquired evaluation value is further classified into “large” and “small” according to a predetermined criterion. To do. As a result, “large” and “small” movements of the player are detected for each detection target area.
- various controls can be performed according to the detection result. For example, when the player's movement is detected only in the center detection target area 210d, the ball 210 moves straight.
- a character 220 is displayed as an object to be moved as shown in FIG. 10A. It is assumed that a plurality of characters 220 can exist. Then, these characters 220 are moved according to the player's action.
- a goal is provided in the computer image, and the goal is provided with an openable door. This door is controlled to open and close as appropriate. The player opens and operates the character 220 to reach the goal through the door.
- the computer image can include, for example, time limit, score, and other information presented to the player according to the game scenario.
- FIG. 10B is a diagram showing a screen during play in which a mirror moving image of the player is superimposed on the computer image shown in FIG. 10A.
- the mirrored moving image is made translucent and displayed over the computer image.
- the image data of the character 220 is stored in the object data storage unit 104 in advance.
- main controller 103 sets four detection target areas 220a, 220b, 220c, and 220d corresponding to the area where character 220 exists. That is, an upper left detection target area 220a, an upper right detection area 220b, a lower right detection target area 220c, and a lower left detection target area 220d are set around the character 220.
- the character 220 When the player's motion is detected in any of the detection target areas, the character 220 is moved so that the detection target area force also moves away. That is, when the player's motion is detected in the detection target area 220a at the upper left, as shown in FIG. 11B, display is performed so that the character 220 moves to the lower right. Similarly, when the player's movement is detected in the upper right detection target area 220b, the character 220 is displayed to move to the lower left, and when the player's movement is detected in the lower right detection target area 220c, When the character 220 is displayed to move to the upper left, and the player's action is detected in the detection target area 220d at the lower left, the character 220 is displayed to move to the upper right.
- the evaluation value corresponding to the difference value is the largest, that is, the detection target having a wider area in which the motion is detected.
- the area is treated as a detection target area in which the action of the player is detected.
- the player can control the movement of the character 220 by moving in the vicinity of the character 220, for example, by waving his hand. Since there can be a plurality of characters 220, the player designates the character 220 to be moved and then performs an operation for movement.
- the designation of the character 220 to be moved can be performed, for example, as follows. sand In other words, if any player's motion is detected in any detection target area of a certain character 220 in a state where no character 220 is the target of movement, that character 220 is treated as a movement target. . In order to clarify that the character 220 to be moved is a moving object, it is desirable to distinguish it from other characters 220 by, for example, changing the color or adding a mark.
- FIG. 13 is a flowchart for explaining the processing of the entertainment system in the second embodiment.
- the entertainment system starts playing based on the instructions of the player, the following processing is repeated at a predetermined interval, for example, every frame until the game ends.
- the entertainment system determines whether any character 220 is an operation target V or not (S201).
- the operation target character 220 is set based on the instruction of the player (S202). That is, the character 220 that has detected the player's movement in any of the detection target areas is set as the operation target. Note that a plurality of characters 220 may be set as operation targets.
- the movement value of the character 220 is calculated in accordance with a rule determined in advance (S204), and the position of the movement destination of the character 220 is calculated (S205).
- the rules determined in advance determine the traveling direction of the character 220 according to the detection results of the detection target areas 220a to 220d in the upper left, upper right, lower right, and lower left as described above. Further, the amount of movement can be set to a predetermined value.
- the position is the destination of movement and move the character 220 so as to approach the position sequentially. In this case, until the character 220 reaches the destination, it is not necessary to determine the detection target area for each character 220 in units of frames. [0080] Then, the display process of the character 220 moving and crawl is performed! / ⁇ , and the computer screen is updated (S206).
- a plurality of detection target areas (220a, 220b, 220c, 220d) are provided for one object (character 220), and based on the detection results of the respective detection target areas.
- the game scenario will progress by moving the object.
- goal processing is performed for the character 220.
- the goal processing for example, points are added to the player. Further, the character 220 is excluded from the operation target.
- a detection target area may be provided.
- the upper left detection target area 220e, the upper right detection area 220f, the lower right detection target area 220g, and the lower left detection target area 220g are set around the character 220.
- the character 220 is moved so as to proceed in the direction of the detection target area. That is, when the player's motion is detected in the detection target area 220a at the upper left, as shown in FIG. 12B, display is performed so that the character 220 moves to the upper left. Similarly, when the player's motion is detected in the upper right detection target area 220b, the character 220 is displayed to move to the upper right, and the player's movement is detected in the lower right detection target area 220c. In such a case, display is performed such that the character 220 moves to the lower right, and when the action of the player is detected in the detection target area 220d at the lower left, display is performed such that the character 220 moves to the lower left.
- the character 220 moves in the lower right direction. To do. Furthermore, if the player continues to move at the same position, eventually the character 220 moves to the movement position of the player. Then, the position of the character 220 and the movement position of the player overlap, and the movement of the player deviates from the detection target area. As a result, the player's movement is not detected, and the movement of the character 220 stops. In other words, the player can control the movement position of the character 220 by performing an action at the movement target location.
- a plurality of detection target areas are provided for one object displayed as a computer image, and the detection results of the respective detection target areas with respect to the action of the superimposed player are displayed.
Abstract
Description
Claims
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/630,748 US7911447B2 (en) | 2004-06-30 | 2005-06-28 | Information processing device for controlling object by using player image and object control method in the information processing device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2004194815A JP4005061B2 (ja) | 2004-06-30 | 2004-06-30 | 情報処理装置、プログラム、および、情報処理装置におけるオブジェクト制御方法 |
JP2004-194815 | 2004-06-30 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2006003869A1 true WO2006003869A1 (ja) | 2006-01-12 |
Family
ID=35782677
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2005/011777 WO2006003869A1 (ja) | 2004-06-30 | 2005-06-28 | プレイヤの画像を用いてオブジェクトを制御する情報処理装置、情報処理装置におけるオブジェクト制御方法 |
Country Status (3)
Country | Link |
---|---|
US (1) | US7911447B2 (ja) |
JP (1) | JP4005061B2 (ja) |
WO (1) | WO2006003869A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102681658A (zh) * | 2011-01-06 | 2012-09-19 | 三星电子株式会社 | 通过动作控制的显示设备及其动作控制方法 |
JP2012230440A (ja) * | 2011-04-22 | 2012-11-22 | Nintendo Co Ltd | 情報処理システム、情報処理装置、情報処理方法及び情報処理プログラム |
US9513711B2 (en) | 2011-01-06 | 2016-12-06 | Samsung Electronics Co., Ltd. | Electronic device controlled by a motion and controlling method thereof using different motions to activate voice versus motion recognition |
Families Citing this family (34)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1821183A4 (en) * | 2004-10-05 | 2011-01-26 | Nikon Corp | ELECTRONIC EQUIPMENT |
US20060197775A1 (en) * | 2005-03-07 | 2006-09-07 | Michael Neal | Virtual monitor system having lab-quality color accuracy |
JP5098004B2 (ja) * | 2006-02-28 | 2012-12-12 | 株式会社メガチップス | 携帯端末 |
US7916129B2 (en) * | 2006-08-29 | 2011-03-29 | Industrial Technology Research Institute | Interactive display system |
CA2591808A1 (en) * | 2007-07-11 | 2009-01-11 | Hsien-Hsiang Chiu | Intelligent object tracking and gestures sensing input device |
KR101079598B1 (ko) * | 2007-12-18 | 2011-11-03 | 삼성전자주식회사 | 디스플레이장치 및 그 제어방법 |
JP5096258B2 (ja) * | 2008-08-05 | 2012-12-12 | 株式会社藤商事 | 遊技機 |
US8305345B2 (en) * | 2008-08-07 | 2012-11-06 | Life Technologies Co., Ltd. | Multimedia playing device |
JP2010134629A (ja) * | 2008-12-03 | 2010-06-17 | Sony Corp | 情報処理装置および情報処理方法 |
JP2010142592A (ja) | 2008-12-22 | 2010-07-01 | Nintendo Co Ltd | ゲームプログラムおよびゲーム装置 |
JP5113781B2 (ja) * | 2009-02-17 | 2013-01-09 | シャープ株式会社 | テレビ電話装置 |
JP5143085B2 (ja) * | 2009-05-28 | 2013-02-13 | 株式会社エヌ・ティ・ティ・ドコモ | コンピュータ、システム、シミュレーション方法及びシミュレーションプログラム |
JP2011110215A (ja) * | 2009-11-26 | 2011-06-09 | Toyota Motor Kyushu Inc | リハビリテーション用システム、プログラム、およびプログラムを記録したコンピュータ読み取り可能な記録媒体 |
US8864581B2 (en) | 2010-01-29 | 2014-10-21 | Microsoft Corporation | Visual based identitiy tracking |
JP5255610B2 (ja) * | 2010-08-18 | 2013-08-07 | 株式会社コナミデジタルエンタテインメント | ゲーム装置、ゲーム装置の制御方法、ならびに、プログラム |
JP5133380B2 (ja) * | 2010-08-27 | 2013-01-30 | 株式会社コナミデジタルエンタテインメント | ゲーム装置、ゲーム装置の制御方法、ならびに、プログラム |
JP5627973B2 (ja) * | 2010-09-24 | 2014-11-19 | 任天堂株式会社 | ゲーム処理をするためのプログラム、装置、システムおよび方法 |
FR2967804B1 (fr) * | 2010-11-19 | 2013-01-04 | Total Immersion | Procede et dispositif de detection et de suivi d'objets non rigides en mouvement, en temps reel, dans un flux video, permettant a un utilisateur d'interagir avec un systeme informatique |
US9848106B2 (en) * | 2010-12-21 | 2017-12-19 | Microsoft Technology Licensing, Llc | Intelligent gameplay photo capture |
US8928589B2 (en) | 2011-04-20 | 2015-01-06 | Qualcomm Incorporated | Virtual keyboards and methods of providing the same |
JP5583087B2 (ja) * | 2011-08-04 | 2014-09-03 | 株式会社東芝 | 画像処理装置、方法、及びプログラム |
JP5974422B2 (ja) * | 2011-10-04 | 2016-08-23 | 長崎県公立大学法人 | 画像表示装置 |
US9033795B2 (en) * | 2012-02-07 | 2015-05-19 | Krew Game Studios LLC | Interactive music game |
JP5880199B2 (ja) | 2012-03-27 | 2016-03-08 | ソニー株式会社 | 表示制御装置、表示制御方法およびプログラム |
TW201403454A (zh) * | 2012-07-05 | 2014-01-16 | Asustek Comp Inc | 顯示畫面旋轉的方法及系統 |
JP5689103B2 (ja) * | 2012-11-07 | 2015-03-25 | 任天堂株式会社 | ゲームプログラム、ゲームシステム、ゲーム装置、およびゲーム制御方法 |
JP6313666B2 (ja) * | 2014-06-06 | 2018-04-18 | 株式会社ソニー・インタラクティブエンタテインメント | 画像処理装置、画像処理方法及び画像処理プログラム |
WO2015186402A1 (ja) * | 2014-06-06 | 2015-12-10 | 株式会社ソニー・コンピュータエンタテインメント | 画像処理装置、画像処理方法及び画像処理プログラム |
US9696813B2 (en) * | 2015-05-27 | 2017-07-04 | Hsien-Hsiang Chiu | Gesture interface robot |
JP5979450B2 (ja) * | 2014-07-28 | 2016-08-24 | 株式会社クラス・マイスター | ゲーム装置の制御プログラム |
US9977565B2 (en) | 2015-02-09 | 2018-05-22 | Leapfrog Enterprises, Inc. | Interactive educational system with light emitting controller |
CN108932632A (zh) * | 2018-06-01 | 2018-12-04 | 北京市商汤科技开发有限公司 | 广告互动方法及装置、电子设备和存储介质 |
FR3088733A1 (fr) * | 2018-11-16 | 2020-05-22 | Abderrahim Ouabbas | Miroir a image retablie |
CN112906553B (zh) | 2021-02-09 | 2022-05-17 | 北京字跳网络技术有限公司 | 图像处理方法、装置、设备及介质 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO1999034276A2 (en) * | 1997-12-23 | 1999-07-08 | Koninklijke Philips Electronics N.V. | System and method for constructing three-dimensional images using camera-based gesture inputs |
US6088018A (en) * | 1998-06-11 | 2000-07-11 | Intel Corporation | Method of using video reflection in providing input data to a computer system |
JP2001273503A (ja) * | 2000-03-23 | 2001-10-05 | Eiji Kawamura | モーション認識システム |
JP2001307124A (ja) * | 2000-02-15 | 2001-11-02 | Sega Corp | 画像処理システム、画像処理装置及び撮像装置 |
JP2002149302A (ja) * | 2000-11-09 | 2002-05-24 | Sharp Corp | インターフェース装置およびインターフェース処理プログラムを記録した記録媒体 |
JP2002292123A (ja) * | 2001-03-29 | 2002-10-08 | Konami Co Ltd | ゲーム装置、ゲーム方法、ゲームプログラム及びゲームシステム |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4988981B1 (en) * | 1987-03-17 | 1999-05-18 | Vpl Newco Inc | Computer data entry and manipulation apparatus and method |
JP2552427B2 (ja) * | 1993-12-28 | 1996-11-13 | コナミ株式会社 | テレビ遊戯システム |
US5594469A (en) * | 1995-02-21 | 1997-01-14 | Mitsubishi Electric Information Technology Center America Inc. | Hand gesture machine control system |
JP3789088B2 (ja) | 1996-12-06 | 2006-06-21 | 財団法人流通システム開発センタ− | 統合情報通信システム |
JP2001030712A (ja) * | 1999-07-15 | 2001-02-06 | Bridgestone Corp | 空気入りタイヤ |
JP3725460B2 (ja) | 2000-10-06 | 2005-12-14 | 株式会社ソニー・コンピュータエンタテインメント | 画像処理装置、画像処理方法、記録媒体、コンピュータプログラム、半導体デバイス |
JP2002157606A (ja) * | 2000-11-17 | 2002-05-31 | Canon Inc | 画像表示制御装置、複合現実感提示システム、画像表示制御方法、及び処理プログラムを提供する媒体 |
-
2004
- 2004-06-30 JP JP2004194815A patent/JP4005061B2/ja active Active
-
2005
- 2005-06-28 WO PCT/JP2005/011777 patent/WO2006003869A1/ja active Application Filing
- 2005-06-28 US US11/630,748 patent/US7911447B2/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO1999034276A2 (en) * | 1997-12-23 | 1999-07-08 | Koninklijke Philips Electronics N.V. | System and method for constructing three-dimensional images using camera-based gesture inputs |
US6088018A (en) * | 1998-06-11 | 2000-07-11 | Intel Corporation | Method of using video reflection in providing input data to a computer system |
JP2001307124A (ja) * | 2000-02-15 | 2001-11-02 | Sega Corp | 画像処理システム、画像処理装置及び撮像装置 |
JP2001273503A (ja) * | 2000-03-23 | 2001-10-05 | Eiji Kawamura | モーション認識システム |
JP2002149302A (ja) * | 2000-11-09 | 2002-05-24 | Sharp Corp | インターフェース装置およびインターフェース処理プログラムを記録した記録媒体 |
JP2002292123A (ja) * | 2001-03-29 | 2002-10-08 | Konami Co Ltd | ゲーム装置、ゲーム方法、ゲームプログラム及びゲームシステム |
Non-Patent Citations (1)
Title |
---|
KUBODERA A.: "Creation of Game Software using Action Interface", THE INSTITUTE OF ELECTRONICS, INFORMATION AND COMMUNICATION ENGINEERS GIJUTSU KENKYU HOKOKU, vol. 97, no. 85, 3 June 1997 (1997-06-03), pages 99 - 104, XP002997800 * |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102681658A (zh) * | 2011-01-06 | 2012-09-19 | 三星电子株式会社 | 通过动作控制的显示设备及其动作控制方法 |
CN102681658B (zh) * | 2011-01-06 | 2016-03-09 | 三星电子株式会社 | 通过动作控制的显示设备及其动作控制方法 |
US9398243B2 (en) | 2011-01-06 | 2016-07-19 | Samsung Electronics Co., Ltd. | Display apparatus controlled by motion and motion control method thereof |
US9513711B2 (en) | 2011-01-06 | 2016-12-06 | Samsung Electronics Co., Ltd. | Electronic device controlled by a motion and controlling method thereof using different motions to activate voice versus motion recognition |
JP2012230440A (ja) * | 2011-04-22 | 2012-11-22 | Nintendo Co Ltd | 情報処理システム、情報処理装置、情報処理方法及び情報処理プログラム |
Also Published As
Publication number | Publication date |
---|---|
JP4005061B2 (ja) | 2007-11-07 |
US20080030459A1 (en) | 2008-02-07 |
US7911447B2 (en) | 2011-03-22 |
JP2006014875A (ja) | 2006-01-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4005061B2 (ja) | 情報処理装置、プログラム、および、情報処理装置におけるオブジェクト制御方法 | |
WO2006003870A1 (ja) | プレイヤの画像を用いてゲームキャラクタの移動を制御する情報処理装置、ゲームキャラクタ移動制御方法 | |
US8979650B2 (en) | Game apparatus, recording medium having game program recorded thereon, and game system | |
JP4187768B2 (ja) | ゲーム装置、進行制御方法、および、プログラム | |
EP1768759B1 (en) | Control of data processing | |
JP3841806B2 (ja) | 画像処理装置および画像処理方法 | |
JP2007167533A (ja) | ビデオゲームプログラム、ビデオゲーム装置及びビデオゲーム制御方法 | |
EP1213044B1 (en) | Video game system, character action control method, and readable storage medium storing character action control program | |
JP2009279038A (ja) | ゲームプログラムおよび記録媒体 | |
US6793576B2 (en) | Methods and apparatus for causing a character object to overcome an obstacle object | |
JP4956568B2 (ja) | ゲーム装置、ゲーム制御方法、及び、プログラム | |
JP5502043B2 (ja) | ゲーム装置、および、プログラム | |
JP2009011371A (ja) | レーシングゲーム装置のプログラム、そのプログラムを格納した記録媒体、及びレーシングゲーム装置 | |
US8345001B2 (en) | Information processing system, entertainment system, and information processing system input accepting method | |
US7264547B1 (en) | Game apparatus, game image preparation method and information storage medium | |
JP4035652B2 (ja) | ゲームを実行する方法及び、これを実施するゲーム装置 | |
JP2009207594A (ja) | プログラム、情報記憶媒体、およびゲーム装置 | |
JP2005261642A (ja) | エンタテインメント装置 | |
JP4148868B2 (ja) | ゲームプログラムおよびゲーム装置 | |
JP3981388B2 (ja) | ビデオゲームプログラム、ビデオゲーム装置及びビデオゲーム制御方法 | |
US20110183756A1 (en) | Game device, method for controlling game device, program and information memory medium | |
JP3853796B2 (ja) | 情報処理装置およびエンタテインメント装置 | |
CN112891921A (zh) | 游戏装置、游戏处理方法以及记录介质 | |
KR20060023313A (ko) | 체감형 게임용 화상처리 방법 및 이를 이용한 게임 방법 | |
JP2001256503A (ja) | 球軌跡の解析方法、球軌跡の解析装置、ゲーム実行方法、ゲーム装置、及び記録媒体 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS KE KG KM KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NG NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 11630748 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWW | Wipo information: withdrawn in national office |
Country of ref document: DE |
|
122 | Ep: pct application non-entry in european phase | ||
WWP | Wipo information: published in national office |
Ref document number: 11630748 Country of ref document: US |