US20110195781A1 - Multi-touch mouse in gaming applications - Google Patents

Multi-touch mouse in gaming applications Download PDF

Info

Publication number
US20110195781A1
US20110195781A1 US12/701,150 US70115010A US2011195781A1 US 20110195781 A1 US20110195781 A1 US 20110195781A1 US 70115010 A US70115010 A US 70115010A US 2011195781 A1 US2011195781 A1 US 2011195781A1
Authority
US
United States
Prior art keywords
touch mouse
user interaction
game
user
task
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/701,150
Inventor
Billy Chen
Hrvoje Benko
Eyal Ofek
Daniel A. Rosenfeld
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Microsoft Technology Licensing LLC
Original Assignee
Microsoft Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Microsoft Corp filed Critical Microsoft Corp
Priority to US12/701,150 priority Critical patent/US20110195781A1/en
Assigned to MICROSOFT CORPORATION reassignment MICROSOFT CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: OFEK, EYAL, CHEN, BILLY, BENKO, HRVOJE, ROSENFELD, DANIEL A
Publication of US20110195781A1 publication Critical patent/US20110195781A1/en
Assigned to MICROSOFT TECHNOLOGY LICENSING, LLC reassignment MICROSOFT TECHNOLOGY LICENSING, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MICROSOFT CORPORATION
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1006Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals having additional degrees of freedom
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6045Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands

Definitions

  • Today's gaming applications are becoming increasingly comprehensive, allowing players to accomplish a wide range of complex tasks. For example, players may perform tasks ranging from flying F16 fighter jets to commanding army squadrons in military scenarios. Many of these tasks require a large number of input parameters from various input devices, such as a mouse and keyboard combination. For example, in a first-person-shooter (FPS), the player's view orientation may be mapped to the mouse position, the player's movement may be mapped to keyboard keys (e.g., w, a, s, d), and the player's actions may be mapped to a combination of keyboard keys and/or mouse clicks.
  • FPS first-person-shooter
  • keyboard keys e.g., w, a, s, d
  • this type of input scheme may provide limited control in a computer video game.
  • the keyboard's inherent binary control may limit the player's ability to control tasks that require continuous control, such as in navigation (e.g., varying travel speed).
  • the mouse provides limited control in computer video games due to the limited number of inputs (e.g., 3 input mouse buttons may not map easily to 8 different character abilities).
  • these input schemes may have a high learning curve for new players.
  • a multi-touch mouse may be interpreted as a mouse comprising two or more spatial sensors, which may or may not be overlaid on one another.
  • a multi-touch mouse may comprise a first spatial sensor configured to detect the position of the multi-touch mouse (e.g., a position of the mouse based upon the relative location of the mouse on a surface, such as a mouse pad, direction change, velocity, etc.).
  • the multi-touch mouse may comprise a second spatial sensor configured to detect gestures and/or the position of one or more fingers on the surface of the multi-touch mouse.
  • the second spatial sensor may be configured to detect a grip, change in a grip, a swipe gesture, a finger location, a finger press, a first finger to second finger distance, and/or other hand or finger detection.
  • the first sensor can be said to face “down” toward a (desktop/mouse pad) surface upon which the mouse rests/moves, while the second sensor can be said to face “up” away from said surface, and this can be thought of as an “overlying” arrangement as the second sensor (located on an upper surface of the mouse) may be substantially directly above the first sensor (located on a lower surface of the mouse). It will be appreciated, however, that other “overlying” arrangements are possible, and also that such arrangements may not be necessary.
  • a touch sensor may be located substantially directly above a movement sensor, such an arrangement is not necessary.
  • “overlying” as used herein is not meant to be interpreted in a limiting manner, for example, to mean that the sensors are in direct contact with one another, influence one another, need to be acted on concurrently, etc. That is, one sensor can be acted on by a user, for example, while the other sensor is not.
  • a multi-touch mouse may comprise one or more sensors (e.g., buttons) that may or may not overlay one or more other sensors (e.g., none, one, some, all) of the mouse.
  • a spatial sensor may (but need not) detect one or more user interactions concurrently.
  • a second spatial sensor may detect a grip and a swipe gesture concurrently.
  • a user interaction may comprise one or more spatial measurements.
  • a second spatial sensor may detect a finger swipe gesture as comprising a swipe gesture, a swipe gesture direction, a swipe gesture length, and/or a swipe gesture speed.
  • finger may be interpreted as one or more digits of a hand (e.g., thumb, index finger, middle finger, ring finger, pinky, etc.).
  • a character may be interpreted as an in-game character representing a user within the gamming application environment.
  • any type of user interactions are contemplated herein and that said term (e.g., as used in the claims) is not meant to be limited to merely the examples provided herein.
  • a grip, swipe, pinch, finger location, finger press, distance between digits, etc. may be mentioned herein, these and any other gestures and/or interactions are contemplated.
  • a mapping component may be configured to maintain a mapping of one or more user interactions within a multi-touch mouse to one or more in-game tasks.
  • a finger swipe gesture on the surface of the multi-touch mouse may be mapped to an avatar movement in-game task.
  • a grip (a change in grip) of the multi-touch mouse may be mapped to a camera view roll in-game task.
  • the mappings may be derived based upon user preferences defined by one or more user settings.
  • the mapping component may be configured to map portions of the multi-touch mouse surface with in-game tasks.
  • the surface of the multi-touch mouse may be treated as one or more portions (regions), such that respective portions are mapped to different in-game tasks (e.g., a first portion relative to the forefinger location may be mapped to an avatar movement in-game task, while a second portion relative to the pinky location may be mapped to a character jump in-game task).
  • portions regions
  • in-game tasks e.g., a first portion relative to the forefinger location may be mapped to an avatar movement in-game task, while a second portion relative to the pinky location may be mapped to a character jump in-game task.
  • a task component may be configured to receive user input comprising a first user interaction with a first spatial sensor of the multi-touch mouse, a second user interaction with a second spatial sensor of the multi-touch mouse, and/or other user interactions with spatial sensors within the multi-touch mouse.
  • the first user interaction and the second user interaction may be received concurrently within the user input because the first spatial sensor may detect the first user input at a substantially similar time as to when the second spatial sensor detects the second user input due to an overlaid configuration of the spatial sensors. It may be appreciated that the overlaid configuration allows the spatial sensors to operate independent of one another.
  • the task component may be configured to perform a first in-game task corresponding to the first user interaction, a second in-game task corresponding to the second user interaction, and/or other in-game tasks corresponding to other user interactions based upon the mapping.
  • the task component may receive user input comprising a first user interaction relating to a multi-touch mouse position change detected by the first spatial sensor, a second user interaction relating to a first finger to second finger distance detected by the second spatial sensor, and a third user interaction relating to a grip detected by the second spatial sensor.
  • the task component may be configured to perform a character view change in-game task based upon the multi-touch mouse position change, a view zoom in-game task based upon the first finger to second finger distance, and a camera view tilt based upon the grip as specified in the mapping maintained by the mapping component.
  • FIG. 1 is a flow chart illustrating an exemplary method of performing in-game tasks based upon user input on a multi-touch mouse.
  • FIG. 2 is a component block diagram illustrating an exemplary system for performing in-game tasks based upon user input on a multi-touch mouse.
  • FIG. 3A is an illustration of an example of a multi-touch mouse.
  • FIG. 3B is an illustration of an example of a multi-touch mouse.
  • FIG. 3C is an illustration of an example of a multi-touch mouse.
  • FIG. 4 is an illustration of an example of a multi-touch mouse configured to generate user input based upon detected user interaction.
  • FIG. 5 is an illustration of an example of performing in-game tasks based upon user input on a multi-touch mouse.
  • FIG. 6 is an illustration of an example of performing in-game tasks based upon user input on a multi-touch mouse.
  • FIG. 7 is an illustration of an example of performing in-game tasks based upon user input on a multi-touch mouse.
  • FIG. 8 is an illustration of an exemplary computer-readable medium wherein processor-executable instructions configured to embody one or more of the provisions set forth herein may be comprised.
  • FIG. 9 illustrates an exemplary computing environment wherein one or more of the provisions set forth herein may be implemented.
  • Video games may be developed based upon action, role playing, first person shooter, strategy, flight simulation, third person adventure, racing, massive multiplayer online, and/or other genres.
  • video games may be developed based upon action, role playing, first person shooter, strategy, flight simulation, third person adventure, racing, massive multiplayer online, and/or other genres.
  • different user input devices have been developed. For example, joysticks, keyboards, mice, gamepads, motion sensors, and other peripherals are available to video game players. There are even keyboards customized sole for the purpose of playing particular games.
  • These input devices are developed in an attempt to map user input with in-game tasks. Unfortunately, conventional input devices do not allow for user interaction that adequately maps to the complex input parameters required by today's computer video games.
  • the multi-touch mouse allows a video game player to invoke various in-game tasks within a computer video game by performing user interactions on the multi-touch mouse that are mapped to the various in-game tasks
  • the multi-touch mouse may comprise a first spatial sensor configured to detect the position of the multi-touch mouse.
  • the mouse position change may comprise one or more measurements, such as direction change, speed, acceleration, velocity, etc.
  • An in-game task such as a character view change, may be performed based upon receiving user input of the multi-touch position change.
  • the multi-touch mouse may comprise a second spatial sensor configured to detect a plurality of user interactions mapped to one or more in-game tasks.
  • the second spatial sensor may detect a finger position, a finger swipe gesture, a first finger to second finger distance, etc.
  • the first spatial sensor and the second spatial sensor allow a user to perform a wide variety of gestures that may be individually mapped to different in-game tasks. Because a wide variety of gestures are available using just the multi-touch mouse, the need to use an additional input device, such as a keyboard, and/or the need to perform complex input combinations may be mitigated. For example, a user may be able to control the movement, camera view, and actions of a character within a first person shooter video game using just the multi-touch mouse.
  • FIG. 1 One embodiment of performing in-game tasks based upon user input on a multi-touch mouse is illustrated by an exemplary method 100 in FIG. 1 .
  • the method begins.
  • user input comprising a first user interaction with a first spatial sensor of a multi-touch mouse and a second user interaction with a second spatial sensor of the multi-touch mouse is received. It may be appreciated that the first user interaction and the second user interaction may be received concurrently within the user input.
  • user interactions may be mapped to in-game tasks. For example, if the video game player is engaged with a strategy video game, then user interactions may be mapped to in-game tasks of the strategy video game.
  • the in-game tasks e.g., select an infantry, rotate view, issue infantry movement commands, issue infantry action commands, etc.
  • the multi-touch mouse e.g., a swipe gesture, a multi-touch mouse position change, a grip or change in grip, etc.
  • portions (regions) of the multi-touch mouse surface may be mapped to in-game tasks. That is, a user interaction with a particular portion of the multi-touch mouse surface may be mapped to a specific in-game task. For example, if the video game player is engaged with a flight simulation game, then user interactions with different portions of the multi-touch mouse surface may be mapped to different in-game tasks of the flight simulation game. User interaction with a first portion of the multi-touch mouse surface may be mapped to accelerations/deceleration. User interaction with a second portion of the multi-touch mouse surface may be mapped to a view change of the pilot. User interaction with a third portion of the multi-touch mouse surface may be mapped to the flight path direction.
  • the user interaction with the one or more portions of the multi-touch mouse surface may be detected by one or more spatial sensor within the multi-touch mouse. It may be appreciated that a portion (region) of the multi-touch mouse may be mapped to more than one in-game tasks. For example, an upper left corner portion of the multi-touch mouse may be mapped to a zoom aim in-game task and a shoot in-game task.
  • a first portion may be a first region of the multi-touch mouse associated with a first in-game task.
  • a second portion may be a second region of the multi-touch mouse associated with a second in-game task. It may be appreciated that the first region and the second region may or may not overlap on the surface of the multi-touch mouse. In this way, the first in-game task and second in-game task may be invoked based upon user interaction with the overlap (and/or degree thereof) of the first and second regions.
  • a first in-game task may be performed based upon the first user interaction and a second in-game task may be performed based upon the second user interaction.
  • the in-game tasks may be performed based upon the user interactions mapping to the in-games tasks. It may be appreciated that the first in-game task and the second in-game task may be performed concurrently.
  • the method ends.
  • FIG. 2 illustrates an example of a system 200 configured for performing in-game tasks 212 based upon user input 204 on a multi-touch mouse 202 .
  • the system 200 may comprise a task component 210 and/or a mapping component 206 .
  • the mapping component 206 may be configured to maintain a mapping 208 of one or more user interactions on a multi-touch mouse 202 to one or more in-game tasks of a computer video game 214 .
  • a grip user interaction may be mapped to a camera view roll in-game task
  • a swipe user interaction may be mapped to a movement in-game task
  • a swipe length may be mapped to a movement speed in-game task
  • a multi-touch mouse location may be mapped to a view in-game task, etc.
  • the mapping 208 may be maintained based upon user preference defined by one or more user settings. It may be appreciated that a spatial sensor of the multi-touch mouse 202 may be configured to detect one or more of the user interactions mapped within the mapping 208
  • the mapping component 206 may be configured to maintain the mapping 208 as comprising mappings of one or more portions (regions) of the multi-touch mouse surface to one or more in-game tasks. That is, user interaction with a first portion of the multi-touch mouse surface may be mapped to a first in-game task and user interaction with a second portion of the multi-touch mouse surface may be mapped to a second in-game task. It may be appreciated that a spatial sensor of the multi-touch mouse 202 may be configured to detect user interaction with one or more of the portions (regions) of the multi-touch mouse surface. It may be appreciated that the mapping component 206 may map two or more in-game tasks to a single user interaction. It may be appreciated that the mapping component 206 may map a single in-game task to two or more user interactions.
  • the multi-touch mouse 202 may comprise a first spatial sensor, a second spatial sensor, a button, and/or other spatial sensors.
  • the first spatial sensor and the second spatial sensor may be in an overlaid configuration within the multi-sensor mouse 202 .
  • the task component 210 may be configured to receive user input 204 comprising a first user interaction detected by a first spatial sensor of the multi-touch mouse 202 and a second user interaction detected by a second spatial sensor of the multi-touch mouse 202 .
  • the task component 210 may be configured to receive the user input 204 comprising three or more user interactions (e.g., a third user interaction detected by the second spatial sensor, a fourth user interaction with a button of the multi-touch mouse 202 ) within the user input 204 .
  • the task component 210 may be configured to perform in-game tasks 212 within the computer video game 214 based upon the user interactions within the user input 204 and the mapping 208 .
  • the task component 210 may be configured to perform a first in-game task corresponding to the first user interaction and a second in-game task corresponding to the second user interaction based upon the mapping 208 (e.g., the first user interaction is mapped to the first in-game task within the mapping 208 ). It may be appreciated that the task component 210 may be configured to perform one or more in-game tasks concurrently.
  • the task component 210 may receive the user input 204 comprising a first user interaction corresponding to a multi-touch mouse position change and a second user interaction corresponding to a swipe gesture on the surface of the multi-touch mouse 202 .
  • the mapping 208 may comprise a mapping of a multi-touch mouse position change to a character selection in-game task and a mapping of a swipe gesture to a character movement to destination in-game task.
  • the task component 210 may perform the character selection in-game task of one or more characters within a strategy game (the computer video game 214 ) based upon the multi-touch mouse position change (user interaction) moving a curser over the one or more characters within the strategy game.
  • a video game player may move the multi-touch mouse in such a way that a corresponding cursor within the strategy game hovers over one or more characters within the strategy game. In this way, the one or more characters hovered over by the cursor may be selected.
  • the task component 210 may perform the character movement to destination in-game task of the one or more selected characters based upon the swipe gesture (user interaction). It may be appreciated that the task component 210 may perform the character selection in-game task and the character movement to destination in-game task concurrently.
  • FIG. 3A illustrates an example of a multi-touch mouse 300 .
  • the multi-touch mouse surface may be divided into one or more portions (regions), which may be individually mapped to in-game tasks.
  • the multi-touch mouse 300 may comprise a first portion 302 , a second portion 304 , a third portion 306 , a fourth portion 308 , a fifth portion 310 , and/or other portions.
  • a first spatial sensor (not illustrated) within the multi-touch mouse 300 may be configured to detect a change in position of the multi-touch mouse (e.g., an optical eye located at the button of the multi-touch mouse 320 ).
  • a second spatial sensor within the multi-touch mouse 300 may be configured to detect user interaction within the various portions of the multi-touch mouse surface.
  • the second spatial sensor may be configured to detect a swipe gesture within the second portion 304 .
  • the second spatial sensor may be configured to detect a first finger to second finger distance based upon the first portion 302 and the third portion 306 .
  • the second spatial may be configured to detect a grip based upon user interaction within the first portion 302 , the second portion 304 , the third portion 306 , the fourth portion 308 , and the fifth portion 310 .
  • the multi-touch mouse 300 may comprise a single surface that is not divided into multiple portions.
  • FIG. 3B illustrates an example of a multi-touch mouse 320 .
  • the multi-touch mouse 320 may comprise a first spatial sensor (not illustrated) configured to detect the movement of the multi-touch mouse on a surface, such as a mouse pad (e.g., an optical eye located at the button of the multi-touch mouse 320 ).
  • the multi-touch mouse 320 may comprise a second spatial sensor overlaying the first spatial sensor.
  • the second spatial sensor may comprise a light source 322 , a camera 324 , a mirror 326 , and/or other components (e.g., optical components).
  • the second spatial sensor may be configured to detect user interaction on the surface 328 of the multi-touch mouse 320 .
  • the second spatial sensor may be configured to detect user interaction by a first finger 330 and user interaction by a second finger 332 .
  • FIG. 3C illustrates an example of a multi-touch mouse 340 .
  • the multi-touch mouse 340 may comprise a first spatial sensor (not illustrated) configured to detect the movement of the multi-touch mouse on a surface, such as a mouse pad (e.g., an optical eye located at the button of the multi-touch mouse 340 ).
  • the multi-touch mouse 340 may comprise a second spatial sensor 344 and a third spatial sensor 346 . It may be appreciated that the second spatial sensor 344 and the third spatial sensor 346 may be configured to operate as a single spatial sensor or two separate spatial sensors. The second spatial sensor 344 and the third spatial sensor 346 may be configured to detect user interaction with the multi-touch mouse 340 .
  • FIG. 4 illustrates an example of a multi-touch mouse 400 configured to generate user input based upon detected user interaction.
  • the multi-touch mouse 400 may comprise a first spatial sensor (not illustrated) configured to detect the movement of the multi-touch mouse on a surface, such as a mouse pad (e.g., an optical eye located at the button of the multi-touch mouse 400 ).
  • the multi-touch mouse 400 may comprise a second spatial sensor configured to detect user interaction on a multi-touch mouse surface 402 of the multi-touch mouse 400 .
  • the multi-touch mouse surface 402 may be divided into multiple portions (e.g., a first portion 404 , a second portion 408 , a third portion 410 , a fourth portion 412 , a fifth portion 416 , a sixth portion 418 , a seventh portion 420 , etc.). Respective portions of the multi-touch mouse surface 402 may be mapped to one or more in-game tasks of a computer video game. In another example, the multi-touch mouse surface 402 may be a single surface that is not divided into multiple portions. It may be appreciated that user interaction with more than one portion may be detected concurrently by a spatial sensor. It may be appreciated that a single portion may be mapped to more than one in-game task.
  • the second spatial sensor may be configured to detect user interaction with the first portion 404 of the surface of the multi-touch mouse surface 402 .
  • the user interaction with the first portion 404 may be mapped to a view change in-game task of a computer video game. For example, a user may roll or swipe their wrist across the first portion 404 to change their view within the computer video game (e.g., rotate the view as though the character in the computer video game was moving his or her head).
  • the second spatial sensor may be configured to detect user interaction with the second portion 408 of the surface of the multi-touch mouse surface 402 .
  • the user interaction with the second portion 408 may be mapped to a camera view roll in-game task of the computer video game. For example, a user may roll their palm across the second portion 408 to tilt the view as though the character within the computer video game tilted his or her head.
  • the second spatial sensor may be configured to detect user interaction with the third portion 410 of the surface of the multi-touch mouse surface 402 .
  • the user interaction with the third portion 410 may be mapped to a weapon fire in-game task of the computer video game. For example, a user may press a finger over the third portion 410 to fire a weapon within the computer video game.
  • the second spatial sensor may be configured to detect user interaction with the fourth portion 412 of the surface of the multi-touch mouse surface 402 .
  • the user interaction with the fourth portion 412 may be mapped to an in-game navigation control task of the computer video game. For example, a user may swipe a finger over the fourth portion to move a character within the computer video game in a direction of the swipe gesture.
  • a swipe gesture length of the swipe gesture may translate into the speed in which the character moves and a swipe gesture direction of the swipe gesture may translate into the direction in which the character moves.
  • the second spatial sensor may be configured to detect user interaction between one or more fingers. That is, a first finger to second finger user interaction 414 may be detected based upon a distance between a first finger and a second finger on the multi-touch mouse surface 402 . For example, the first finger to second finger user interaction 414 may be mapped to a zoom in/out in-game task.
  • FIG. 5 illustrates an example of performing in-game tasks in a first person shooter computer video game 510 based upon user input on a multi-touch mouse.
  • the multi-touch mouse may comprise one or more spatial sensors.
  • a first spatial sensor may be configured to detect positional changes of the multi-touch mouse.
  • a second spatial sensor may be configured to detect gestures on the multi-touch mouse surface 502 .
  • a mapping component may be configured to maintain a mapping of user interactions with in-game tasks.
  • the mapping component may be configured to maintain a mapping of portions (regions) of the multi-touch mouse surface 502 with in-game tasks.
  • a task component may be configured to receive user input comprising user interactions detected by the spatial sensors of the multi-touch mouse.
  • the task component may be configured to perform in-game tasks based upon the user input and the mappings maintained by the mapping component.
  • the mapping component may maintain a mapping that maps a swipe gesture user interaction 508 with a character movement in-game task, a finger press user interaction 506 with a fire weapon in-game task, and a multi-touch mouse position change user interaction 504 with a character view change.
  • a first spatial sensor may be configured to detect the multi-touch mouse position change user interaction 504 and/or other user interactions.
  • a second spatial sensor may be configured to detect the swipe gesture user interaction 508 , the finger press user interaction 506 , and/or other user interactions.
  • the task component may be configured to receive user input from the multi-touch mouse.
  • the user input may comprise one or more detected user interactions.
  • the task component may invoke one or more in-game task based upon the in-game tasks being mapped to the user interactions within the mapping.
  • FIG. 6 illustrates an example of performing in-game tasks within a first person shooter computer video game 610 based upon user input on a multi-touch mouse.
  • the multi-touch mouse may comprise one or more spatial sensors configured to detect user interaction with the multi-touch mouse (e.g., a gesture on the multi-touch mouse surface 602 ).
  • a mapping component may maintain a mapping that maps a swipe gesture user interaction 608 with a character movement in-game task, a first finger to second finger distance user interaction 612 with a character view zoom in-game task, and a multi-touch mouse position change user interaction 604 with a character view change.
  • a task component may be configured to receive user input from the multi-touch mouse.
  • the user input may comprise one or more detected user interactions.
  • the task component may invoke one or more in-game task based upon the in-game tasks based upon the in-game tasks being mapped to the user interactions within the mapping.
  • FIG. 7 illustrates an example of performing in-game tasks within a strategy computer video game 710 based upon user input on a multi-touch mouse.
  • the multi-touch mouse may comprise one or more spatial sensors configured to detect user interaction with the multi-touch mouse (e.g., a gesture on the multi-touch mouse surface 706 ).
  • a mapping component may maintain a mapping that maps a swipe gesture user interaction 704 with a character selection in-game task 712 and a multi-touch mouse position change user interaction 702 with character movement to destination in-game task 714 .
  • a task component may be configured to receive user input from the multi-touch mouse.
  • the user input may comprise one or more detected user interactions.
  • a user may perform a circle gesture on the multi-touch mouse surface 706 using a finger, while moving the position of the multi-touch mouse to the right.
  • a first spatial sensor within the multi-touch mouse may detect the position change of the multi-touch mouse as the multi-touch mouse position change user interaction 702 .
  • a second spatial sensor within the multi-touch mouse may detect the circle gesture as the swipe gesture user interaction 704 .
  • the task component may receive user input of the two user interactions.
  • a cursor 708 may be moved in a circular motion around a set of characters within the strategy video game 710 .
  • the task component may invoke the character selection in-game task 712 of the characters encompasses within the circular motion of the cursor 708 .
  • the task component may also invoke the character movement to destination in-game task 714 to move the selected characters.
  • Still another embodiment involves a computer-readable medium comprising processor-executable instructions configured to implement one or more of the techniques presented herein.
  • An exemplary computer-readable medium that may be devised in these ways is illustrated in FIG. 8 , wherein the implementation 800 comprises a computer-readable medium 816 (e.g., a CD-R, DVD-R, or a platter of a hard disk drive), on which is encoded computer-readable data 814 .
  • This computer-readable data 814 in turn comprises a set of computer instructions 812 configured to operate according to one or more of the principles set forth herein.
  • the processor-executable instructions 812 may be configured to perform a method 810 , such as the exemplary method 100 of FIG. 1 , for example.
  • the processor-executable instructions 812 may implement the exemplary method 100 which may be executed via one or more processors.
  • the processor-executable instructions 812 may be configured to implement a system, such as the exemplary system 200 of FIG. 2 .
  • Many such computer-readable media may be devised by those of ordinary skill in the art that are configured to operate in accordance with the techniques presented herein.
  • a component may be, but is not limited to being, a process running on a processor, a processor, an object, an executable, a thread of execution, a program, and/or a computer.
  • an application running on a controller and the controller can be a component.
  • One or more components may reside within a process and/or thread of execution and a component may be localized on one computer and/or distributed between two or more computers.
  • the claimed subject matter may be implemented as a method, apparatus, or article of manufacture using standard programming and/or engineering techniques to produce software, firmware, hardware, or any combination thereof to control a computer to implement the disclosed subject matter.
  • article of manufacture as used herein is intended to encompass a computer program accessible from any computer-readable device, carrier, or media.
  • FIG. 9 and the following discussion provide a brief, general description of a suitable computing environment to implement embodiments of one or more of the provisions set forth herein.
  • the operating environment of FIG. 9 is only one example of a suitable operating environment and is not intended to suggest any limitation as to the scope of use or functionality of the operating environment.
  • Example computing devices include, but are not limited to, personal computers, server computers, hand-held or laptop devices, mobile devices (such as mobile phones, Personal Digital Assistants (PDAs), media players, and the like), multiprocessor systems, consumer electronics, mini computers, mainframe computers, distributed computing environments that include any of the above systems or devices, and the like.
  • Computer readable instructions may be distributed via computer readable media (discussed below).
  • Computer readable instructions may be implemented as program modules, such as functions, objects, Application Programming Interfaces (APIs), data structures, and the like, that perform particular tasks or implement particular abstract data types.
  • APIs Application Programming Interfaces
  • the functionality of the computer readable instructions may be combined or distributed as desired in various environments.
  • FIG. 9 illustrates an example of a system 910 comprising a computing device 912 configured to implement one or more embodiments provided herein.
  • computing device 912 includes at least one processing unit 916 and memory 918 .
  • memory 918 may be volatile (such as RAM, for example), non-volatile (such as ROM, flash memory, etc., for example) or some combination of the two. This configuration is illustrated in FIG. 9 by dashed line 914 .
  • device 912 may include additional features and/or functionality.
  • device 912 may also include additional storage (e.g., removable and/or non-removable) including, but not limited to, magnetic storage, optical storage, and the like.
  • additional storage e.g., removable and/or non-removable
  • FIG. 9 Such additional storage is illustrated in FIG. 9 by storage 920 .
  • computer readable instructions to implement one or more embodiments provided herein may be in storage 920 .
  • Storage 920 may also store other computer readable instructions to implement an operating system, an application program, and the like. Computer readable instructions may be loaded in memory 918 for execution by processing unit 916 , for example.
  • Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions or other data.
  • Memory 918 and storage 920 are examples of computer storage media.
  • Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, Digital Versatile Disks (DVDs) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by device 912 . Any such computer storage media may be part of device 912 .
  • Device 912 may also include communication connection(s) 926 that allows device 912 to communicate with other devices.
  • Communication connection(s) 926 may include, but is not limited to, a modem, a Network Interface Card (NIC), an integrated network interface, a radio frequency transmitter/receiver, an infrared port, a USB connection, or other interfaces for connecting computing device 912 to other computing devices.
  • Communication connection(s) 926 may include a wired connection or a wireless connection. Communication connection(s) 926 may transmit and/or receive communication media.
  • Computer readable media may include communication media.
  • Communication media typically embodies computer readable instructions or other data in a “modulated data signal” such as a carrier wave or other transport mechanism and includes any information delivery media.
  • modulated data signal may include a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal.
  • Device 912 may include input device(s) 924 such as keyboard, mouse, pen, voice input device, touch input device, infrared cameras, video input devices, and/or any other input device.
  • Output device(s) 922 such as one or more displays, speakers, printers, and/or any other output device may also be included in device 912 .
  • Input device(s) 924 and output device(s) 922 may be connected to device 912 via a wired connection, wireless connection, or any combination thereof.
  • an input device or an output device from another computing device may be used as input device(s) 924 or output device(s) 922 for computing device 912 .
  • Components of computing device 912 may be connected by various interconnects, such as a bus.
  • Such interconnects may include a Peripheral Component Interconnect (PCI), such as PCI Express, a Universal Serial Bus (USB), firewire (IEEE 1394), an optical bus structure, and the like.
  • PCI Peripheral Component Interconnect
  • USB Universal Serial Bus
  • IEEE 1394 Firewire
  • optical bus structure and the like.
  • components of computing device 912 may be interconnected by a network.
  • memory 918 may be comprised of multiple physical memory units located in different physical locations interconnected by a network.
  • a computing device 930 accessible via a network 928 may store computer readable instructions to implement one or more embodiments provided herein.
  • Computing device 912 may access computing device 930 and download a part or all of the computer readable instructions for execution.
  • computing device 912 may download pieces of the computer readable instructions, as needed, or some instructions may be executed at computing device 912 and some at computing device 930 .
  • one or more of the operations described may constitute computer readable instructions stored on one or more computer readable media, which if executed by a computing device, will cause the computing device to perform the operations described.
  • the order in which some or all of the operations are described should not be construed as to imply that these operations are necessarily order dependent. Alternative ordering will be appreciated by one skilled in the art having the benefit of this description. Further, it will be understood that not all operations are necessarily present in each embodiment provided herein.
  • the word “exemplary” is used herein to mean serving as an example, instance, or illustration. Any aspect or design described herein as “exemplary” is not necessarily to be construed as advantageous over other aspects or designs. Rather, use of the word exemplary is intended to present concepts in a concrete fashion.
  • the term “or” is intended to mean an inclusive “or” rather than an exclusive “or”. That is, unless specified otherwise, or clear from context, “X employs A or B” is intended to mean any of the natural inclusive permutations. That is, if X employs A; X employs B; or X employs both A and B, then “X employs A or B” is satisfied under any of the foregoing instances.
  • the articles “a” and “an” as used in this application and the appended claims may generally be construed to mean “one or more” unless specified otherwise or clear from context to be directed to a singular form.

Abstract

Keyboards, mice, joysticks, customized gamepads, and other peripherals are continually being developed to enhance a user's experience when playing computer video games. Unfortunately, many of these devices provide users with limited input control because of the complexity of today's gaming applications. For example, many computer video games require a combination of mouse and keyboard to control even the simplest of in-game tasks (e.g., walking into a room and looking around may require several keyboard keystrokes and mouse movements). Accordingly, one or more systems and/or techniques for performing in-game tasks based upon user input within a multi-touch mouse are disclosed herein. User input comprising one or more user interactions detect by spatial sensors within the multi-touch mouse may be received. A wide variety of in-game tasks (e.g., character movements, character actions, character view, etc.) may be performed based upon the user interactions (e.g., a swipe gesture, a mouse position change, etc.).

Description

    BACKGROUND
  • Today's gaming applications are becoming increasingly comprehensive, allowing players to accomplish a wide range of complex tasks. For example, players may perform tasks ranging from flying F16 fighter jets to commanding army squadrons in WWII scenarios. Many of these tasks require a large number of input parameters from various input devices, such as a mouse and keyboard combination. For example, in a first-person-shooter (FPS), the player's view orientation may be mapped to the mouse position, the player's movement may be mapped to keyboard keys (e.g., w, a, s, d), and the player's actions may be mapped to a combination of keyboard keys and/or mouse clicks. Unfortunately, this type of input scheme may provide limited control in a computer video game. For example, the keyboard's inherent binary control may limit the player's ability to control tasks that require continuous control, such as in navigation (e.g., varying travel speed). Furthermore, the mouse provides limited control in computer video games due to the limited number of inputs (e.g., 3 input mouse buttons may not map easily to 8 different character abilities). In addition, these input schemes may have a high learning curve for new players.
  • SUMMARY
  • This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key factors or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
  • Among other things, one or more systems and/or techniques for performing in-game tasks based upon user input within a multi-touch mouse are disclosed herein. It may be appreciated that a multi-touch mouse may be interpreted as a mouse comprising two or more spatial sensors, which may or may not be overlaid on one another. For example, a multi-touch mouse may comprise a first spatial sensor configured to detect the position of the multi-touch mouse (e.g., a position of the mouse based upon the relative location of the mouse on a surface, such as a mouse pad, direction change, velocity, etc.). The multi-touch mouse may comprise a second spatial sensor configured to detect gestures and/or the position of one or more fingers on the surface of the multi-touch mouse. For example, the second spatial sensor may be configured to detect a grip, change in a grip, a swipe gesture, a finger location, a finger press, a first finger to second finger distance, and/or other hand or finger detection. In this example, the first sensor can be said to face “down” toward a (desktop/mouse pad) surface upon which the mouse rests/moves, while the second sensor can be said to face “up” away from said surface, and this can be thought of as an “overlying” arrangement as the second sensor (located on an upper surface of the mouse) may be substantially directly above the first sensor (located on a lower surface of the mouse). It will be appreciated, however, that other “overlying” arrangements are possible, and also that such arrangements may not be necessary. That is, while it may not be unusual for a touch sensor to be located substantially directly above a movement sensor, such an arrangement is not necessary. Moreover, “overlying” as used herein is not meant to be interpreted in a limiting manner, for example, to mean that the sensors are in direct contact with one another, influence one another, need to be acted on concurrently, etc. That is, one sensor can be acted on by a user, for example, while the other sensor is not. Generally, a multi-touch mouse may comprise one or more sensors (e.g., buttons) that may or may not overlay one or more other sensors (e.g., none, one, some, all) of the mouse.
  • It may be appreciated that a spatial sensor may (but need not) detect one or more user interactions concurrently. For example, a second spatial sensor may detect a grip and a swipe gesture concurrently. It may be appreciated that a user interaction may comprise one or more spatial measurements. For example, a second spatial sensor may detect a finger swipe gesture as comprising a swipe gesture, a swipe gesture direction, a swipe gesture length, and/or a swipe gesture speed. It may be appreciated that the term “finger” may be interpreted as one or more digits of a hand (e.g., thumb, index finger, middle finger, ring finger, pinky, etc.). It may be appreciated that, in one example, a character may be interpreted as an in-game character representing a user within the gamming application environment. It may also be appreciated that any type of user interactions are contemplated herein and that said term (e.g., as used in the claims) is not meant to be limited to merely the examples provided herein. For example, while a grip, swipe, pinch, finger location, finger press, distance between digits, etc. may be mentioned herein, these and any other gestures and/or interactions are contemplated.
  • A mapping component may be configured to maintain a mapping of one or more user interactions within a multi-touch mouse to one or more in-game tasks. In one example, a finger swipe gesture on the surface of the multi-touch mouse may be mapped to an avatar movement in-game task. In another example, a grip (a change in grip) of the multi-touch mouse may be mapped to a camera view roll in-game task. The mappings may be derived based upon user preferences defined by one or more user settings. In another example, the mapping component may be configured to map portions of the multi-touch mouse surface with in-game tasks. For example, the surface of the multi-touch mouse may be treated as one or more portions (regions), such that respective portions are mapped to different in-game tasks (e.g., a first portion relative to the forefinger location may be mapped to an avatar movement in-game task, while a second portion relative to the pinky location may be mapped to a character jump in-game task). It may be appreciated that multiple mapping variations may be maintained for a single computer video game.
  • A task component may be configured to receive user input comprising a first user interaction with a first spatial sensor of the multi-touch mouse, a second user interaction with a second spatial sensor of the multi-touch mouse, and/or other user interactions with spatial sensors within the multi-touch mouse. In one example, the first user interaction and the second user interaction may be received concurrently within the user input because the first spatial sensor may detect the first user input at a substantially similar time as to when the second spatial sensor detects the second user input due to an overlaid configuration of the spatial sensors. It may be appreciated that the overlaid configuration allows the spatial sensors to operate independent of one another.
  • The task component may be configured to perform a first in-game task corresponding to the first user interaction, a second in-game task corresponding to the second user interaction, and/or other in-game tasks corresponding to other user interactions based upon the mapping. For example, the task component may receive user input comprising a first user interaction relating to a multi-touch mouse position change detected by the first spatial sensor, a second user interaction relating to a first finger to second finger distance detected by the second spatial sensor, and a third user interaction relating to a grip detected by the second spatial sensor. The task component may be configured to perform a character view change in-game task based upon the multi-touch mouse position change, a view zoom in-game task based upon the first finger to second finger distance, and a camera view tilt based upon the grip as specified in the mapping maintained by the mapping component.
  • To the accomplishment of the foregoing and related ends, the following description and annexed drawings set forth certain illustrative aspects and implementations. These are indicative of but a few of the various ways in which one or more aspects may be employed. Other aspects, advantages, and novel features of the disclosure will become apparent from the following detailed description when considered in conjunction with the annexed drawings.
  • DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a flow chart illustrating an exemplary method of performing in-game tasks based upon user input on a multi-touch mouse.
  • FIG. 2 is a component block diagram illustrating an exemplary system for performing in-game tasks based upon user input on a multi-touch mouse.
  • FIG. 3A is an illustration of an example of a multi-touch mouse.
  • FIG. 3B is an illustration of an example of a multi-touch mouse.
  • FIG. 3C is an illustration of an example of a multi-touch mouse.
  • FIG. 4 is an illustration of an example of a multi-touch mouse configured to generate user input based upon detected user interaction.
  • FIG. 5 is an illustration of an example of performing in-game tasks based upon user input on a multi-touch mouse.
  • FIG. 6 is an illustration of an example of performing in-game tasks based upon user input on a multi-touch mouse.
  • FIG. 7 is an illustration of an example of performing in-game tasks based upon user input on a multi-touch mouse.
  • FIG. 8 is an illustration of an exemplary computer-readable medium wherein processor-executable instructions configured to embody one or more of the provisions set forth herein may be comprised.
  • FIG. 9 illustrates an exemplary computing environment wherein one or more of the provisions set forth herein may be implemented.
  • DETAILED DESCRIPTION
  • The claimed subject matter is now described with reference to the drawings, wherein like reference numerals are used to refer to like elements throughout. In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the claimed subject matter. It may be evident, however, that the claimed subject matter may be practiced without these specific details. In other instances, structures and devices are illustrated in block diagram form in order to facilitate describing the claimed subject matter.
  • Today, many computer users spend significant time playing video games through their computers. A wide variety of video game genres have developed over the years. For example, video games may be developed based upon action, role playing, first person shooter, strategy, flight simulation, third person adventure, racing, massive multiplayer online, and/or other genres. To accommodate the different play styles associated with the different video game genres and to provide video game players with an enhanced experience, different user input devices have been developed. For example, joysticks, keyboards, mice, gamepads, motion sensors, and other peripherals are available to video game players. There are even keyboards customized sole for the purpose of playing particular games. These input devices are developed in an attempt to map user input with in-game tasks. Unfortunately, conventional input devices do not allow for user interaction that adequately maps to the complex input parameters required by today's computer video games.
  • Accordingly, one or more systems and/or techniques for performing in-game tasks based upon user input on a multi-touch mouse are provided herein. The multi-touch mouse allows a video game player to invoke various in-game tasks within a computer video game by performing user interactions on the multi-touch mouse that are mapped to the various in-game tasks For example, the multi-touch mouse may comprise a first spatial sensor configured to detect the position of the multi-touch mouse. It may be appreciated that the mouse position change may comprise one or more measurements, such as direction change, speed, acceleration, velocity, etc. An in-game task, such as a character view change, may be performed based upon receiving user input of the multi-touch position change. The multi-touch mouse may comprise a second spatial sensor configured to detect a plurality of user interactions mapped to one or more in-game tasks. For example, the second spatial sensor may detect a finger position, a finger swipe gesture, a first finger to second finger distance, etc.
  • The first spatial sensor and the second spatial sensor allow a user to perform a wide variety of gestures that may be individually mapped to different in-game tasks. Because a wide variety of gestures are available using just the multi-touch mouse, the need to use an additional input device, such as a keyboard, and/or the need to perform complex input combinations may be mitigated. For example, a user may be able to control the movement, camera view, and actions of a character within a first person shooter video game using just the multi-touch mouse.
  • One embodiment of performing in-game tasks based upon user input on a multi-touch mouse is illustrated by an exemplary method 100 in FIG. 1. At 102, the method begins. At 104, user input comprising a first user interaction with a first spatial sensor of a multi-touch mouse and a second user interaction with a second spatial sensor of the multi-touch mouse is received. It may be appreciated that the first user interaction and the second user interaction may be received concurrently within the user input.
  • In one example, user interactions may be mapped to in-game tasks. For example, if the video game player is engaged with a strategy video game, then user interactions may be mapped to in-game tasks of the strategy video game. The in-game tasks (e.g., select an infantry, rotate view, issue infantry movement commands, issue infantry action commands, etc.) may be invoked within the strategy video game based upon user interactions with the multi-touch mouse (e.g., a swipe gesture, a multi-touch mouse position change, a grip or change in grip, etc.).
  • In another example, portions (regions) of the multi-touch mouse surface may be mapped to in-game tasks. That is, a user interaction with a particular portion of the multi-touch mouse surface may be mapped to a specific in-game task. For example, if the video game player is engaged with a flight simulation game, then user interactions with different portions of the multi-touch mouse surface may be mapped to different in-game tasks of the flight simulation game. User interaction with a first portion of the multi-touch mouse surface may be mapped to accelerations/deceleration. User interaction with a second portion of the multi-touch mouse surface may be mapped to a view change of the pilot. User interaction with a third portion of the multi-touch mouse surface may be mapped to the flight path direction. It may be appreciated that the user interaction with the one or more portions of the multi-touch mouse surface may be detected by one or more spatial sensor within the multi-touch mouse. It may be appreciated that a portion (region) of the multi-touch mouse may be mapped to more than one in-game tasks. For example, an upper left corner portion of the multi-touch mouse may be mapped to a zoom aim in-game task and a shoot in-game task.
  • In another example, a first portion may be a first region of the multi-touch mouse associated with a first in-game task. A second portion may be a second region of the multi-touch mouse associated with a second in-game task. It may be appreciated that the first region and the second region may or may not overlap on the surface of the multi-touch mouse. In this way, the first in-game task and second in-game task may be invoked based upon user interaction with the overlap (and/or degree thereof) of the first and second regions.
  • At 106, a first in-game task may be performed based upon the first user interaction and a second in-game task may be performed based upon the second user interaction. The in-game tasks may be performed based upon the user interactions mapping to the in-games tasks. It may be appreciated that the first in-game task and the second in-game task may be performed concurrently. At 108, the method ends.
  • FIG. 2 illustrates an example of a system 200 configured for performing in-game tasks 212 based upon user input 204 on a multi-touch mouse 202. The system 200 may comprise a task component 210 and/or a mapping component 206. The mapping component 206 may be configured to maintain a mapping 208 of one or more user interactions on a multi-touch mouse 202 to one or more in-game tasks of a computer video game 214. For example, a grip user interaction may be mapped to a camera view roll in-game task, a swipe user interaction may be mapped to a movement in-game task, a swipe length may be mapped to a movement speed in-game task, a multi-touch mouse location may be mapped to a view in-game task, etc. The mapping 208 may be maintained based upon user preference defined by one or more user settings. It may be appreciated that a spatial sensor of the multi-touch mouse 202 may be configured to detect one or more of the user interactions mapped within the mapping 208
  • It may be appreciated that the mapping component 206 may be configured to maintain the mapping 208 as comprising mappings of one or more portions (regions) of the multi-touch mouse surface to one or more in-game tasks. That is, user interaction with a first portion of the multi-touch mouse surface may be mapped to a first in-game task and user interaction with a second portion of the multi-touch mouse surface may be mapped to a second in-game task. It may be appreciated that a spatial sensor of the multi-touch mouse 202 may be configured to detect user interaction with one or more of the portions (regions) of the multi-touch mouse surface. It may be appreciated that the mapping component 206 may map two or more in-game tasks to a single user interaction. It may be appreciated that the mapping component 206 may map a single in-game task to two or more user interactions.
  • The multi-touch mouse 202 may comprise a first spatial sensor, a second spatial sensor, a button, and/or other spatial sensors. In one example, the first spatial sensor and the second spatial sensor may be in an overlaid configuration within the multi-sensor mouse 202.
  • The task component 210 may be configured to receive user input 204 comprising a first user interaction detected by a first spatial sensor of the multi-touch mouse 202 and a second user interaction detected by a second spatial sensor of the multi-touch mouse 202. In one example, the task component 210 may be configured to receive the user input 204 comprising three or more user interactions (e.g., a third user interaction detected by the second spatial sensor, a fourth user interaction with a button of the multi-touch mouse 202) within the user input 204. The task component 210 may be configured to perform in-game tasks 212 within the computer video game 214 based upon the user interactions within the user input 204 and the mapping 208. For example, the task component 210 may be configured to perform a first in-game task corresponding to the first user interaction and a second in-game task corresponding to the second user interaction based upon the mapping 208 (e.g., the first user interaction is mapped to the first in-game task within the mapping 208). It may be appreciated that the task component 210 may be configured to perform one or more in-game tasks concurrently.
  • In one example, the task component 210 may receive the user input 204 comprising a first user interaction corresponding to a multi-touch mouse position change and a second user interaction corresponding to a swipe gesture on the surface of the multi-touch mouse 202. The mapping 208 may comprise a mapping of a multi-touch mouse position change to a character selection in-game task and a mapping of a swipe gesture to a character movement to destination in-game task. The task component 210 may perform the character selection in-game task of one or more characters within a strategy game (the computer video game 214) based upon the multi-touch mouse position change (user interaction) moving a curser over the one or more characters within the strategy game. That is, a video game player may move the multi-touch mouse in such a way that a corresponding cursor within the strategy game hovers over one or more characters within the strategy game. In this way, the one or more characters hovered over by the cursor may be selected. The task component 210 may perform the character movement to destination in-game task of the one or more selected characters based upon the swipe gesture (user interaction). It may be appreciated that the task component 210 may perform the character selection in-game task and the character movement to destination in-game task concurrently.
  • FIG. 3A illustrates an example of a multi-touch mouse 300. The multi-touch mouse surface may be divided into one or more portions (regions), which may be individually mapped to in-game tasks. For example, the multi-touch mouse 300 may comprise a first portion 302, a second portion 304, a third portion 306, a fourth portion 308, a fifth portion 310, and/or other portions. In one example, a first spatial sensor (not illustrated) within the multi-touch mouse 300 may be configured to detect a change in position of the multi-touch mouse (e.g., an optical eye located at the button of the multi-touch mouse 320). A second spatial sensor within the multi-touch mouse 300 may be configured to detect user interaction within the various portions of the multi-touch mouse surface. For example, the second spatial sensor may be configured to detect a swipe gesture within the second portion 304. The second spatial sensor may be configured to detect a first finger to second finger distance based upon the first portion 302 and the third portion 306. The second spatial may be configured to detect a grip based upon user interaction within the first portion 302, the second portion 304, the third portion 306, the fourth portion 308, and the fifth portion 310. It may be appreciated that the multi-touch mouse 300 may comprise a single surface that is not divided into multiple portions.
  • FIG. 3B illustrates an example of a multi-touch mouse 320. The multi-touch mouse 320 may comprise a first spatial sensor (not illustrated) configured to detect the movement of the multi-touch mouse on a surface, such as a mouse pad (e.g., an optical eye located at the button of the multi-touch mouse 320). The multi-touch mouse 320 may comprise a second spatial sensor overlaying the first spatial sensor. The second spatial sensor may comprise a light source 322, a camera 324, a mirror 326, and/or other components (e.g., optical components). The second spatial sensor may be configured to detect user interaction on the surface 328 of the multi-touch mouse 320. For example, the second spatial sensor may be configured to detect user interaction by a first finger 330 and user interaction by a second finger 332.
  • FIG. 3C illustrates an example of a multi-touch mouse 340. The multi-touch mouse 340 may comprise a first spatial sensor (not illustrated) configured to detect the movement of the multi-touch mouse on a surface, such as a mouse pad (e.g., an optical eye located at the button of the multi-touch mouse 340). The multi-touch mouse 340 may comprise a second spatial sensor 344 and a third spatial sensor 346. It may be appreciated that the second spatial sensor 344 and the third spatial sensor 346 may be configured to operate as a single spatial sensor or two separate spatial sensors. The second spatial sensor 344 and the third spatial sensor 346 may be configured to detect user interaction with the multi-touch mouse 340.
  • FIG. 4 illustrates an example of a multi-touch mouse 400 configured to generate user input based upon detected user interaction. For example, the multi-touch mouse 400 may comprise a first spatial sensor (not illustrated) configured to detect the movement of the multi-touch mouse on a surface, such as a mouse pad (e.g., an optical eye located at the button of the multi-touch mouse 400). The multi-touch mouse 400 may comprise a second spatial sensor configured to detect user interaction on a multi-touch mouse surface 402 of the multi-touch mouse 400. In one example, the multi-touch mouse surface 402 may be divided into multiple portions (e.g., a first portion 404, a second portion 408, a third portion 410, a fourth portion 412, a fifth portion 416, a sixth portion 418, a seventh portion 420, etc.). Respective portions of the multi-touch mouse surface 402 may be mapped to one or more in-game tasks of a computer video game. In another example, the multi-touch mouse surface 402 may be a single surface that is not divided into multiple portions. It may be appreciated that user interaction with more than one portion may be detected concurrently by a spatial sensor. It may be appreciated that a single portion may be mapped to more than one in-game task.
  • In one example, the second spatial sensor may be configured to detect user interaction with the first portion 404 of the surface of the multi-touch mouse surface 402. The user interaction with the first portion 404 may be mapped to a view change in-game task of a computer video game. For example, a user may roll or swipe their wrist across the first portion 404 to change their view within the computer video game (e.g., rotate the view as though the character in the computer video game was moving his or her head).
  • The second spatial sensor may be configured to detect user interaction with the second portion 408 of the surface of the multi-touch mouse surface 402. The user interaction with the second portion 408 may be mapped to a camera view roll in-game task of the computer video game. For example, a user may roll their palm across the second portion 408 to tilt the view as though the character within the computer video game tilted his or her head.
  • The second spatial sensor may be configured to detect user interaction with the third portion 410 of the surface of the multi-touch mouse surface 402. The user interaction with the third portion 410 may be mapped to a weapon fire in-game task of the computer video game. For example, a user may press a finger over the third portion 410 to fire a weapon within the computer video game.
  • The second spatial sensor may be configured to detect user interaction with the fourth portion 412 of the surface of the multi-touch mouse surface 402. The user interaction with the fourth portion 412 may be mapped to an in-game navigation control task of the computer video game. For example, a user may swipe a finger over the fourth portion to move a character within the computer video game in a direction of the swipe gesture. In particular, a swipe gesture length of the swipe gesture may translate into the speed in which the character moves and a swipe gesture direction of the swipe gesture may translate into the direction in which the character moves.
  • The second spatial sensor may be configured to detect user interaction between one or more fingers. That is, a first finger to second finger user interaction 414 may be detected based upon a distance between a first finger and a second finger on the multi-touch mouse surface 402. For example, the first finger to second finger user interaction 414 may be mapped to a zoom in/out in-game task.
  • FIG. 5 illustrates an example of performing in-game tasks in a first person shooter computer video game 510 based upon user input on a multi-touch mouse. The multi-touch mouse may comprise one or more spatial sensors. For example, a first spatial sensor may be configured to detect positional changes of the multi-touch mouse. A second spatial sensor may be configured to detect gestures on the multi-touch mouse surface 502. In one example, a mapping component may be configured to maintain a mapping of user interactions with in-game tasks. In another example the mapping component may be configured to maintain a mapping of portions (regions) of the multi-touch mouse surface 502 with in-game tasks. A task component may be configured to receive user input comprising user interactions detected by the spatial sensors of the multi-touch mouse. The task component may be configured to perform in-game tasks based upon the user input and the mappings maintained by the mapping component.
  • In one example, the mapping component may maintain a mapping that maps a swipe gesture user interaction 508 with a character movement in-game task, a finger press user interaction 506 with a fire weapon in-game task, and a multi-touch mouse position change user interaction 504 with a character view change. A first spatial sensor may be configured to detect the multi-touch mouse position change user interaction 504 and/or other user interactions. A second spatial sensor may be configured to detect the swipe gesture user interaction 508, the finger press user interaction 506, and/or other user interactions. The task component may be configured to receive user input from the multi-touch mouse. The user input may comprise one or more detected user interactions. The task component may invoke one or more in-game task based upon the in-game tasks being mapped to the user interactions within the mapping.
  • FIG. 6 illustrates an example of performing in-game tasks within a first person shooter computer video game 610 based upon user input on a multi-touch mouse. The multi-touch mouse may comprise one or more spatial sensors configured to detect user interaction with the multi-touch mouse (e.g., a gesture on the multi-touch mouse surface 602). In one example, a mapping component may maintain a mapping that maps a swipe gesture user interaction 608 with a character movement in-game task, a first finger to second finger distance user interaction 612 with a character view zoom in-game task, and a multi-touch mouse position change user interaction 604 with a character view change. A task component may be configured to receive user input from the multi-touch mouse. The user input may comprise one or more detected user interactions. The task component may invoke one or more in-game task based upon the in-game tasks based upon the in-game tasks being mapped to the user interactions within the mapping.
  • FIG. 7 illustrates an example of performing in-game tasks within a strategy computer video game 710 based upon user input on a multi-touch mouse. The multi-touch mouse may comprise one or more spatial sensors configured to detect user interaction with the multi-touch mouse (e.g., a gesture on the multi-touch mouse surface 706). In one example, a mapping component may maintain a mapping that maps a swipe gesture user interaction 704 with a character selection in-game task 712 and a multi-touch mouse position change user interaction 702 with character movement to destination in-game task 714. A task component may be configured to receive user input from the multi-touch mouse. The user input may comprise one or more detected user interactions.
  • For example, a user may perform a circle gesture on the multi-touch mouse surface 706 using a finger, while moving the position of the multi-touch mouse to the right. A first spatial sensor within the multi-touch mouse may detect the position change of the multi-touch mouse as the multi-touch mouse position change user interaction 702. A second spatial sensor within the multi-touch mouse may detect the circle gesture as the swipe gesture user interaction 704. The task component may receive user input of the two user interactions. In response, a cursor 708 may be moved in a circular motion around a set of characters within the strategy video game 710. In this way, the task component may invoke the character selection in-game task 712 of the characters encompasses within the circular motion of the cursor 708. The task component may also invoke the character movement to destination in-game task 714 to move the selected characters.
  • Still another embodiment involves a computer-readable medium comprising processor-executable instructions configured to implement one or more of the techniques presented herein. An exemplary computer-readable medium that may be devised in these ways is illustrated in FIG. 8, wherein the implementation 800 comprises a computer-readable medium 816 (e.g., a CD-R, DVD-R, or a platter of a hard disk drive), on which is encoded computer-readable data 814. This computer-readable data 814 in turn comprises a set of computer instructions 812 configured to operate according to one or more of the principles set forth herein. In one such embodiment 800, the processor-executable instructions 812 may be configured to perform a method 810, such as the exemplary method 100 of FIG. 1, for example. That is, the processor-executable instructions 812 may implement the exemplary method 100 which may be executed via one or more processors. In another such embodiment, the processor-executable instructions 812 may be configured to implement a system, such as the exemplary system 200 of FIG. 2. Many such computer-readable media may be devised by those of ordinary skill in the art that are configured to operate in accordance with the techniques presented herein.
  • Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.
  • As used in this application, the terms “component,” “module,” “system”, “interface”, and the like are generally intended to refer to a computer-related entity, either hardware, a combination of hardware and software, software, or software in execution. For example, a component may be, but is not limited to being, a process running on a processor, a processor, an object, an executable, a thread of execution, a program, and/or a computer. By way of illustration, both an application running on a controller and the controller can be a component. One or more components may reside within a process and/or thread of execution and a component may be localized on one computer and/or distributed between two or more computers.
  • Furthermore, the claimed subject matter may be implemented as a method, apparatus, or article of manufacture using standard programming and/or engineering techniques to produce software, firmware, hardware, or any combination thereof to control a computer to implement the disclosed subject matter. The term “article of manufacture” as used herein is intended to encompass a computer program accessible from any computer-readable device, carrier, or media. Of course, those skilled in the art will recognize many modifications may be made to this configuration without departing from the scope or spirit of the claimed subject matter.
  • FIG. 9 and the following discussion provide a brief, general description of a suitable computing environment to implement embodiments of one or more of the provisions set forth herein. The operating environment of FIG. 9 is only one example of a suitable operating environment and is not intended to suggest any limitation as to the scope of use or functionality of the operating environment. Example computing devices include, but are not limited to, personal computers, server computers, hand-held or laptop devices, mobile devices (such as mobile phones, Personal Digital Assistants (PDAs), media players, and the like), multiprocessor systems, consumer electronics, mini computers, mainframe computers, distributed computing environments that include any of the above systems or devices, and the like.
  • Although not required, embodiments are described in the general context of “computer readable instructions” being executed by one or more computing devices. Computer readable instructions may be distributed via computer readable media (discussed below). Computer readable instructions may be implemented as program modules, such as functions, objects, Application Programming Interfaces (APIs), data structures, and the like, that perform particular tasks or implement particular abstract data types. Typically, the functionality of the computer readable instructions may be combined or distributed as desired in various environments.
  • FIG. 9 illustrates an example of a system 910 comprising a computing device 912 configured to implement one or more embodiments provided herein. In one configuration, computing device 912 includes at least one processing unit 916 and memory 918. Depending on the exact configuration and type of computing device, memory 918 may be volatile (such as RAM, for example), non-volatile (such as ROM, flash memory, etc., for example) or some combination of the two. This configuration is illustrated in FIG. 9 by dashed line 914.
  • In other embodiments, device 912 may include additional features and/or functionality. For example, device 912 may also include additional storage (e.g., removable and/or non-removable) including, but not limited to, magnetic storage, optical storage, and the like. Such additional storage is illustrated in FIG. 9 by storage 920. In one embodiment, computer readable instructions to implement one or more embodiments provided herein may be in storage 920. Storage 920 may also store other computer readable instructions to implement an operating system, an application program, and the like. Computer readable instructions may be loaded in memory 918 for execution by processing unit 916, for example.
  • The term “computer readable media” as used herein includes computer storage media. Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions or other data. Memory 918 and storage 920 are examples of computer storage media. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, Digital Versatile Disks (DVDs) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by device 912. Any such computer storage media may be part of device 912.
  • Device 912 may also include communication connection(s) 926 that allows device 912 to communicate with other devices. Communication connection(s) 926 may include, but is not limited to, a modem, a Network Interface Card (NIC), an integrated network interface, a radio frequency transmitter/receiver, an infrared port, a USB connection, or other interfaces for connecting computing device 912 to other computing devices. Communication connection(s) 926 may include a wired connection or a wireless connection. Communication connection(s) 926 may transmit and/or receive communication media.
  • The term “computer readable media” may include communication media. Communication media typically embodies computer readable instructions or other data in a “modulated data signal” such as a carrier wave or other transport mechanism and includes any information delivery media. The term “modulated data signal” may include a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal.
  • Device 912 may include input device(s) 924 such as keyboard, mouse, pen, voice input device, touch input device, infrared cameras, video input devices, and/or any other input device. Output device(s) 922 such as one or more displays, speakers, printers, and/or any other output device may also be included in device 912. Input device(s) 924 and output device(s) 922 may be connected to device 912 via a wired connection, wireless connection, or any combination thereof. In one embodiment, an input device or an output device from another computing device may be used as input device(s) 924 or output device(s) 922 for computing device 912.
  • Components of computing device 912 may be connected by various interconnects, such as a bus. Such interconnects may include a Peripheral Component Interconnect (PCI), such as PCI Express, a Universal Serial Bus (USB), firewire (IEEE 1394), an optical bus structure, and the like. In another embodiment, components of computing device 912 may be interconnected by a network. For example, memory 918 may be comprised of multiple physical memory units located in different physical locations interconnected by a network.
  • Those skilled in the art will realize that storage devices utilized to store computer readable instructions may be distributed across a network. For example, a computing device 930 accessible via a network 928 may store computer readable instructions to implement one or more embodiments provided herein. Computing device 912 may access computing device 930 and download a part or all of the computer readable instructions for execution. Alternatively, computing device 912 may download pieces of the computer readable instructions, as needed, or some instructions may be executed at computing device 912 and some at computing device 930.
  • Various operations of embodiments are provided herein. In one embodiment, one or more of the operations described may constitute computer readable instructions stored on one or more computer readable media, which if executed by a computing device, will cause the computing device to perform the operations described. The order in which some or all of the operations are described should not be construed as to imply that these operations are necessarily order dependent. Alternative ordering will be appreciated by one skilled in the art having the benefit of this description. Further, it will be understood that not all operations are necessarily present in each embodiment provided herein.
  • Moreover, the word “exemplary” is used herein to mean serving as an example, instance, or illustration. Any aspect or design described herein as “exemplary” is not necessarily to be construed as advantageous over other aspects or designs. Rather, use of the word exemplary is intended to present concepts in a concrete fashion. As used in this application, the term “or” is intended to mean an inclusive “or” rather than an exclusive “or”. That is, unless specified otherwise, or clear from context, “X employs A or B” is intended to mean any of the natural inclusive permutations. That is, if X employs A; X employs B; or X employs both A and B, then “X employs A or B” is satisfied under any of the foregoing instances. In addition, the articles “a” and “an” as used in this application and the appended claims may generally be construed to mean “one or more” unless specified otherwise or clear from context to be directed to a singular form.
  • Also, although the disclosure has been shown and described with respect to one or more implementations, equivalent alterations and modifications will occur to others skilled in the art based upon a reading and understanding of this specification and the annexed drawings. The disclosure includes all such modifications and alterations and is limited only by the scope of the following claims. In particular regard to the various functions performed by the above described components (e.g., elements, resources, etc.), the terms used to describe such components are intended to correspond, unless otherwise indicated, to any component which performs the specified function of the described component (e.g., that is functionally equivalent), even though not structurally equivalent to the disclosed structure which performs the function in the herein illustrated exemplary implementations of the disclosure. In addition, while a particular feature of the disclosure may have been disclosed with respect to only one of several implementations, such feature may be combined with one or more other features of the other implementations as may be desired and advantageous for any given or particular application. Furthermore, to the extent that the terms “includes”, “having”, “has”, “with”, or variants thereof are used in either the detailed description or the claims, such terms are intended to be inclusive in a manner similar to the term “comprising.

Claims (19)

1. A method for performing in-game tasks based upon user input on a multi-touch mouse comprising:
receiving user input comprising a first user interaction with a first spatial sensor of a multi-touch mouse and a second user interaction with a second spatial sensor of the multi-touch mouse; and
performing a first in-game task based upon the first user interaction and a second in-game task based upon the second user interaction.
2. The method of claim 1, the first spatial sensor overlaid the second spatial sensor within the multi-touch mouse.
3. The method of claim 1, the user input comprising a third user interaction with the second spatial sensor; and the performing comprising performing a third in-game task based upon the third user interaction.
4. The method of claim 3, the first user interaction comprising a multi-touch mouse position change; the second user interaction comprising a swipe gesture on the surface of the multi-touch mouse; and the third user interaction comprising a grip of the multi-touch mouse.
5. The method of claim 1, the first user interaction comprising a multi-touch mouse position change; and the second user interaction comprising a swipe gesture on the surface of the multi-touch mouse, a swipe gesture length, and swipe gesture direction.
6. The method of claim 5, comprising:
performing a view control in-game task based upon mapping the multi-touch mouse position change with a view change in-game task; and
performing an in-game navigation control task based upon mapping the swipe gesture with movement of a character; mapping the swipe gesture length with a speed of the movement; and mapping the swipe gesture direction with a movement direction.
7. The method of claim 1, the second user interaction comprising a first finger to second finger distance on the multi-touch mouse surface.
8. The method of claim 1, the second user interaction comprising a change of grip of the multi-touch mouse; and the performing the second in-game task comprising mapping the change of grip with a camera view roll in-game task.
9. The method of claim 1, the first user interaction and the second user interaction received concurrently within the user input, and the first in-game task and the second in-game task performed concurrently.
10. The method of claim 3, comprising:
detecting the second user interaction within a first portion of the multi-touch mouse surface mapped to the first in-game task; and
detecting the third user interaction within a second portion of the multi-touch mouse surface mapped to the third in-game task.
11. A system for performing in-game tasks based upon user input on a multi-touch mouse, comprising:
a mapping component configured to:
maintain a mapping of one or more user interactions on a multi-touch mouse to one or more in-game tasks; and
a task component configured to:
receive user input comprising a first user interaction with a first spatial sensor of a multi-touch mouse and a second user interaction with a second spatial sensor of the multi-touch mouse; and
perform a first in-game task corresponding to the first user interaction and a second in-game task corresponding to the second user interaction based upon the mapping.
13. The system of claim 11, the mapping component configured to:
map a first portion of the multi-touch mouse surface to a third in-game task and a second portion of the multi-touch mouse surface to a fourth in-game task.
14. The system of claim 11, the user input comprising a third user interaction with the second spatial sensor of the multi-touch mouse.
15. The system of claim 14, the task component configured to:
perform a third in-game task corresponding to the third user interaction based upon the mapping.
16. The system of claim 11, the first spatial sensor overlaid the second spatial sensor within the multi-touch mouse.
17. The system of claim 16, the task component configured to perform the first in-game task and the second in-game task concurrently.
18. The system of claim 11, the task component configured to:
receive user input comprising the first user interaction corresponding to a multi-touch mouse position change; and the second user interaction corresponding a swipe gesture on the surface of the multi-touch mouse;
perform a character selection in-game task of one or more characters based upon the multi-touch mouse position change corresponding to locations of one or more characters within a game; and
perform a character movement to destination in-game task of the selected characters based upon the swipe gesture.
19. The system of claim 11, the mapping component configured to:
maintain the mapping based upon user preference defined by one or more user settings.
20. A method for performing in-game tasks based upon user input on a multi-touch mouse comprising:
receiving user input comprising a first user interaction with a first spatial sensor of a multi-touch mouse, a second user interaction with a second spatial sensor of the multi-touch mouse, and a third user interaction with a button of the multi-touch mouse, the first spatial sensor, second spatial sensor, and the button in an overlaid configuration within the multi-touch mouse; and
performing a first in-game task based upon the first user interaction, a second in-game task based upon the second user interaction, and a third in-game task based upon the third user interaction concurrently.
US12/701,150 2010-02-05 2010-02-05 Multi-touch mouse in gaming applications Abandoned US20110195781A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/701,150 US20110195781A1 (en) 2010-02-05 2010-02-05 Multi-touch mouse in gaming applications

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US12/701,150 US20110195781A1 (en) 2010-02-05 2010-02-05 Multi-touch mouse in gaming applications

Publications (1)

Publication Number Publication Date
US20110195781A1 true US20110195781A1 (en) 2011-08-11

Family

ID=44354145

Family Applications (1)

Application Number Title Priority Date Filing Date
US12/701,150 Abandoned US20110195781A1 (en) 2010-02-05 2010-02-05 Multi-touch mouse in gaming applications

Country Status (1)

Country Link
US (1) US20110195781A1 (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120038582A1 (en) * 2010-08-13 2012-02-16 Immersion Corporation Systems and Methods for Providing Haptic Feedback to Touch-Sensitive Input Devices
CN102999175A (en) * 2012-08-29 2013-03-27 漳州宝发光电科技有限公司 Structure of three-dimensional (3D) touch mouse
US9367146B2 (en) 2011-11-14 2016-06-14 Logiteh Europe S.A. Input device with multiple touch-sensitive zones
US20170075565A1 (en) * 2012-07-18 2017-03-16 Sony Corporation Mobile client device, operation method, recording medium, and operation system
US20190114740A1 (en) * 2016-04-25 2019-04-18 Panasonic Intellectual Property Management Co., Ltd. Image processing device, imaging system provided therewith, and calibration method
US20200147481A1 (en) * 2018-11-13 2020-05-14 Steelseries Aps Method and apparatus for enhancing accuracy associated with a gaming accessory
US20200192486A1 (en) * 2018-12-18 2020-06-18 Samsung Electronics Co., Ltd. System and method for multipurpose input device for two-dimensional and three-dimensional environments
CN113426096A (en) * 2021-07-22 2021-09-24 网易(杭州)网络有限公司 Method and device for switching props in game, electronic equipment and storage medium
US11209920B2 (en) 2017-10-04 2021-12-28 Hewlett-Packard Development Company, L.P. User interfaces with strike sensors
CN113975803A (en) * 2021-10-28 2022-01-28 腾讯科技(深圳)有限公司 Control method and device of virtual role, storage medium and electronic equipment
WO2024057306A1 (en) * 2022-09-14 2024-03-21 Tactile World Ltd Systems and methods for alleviation of one or more psychiatric disorders

Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020097229A1 (en) * 2001-01-24 2002-07-25 Interlink Electronics, Inc. Game and home entertainment device remote control
US7168047B1 (en) * 2002-05-28 2007-01-23 Apple Computer, Inc. Mouse having a button-less panning and scrolling switch
US20070152966A1 (en) * 2005-12-30 2007-07-05 Apple Computer, Inc. Mouse with optical sensing surface
US20070257891A1 (en) * 2006-05-03 2007-11-08 Esenther Alan W Method and system for emulating a mouse on a multi-touch sensitive surface
US20080163130A1 (en) * 2007-01-03 2008-07-03 Apple Inc Gesture learning
US20080165132A1 (en) * 2007-01-05 2008-07-10 Microsoft Corporation Recognizing multiple input point gestures
US20080179507A2 (en) * 2006-08-03 2008-07-31 Han Jefferson Multi-touch sensing through frustrated total internal reflection
US20090273570A1 (en) * 2008-04-30 2009-11-05 Apple Inc. Multi-touch sensor patterns and stack-ups
US20100117963A1 (en) * 2008-11-12 2010-05-13 Wayne Carl Westerman Generating Gestures Tailored to a Hand Resting on a Surface
US20100242274A1 (en) * 2009-03-30 2010-09-30 Microsoft Corporation Detecting touch on a curved surface
US20100245246A1 (en) * 2009-03-30 2010-09-30 Microsoft Corporation Detecting touch on a curved surface
US20100265178A1 (en) * 2009-04-17 2010-10-21 Microsoft Corporation Camera-based multi-touch mouse
US20110009195A1 (en) * 2009-07-08 2011-01-13 Gunjan Porwal Configurable representation of a virtual button on a game controller touch screen
US20110039602A1 (en) * 2009-08-13 2011-02-17 Mcnamara Justin Methods And Systems For Interacting With Content On A Mobile Device
US20110080341A1 (en) * 2009-10-01 2011-04-07 Microsoft Corporation Indirect Multi-Touch Interaction
US20110109552A1 (en) * 2009-11-09 2011-05-12 Primax Electronics Ltd. Multi-touch multi-dimensional mouse
US20110227947A1 (en) * 2010-03-16 2011-09-22 Microsoft Corporation Multi-Touch User Interface Interaction

Patent Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020097229A1 (en) * 2001-01-24 2002-07-25 Interlink Electronics, Inc. Game and home entertainment device remote control
US7168047B1 (en) * 2002-05-28 2007-01-23 Apple Computer, Inc. Mouse having a button-less panning and scrolling switch
US20070152966A1 (en) * 2005-12-30 2007-07-05 Apple Computer, Inc. Mouse with optical sensing surface
US20070257891A1 (en) * 2006-05-03 2007-11-08 Esenther Alan W Method and system for emulating a mouse on a multi-touch sensitive surface
US20080179507A2 (en) * 2006-08-03 2008-07-31 Han Jefferson Multi-touch sensing through frustrated total internal reflection
US20080163130A1 (en) * 2007-01-03 2008-07-03 Apple Inc Gesture learning
US20080165132A1 (en) * 2007-01-05 2008-07-10 Microsoft Corporation Recognizing multiple input point gestures
US20090273570A1 (en) * 2008-04-30 2009-11-05 Apple Inc. Multi-touch sensor patterns and stack-ups
US20100117963A1 (en) * 2008-11-12 2010-05-13 Wayne Carl Westerman Generating Gestures Tailored to a Hand Resting on a Surface
US20100242274A1 (en) * 2009-03-30 2010-09-30 Microsoft Corporation Detecting touch on a curved surface
US20100245246A1 (en) * 2009-03-30 2010-09-30 Microsoft Corporation Detecting touch on a curved surface
US20100265178A1 (en) * 2009-04-17 2010-10-21 Microsoft Corporation Camera-based multi-touch mouse
US20110009195A1 (en) * 2009-07-08 2011-01-13 Gunjan Porwal Configurable representation of a virtual button on a game controller touch screen
US20110039602A1 (en) * 2009-08-13 2011-02-17 Mcnamara Justin Methods And Systems For Interacting With Content On A Mobile Device
US20110080341A1 (en) * 2009-10-01 2011-04-07 Microsoft Corporation Indirect Multi-Touch Interaction
US20110109552A1 (en) * 2009-11-09 2011-05-12 Primax Electronics Ltd. Multi-touch multi-dimensional mouse
US20110227947A1 (en) * 2010-03-16 2011-09-22 Microsoft Corporation Multi-Touch User Interface Interaction

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120038582A1 (en) * 2010-08-13 2012-02-16 Immersion Corporation Systems and Methods for Providing Haptic Feedback to Touch-Sensitive Input Devices
US8576171B2 (en) * 2010-08-13 2013-11-05 Immersion Corporation Systems and methods for providing haptic feedback to touch-sensitive input devices
US9134797B2 (en) 2010-08-13 2015-09-15 Immersion Corporation Systems and methods for providing haptic feedback to touch-sensitive input devices
US9367146B2 (en) 2011-11-14 2016-06-14 Logiteh Europe S.A. Input device with multiple touch-sensitive zones
US9489061B2 (en) 2011-11-14 2016-11-08 Logitech Europe S.A. Method and system for power conservation in a multi-zone input device
US20170075565A1 (en) * 2012-07-18 2017-03-16 Sony Corporation Mobile client device, operation method, recording medium, and operation system
US10007424B2 (en) * 2012-07-18 2018-06-26 Sony Mobile Communications Inc. Mobile client device, operation method, recording medium, and operation system
CN102999175A (en) * 2012-08-29 2013-03-27 漳州宝发光电科技有限公司 Structure of three-dimensional (3D) touch mouse
US20190114740A1 (en) * 2016-04-25 2019-04-18 Panasonic Intellectual Property Management Co., Ltd. Image processing device, imaging system provided therewith, and calibration method
US10872395B2 (en) * 2016-04-25 2020-12-22 Panasonic Intellectual Property Management Co., Ltd. Image processing device, imaging system provided therewith, and calibration method
US20210073942A1 (en) * 2016-04-25 2021-03-11 Panasonic Intellectual Property Management Co., Ltd. Image processing device, imaging system provided therewith, and calibration method
US11209920B2 (en) 2017-10-04 2021-12-28 Hewlett-Packard Development Company, L.P. User interfaces with strike sensors
US20200147481A1 (en) * 2018-11-13 2020-05-14 Steelseries Aps Method and apparatus for enhancing accuracy associated with a gaming accessory
US10888769B2 (en) * 2018-11-13 2021-01-12 Steelseries Aps Method and apparatus for enhancing accuracy associated with a gaming accessory in accordance with a distance of the gaming accessory relative to a surface
US20200192486A1 (en) * 2018-12-18 2020-06-18 Samsung Electronics Co., Ltd. System and method for multipurpose input device for two-dimensional and three-dimensional environments
US10890982B2 (en) * 2018-12-18 2021-01-12 Samsung Electronics Co., Ltd. System and method for multipurpose input device for two-dimensional and three-dimensional environments
CN113426096A (en) * 2021-07-22 2021-09-24 网易(杭州)网络有限公司 Method and device for switching props in game, electronic equipment and storage medium
CN113975803A (en) * 2021-10-28 2022-01-28 腾讯科技(深圳)有限公司 Control method and device of virtual role, storage medium and electronic equipment
WO2024057306A1 (en) * 2022-09-14 2024-03-21 Tactile World Ltd Systems and methods for alleviation of one or more psychiatric disorders

Similar Documents

Publication Publication Date Title
US20110195781A1 (en) Multi-touch mouse in gaming applications
US20210263593A1 (en) Hand gesture input for wearable system
KR101956410B1 (en) Game controller on mobile touch-enabled devices
CN105339884B (en) The classification of user's input
US20150199030A1 (en) Hover-Sensitive Control Of Secondary Display
Teather et al. Position vs. velocity control for tilt-based interaction
KR20160120760A (en) Advanced game mechanics on hover-sensitive devices
KR102260409B1 (en) Method and apparatus for interfacing of game
JP7244249B2 (en) Game program and information processing device
JP2023076611A (en) Game program and information processing device
EP3433713B1 (en) Selecting first digital input behavior based on presence of a second, concurrent, input
KR102557808B1 (en) Gaming service system and method for sharing memo therein
JP7256627B2 (en) game program
KR102609293B1 (en) Apparatus and method for determining game action
KR102614708B1 (en) Method for selecting target object and gaming device for executint the method
Tsuchida et al. TetraForce: a magnetic-based interface enabling pressure force and shear force input applied to front and back of a smartphone
KR102540798B1 (en) Method for providing user interface and mobile terminal
KR102369251B1 (en) Method for providing user interface and terminal for executing the same
US11908097B2 (en) Information processing system, program, and information processing method
KR102369256B1 (en) Method for providing user interface and terminal for executing the same
KR102185576B1 (en) Smart controler, apparatus for controling user terminal, and method for controling user terminal
WO2018154327A1 (en) Computer interface system and method
Kurabayashi Kinetics: A mathematical model for an on-screen gamepad controllable by finger-tilting
JP6919050B1 (en) Game system, program and information processing method
KR102345395B1 (en) Appartus and method for providing user interface

Legal Events

Date Code Title Description
AS Assignment

Owner name: MICROSOFT CORPORATION, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:CHEN, BILLY;BENKO, HRVOJE;OFEK, EYAL;AND OTHERS;SIGNING DATES FROM 20100129 TO 20100202;REEL/FRAME:023927/0794

AS Assignment

Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:034564/0001

Effective date: 20141014

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION