US9235969B2 - System and method for providing complex haptic stimulation during input of control gestures, and relating to control of virtual equipment - Google Patents

System and method for providing complex haptic stimulation during input of control gestures, and relating to control of virtual equipment Download PDF

Info

Publication number
US9235969B2
US9235969B2 US13/924,084 US201313924084A US9235969B2 US 9235969 B2 US9235969 B2 US 9235969B2 US 201313924084 A US201313924084 A US 201313924084A US 9235969 B2 US9235969 B2 US 9235969B2
Authority
US
United States
Prior art keywords
user
haptic
stimulation
motion
body part
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US13/924,084
Other versions
US20140009273A1 (en
Inventor
Danny A. Grant
Robert W. Heubel
David M. Birnbaum
Erin B. Ramsay
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Immersion Corp
Original Assignee
Immersion Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Family has litigation
First worldwide family litigation filed litigation Critical https://patents.darts-ip.com/?family=42931900&utm_source=google_patent&utm_medium=platform_link&utm_campaign=public_patent_search&patent=US9235969(B2) "Global patent litigation dataset” by Darts-ip is licensed under a Creative Commons Attribution 4.0 International License.
Application filed by Immersion Corp filed Critical Immersion Corp
Priority to US13/924,084 priority Critical patent/US9235969B2/en
Assigned to IMMERSION CORPORATION reassignment IMMERSION CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BIRNBAUM, DAVID M., RAMSAY, ERIN B., GRANT, DANNY A., HEUBEL, ROBERT W.
Publication of US20140009273A1 publication Critical patent/US20140009273A1/en
Priority to US14/955,694 priority patent/US9671866B2/en
Application granted granted Critical
Publication of US9235969B2 publication Critical patent/US9235969B2/en
Priority to US15/600,522 priority patent/US10139911B2/en
Priority to US16/178,711 priority patent/US20190073037A1/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/10
    • A63F13/12
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • A63F13/28Output arrangements for video game devices responding to control signals received from the game device for affecting ambient conditions, e.g. for vibrating players' seats, activating scent dispensers or affecting temperature or light
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • A63F13/28Output arrangements for video game devices responding to control signals received from the game device for affecting ambient conditions, e.g. for vibrating players' seats, activating scent dispensers or affecting temperature or light
    • A63F13/285Generating tactile feedback signals via the game input device, e.g. force feedback
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/45Controlling the progress of the video game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/45Controlling the progress of the video game
    • A63F13/48Starting a game, e.g. activating a game device or waiting for other players to join a multiplayer session
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/90Constructional details or arrangements of video game devices not provided for in groups A63F13/20 or A63F13/25, e.g. housing, wiring, connections or cabinets
    • A63F13/92Video game devices specially adapted to be hand-held while playing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B6/00Tactile signalling systems, e.g. personal calling systems
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1037Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted for converting control signals received from the game device into a haptic signal, e.g. using force feedback
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1068Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad
    • A63F2300/1075Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad using a touch screen
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/20Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform
    • A63F2300/204Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform the platform being a handheld device
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/50Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
    • A63F2300/57Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers details of game services offered to the player
    • A63F2300/575Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers details of game services offered to the player for trading virtual items
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/013Force feedback applied to a game
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/014Force feedback applied to GUI
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04803Split screen, i.e. subdividing the display area or the window area into separate subareas

Definitions

  • the invention relates to a system and method of providing haptic stimulation to a user during performance of a complex control gesture, and/or during the control of virtual equipment.
  • Haptic stimulation provide a physical sensation to users.
  • Haptic stimulation is used in the context of games, and virtual worlds, and in real world control systems.
  • Such haptic stimulation may be generated to provide feedback to users that a control input has been received, that another user has input a command, that virtual or real objects have collided, exploded, or imploded, that an ambient force is present (e.g., simulated or real wind, rain, magnetism, and/or other virtual forces), and/or that other phenomena have occurred.
  • the parameters of such stimulation is typically static and provides a simple mechanism for instructing a user that a corresponding phenomena has occurred (or will occur).
  • One aspect of the invention relates to a system configured to provide haptic stimulation to a user of a game.
  • the system comprises a user interface, an actuator, and one or more processors configured to execute computer program modules.
  • the user interface is configured to generate output signals related to the gestures of a user.
  • the actuator is configured to generate haptic stimulation to the user.
  • the computer program modules comprise a gesture module, a stimulation module and an actuator control module.
  • the gesture module is configured to monitor performance of a control gesture by the user based on the output signals of the user interface.
  • the control gesture is a gesture associated with a command input to the game, and includes an initial portion, a first intermediate portion, and an ending portion.
  • the stimulation module is configured to receive information related to performance of the control gesture from the gesture module, and to determine haptic stimulation to be generated for the user associated with the control gesture.
  • the haptic stimulation includes a first stimulation determined responsive to performance of the initial portion of the control gesture, and a second stimulation that is different from the first stimulation and is determined responsive to performance of the first intermediate portion of the control gesture.
  • the actuator control module is configured to control the actuator to generate the stimulation determined by the stimulation module.
  • the method comprises monitoring performance of a control gesture by a user, wherein the control gesture is a gesture associated with a command input to the game, and includes an initial portion, a first intermediate portion, and an ending portion; determining haptic stimulation associated with performance of the control gesture to be generated for the user, wherein the haptic stimulation includes a first stimulation determined responsive to performance of the initial portion of the control gesture, and a second stimulation that is different from the first stimulation and is determined responsive to performance of the first intermediate portion of the control gesture; and generating the determined stimulation during performance of the control gesture.
  • the system comprises a touch sensitive electronic display, an actuator, and one or more processors configured to execute computer program modules.
  • the touch sensitive electronic display has an interface surface that is accessible for engagement by the user, and the touch sensitive user interface is configured to generate output signals related to the position at which the interface surface is engaged, and to present views of the game to the user through the interface surface.
  • the views presented through the interface surface include views of virtual equipment having user selectable sections that are selectable by the user to interact with the virtual equipment by engaging the interface surface at the user selectable sections of the views of the virtual equipment.
  • the actuator is configured to generate haptic stimulation to the user.
  • the computer program modules comprise an equipment module, a stimulation module, and an actuator control module.
  • the equipment module is configured to determine the operating parameters of the virtual equipment in the views, and to simulate operation of the virtual equipment.
  • the equipment module determines the operating parameters of the virtual equipment and/or simulates operation of the virtual equipment based on selections by the user of the user selectable sections of the views of the virtual equipment.
  • the stimulation module is configured to determine haptic stimulation to be generated for the user associated with the operating parameters of the virtual equipment and/or simulated operation of the virtual equipment.
  • the actuator control module is configured to control the actuator to generate the stimulation determined by the stimulation module.
  • Still another aspect of the invention relates to a method of providing stimulation to a user of a game.
  • the method comprises presenting views of a game through an interface surface of a touch sensitive electronic display that is accessible for engagement by a user, wherein the views presented through the interface surface include views of virtual equipment having user selectable sections that are selectable by the user to interact with the virtual equipment by engaging the interface surface at the user selectable sections of the views of the virtual equipment; receiving selection of one of the user selectable sections via an by the user engagement of the selected user selectable section on the interface surface; determining the operating parameters of the virtual equipment in the views and/or simulating operation of the virtual equipment based on the received selection; determining, responsive to the received selection, haptic stimulation to be generated for the user associated with the operating parameters of the virtual equipment and/or simulated operation of the virtual equipment; and generating the determined haptic stimulation.
  • a still further aspect of the invention relates to a system and method for providing a game on one or more portable computing device in which a virtual object (e.g., a ball) travels through views of the game displayed on the interfaces of the one or more portable computing devices.
  • Haptic effects corresponding to the travel of the virtual object (or virtual objects) through the views are provided on the individual portable computing devices.
  • the haptic effects may be determined based on one or more parameters of the travel of the virtual object (e.g., speed, direction, acceleration, etc.), one or more parameters of objects and/or features with which the virtual object interacts (e.g., walls, flippers, blockers, bumpers, etc.), and/or other parameters.
  • the haptic effects may include haptic effects to be provided on portable computing devices that are not currently displaying the virtual object corresponding to the haptic effects. This may enhance the interactivity of the game for a group of users playing the game together on separate portable computing devices.
  • FIG. 1 illustrates a system configured to provide haptic stimulation to a user, in accordance with one or more embodiments of the invention.
  • FIG. 2 illustrates a view of a piece of virtual equipment, in accordance with one or more embodiments of the invention.
  • FIG. 3 illustrates a view of a piece of virtual equipment, in accordance with one or more embodiments of the invention.
  • FIG. 4 illustrates a view of a piece of virtual equipment, in accordance with one or more embodiments of the invention.
  • FIG. 5 illustrates a method of providing haptic feedback to a user, according to one or more embodiments of the invention.
  • FIG. 6 illustrates a method of providing haptic feedback to a user, according to one or more embodiments of the invention.
  • FIG. 7 illustrates a portable computing device, in accordance with one or more embodiments of the invention.
  • FIG. 8A illustrates an example of the use of a game to support multi-user play, according to one or more embodiments of the invention.
  • FIG. 8B illustrates an example of the use of a game to support multi-user play, according to one or more embodiments of the invention.
  • FIG. 9 illustrates an example of play areas for two respective users over a plurality of time intervals, in accordance with one or more embodiments of the invention.
  • FIG. 10 illustrates an example of play and depicts the virtual object ricocheting off the border of a play area, according to one or more embodiments of the invention.
  • FIG. 11 illustrate a use of haptic effects to simulate a continuous effect, in accordance with one or more embodiments of the invention.
  • FIG. 12 illustrates a method for providing a game, according to one or more embodiments of the invention.
  • FIG. 13 illustrates an example of an interface for an instance of a game, in accordance with one or more embodiments of the invention.
  • FIG. 1 illustrates a system 10 configured to provide haptic stimulation to a user 12 .
  • the haptic stimulation is provided to user 12 in conjunction with the performance of one or more control gestures through which user 12 controls, for example, a game, a real world component or piece of equipment, and/or other entity.
  • the haptic stimulation is provided to user 12 such that as performance of a given control gesture continues, the haptic stimulation associated with the control gesture changes in accordance with the progression of the control gesture.
  • the haptic stimulation is provided to user 12 in conjunction with control of virtual equipment by user 12 .
  • the haptic stimulation corresponds to control inputs provided to system 10 in controlling the virtual equipment.
  • system 10 includes one or more user interfaces 14 , one or more actuators 16 , electronic storage 18 , one or more processors 20 , and/or other components.
  • haptic stimulation may be used to enhance the performance of complex control gestures by users, enhance the user experience, teach users to perform control gestures, and/or provide other enhancement over conventional real world control systems.
  • Implementations involving control of real world components or systems may include instances in which a user is present at the component being controlled (e.g., a control panel in a car, on a microwave, etc.), and/or instances in which a user is located remotely from the component being controlled.
  • Control of a remote real world component or system may be accompanied by other sensory stimulation informing the user of the state of the component being controlled.
  • Such other sensory stimulation may include, for example, real-time (or near real-time) video, audio, and/or still images provided to the user.
  • a multiple user embodiment may include the provision of haptic stimulation to “passive users” (e.g., users not performing a control gesture) related to the performance of a control gesture by an “active user” (e.g., the user performing the control gesture).
  • the haptic stimulation may further provide feedback to the passive and/or active users of other phenomena present in the multi-user environment.
  • a “control gesture” refers to a gesture made by a user that is a single and discrete control input having separate portions. The separate portions must be performed in specific order and/or with a specific timing to effectively achieve the control input associated with the “control gesture.” Performance of the separate portions, on their own, will not result in the control input associated with the “control gesture” as a whole (e.g., a “control gesture” is not merely a combination of other gestures, each associated with its own control input).
  • a “control gesture” is an abstract gesture that does not correlate with exactness to the control input to which it corresponds.
  • the user interface 14 includes one or more input and/or output devices configured to communicate information to and/or receive information from user 12 .
  • the user interface 14 may include, for example, one or more content delivery devices that convey content to user 12 .
  • This content may include audio content, video content, still images, and/or other content.
  • the content delivery devices may include, for example, electronic displays (e.g., including touch-sensitive displays), audio speakers, and/or other content delivery devices.
  • the user interface 14 may include one or more control input devices configured to generate an output signal indicating input from the user to system 10 .
  • user interface 14 may include a game controller, a remote control, a keypad, a button, a switch, a keyboard, a knob, a lever, a microphone, a position detecting device (e.g., an image sensor, a pressure sensor, an optical position detector, an ultrasonic position detector, a touch sensitive surface, and/or other position detecting devices), an accelerometer, a gyroscope, a digital compass, and/or other control input devices.
  • the user interface 14 may be embodied in, or associated with, a computing device, and/or may be embodied in a control peripheral.
  • a computing device may include one or more of a desktop computer, a laptop computer, a handheld computer, a personal digital assistant, a Smartphone, a personal music player, a portable gaming console, a gaming console, and/or other computing devices.
  • user interface 14 includes a plurality of actuators.
  • the plurality of user interfaces 14 may be included in, carried by, and/or in contact with a single object or device.
  • the plurality of user interfaces 14 may include user interfaces included in and/or carried by a plurality of separate objects or devices.
  • the actuators 16 are configured to generate haptic stimulus for user 12 . As such, at least some of actuators 16 are in contact with the users, or in contact with objects that contact the users, during conveyance of the sensory content to the users by user interface 14 .
  • one or more of actuators 16 may be positioned in or on a floor surface supporting the users (e.g., installed in the floor, carried by a mat lying on the floor, etc.), one or more of actuators 16 may be carried by a brace or other wearable item worn by the users, one or more of the actuators 16 may be carried by objects that are carried by the users (e.g., carried by a controller), one or more of actuators 16 may be carried by furniture on which the users are seated or lying, one or more of actuators 16 may be carried by user interface 14 , and/or one or more of the actuators 16 may be carried by or disposed in or on other objects that contact the users.
  • haptic stimulus refers to tactile feedback that is applied to the users.
  • feedback may include one or more of vibrations, forces, and/or motions that are applied physically to the user by the actuators 16 and/or the objects with which both actuators 16 and the user are in contact.
  • the actuators 16 may include any device configured to generate such feedback for application to the users.
  • actuators 16 may include one or more of a piezoelectric actuator, a pneumatic actuator, a central mass actuator, an electroactive polymer actuator, an electrostatic surface actuator, macro-fiber composite actuator, and/or other actuators.
  • a touch sensitive surface and/or other surfaces
  • the surface actuated by actuators 16 may be rigid, semi-rigid, flexible, and/or deformable.
  • actuators 16 are shown in FIG. 1 as a single entity, this is for illustrative purposes only.
  • actuators 16 include a plurality of actuators.
  • the plurality of actuators may be included in, carried by, and/or in contact with a single object or device.
  • the plurality of actuators may include actuators included in, carried by, and/or in contact with a plurality of separate objects or devices.
  • electronic storage 18 comprises electronic storage media that electronically stores information.
  • the electronic storage media of electronic storage 18 may include one or both of system storage that is provided integrally (i.e., substantially non-removable) with system 10 and/or removable storage that is removably connectable to system 10 via, for example, a port (e.g., a USB port, a firewire port, etc.) or a drive (e.g., a disk drive, etc.).
  • a port e.g., a USB port, a firewire port, etc.
  • a drive e.g., a disk drive, etc.
  • Electronic storage 18 may include one or more of optically readable storage media (e.g., optical disks, etc.), magnetically readable storage media (e.g., magnetic tape, magnetic hard drive, floppy drive, etc.), electrical charge-based storage media (e.g., EEPROM, RAM, etc.), solid-state storage media (e.g., flash drive, etc.), and/or other electronically readable storage media.
  • Electronic storage 18 may store software algorithms, information determined by processor 20 , information received via user interface 14 , and/or other information that enables system 10 to function properly.
  • Electronic storage 18 may be a separate component within system 10 , or electronic storage 18 may be provided integrally with one or more other components of system 10 (e.g., processor 20 ).
  • Processor 20 is configured to provide information processing capabilities in system 10 .
  • processor 20 may include one or more of a digital processor, an analog processor, a digital circuit designed to process information, an analog circuit designed to process information, a state machine, and/or other mechanisms for electronically processing information.
  • processor 20 is shown in FIG. 1 as a single entity, this is for illustrative purposes only.
  • processor 20 may include a plurality of processing units. These processing units may be physically located within the same device, or processor 20 may represent processing functionality of a plurality of devices operating in coordination.
  • processor 20 may be configured to execute one or more computer program modules.
  • the one or more computer program modules may include one or more of a content module 22 , a gesture module 24 , a stimulation module 26 , an equipment module 28 , an actuator control module 30 , and/or other modules.
  • Processor 20 may be configured to execute modules 22 , 24 , 26 , 28 , and/or 30 by software; hardware; firmware; some combination of software, hardware, and/or firmware; and/or other mechanisms for configuring processing capabilities on processor 20 .
  • modules 22 , 24 , 26 , 28 , and 30 are illustrated in FIG. 1 as being co-located within a single processing unit, in implementations in which processor 20 includes multiple processing units, one or more of modules 22 , 24 , 26 , 28 , and/or 30 may be located remotely from the other modules.
  • the description of the functionality provided by the different modules 22 , 24 , 26 , 28 , and/or 30 described below is for illustrative purposes, and is not intended to be limiting, as any of modules 22 , 24 , 26 , 28 , and/or 30 may provide more or less functionality than is described.
  • modules 22 , 24 , 26 , 28 , and/or 30 may be eliminated, and some or all of its functionality may be provided by other ones of modules 22 , 24 , 26 , 28 , and/or 30 .
  • processor 38 may be configured to execute one or more additional modules that may perform some or all of the functionality attributed below to one of modules 22 , 24 , 26 , 28 , and/or 30 .
  • the content module 22 is configured to control the provision of content to user 12 via user interface 14 . If the content includes computer generated images (e.g., in a game, virtual world, simulation, etc.), content module 22 is configured to generate the images and/or views for display to user 12 through user interface 14 . If the content includes video and/or still images, content module 22 is configured to access the video and/or still images and to generate views of the video and/or still images for display on user interface 14 . If the content includes audio content, content module 22 is configured to generate the electronic signals that will drive user interface 14 to output the appropriate sounds. The content, or information from which the content is derived, may be obtained by content module 22 from electronic storage 18 .
  • content module 22 may be obtained by content module 22 from electronic storage 18 .
  • the content provided by content module 22 is content associated with a game.
  • content module 22 is configured to render views of the game for display to user 12 via user interface 14 .
  • the content module 22 further provides audio associated with the views in accordance with the machine readable-program code associated with the game.
  • the gesture module 24 is configured to receive one or more output signals generated by user interface 14 indicating control inputs received from user 12 . Based on the received one or more output signals, gesture module 24 monitors performance of one or more control gestures by user 12 .
  • a control gesture being monitored by gesture module 24 includes an initial portion, one or more intermediate portions, and an ending portion.
  • an initial portion of a control gesture may include initiating contact with the touch-sensitive surface at one or more locations.
  • the control gesture may dictate a location (or locations for a multi-touch control gesture) at which contact is initiated, a pressure with which contact is initiated, and/or other parameters of the initial contact between user 12 and the touch sensitive surface.
  • the initial portion of the control gesture may include moving to one or more locations corresponding to the control gesture, maintaining contact (at specific location(s), or generally) with the touch-sensitive surface, motioning in one or more specific directions, making one or more specific shapes, ending contact at one or more locations on the touch-sensitive surface, and/or other actions.
  • the one or more intermediate portions of the control gesture may include one or more of maintaining contact at one or more points without moving, motioning in a specific direction, making a specific shape, halting motion, contacting the touch-sensitive surface at one or more additional locations, ending contact at one or more locations on the touch-sensitive surface, pressing harder or softer on the touch-sensitive surface, and/or other actions.
  • the ending portion may include one or more of ending contact at one or more locations on the touch-sensitive surface, halting motion at one or more locations, motioning in a specific direction, making a specific shape, contacting the touch-sensitive surface at one or more additional locations, pressing harder or softer on the touch-sensitive surface, and/or other actions.
  • One or more of the actions dictated by the initial portion, the one or more intermediate portions, and/or the ending portion of the control gesture that are location-based may be associated with static locations (e.g., at the same location every time), dynamic locations that do not change during the corresponding portion (e.g., the location may move between performances of the control gesture, but remains fixed while the portion corresponding to the location is performed), dynamic locations that change during the corresponding portion, and/or other types of locations.
  • one of the most simple examples of a control gesture would include an initial portion of in which user 12 contacts the touch-sensitive surface.
  • the intermediate portion may include holding the contact made during the initial portion of the control gesture.
  • the ending portion may include removing the contact made during the initial portion of the control gesture, and maintained during the intermediate portion of the control gesture.
  • an initial portion of a control gesture may include one or more of facing a specific direction, motioning or moving in a specific direction, motioning with a specific one or more appendages, positioning body parts with respect to each other in a specific manner (e.g., holding hands in a predetermined configuration, and/or other configurations of body parts), motioning or moving in a specific shape, motioning for a specific amount of time, motioning or moving at (or above, or below) a specific rate and/or acceleration), changing a direction of motion or movement in a specific manner, and/or other actions.
  • the one or more intermediate portions may include one or more of changing orientation of the head, and/or other body parts to a specific direction, motioning or moving in a specific direction, motioning with a specific one or more appendages, motioning with body parts to move in a specific relationship to each other, motioning or moving in a specific shape, motioning for a specific amount of time, motioning or moving at (or above, or below) a specific rate and/or acceleration), changing a direction of motion or movement in a specific manner, halting motion or movement by one or more body parts, and/or other actions.
  • the ending portion of the control gesture may include may include one or more of changing orientation of the head, and/or other body parts to a specific direction, motioning or moving in a specific direction, motioning with a specific one or more appendages, motioning with body parts to move in a specific relationship to each other, motioning or moving in a specific shape, motioning for a specific amount of time, motioning or moving at (or above, or below) a specific rate and/or acceleration), changing a direction of motion or movement in a specific manner, halting motion or movement by one or more body parts, and/or other actions.
  • the gesture module 24 is configured to monitor performance of the control gestures by obtaining the output signal of user interface 14 indicating movement and/or motion of user 12 , and comparing the movement and/or motion of user 12 with the control gestures.
  • One or more of the control gestures may be a function of the content being conveyed to user 12 via user interface 14 by content module 22 .
  • one or more of the control gestures may be a function of a game being conveyed to user 12 via user interface 14 .
  • One or more of the control gestures may be independent from the content being provided to user 12 via user interface 14 .
  • one or more of the control gestures may control function of user interface 14 , processor 20 , and/or other components.
  • the stimulation module 26 is configured to receive information related to performance of control gestures from gesture module 24 , and to determine haptic stimulation to be generated for user 12 associated with the control gestures.
  • the haptic stimulation determined by stimulation module 26 includes haptic stimulation that is responsive to performance of the control gestures separate from the context in which they are performed. For example, in the context of a game haptic stimulation responsive to performance of a control gesture includes haptic stimulation that is not dependent on other variables within the game other than performance of the control gesture.
  • the haptic stimulation determined by stimulation module 26 is complex and rich beyond the haptic stimulation typically associated with control gestures.
  • haptic stimulation associated with control gestures tends to include a single stimulation that is provided during and/or after a control gesture. This haptic stimulation provides confirmation of the control gesture.
  • the haptic stimulation determined by stimulation module 26 by contrast, tracks more closely with performance of the control gesture to increase the immersive experience provided by the haptic stimulation.
  • the haptic stimulation determined by stimulation module 26 corresponding to a control gesture includes a first stimulation, a second stimulation, and/or other stimulations.
  • the first stimulation is different from the second stimulation.
  • the one or more parameters may include one or more of, for example, periodicity, force, directionality, location, and/or other parameters of haptic stimulation.
  • Variation of the one or more parameters between the first stimulation and the second stimulation (and/or other stimulations) may be smooth, and/or may be discrete so as to create a distinct step in the parameter(s) of haptic stimulation between the first stimulation and the second stimulation.
  • the stimulation module 26 is configured to correlate provision of the first stimulation and the second stimulation with performance of the control gesture to which the first stimulation and the second stimulation correspond.
  • the first stimulation is determined by stimulation module 26 responsive to performance of the initial portion of the control gesture and the second stimulation is determined by stimulation module 26 responsive to performance of one or more intermediate portions of the control gesture.
  • stimulation module 26 may determined another stimulation (e.g., a third stimulation), or may cease the provision of haptic stimulation associated with the control gesture.
  • stimulation module 26 determines haptic stimulation associated with a control gesture such that the stimulation is different between intermediate portions of the control gesture. This may result in a smooth changes in the haptic stimulation as performance of the control gesture proceeds, and/or in discrete changes in the haptic stimulation as performance of the control gesture transitions between intermediate portions.
  • the haptic stimulation determined by stimulation module 26 may be provided to give user 12 some feedback about performance of a control gesture. This may include stimulation that informs user 12 that a portion of the control gesture has begun and/or is currently ongoing, stimulation that prompts user 12 to begin a next portion of the control gesture, and/or stimulation that provides other information about performance of the control gesture to user 12 .
  • stimulation module 26 provides haptic stimulation to user 12 that indicates to user 12 that performance of a control gesture has failed.
  • stimulation module 26 determines a stimulation for provision to user 12 that is indicative of the failure.
  • the failure stimulation may include a fizzle, and/or other stimulation.
  • control gestures and corresponding haptic stimulation are presented hereafter.
  • the examples given correspond to some classes of game characters commonly found in games that include combat between players and/or between players and non-player characters. Specifically, the examples given below correspond to “casters”, “close combat warriors”, and “ranged combat warriors.” It will be appreciated that these classes of characters are not limiting. Other classes of characters, and/or hybrids included characteristics of a plurality of the classes, may be implemented in accordance with the principles set forth herein.
  • a “caster” is a character that casts spells (or other similar attacks and/or defenses) during combat. Examples of caster characters include wizards, clergys, warlocks, engineers, and/or other characters.
  • user 12 may be required to perform a control gesture that corresponds to the given spell.
  • the haptic stimulation determined by stimulation module 26 corresponding to the control gesture may inform user 12 as to the progress of the control gesture.
  • the haptic stimulation associated with the control gesture may further provide information about the power of the spell being cast (e.g., greater the force and/or more rapid periodicity may indicate a more powerful spell).
  • the force and/or rapidity of the haptic stimulation determined by stimulation module 26 may increase. This increase may be gradual as the control gesture proceeds and/or in discrete steps (e.g., at transitions between the portions of the control gesture). This is not intended to be limiting, as other parameters of the haptic stimulation may be changed to indicate the progress of the control gesture.
  • the haptic stimulation may provide feedback about the (thus far) successful progress of the control gesture, and/or prompt user 12 with respect to future performance of the control gesture (e.g., to proceed to the next portion, to maintain the current portion, etc.).
  • stimulation module 26 may determine haptic stimulation that indicates the spell being successfully cast.
  • user 12 may perform portions of the control gesture that effectively “store” the spell for discharge. This may store the spell for an indefinite period, or for some predetermined maximum storage time.
  • stimulation module 26 may determine haptic stimulation that confirms the ongoing storage of the spell. If the storage period has a maximum storage time, the haptic stimulation may indicate the status of the storage period with respect to the maximum storage time.
  • a “close combat warrior” is a character configured to fight enemies at close quarters (e.g., hand-to-hand combat). Such a character may be armed with swinging weapons (e.g., clubs, maces, etc.), stabbing and/or slashing weapons (e.g., knife, axe, sword, etc.), and/or other close quarter weapons.
  • stimulation module 26 may determine haptic stimulation that mimics some real-world properties of such attacks. For example, as user 12 strikes into an opponent or object, a first feedback may mimic the striking and/or cutting sensation. As user 12 withdraws a weapon after such a blow, a second feedback may mimic the withdrawal of the weapon from the object or opponent.
  • a first feedback determined by stimulation module 26 may mimic the real world sensation that would be the result of such an activity in real life. Releasing the weapon from the ready position (e.g., striking an opponent or object with the swinging ball) may result in the determination by stimulation module 26 of a second stimulation that mimics the feel of such an attack.
  • the intensity of an attack may be determined based on an amount of time a control gesture (or some portion thereof) is performed, the range of motion of a control gesture, a speed of motion during a control gesture, pressure of the contact between the user and a touch screen included in user interface 14 , and/or other parameters of a control gesture (or some portion thereof).
  • the user interface 14 includes a touch screen in which displays of the content associated with a game is presented by content module 22 .
  • the location at which the control gesture is made impacts the manner in which the game is controlled and/or the haptic stimulation determined by stimulation module 26 (for the user inputting the gesture and/or other users).
  • a user may input an attacking control gesture in a manner that indicates a specific portion of an enemy at which the attack should be directed (e.g., a leg, an arm, the head, the torso, etc.).
  • a user may input a defense or blocking control gesture in a manner that indicates a specific portion of a character being controlled that should be defended or blocked (e.g., a leg, an arm, the head, the torso, etc.).
  • This location-based aspect of the control gesture may be taken into account in determining the success and/or impact of the corresponding control, and/or in determining the haptic stimulation that corresponds to the control gesture.
  • Multiple attacks may be triggered by touching multiple locations on the touch screen of user interface 14 .
  • a “ranged combat warrior” is a character that is configured and armed for attacking enemies from a range. This may include characters armed, for example, to release projectiles such as arrows, stones, bullets, rockets, and/or other projectiles.
  • user 12 may control a ranged combat warrior to fire a bow and arrow.
  • different haptic stimulation may be determined by stimulation module 26 for notching the arrow, drawing the bow (and varying the haptic stimulation to indicate increasing string tension), releasing the string, the bowstring striking the forearm of the character upon release, and/or other portions of the control gesture associated with firing an arrow.
  • the intensity and/or speed of an attack or projectile may be determined based on an amount of time a control gesture (or some portion thereof) is performed, the range of motion of a control gesture, a speed of motion during a control gesture, pressure of the contact between the user and a touch screen included in user interface 14 , and/or other parameters of a control gesture (or some portion thereof).
  • the target at which a ranged attack is directed and/or the path of a projectile emitted as part of a ranged attack may be determined based on a control gesture. For example, if user interface includes a touch screen, the user performing a control gesture may contact a portion of the touch screen on which the target of the ranged attack is displayed (e.g., as part of the control gesture). Similarly, the user may trace on the touch screen a path of a projectile emitted during a ranged attack. The tracing of the path of the projectile may form at least part of the control gesture initiating the attack. The target and/or the path indicated by the control gesture may impact the determination of the corresponding haptic stimulation determined by stimulation module 26 .
  • Game content may include boxes, rooms, gates, vehicles, and/or other objects or items that are “locked”, and must be opened through performance of some control gesture.
  • stimulation module 26 may determine haptic stimulation indicating to the user that the selected object or item is locked. Performance by the user of a control gesture to unlock the object or item may result in the determination of haptic stimulation by stimulation module 26 corresponding to the control gesture.
  • the haptic stimulation determined stimulation module 26 for a given control gesture may vary based on characteristics of a character in a game being controlled by user 12 .
  • the haptic stimulation may vary as a function of character skill, character fatigue, character injury, weapon type, weapon disrepair, and/or other characteristics.
  • Changing the haptic stimulation based on character fatigue and/or injury may be a way of handicapping user 12 as a character that was fatigued or injured would be. This may include altering the force, periodicity, and/or other parameters of the haptic stimulation to inhibit precise control by user 12 (e.g., by increasing the force), to inhibit perception of cues and prompts in the haptic stimulation (e.g., by reducing force, increasing the force of portions of the haptic stimulation that do not provide cues and prompts, reducing response time to cues and/or prompts, etc.), and/or otherwise handicapping user 12 .
  • altering the force, periodicity, and/or other parameters of the haptic stimulation to inhibit precise control by user 12 (e.g., by increasing the force), to inhibit perception of cues and prompts in the haptic stimulation (e.g., by reducing force, increasing the force of portions of the haptic stimulation that do not provide cues and prompts, reducing response time to cues and/or
  • Varying the haptic stimulation based on character skill may provide a gradual and/or graduated “unlocking” of skills and abilities.
  • stimulation module 26 may determine haptic stimulation that provides more clear and/or more easily followed guide to performing a control gesture corresponding to the given ability.
  • the haptic stimulation determined by stimulation module 26 for the character may provide more definite cues and/or prompts (e.g., see the examples above with respect to character fatigue) that guide user 12 through the control gesture. This enables user 12 to utilize the control gesture without being fully trained in the corresponding ability, but may impact the reproducibility of the control gesture for the user unless further training and/or other skill-building is sought.
  • Altering the haptic stimulation determined for a control gesture based on the equipment of a character may incentivize maintaining repair of equipment and/or obtaining upgraded equipment without making such activities mandatory. Instead of a digital determination of whether a control gesture is available to user 12 , the stimulation module 26 provides haptic stimulation that makes the control gesture easier and/or more enjoyable with “better” virtual equipment.
  • the equipment module 28 is configured to provide control over virtual equipment for user 12 .
  • user 12 may control virtual equipment (e.g., directly or through a character that is using the virtual equipment.
  • content module 22 provides to user 12 , via user interface 14 views of the virtual equipment.
  • the equipment module 28 is configured to determine the operating parameters of the virtual equipment, and to simulate its operation.
  • equipment module 28 determines the operating parameters of the virtual equipment and/or simulates operation of the virtual equipment based on selections by user 12 of user selectable sections of the views of the virtual equipment.
  • the user selectable sections of the virtual equipment may be located in the views of the virtual equipment that correspond to sections of the virtual equipment analogous to sections of equipment that would be engaged in real life to configure operating parameters of equipment and/or to operate equipment.
  • the operating parameters of the virtual equipment that are configurable by engaging user selectable sections of the views of the virtual equipment include one or more of an amount of loaded rounds of ammunition, a level of disrepair, a stopping power, a projectile velocity, an ammunition type, a noise level, and/or other operating parameters.
  • FIG. 2 shows a view of a piece of virtual equipment 32 (e.g., a gun).
  • piece of virtual equipment 32 includes a plurality of selectable sections 34 (illustrated in FIG. 2 as section 34 a , section 34 b , section 34 c , and section 34 d ).
  • section 34 a user 12 may configure the handle/stock of piece of virtual equipment 32 .
  • section 34 b user 12 may configure the ammunition of piece of virtual equipment 32 .
  • section 34 c user 12 may configure the sights of piece of virtual equipment 32 .
  • section 34 d user 12 may configure the barrel/muzzle of piece of virtual equipment 32 .
  • Selection of one of selectable sections 34 to configure piece of virtual equipment 32 may include simply selecting the desired selectable section 34 , and then selecting a new configuration for the selected section 34 .
  • reconfiguration is made through a touch and drag type of interaction. For example, user 12 may engage an area on user interface 14 that corresponds to ammunition and “drag” the ammunition to section 34 b to reload piece of virtual equipment 32 .
  • user 12 may select and drag a silencer (and/or other muzzle or barrel feature) to section 34 d of piece of virtual equipment 32 to change the sound characteristics of piece of virtual equipment 32 (and/or other characteristics).
  • FIG. 3 shows a view of a virtual firearm 36 having a plurality of selectable sections 38 (shown in FIG. 3 as section 38 a , section 38 b , section 38 c , and section 38 d ).
  • the selectable sections 38 of stimulation module 26 may be selected in a fashion similar to the one discussed above with respect to FIG. 2 .
  • FIG. 4 shows a view of a piece of virtual equipment 40 configured to show the proximity of various items to a user (or to a character being controlled by a user) in a game.
  • the piece of virtual equipment 40 includes a plurality of selectable sections 42 (shown in FIG. 4 as section 42 a , section 42 b , section 42 c , section 42 d , and section 42 e ).
  • Individual ones of selectable sections 42 correspond to individual modes of piece of virtual equipment 40 such that selection of one of selectable sections 42 results in piece of virtual equipment 40 operating in accordance with the selected mode.
  • a first mode piece of virtual equipment 40 may provide an indication of the proximity of other characters
  • a second mode piece of virtual equipment 40 may provide an indication of the proximity of enemies
  • in a third mode piece of virtual equipment 40 may provide an indication of the proximity of one or more resources (e.g., gold, food, ammunition, power-ups, etc.).
  • stimulation module 26 is configured to determine haptic stimulation to be generated for the user that is associated with the operating parameters of the virtual equipment and/or the simulated operation of the virtual equipment. This may include, for example, varying the haptic stimulation based on the current operating parameters of a piece of virtual equipment, determining haptic stimulation during the reconfiguration of a piece of virtual equipment to reflect changes being made in the piece of virtual equipment, and/or other aspects of configuration and/or operation of virtual equipment.
  • stimulation module 26 determines haptic feedback indicating the reload.
  • the new ammunition is different from previously used ammunition
  • stimulation module 26 determines haptic feedback upon firing of the firearm that is different from the haptic stimulation determined during firing of the previously used ammunition.
  • actuators 16 are configured to actuate the touch sensitive electronic display that presents the views of the virtual equipment to user 12 .
  • delivery of the haptic stimulation determined by stimulation module 26 to user 12 as user 12 engages the touch sensitive electronic display results in a more immersive virtual equipment control experience. For example, as user 12 engages the touch sensitive display to reload a gun, or to switch out a piece of equipment, haptic stimulation is provided to user 12 through the touch sensitive surface that corresponds to the control being exerted over the gun.
  • the actuator control module 30 is configured to control actuators 16 to generate the haptic stimulus determined by stimulation module 26 . This includes communicating the haptic stimulus to be generated from processor 22 to actuators 16 .
  • the haptic stimulus to be generated may be communicated over wired communication links, wireless communication links, and/or other communication links between processor 22 and actuators 16 .
  • FIG. 5 illustrates a method 44 of determining and/or providing haptic stimulation to a user.
  • the haptic stimulation may be determined and/or provided to the user in conjunction with a game, and/or in other contexts.
  • the operations of method 44 presented below are intended to be illustrative. In some embodiments, method 44 may be accomplished with one or more additional operations not described, and/or without one or more of the operations discussed. Additionally, the order in which the operations of method 44 are illustrated in FIG. 5 and described below is not intended to be limiting.
  • method 44 may be implemented in one or more processing devices (e.g., a digital processor, an analog processor, a digital circuit designed to process information, an analog circuit designed to process information, a state machine, and/or other mechanisms for electronically processing information).
  • the one or more processing devices may include one or more devices executing some or all of the operations of method 44 in response to instructions stored electronically on an electronic storage medium.
  • the one or more processing devices may include one or more devices configured through hardware, firmware, and/or software to be specifically designed for execution of one or more of the operations of method 44 .
  • performance of an initial portion of a control gesture by a user is monitored. Performance of the initial portion of the control gesture may be monitored based on the output signal of a user interface similar to or the same as user interface 14 (shown in FIG. 1 and described above). In one embodiment, operation 46 is performed by a gesture module similar to or the same as gesture module 24 (shown in FIG. 1 and described above).
  • a first haptic stimulation is determined.
  • the first haptic stimulation corresponds to the initial portion of the control gesture.
  • operation 48 is performed by a stimulation module 26 similar to or the same as stimulation module 26 (shown in FIG. 1 and described above).
  • the first haptic stimulation is generated for the user.
  • operation 50 is performed by one or more actuators similar to or the same as actuators 16 (shown in FIG. 1 and described above).
  • operation 52 a determination is made as to whether a first intermediate portion of the control gesture has been performed by the user. Responsive to the first intermediate portion of the control gesture having been performed, method 44 proceeds to an operation 54 .
  • operation 52 is performed by a gesture module similar to or the same as gesture module 24 (shown in FIG. 1 and described above).
  • a second haptic stimulation is determined.
  • the second haptic stimulation corresponds to the first intermediate portion of the control gesture.
  • operation 54 is performed by a stimulation module similar to or the same as stimulation module 26 (shown in FIG. 1 and described above).
  • operation 56 the second haptic stimulation is generated for the user.
  • operation 56 is performed by one or more actuators similar to or the same as actuators 16 (shown in FIG. 1 and described above).
  • method 44 proceeds to an operation 58 at which a determination is made as to whether the control gesture has failed. This may be determined based on the output signal of a user interface similar to or the same as user interface 14 (shown in FIG. 1 and described above). The determination of whether the control gesture has failed may be made, for example, based on an amount of time since performance of the initial portion of the control gesture, if actions not included in the control gesture have been performed since performance of the initial portion of the control gesture, and/or other factors. In one embodiment, operation 58 is performed by a gesture module similar to or the same as gesture module 24 (shown in FIG. 1 and described above).
  • method 44 Responsive to a determination at operation 58 that the control gesture has not failed, method 44 returns to operation 52 . Responsive to a determination at operation 58 that the control gesture has failed, method 44 proceeds to an operation 60 .
  • haptic stimulation is determined that indicates failure of the control gesture. In one embodiment, operation 60 is performed by a stimulation module similar to or the same as stimulation module 26 (shown in FIG. 1 and described above).
  • operation 62 is performed by one or more actuators similar to or the same as actuators 16 (shown in FIG. 1 and described above).
  • method 44 proceeds to an operation 64 .
  • a determination is made as to whether an ending portion of the control gesture has been performed. Responsive to a determination that the control gesture has ended, method 44 ends the provision of haptic stimulation associated with the control gesture. In one embodiment, a final haptic stimulation associated with the completion of the control gesture is further determined and generated. In one embodiment, operation 64 is performed by a gesture module similar to or the same as gesture module 24 (shown in FIG. 1 and described above).
  • method 44 proceeds to an operation 66 .
  • a determination is made as to whether the control gesture has failed. This may be determined based on the output signal of a user interface similar to or the same as user interface 14 (shown in FIG. 1 and described above). The determination of whether the control gesture has failed may be made, for example, based on an amount of time since performance of the first intermediate portion of the control gesture, if actions not included in the control gesture have been performed since performance of the first intermediate portion of the control gesture, and/or other factors.
  • operation 66 is performed by a gesture module similar to or the same as gesture module 24 (shown in FIG. 1 and described above).
  • method 44 Responsive to a determination at operation 66 that the control gesture has not failed, method 44 returns to operation 64 . Responsive to a determination at operation 66 that the control gesture has failed, method 44 proceeds to operations 60 and 62 .
  • control gesture includes more than one intermediate portion.
  • method 44 is expanded between operations 52 and 64 to monitor the additional intermediate portion(s) and generate corresponding haptic stimulation in a manner similar to that shown and described for the first intermediate portion.
  • FIG. 6 illustrates a method 68 of determining and/or providing haptic stimulation to a user.
  • the haptic stimulation may be determined and/or provided to the user in conjunction with a game, and/or in other contexts.
  • the operations of method 68 presented below are intended to be illustrative. In some embodiments, method 68 may be accomplished with one or more additional operations not described, and/or without one or more of the operations discussed. Additionally, the order in which the operations of method 68 are illustrated in FIG. 6 and described below is not intended to be limiting.
  • method 68 may be implemented in one or more processing devices (e.g., a digital processor, an analog processor, a digital circuit designed to process information, an analog circuit designed to process information, a state machine, and/or other mechanisms for electronically processing information).
  • the one or more processing devices may include one or more devices executing some or all of the operations of method 68 in response to instructions stored electronically on an electronic storage medium.
  • the one or more processing devices may include one or more devices configured through hardware, firmware, and/or software to be specifically designed for execution of one or more of the operations of method 68 .
  • a view of virtual equipment is presented to the user via a touch sensitive electronic display.
  • the view includes selectable sections that are selectable by the user to interact with the virtual equipment shown in the view.
  • the view of the virtual equipment is determined by a content module similar to or the same content module 22 (shown in FIG. 1 and described above).
  • selection of one of the selectable sections of the view of the virtual equipment is received.
  • the selection may be received through the touch sensitive electronic display through which the view is presented.
  • operation of the virtual equipment is simulated and/or one or more parameters of the virtual equipment is adjusted based on the selection received at operation 72 .
  • operation 74 is performed by an equipment module similar to or the same as equipment module 28 (shown in FIG. 1 and described above).
  • haptic stimulation associated with the operation of the virtual equipment and/or the adjustment of the operating parameters of the virtual equipment effected at operation 74 is determined.
  • operation 76 is performed by a stimulation module similar to or the same as stimulation module 26 (shown in FIG. 1 and described above).
  • operation 78 the haptic stimulation determined at operation 76 is generated for the user.
  • operation 78 is performed by one or more actuators similar to or the same as actuators 16 (shown in FIG. 1 and described above).
  • haptic stimulation in response to gesture-based control for a single user is not intended to be limiting.
  • Haptic stimulation generated in response to gesture-based control in a multi-user environment is also contemplated.
  • the haptic stimulation provided to a given user may include haptic feedback that is responsive to gesture-based controls of another user and/or other phenomena.
  • FIG. 7 is a diagram illustrating an example 100 of a portable computing device configured in accordance with one or more aspects of the present invention.
  • a tablet computer 102 is shown.
  • Tablet computer 102 includes a screen 104 mounted to a body 106 , with the top surface 108 representing a surface with which a user interacts.
  • top surface 108 is bordered by body 106 , but the screen area could extend all the way to the edges of the tablet.
  • Tablet computer 102 can use any number or type of touch-sensing technologies to determine when a user has touched on or near surface 106 .
  • surface 108 may include a resistance or capacitance-based touch sensing system and/or an optical touch sensing system.
  • any touch-enabled computing device can be used.
  • a smaller device such as a cellular telephone or media player may feature a touch-enabled display and can provide haptic outputs in accordance with the present subject matter.
  • a haptically-enabled display surface associated with and interfaced to another computing device e.g., a desktop or server computer
  • surface 108 may represent a larger touch-enabled surface such as a table upon which a screen image is projected from above or below.
  • a flat surface is depicted here, although the present subject matter could be applied for use in devices with curved surface and/or non-smooth surfaces.
  • Control gestures may include (with or without touch gestures) manipulation of a component or body (e.g., “tilt” controls), manipulation of one or more stick controls, manipulation of one or more buttons, manipulation of one or more switches, and/or manipulation of or interaction with other interface features.
  • FIG. 7 illustrates an example of the architecture of a computing device 102 at 110 .
  • Computing device 102 comprises one or more processors 112 configured to execute computer program modules, a memory 114 such as RAM, ROM, or other memory technology, a display interface 116 , a haptic interface 118 , I/O interface 120 , and network interface 122 . Any suitable display technology can be used. In some embodiments, an LCD display is used.
  • Haptic interface 118 can comprise suitable components for driving one or more actuators used to play back haptic effects so as to provide a physical sensation to a user of device 102 .
  • display 104 may include embedded actuators so that targeted physical output can be provided to a portion of the display to provide a physical effect where the user touches surface 108 .
  • Additional actuators may be used to provide haptic output via other surfaces of tablet 102 , such as its sides and the surface opposite surface 108 (i.e., the back of the device). It will be appreciated that the location of the actuator relative to the desired physical effect can vary.
  • an actuator at a second part of the screen may be driven so that the properties of intervening components of the screen and/or device influence what is felt at the first part of the screen.
  • the tablet comprises an actuator having an eccentric rotating mass motor.
  • the actuator is coupled either directly or indirectly to a surface of the tablet housing. Powering the motor causes vibration on the surface that a user can feel.
  • an actuator may be used to raiser or lower sections of the screen to create ridges, troughs, or other features.
  • an actuator can comprise a piezoelectric.
  • a piezoelectric actuator can be embedded, at least partially, in an inorganic polymer matrix, such as silicone.
  • an actuator may comprise a macro-fiber composite actuator or piezocomposite actuator.
  • Such actuators may be formed as a thin layer of piezoelectric fibers suspended in a matrix (e.g., epoxy).
  • the fibers may communicate electrically with polyimide electrodes.
  • Many other types of actuators may be used, and so this exemplary description of actuators is not meant to be limiting.
  • I/O interface 120 can be used by processor(s) 112 to receive input and provide output using any suitable components.
  • I/O interface 120 may link to speakers and/or a microphone for receiving voice input and providing audio output.
  • I/O interface 120 may provide a connection to peripheral devices such as a mouse or stylus used to provide input to the device, or to an imaging sensor used to capture still images and/or video.
  • Network interface 122 can be used to link device 102 to a network using one or more networking technologies.
  • interface 122 may provide a connection to suitable components for connecting to an IEEE 802.11 (Wi-Fi) or 802.16 (Wimax) network, or a connection using Bluetooth technology.
  • interface 122 may allow communication via a telephone, Ethernet, or other wired connection or may support other wireless technology such as communication via an IR port.
  • Computing device 102 can comprise additional components—for example, one or more storage components (e.g., magnetic or solid-state hard disk drives) can be included. If computing device 102 comprises a cellular telephone, appropriate RF components may be included as well.
  • storage components e.g., magnetic or solid-state hard disk drives
  • RF components may be included as well.
  • Memory 114 tangibly embodies one or more program components that configure computing device 102 to operate in an intended manner.
  • memory 114 can include one or more applications, an operating system, and can also include stored data.
  • memory 114 also includes a program component 124 for providing an interactive game in accordance with one or more aspects noted below.
  • the game can configure computing device 102 to present a play area 126 via display 104 , track the movement of a virtual object 128 (e.g., a ball) in the play area, and respond to user interactions to launch and deflect the virtual object during play using paddle 130 .
  • a virtual object 128 e.g., a ball
  • the game can be configured to play back haptic effects as the virtual object moves through and encounters features of the play area.
  • the haptic effects can be selected to provide a sensation that differs based on the particular features that are encountered.
  • play area includes a top T, bottom B, left side L, and right side R.
  • the game can be configured so that the virtual object is deflected by paddle 130 prior to reaching left side L. If multiple players are involved, the virtual object may exit at one or more sides T, B, or R and pass to another user's screen as noted below.
  • FIG. 8A illustrates an example 200 A of the use of a game to support multi-user play.
  • the game program can support sending and receiving data to facilitate tracking the position of the virtual object in a play area that comprises multiple screens.
  • a first device 202 is interfaced to a second device 204 via a network 206 .
  • Network 206 may comprise a local area network, a wide area network, or may represent a direct connection between devices 202 and 204 .
  • haptic effects may include haptic effects determined in accordance with a control gesture input on another user's screen.
  • the play area 126 of FIG. 7 when virtual object 128 exits at right side R of a first device it may enter the screen at right side R of the second device.
  • the devices may have mirror-image layouts; that is, device 202 may feature paddle 130 along left side L while device 204 includes paddle 130 along right side R. In that case, when the virtual object reaches right side R of the play area of the first device, it may enter the play area at left side L of the other device, headed towards right side R and paddle 130 in the play area of the other device.
  • Server 208 is depicted to illustrate that in some embodiments, multi-user play may be facilitated by a server. However, as noted above, in some embodiments game program is configured to directly interface with other instances without need of a server.
  • FIG. 8B illustrates another example 200 B of a multi-user play.
  • three devices 202 , 204 , and 210 are interfaced via a first network 206 .
  • a second network 214 facilitates interaction with a fourth device 212 .
  • network 206 may comprise a local area network connection while network 214 may comprise a wide-area network connection.
  • Compass 216 is illustrated to show that in some embodiments the relative position of multiple players can be considered.
  • device 202 may include paddle 130 at left side L.
  • Device 204 may have its paddle positioned along top side T, since device 204 is “northmost.”
  • Device 210 may have its paddle positioned along bottom side B, while device 212 may have its paddle positioned along right side R.
  • the paddle may remain on the same side of the screen for each user but with appropriate mapping between edges to maintain the relative position of the users around the shared play area.
  • the game can be configured to dynamically adjust the behavior and connections between the play areas based on factors such as each device's orientation, the relative position of the players, and other considerations such as the relative size of screen areas.
  • the virtual object can move from one screen to another in other manners in addition to or instead of encountering the boundaries of the play area.
  • FIG. 9 illustrates an example of play areas 302 A and 302 B for two respective users A and B over a plurality of time intervals (I), (II), and (III).
  • Each play area 302 includes a respective paddle 304 .
  • virtual object 306 has been launched or deflected from paddle 304 B towards the boundary of play area 302 B.
  • exit point 308 from play area 302 B is mapped to an entry point 310 in play area 302 A.
  • the mapping may allow the virtual object to pass instantly between the play areas or there may be a delay based on the distance between the players (e.g., as determined by GPS and/or other triangulation or proximity detection techniques).
  • time interval (II) depicts virtual object 306 encountering paddle 304 A. For example, user A may have moved paddle 304 A by sliding his or her fingers along the display surface of his device to intercept virtual object 306 .
  • haptic effect H1 is selected and played back.
  • haptic effect H1 is localized to the point at which user A touches paddle 304 A (and/or another part of play area 302 ).
  • the sensation can be generated by commanding one or more actuators to provide motion or another effect; the actuators may be located at the point at which the effect is intended to be felt and/or elsewhere.
  • FIG. 9 shows effect H1 as “(((H1)))” in play area 302 A and as “(H1)” in play area 302 B since the effect is also played back for player B and localized to player B's touch point.
  • the intensity of effect H1 differs as between players A and B.
  • the haptic effect can be selected based on the simulated physics of the game.
  • the paddles 304 may represent a hard surface, and so effect H1 may comprise a strong, sharp effect. Since a deflection is meant to represent a “hit,” effect H1 may be the strongest effect in the game.
  • the “hit” is played back to user B to alert user B that the virtual object will be returned and user B can prepare to deflect the incoming virtual object.
  • the hit may be played back to player B with a suitable indicator of direction—for instance, the effect may be designed to feel like it originated from the left, rather than the top or bottom; this may be helpful when three of more users are playing together.
  • the game can enhance the perception that the players are sharing a space even though the players cannot view one another's play areas. Thus, the players may become more immersed in the game and may have a more compelling game experience.
  • FIG. 10 illustrates another example of play and depicts the virtual object ricocheting off the border of a play area. Particularly, three time intervals (I), (II), and (III) are again shown. Play areas 402 A and 402 B correspond to players A and B, while paddles 404 and virtual object 406 are also illustrated. As shown at time interval (I), virtual object 406 is launched with a trajectory towards a point 408 at the top boundary of play area 402 . Interval (II) illustrates when virtual object 406 encounters point 408 . A “bounce” haptic effect H2 is played back to players A and B, localized to their touch points at respective paddles 404 A and 404 B.
  • the “bounce” occurs in play area 402 B and is at a closer distance to paddle 404 B than paddle 404 A, it is depicted as “((H2))” in play area 402 B and “(H2)” in play area 402 A since the bounce is “louder” for player B.
  • time interval (III) after the bounce the virtual object passes to play area 402 A. Alerted to the bounce, player A may attempt to intercept the virtual object and prevent it from reaching the goal area behind paddle 404 A.
  • FIG. 11 is a diagram illustrating another aspect of the present subject matter.
  • haptic effects can be played back to simulate a continuous effect.
  • play areas 502 for two players A and B are shown in a time interval (I) and a time interval (II).
  • play areas 502 are shown in a “landscape,” rather than “portrait” orientation.
  • Each play area features a respective paddle 504 as well, and virtual object 506 is depicted.
  • Each play area of this example also includes seams 508 represented by dotted lines.
  • seams 508 may represent boundaries between planks in a wooden surface depicted in the play area.
  • a continuous low rumble effect to correlate with the virtual object rolling across the surface can be combined with click effects to correlate with the virtual object encountering seams 508 .
  • This effect is shown as “H3” in FIG. 11 .
  • the effect is shown as “((H3))” for players B and “(H3)” for player A since the virtual object is closer to paddle 504 B than paddle 504 A.
  • effect H3 is louder for paddle 504 A since virtual object 506 is moving towards player A.
  • a background effect could be included to simulate a surface alone (i.e. a continuous surface) or could vary as the simulated background surface changes (e.g., from a wood area to a metal area to a concrete area, etc.).
  • FIG. 12 is a flowchart illustrating illustrative steps in a method 600 for providing a game in accordance with the present subject matter.
  • Block 602 represents setting up one or more play areas. For example, if two users desire to play, respective play areas can be initialized and mappings between the shared boundaries (and/or other entry-exit points) can be determined.
  • Block 604 occurs while play continues.
  • At least one instance of the game can track the position and motion of the virtual object based on interaction with paddles, obstacles, and characteristics of the play area based on a model simulating physics of the game.
  • the model can provide for changes in the virtual object's speed and direction based on simulating momentum, mass, and material characteristics of the virtual object and the other items in the play area.
  • one or more haptic effects to play back can be determined. For example, if the virtual object encounters a boundary or other object in the play area, a haptic effect associated with the physical interaction between the virtual object and boundary/object can be selected for playback. Different boundaries/objects may result in different effects. For example, a border or paddle may result in a “hard” effect, while obstacles included in the play area may have “soft” effects. Simulated properties of the virtual object can be taken into account as well—the game may support a mode with a hard (e.g., steel) virtual object or a soft (e.g., rubber) virtual object with appropriate changes in the haptic output scheme.
  • a hard e.g., steel
  • a soft e.g., rubber
  • the haptic effect can relate to a background effect.
  • a continuous haptic effect simulating passage of the virtual object over a simulated surface can be provided based on characteristics of that surface.
  • the surface may include material or an obstacle for the virtual object to pass through and a suitable haptic effect can be provided to simulate passage through the material/obstacle.
  • the game determines the position of the virtual object relative to the haptic delivery point(s) to adjust how the haptic effect is to be output.
  • a haptic delivery point can include the point at which a user touches the screen of a device.
  • the “loudness” (i.e., intensity) of the haptic effect can be inversely proportional to the distance between the delivery point and the virtual object.
  • Directionality may also be applied. For example, if a ricochet occurs on another screen, the haptic effect that is presented may include a directional component or may be otherwise be presented to give an indication of where the ricochet occurred.
  • suitable signals are sent to the actuators to generate the haptic effect(s) having the desired volume.
  • the game can consult a library of signal patterns for use in generating different haptic effects and use the signal patterns to command one or more actuators embedded in the screen and/or other portions of the device.
  • the haptic effects may include sounds and/or visual elements as well.
  • each respective instance of the game can determine the virtual object's position and motion while in that instance's play area and pass that information to the other instances.
  • information regarding the virtual object's motion e.g., a vector with direction and speed
  • the haptic effect is selected by the instance of the game whose play area contains the virtual object when the haptic effect is to be triggered and that information is provided to the other instances of the game. For example, in a game involving player A and player B, if the virtual object collides with an obstacle, border, or paddle in player A's play area, the instance of the game on player A's device can provide player B's device the desired haptic effect along with information about the collision and position thereof for use by the instance of the game on player B's device in determining a volume or directionality for the effect.
  • FIG. 13 is a diagram illustrating an example of an interface 700 for an instance of a game configured in accordance with aspects of the present subject matter.
  • a pinball-like game is presented in which the play area 702 includes a border 704 , 706 that extends inward from the screen boundaries.
  • the objective of the game is to prevent a virtual object (not shown), such as a ball or other object, from reaching 708 by deflecting the virtual object using paddle 710 .
  • the virtual object may be launched from paddle 710 or may appear elsewhere in play area 702 .
  • Interface 700 includes a plurality of control buttons 712 , 714 , 716 , and 718 which may be used to provide inputs and access menus.
  • buttons 712 and 714 may comprise play and pause buttons
  • button 716 provides a “launch” command
  • button 718 exits the game or launches a menu for configuring, saving, or exiting the game.
  • control buttons 712 - 718 may be provided with suitable haptic effects to simulate pushing a physical button.
  • a plurality of simulated lights 720 can be provided to enhance the visual experience; the lights may or may not act as obstacles.
  • a bumper 722 may cause the virtual object to “bounce” in response to a collision in a manner different from a ricochet from border 704 , 706 .
  • border 704 , 706 may be presented as a simulated metal border which causes a sharp ricochet effect.
  • Bumper 722 may feature an initial amount of “give” before imparting force on the virtual object in a manner similar to that of a pinball machine bumper.
  • This example features metal bars 724 , 726 , and 728 , which may provide a still further response to collisions with a virtual object and may be assigned their own respective haptic effect.
  • Arrows 730 may comprise a visual effect and/or may result in an acceleration of the virtual object into xylophone structure 732 .
  • xylophone structure 732 comprises a plurality of keys ( 732 A, 732 B, 732 C identified), with each ascending key having its own associated haptic effect. For example, as the virtual object moves from key 732 A to 732 B to 732 C, the haptic effect may increase in pitch along with corresponding xylophone sound effects. At the same time, the haptic effect may decrease in intensity as the virtual object moves away.
  • point 734 represents an exit from the play area and into a second user's play area, which is identical to play area 702 .
  • the virtual object When the virtual object enters play area 702 , it may be returned via chute 736 along with an accompanying “rattle” representing passage through the chute.
  • each haptic effect in one user's play area can also be played back in the other user's (or users') play area(s) but with a corresponding decrease in intensity based on the separation from the site of the event causing the haptic effect and the user's point of contact with the screen.
  • Vortex 738 can comprise a portion of play area 702 that attracts the virtual object toward opening 740 . If the virtual object reaches opening 740 , the virtual object may pass into another play area. When the virtual object initially contacts vortex 738 , a first haptic effect representing the “pull” of the vortex may be played back, with the effect becoming stronger until (and if) the virtual object reaches opening 740 . At that point, an “exit” effect may be played back to represent the virtual object's exit from the vortex in another play area. This can, for instance, alert the user of the play area receiving the virtual object to move his or her paddle 710 into position.
  • a haptic effect representing entry of area 708 is presented, such as absorption of the virtual object or an explosion.
  • the virtual object may be provided for launch again via paddle 710 for another round.
  • the game proceeds until one player reaches a predetermined score level (e.g., 7 goals) and/or score differential (ahead by 3 goals).
  • point values may be associated with hitting certain obstacles (e.g., bumper 722 , bars 724 , 726 , 728 , lights 720 ) or passing through all keys of xylophone structure 732 .
  • Embodiments may support moving or destroying obstacles (e.g., breaking through bricks) during the course of play, with suitable haptic effects provided based on motion or destruction of the obstacles.
  • a computing device may be sensitive to distance of a user's fingers or stylus from the touch screen and/or touch pressure. These features may be used during play of the game and/or in configuring the game application. For instance, if the device indicates that the user is not touching the screen or another haptically-enabled area, haptic effects can be turned off to reduce power consumption. As another example, the user may be able to hover to provide input. For instance, the area at which buttons 712 - 716 are presented may ordinarily appear as brushed metal, but the buttons may appear in response to a user hovering or touching the area.
  • game play proceeded based on movement of a paddle using touch.
  • gameplay may depend on tilt sensors and/or accelerometers.
  • users may be able to tilt or swing their devices to affect the movement of the virtual object and/or paddle position.
  • Haptic effects can be delivered at the point(s) at which the users grip their respective devices.
  • the game may use multiple paddles per user or may use no paddles at all, with all input based on tilt/acceleration.
  • single-user play is also supported.
  • a play area may be completely closed, with the virtual object returning towards the paddle and goal area.
  • single user play may proceed with one or more other players simulated, with the other players having a respective simulated play area and with corresponding haptic feedback when the virtual object enters the simulated play area.

Abstract

A system is configured to provide haptic stimulation to a user. In one embodiment, the haptic stimulation is provided to the user in conjunction with the performance of one or more control gestures through which the user controls, for example, a game, a real world component or piece of equipment, and/or other entity. In one embodiment, the haptic stimulation is provided to the user in conjunction with control of virtual equipment by the user.

Description

RELATED APPLICATIONS
This application is a continuation application of U.S. patent application Ser. No. 12/840,797, filed Jul. 21, 2010, issued as U.S. Pat. No. 8,469,806 on Jun. 25, 2013, and entitled “System And Method For Providing Complex Haptic Stimulation During Input Of Control Gestures, And Relating To Control Of Virtual Equipment,” which claims priority from U.S. Provisional Patent Application No. 61/227,645, filed Jul. 22, 2009, and entitled “Interactive Touch Screen Gaming Metaphors With Haptic Feedback.” Both U.S. patent application Ser. No. 12/840,797 and U.S. Provisional Patent Application No. 61/227,645 are hereby incorporated by reference in their entireties into the present application.
FIELD OF THE INVENTION
The invention relates to a system and method of providing haptic stimulation to a user during performance of a complex control gesture, and/or during the control of virtual equipment.
BACKGROUND OF THE INVENTION
Provision of haptic stimulation to users is known. Haptic stimulation provide a physical sensation to users. Haptic stimulation is used in the context of games, and virtual worlds, and in real world control systems. Such haptic stimulation may be generated to provide feedback to users that a control input has been received, that another user has input a command, that virtual or real objects have collided, exploded, or imploded, that an ambient force is present (e.g., simulated or real wind, rain, magnetism, and/or other virtual forces), and/or that other phenomena have occurred. In conventional systems, the parameters of such stimulation is typically static and provides a simple mechanism for instructing a user that a corresponding phenomena has occurred (or will occur).
Conventional game and/or virtual world systems that enable a user to control virtual equipment are known. The control and feedback schemes for interacting with virtual equipment in these conventional systems tend to be limited, and to not correlate strongly with real world control and/or feedback of corresponding real world equipment.
Although basic haptic effects (e.g., vibrate) have been used in mobile and other computing devices, numerous challenges remain for developers to engage users and provide feedback to enhance the user experience
SUMMARY
One aspect of the invention relates to a system configured to provide haptic stimulation to a user of a game. In one embodiment, the system comprises a user interface, an actuator, and one or more processors configured to execute computer program modules. The user interface is configured to generate output signals related to the gestures of a user. The actuator is configured to generate haptic stimulation to the user. The computer program modules comprise a gesture module, a stimulation module and an actuator control module. The gesture module is configured to monitor performance of a control gesture by the user based on the output signals of the user interface. The control gesture is a gesture associated with a command input to the game, and includes an initial portion, a first intermediate portion, and an ending portion. The stimulation module is configured to receive information related to performance of the control gesture from the gesture module, and to determine haptic stimulation to be generated for the user associated with the control gesture. The haptic stimulation includes a first stimulation determined responsive to performance of the initial portion of the control gesture, and a second stimulation that is different from the first stimulation and is determined responsive to performance of the first intermediate portion of the control gesture. The actuator control module is configured to control the actuator to generate the stimulation determined by the stimulation module.
Another aspect of the invention relates to a method of providing haptic stimulation to a user of a game. In one embodiment, the method comprises monitoring performance of a control gesture by a user, wherein the control gesture is a gesture associated with a command input to the game, and includes an initial portion, a first intermediate portion, and an ending portion; determining haptic stimulation associated with performance of the control gesture to be generated for the user, wherein the haptic stimulation includes a first stimulation determined responsive to performance of the initial portion of the control gesture, and a second stimulation that is different from the first stimulation and is determined responsive to performance of the first intermediate portion of the control gesture; and generating the determined stimulation during performance of the control gesture.
Yet another aspect of the invention relates to a system configured to provide stimulation to a user of a game. In one embodiment, the system comprises a touch sensitive electronic display, an actuator, and one or more processors configured to execute computer program modules. The touch sensitive electronic display has an interface surface that is accessible for engagement by the user, and the touch sensitive user interface is configured to generate output signals related to the position at which the interface surface is engaged, and to present views of the game to the user through the interface surface. The views presented through the interface surface include views of virtual equipment having user selectable sections that are selectable by the user to interact with the virtual equipment by engaging the interface surface at the user selectable sections of the views of the virtual equipment. The actuator is configured to generate haptic stimulation to the user. The computer program modules comprise an equipment module, a stimulation module, and an actuator control module. The equipment module is configured to determine the operating parameters of the virtual equipment in the views, and to simulate operation of the virtual equipment. The equipment module determines the operating parameters of the virtual equipment and/or simulates operation of the virtual equipment based on selections by the user of the user selectable sections of the views of the virtual equipment. The stimulation module is configured to determine haptic stimulation to be generated for the user associated with the operating parameters of the virtual equipment and/or simulated operation of the virtual equipment. The actuator control module is configured to control the actuator to generate the stimulation determined by the stimulation module.
Still another aspect of the invention relates to a method of providing stimulation to a user of a game. In one embodiment, the method comprises presenting views of a game through an interface surface of a touch sensitive electronic display that is accessible for engagement by a user, wherein the views presented through the interface surface include views of virtual equipment having user selectable sections that are selectable by the user to interact with the virtual equipment by engaging the interface surface at the user selectable sections of the views of the virtual equipment; receiving selection of one of the user selectable sections via an by the user engagement of the selected user selectable section on the interface surface; determining the operating parameters of the virtual equipment in the views and/or simulating operation of the virtual equipment based on the received selection; determining, responsive to the received selection, haptic stimulation to be generated for the user associated with the operating parameters of the virtual equipment and/or simulated operation of the virtual equipment; and generating the determined haptic stimulation.
A still further aspect of the invention relates to a system and method for providing a game on one or more portable computing device in which a virtual object (e.g., a ball) travels through views of the game displayed on the interfaces of the one or more portable computing devices. Haptic effects corresponding to the travel of the virtual object (or virtual objects) through the views are provided on the individual portable computing devices. The haptic effects may be determined based on one or more parameters of the travel of the virtual object (e.g., speed, direction, acceleration, etc.), one or more parameters of objects and/or features with which the virtual object interacts (e.g., walls, flippers, blockers, bumpers, etc.), and/or other parameters. The haptic effects may include haptic effects to be provided on portable computing devices that are not currently displaying the virtual object corresponding to the haptic effects. This may enhance the interactivity of the game for a group of users playing the game together on separate portable computing devices.
These and other objects, features, and characteristics of the present invention, as well as the methods of operation and functions of the related elements of structure and the combination of parts and economies of manufacture, will become more apparent upon consideration of the following description and the appended claims with reference to the accompanying drawings, all of which form a part of this specification, wherein like reference numerals designate corresponding parts in the various figures. It is to be expressly understood, however, that the drawings are for the purpose of illustration and description only and are not intended as a definition of the limits of the invention. As used in the specification and in the claims, the singular form of “a”, “an”, and “the” include plural referents unless the context clearly dictates otherwise.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 illustrates a system configured to provide haptic stimulation to a user, in accordance with one or more embodiments of the invention.
FIG. 2 illustrates a view of a piece of virtual equipment, in accordance with one or more embodiments of the invention.
FIG. 3 illustrates a view of a piece of virtual equipment, in accordance with one or more embodiments of the invention.
FIG. 4 illustrates a view of a piece of virtual equipment, in accordance with one or more embodiments of the invention.
FIG. 5 illustrates a method of providing haptic feedback to a user, according to one or more embodiments of the invention.
FIG. 6 illustrates a method of providing haptic feedback to a user, according to one or more embodiments of the invention.
FIG. 7 illustrates a portable computing device, in accordance with one or more embodiments of the invention.
FIG. 8A illustrates an example of the use of a game to support multi-user play, according to one or more embodiments of the invention.
FIG. 8B illustrates an example of the use of a game to support multi-user play, according to one or more embodiments of the invention.
FIG. 9 illustrates an example of play areas for two respective users over a plurality of time intervals, in accordance with one or more embodiments of the invention.
FIG. 10 illustrates an example of play and depicts the virtual object ricocheting off the border of a play area, according to one or more embodiments of the invention.
FIG. 11 illustrate a use of haptic effects to simulate a continuous effect, in accordance with one or more embodiments of the invention.
FIG. 12 illustrates a method for providing a game, according to one or more embodiments of the invention.
FIG. 13 illustrates an example of an interface for an instance of a game, in accordance with one or more embodiments of the invention.
DETAILED DESCRIPTION
FIG. 1 illustrates a system 10 configured to provide haptic stimulation to a user 12. In one embodiment, the haptic stimulation is provided to user 12 in conjunction with the performance of one or more control gestures through which user 12 controls, for example, a game, a real world component or piece of equipment, and/or other entity. The haptic stimulation is provided to user 12 such that as performance of a given control gesture continues, the haptic stimulation associated with the control gesture changes in accordance with the progression of the control gesture. In one embodiment, the haptic stimulation is provided to user 12 in conjunction with control of virtual equipment by user 12. The haptic stimulation corresponds to control inputs provided to system 10 in controlling the virtual equipment. In one embodiment, system 10 includes one or more user interfaces 14, one or more actuators 16, electronic storage 18, one or more processors 20, and/or other components.
Although this disclosure primarily describes the provision of haptic stimulation in conjunction with control gestures performed to control a game, this is not limiting. The provision of haptic stimulation in accordance with the principles set forth herein may be extended to other contexts in which a user provides input in the form of a control gesture (e.g., controlling a television, a computer, an automobile, a remote control vehicle or vessel, etc.). In implementations involving the control of real world components or systems, haptic stimulation may be used to enhance the performance of complex control gestures by users, enhance the user experience, teach users to perform control gestures, and/or provide other enhancement over conventional real world control systems. Implementations involving control of real world components or systems may include instances in which a user is present at the component being controlled (e.g., a control panel in a car, on a microwave, etc.), and/or instances in which a user is located remotely from the component being controlled. Control of a remote real world component or system may be accompanied by other sensory stimulation informing the user of the state of the component being controlled. Such other sensory stimulation may include, for example, real-time (or near real-time) video, audio, and/or still images provided to the user.
Several embodiments are described herein as though user 12 were the only user for whom haptic stimulus is being provided. This is not limiting. Expansion of the principles and embodiments described herein is within the ability of one of ordinary skill in the art, and the description of a single user embodiment would enable the person of ordinary skill in the art to make and/or user a multiple user embodiment providing the same features to a plurality of users. A multiple user embodiment may include the provision of haptic stimulation to “passive users” (e.g., users not performing a control gesture) related to the performance of a control gesture by an “active user” (e.g., the user performing the control gesture). The haptic stimulation may further provide feedback to the passive and/or active users of other phenomena present in the multi-user environment.
As used herein, a “control gesture” refers to a gesture made by a user that is a single and discrete control input having separate portions. The separate portions must be performed in specific order and/or with a specific timing to effectively achieve the control input associated with the “control gesture.” Performance of the separate portions, on their own, will not result in the control input associated with the “control gesture” as a whole (e.g., a “control gesture” is not merely a combination of other gestures, each associated with its own control input). In some examples, a “control gesture” is an abstract gesture that does not correlate with exactness to the control input to which it corresponds. Some non-limiting examples of a “control gesture” are described below.
The user interface 14 includes one or more input and/or output devices configured to communicate information to and/or receive information from user 12. The user interface 14 may include, for example, one or more content delivery devices that convey content to user 12. This content may include audio content, video content, still images, and/or other content. The content delivery devices may include, for example, electronic displays (e.g., including touch-sensitive displays), audio speakers, and/or other content delivery devices. The user interface 14 may include one or more control input devices configured to generate an output signal indicating input from the user to system 10. For example, user interface 14 may include a game controller, a remote control, a keypad, a button, a switch, a keyboard, a knob, a lever, a microphone, a position detecting device (e.g., an image sensor, a pressure sensor, an optical position detector, an ultrasonic position detector, a touch sensitive surface, and/or other position detecting devices), an accelerometer, a gyroscope, a digital compass, and/or other control input devices. The user interface 14 may be embodied in, or associated with, a computing device, and/or may be embodied in a control peripheral. A computing device may include one or more of a desktop computer, a laptop computer, a handheld computer, a personal digital assistant, a Smartphone, a personal music player, a portable gaming console, a gaming console, and/or other computing devices.
It will be appreciated that although user interface 14 is shown in FIG. 1 as a single entity, this is for illustrative purposes only. In one embodiment, user interface 14 includes a plurality of actuators. The plurality of user interfaces 14 may be included in, carried by, and/or in contact with a single object or device. Or, the plurality of user interfaces 14 may include user interfaces included in and/or carried by a plurality of separate objects or devices.
The actuators 16 are configured to generate haptic stimulus for user 12. As such, at least some of actuators 16 are in contact with the users, or in contact with objects that contact the users, during conveyance of the sensory content to the users by user interface 14. By way of non-limiting example, one or more of actuators 16 may be positioned in or on a floor surface supporting the users (e.g., installed in the floor, carried by a mat lying on the floor, etc.), one or more of actuators 16 may be carried by a brace or other wearable item worn by the users, one or more of the actuators 16 may be carried by objects that are carried by the users (e.g., carried by a controller), one or more of actuators 16 may be carried by furniture on which the users are seated or lying, one or more of actuators 16 may be carried by user interface 14, and/or one or more of the actuators 16 may be carried by or disposed in or on other objects that contact the users.
As used herein, the term “haptic stimulus” refers to tactile feedback that is applied to the users. For example, such feedback may include one or more of vibrations, forces, and/or motions that are applied physically to the user by the actuators 16 and/or the objects with which both actuators 16 and the user are in contact. The actuators 16 may include any device configured to generate such feedback for application to the users. For example, actuators 16 may include one or more of a piezoelectric actuator, a pneumatic actuator, a central mass actuator, an electroactive polymer actuator, an electrostatic surface actuator, macro-fiber composite actuator, and/or other actuators. For example, a touch sensitive surface (and/or other surfaces) may be actuated to move relative to a user by an electrostatic actuator. The surface actuated by actuators 16 may be rigid, semi-rigid, flexible, and/or deformable.
It will be appreciated that although actuators 16 are shown in FIG. 1 as a single entity, this is for illustrative purposes only. In one embodiment, actuators 16 include a plurality of actuators. The plurality of actuators may be included in, carried by, and/or in contact with a single object or device. Or, the plurality of actuators may include actuators included in, carried by, and/or in contact with a plurality of separate objects or devices.
In one embodiment, electronic storage 18 comprises electronic storage media that electronically stores information. The electronic storage media of electronic storage 18 may include one or both of system storage that is provided integrally (i.e., substantially non-removable) with system 10 and/or removable storage that is removably connectable to system 10 via, for example, a port (e.g., a USB port, a firewire port, etc.) or a drive (e.g., a disk drive, etc.). Electronic storage 18 may include one or more of optically readable storage media (e.g., optical disks, etc.), magnetically readable storage media (e.g., magnetic tape, magnetic hard drive, floppy drive, etc.), electrical charge-based storage media (e.g., EEPROM, RAM, etc.), solid-state storage media (e.g., flash drive, etc.), and/or other electronically readable storage media. Electronic storage 18 may store software algorithms, information determined by processor 20, information received via user interface 14, and/or other information that enables system 10 to function properly. Electronic storage 18 may be a separate component within system 10, or electronic storage 18 may be provided integrally with one or more other components of system 10 (e.g., processor 20).
Processor 20 is configured to provide information processing capabilities in system 10. As such, processor 20 may include one or more of a digital processor, an analog processor, a digital circuit designed to process information, an analog circuit designed to process information, a state machine, and/or other mechanisms for electronically processing information. Although processor 20 is shown in FIG. 1 as a single entity, this is for illustrative purposes only. In some implementations, processor 20 may include a plurality of processing units. These processing units may be physically located within the same device, or processor 20 may represent processing functionality of a plurality of devices operating in coordination.
As is shown in FIG. 1, processor 20 may be configured to execute one or more computer program modules. The one or more computer program modules may include one or more of a content module 22, a gesture module 24, a stimulation module 26, an equipment module 28, an actuator control module 30, and/or other modules. Processor 20 may be configured to execute modules 22, 24, 26, 28, and/or 30 by software; hardware; firmware; some combination of software, hardware, and/or firmware; and/or other mechanisms for configuring processing capabilities on processor 20.
It should be appreciated that although modules 22, 24, 26, 28, and 30 are illustrated in FIG. 1 as being co-located within a single processing unit, in implementations in which processor 20 includes multiple processing units, one or more of modules 22, 24, 26, 28, and/or 30 may be located remotely from the other modules. The description of the functionality provided by the different modules 22, 24, 26, 28, and/or 30 described below is for illustrative purposes, and is not intended to be limiting, as any of modules 22, 24, 26, 28, and/or 30 may provide more or less functionality than is described. For example, one or more of modules 22, 24, 26, 28, and/or 30 may be eliminated, and some or all of its functionality may be provided by other ones of modules 22, 24, 26, 28, and/or 30. As another example, processor 38 may be configured to execute one or more additional modules that may perform some or all of the functionality attributed below to one of modules 22, 24, 26, 28, and/or 30.
The content module 22 is configured to control the provision of content to user 12 via user interface 14. If the content includes computer generated images (e.g., in a game, virtual world, simulation, etc.), content module 22 is configured to generate the images and/or views for display to user 12 through user interface 14. If the content includes video and/or still images, content module 22 is configured to access the video and/or still images and to generate views of the video and/or still images for display on user interface 14. If the content includes audio content, content module 22 is configured to generate the electronic signals that will drive user interface 14 to output the appropriate sounds. The content, or information from which the content is derived, may be obtained by content module 22 from electronic storage 18.
In one embodiment, the content provided by content module 22 is content associated with a game. In this embodiment content module 22 is configured to render views of the game for display to user 12 via user interface 14. The content module 22 further provides audio associated with the views in accordance with the machine readable-program code associated with the game.
The gesture module 24 is configured to receive one or more output signals generated by user interface 14 indicating control inputs received from user 12. Based on the received one or more output signals, gesture module 24 monitors performance of one or more control gestures by user 12. In one embodiment, a control gesture being monitored by gesture module 24 includes an initial portion, one or more intermediate portions, and an ending portion.
In an embodiment in which user interface 14 includes a touch-sensitive surface through which input is received from user 12, an initial portion of a control gesture may include initiating contact with the touch-sensitive surface at one or more locations. The control gesture may dictate a location (or locations for a multi-touch control gesture) at which contact is initiated, a pressure with which contact is initiated, and/or other parameters of the initial contact between user 12 and the touch sensitive surface. If user 12 is already in contact with the touch-sensitive surface, the initial portion of the control gesture may include moving to one or more locations corresponding to the control gesture, maintaining contact (at specific location(s), or generally) with the touch-sensitive surface, motioning in one or more specific directions, making one or more specific shapes, ending contact at one or more locations on the touch-sensitive surface, and/or other actions. The one or more intermediate portions of the control gesture may include one or more of maintaining contact at one or more points without moving, motioning in a specific direction, making a specific shape, halting motion, contacting the touch-sensitive surface at one or more additional locations, ending contact at one or more locations on the touch-sensitive surface, pressing harder or softer on the touch-sensitive surface, and/or other actions. The ending portion may include one or more of ending contact at one or more locations on the touch-sensitive surface, halting motion at one or more locations, motioning in a specific direction, making a specific shape, contacting the touch-sensitive surface at one or more additional locations, pressing harder or softer on the touch-sensitive surface, and/or other actions.
One or more of the actions dictated by the initial portion, the one or more intermediate portions, and/or the ending portion of the control gesture that are location-based may be associated with static locations (e.g., at the same location every time), dynamic locations that do not change during the corresponding portion (e.g., the location may move between performances of the control gesture, but remains fixed while the portion corresponding to the location is performed), dynamic locations that change during the corresponding portion, and/or other types of locations.
In the embodiment in which user interface 14 includes a touch-sensitive surface, one of the most simple examples of a control gesture would include an initial portion of in which user 12 contacts the touch-sensitive surface. The intermediate portion may include holding the contact made during the initial portion of the control gesture. The ending portion may include removing the contact made during the initial portion of the control gesture, and maintained during the intermediate portion of the control gesture.
In an embodiment in which user interface 14 includes one or more sensors configured to monitor motion of user 12 in space (e.g., an imaging sensor, a pressure sensor, an accelerometer, and/or other sensors), an initial portion of a control gesture may include one or more of facing a specific direction, motioning or moving in a specific direction, motioning with a specific one or more appendages, positioning body parts with respect to each other in a specific manner (e.g., holding hands in a predetermined configuration, and/or other configurations of body parts), motioning or moving in a specific shape, motioning for a specific amount of time, motioning or moving at (or above, or below) a specific rate and/or acceleration), changing a direction of motion or movement in a specific manner, and/or other actions. The one or more intermediate portions may include one or more of changing orientation of the head, and/or other body parts to a specific direction, motioning or moving in a specific direction, motioning with a specific one or more appendages, motioning with body parts to move in a specific relationship to each other, motioning or moving in a specific shape, motioning for a specific amount of time, motioning or moving at (or above, or below) a specific rate and/or acceleration), changing a direction of motion or movement in a specific manner, halting motion or movement by one or more body parts, and/or other actions. The ending portion of the control gesture may include may include one or more of changing orientation of the head, and/or other body parts to a specific direction, motioning or moving in a specific direction, motioning with a specific one or more appendages, motioning with body parts to move in a specific relationship to each other, motioning or moving in a specific shape, motioning for a specific amount of time, motioning or moving at (or above, or below) a specific rate and/or acceleration), changing a direction of motion or movement in a specific manner, halting motion or movement by one or more body parts, and/or other actions.
The gesture module 24 is configured to monitor performance of the control gestures by obtaining the output signal of user interface 14 indicating movement and/or motion of user 12, and comparing the movement and/or motion of user 12 with the control gestures. One or more of the control gestures may be a function of the content being conveyed to user 12 via user interface 14 by content module 22. For example, one or more of the control gestures may be a function of a game being conveyed to user 12 via user interface 14. One or more of the control gestures may be independent from the content being provided to user 12 via user interface 14. For example, one or more of the control gestures may control function of user interface 14, processor 20, and/or other components.
The stimulation module 26 is configured to receive information related to performance of control gestures from gesture module 24, and to determine haptic stimulation to be generated for user 12 associated with the control gestures. The haptic stimulation determined by stimulation module 26 includes haptic stimulation that is responsive to performance of the control gestures separate from the context in which they are performed. For example, in the context of a game haptic stimulation responsive to performance of a control gesture includes haptic stimulation that is not dependent on other variables within the game other than performance of the control gesture.
The haptic stimulation determined by stimulation module 26 is complex and rich beyond the haptic stimulation typically associated with control gestures. For example, in conventional systems, haptic stimulation associated with control gestures tends to include a single stimulation that is provided during and/or after a control gesture. This haptic stimulation provides confirmation of the control gesture. The haptic stimulation determined by stimulation module 26, by contrast, tracks more closely with performance of the control gesture to increase the immersive experience provided by the haptic stimulation.
In one embodiment, the haptic stimulation determined by stimulation module 26 corresponding to a control gesture includes a first stimulation, a second stimulation, and/or other stimulations. The first stimulation is different from the second stimulation. This means that one or more parameters of the first stimulation is different from the second stimulation. The one or more parameters may include one or more of, for example, periodicity, force, directionality, location, and/or other parameters of haptic stimulation. Variation of the one or more parameters between the first stimulation and the second stimulation (and/or other stimulations) may be smooth, and/or may be discrete so as to create a distinct step in the parameter(s) of haptic stimulation between the first stimulation and the second stimulation.
The stimulation module 26 is configured to correlate provision of the first stimulation and the second stimulation with performance of the control gesture to which the first stimulation and the second stimulation correspond. In one embodiment, the first stimulation is determined by stimulation module 26 responsive to performance of the initial portion of the control gesture and the second stimulation is determined by stimulation module 26 responsive to performance of one or more intermediate portions of the control gesture. Responsive to the ending portion of the control gesture, stimulation module 26 may determined another stimulation (e.g., a third stimulation), or may cease the provision of haptic stimulation associated with the control gesture.
In one embodiment, stimulation module 26 determines haptic stimulation associated with a control gesture such that the stimulation is different between intermediate portions of the control gesture. This may result in a smooth changes in the haptic stimulation as performance of the control gesture proceeds, and/or in discrete changes in the haptic stimulation as performance of the control gesture transitions between intermediate portions.
The haptic stimulation determined by stimulation module 26 may be provided to give user 12 some feedback about performance of a control gesture. This may include stimulation that informs user 12 that a portion of the control gesture has begun and/or is currently ongoing, stimulation that prompts user 12 to begin a next portion of the control gesture, and/or stimulation that provides other information about performance of the control gesture to user 12.
In one embodiment stimulation module 26 provides haptic stimulation to user 12 that indicates to user 12 that performance of a control gesture has failed. In this embodiment, responsive to failure by user 12 to perform a portion of a control gesture, or failing to properly transition from one portion of the control gesture to another portion of the control gesture, stimulation module 26 determines a stimulation for provision to user 12 that is indicative of the failure. For example, the failure stimulation may include a fizzle, and/or other stimulation.
Some illustrative examples of control gestures and corresponding haptic stimulation are presented hereafter. The examples given correspond to some classes of game characters commonly found in games that include combat between players and/or between players and non-player characters. Specifically, the examples given below correspond to “casters”, “close combat warriors”, and “ranged combat warriors.” It will be appreciated that these classes of characters are not limiting. Other classes of characters, and/or hybrids included characteristics of a plurality of the classes, may be implemented in accordance with the principles set forth herein.
A “caster” is a character that casts spells (or other similar attacks and/or defenses) during combat. Examples of caster characters include wizards, priests, warlocks, engineers, and/or other characters. To control a caster to cast a given spell, user 12 may be required to perform a control gesture that corresponds to the given spell. As user 12 progresses through the portions of the control gesture, the haptic stimulation determined by stimulation module 26 corresponding to the control gesture may inform user 12 as to the progress of the control gesture. The haptic stimulation associated with the control gesture may further provide information about the power of the spell being cast (e.g., greater the force and/or more rapid periodicity may indicate a more powerful spell).
In one embodiment, as the control gesture proceeds between the portions of the control gesture, the force and/or rapidity of the haptic stimulation determined by stimulation module 26 may increase. This increase may be gradual as the control gesture proceeds and/or in discrete steps (e.g., at transitions between the portions of the control gesture). This is not intended to be limiting, as other parameters of the haptic stimulation may be changed to indicate the progress of the control gesture. The haptic stimulation may provide feedback about the (thus far) successful progress of the control gesture, and/or prompt user 12 with respect to future performance of the control gesture (e.g., to proceed to the next portion, to maintain the current portion, etc.). Upon performing the final portion of the control gesture, stimulation module 26 may determine haptic stimulation that indicates the spell being successfully cast. In one embodiment, user 12 may perform portions of the control gesture that effectively “store” the spell for discharge. This may store the spell for an indefinite period, or for some predetermined maximum storage time. During storage, stimulation module 26 may determine haptic stimulation that confirms the ongoing storage of the spell. If the storage period has a maximum storage time, the haptic stimulation may indicate the status of the storage period with respect to the maximum storage time.
A “close combat warrior” is a character configured to fight enemies at close quarters (e.g., hand-to-hand combat). Such a character may be armed with swinging weapons (e.g., clubs, maces, etc.), stabbing and/or slashing weapons (e.g., knife, axe, sword, etc.), and/or other close quarter weapons. As user 12 controls the character to, for example, wield a blow, stimulation module 26 may determine haptic stimulation that mimics some real-world properties of such attacks. For example, as user 12 strikes into an opponent or object, a first feedback may mimic the striking and/or cutting sensation. As user 12 withdraws a weapon after such a blow, a second feedback may mimic the withdrawal of the weapon from the object or opponent. As another example, as user 12 swings a weapon in a ready position (e.g., swinging a ball on a chain) a first feedback determined by stimulation module 26 may mimic the real world sensation that would be the result of such an activity in real life. Releasing the weapon from the ready position (e.g., striking an opponent or object with the swinging ball) may result in the determination by stimulation module 26 of a second stimulation that mimics the feel of such an attack. The intensity of an attack may be determined based on an amount of time a control gesture (or some portion thereof) is performed, the range of motion of a control gesture, a speed of motion during a control gesture, pressure of the contact between the user and a touch screen included in user interface 14, and/or other parameters of a control gesture (or some portion thereof).
In some implementations, the user interface 14 includes a touch screen in which displays of the content associated with a game is presented by content module 22. In such implementations, the location at which the control gesture is made impacts the manner in which the game is controlled and/or the haptic stimulation determined by stimulation module 26 (for the user inputting the gesture and/or other users). For example, in controlling a close combat warrior, a user may input an attacking control gesture in a manner that indicates a specific portion of an enemy at which the attack should be directed (e.g., a leg, an arm, the head, the torso, etc.). Similarly, in controlling a close combat warrior, a user may input a defense or blocking control gesture in a manner that indicates a specific portion of a character being controlled that should be defended or blocked (e.g., a leg, an arm, the head, the torso, etc.). This location-based aspect of the control gesture may be taken into account in determining the success and/or impact of the corresponding control, and/or in determining the haptic stimulation that corresponds to the control gesture. Multiple attacks may be triggered by touching multiple locations on the touch screen of user interface 14.
A “ranged combat warrior” is a character that is configured and armed for attacking enemies from a range. This may include characters armed, for example, to release projectiles such as arrows, stones, bullets, rockets, and/or other projectiles. By way of non-limiting example, user 12 may control a ranged combat warrior to fire a bow and arrow. In this example, different haptic stimulation may be determined by stimulation module 26 for notching the arrow, drawing the bow (and varying the haptic stimulation to indicate increasing string tension), releasing the string, the bowstring striking the forearm of the character upon release, and/or other portions of the control gesture associated with firing an arrow. The intensity and/or speed of an attack or projectile may be determined based on an amount of time a control gesture (or some portion thereof) is performed, the range of motion of a control gesture, a speed of motion during a control gesture, pressure of the contact between the user and a touch screen included in user interface 14, and/or other parameters of a control gesture (or some portion thereof).
The target at which a ranged attack is directed and/or the path of a projectile emitted as part of a ranged attack may be determined based on a control gesture. For example, if user interface includes a touch screen, the user performing a control gesture may contact a portion of the touch screen on which the target of the ranged attack is displayed (e.g., as part of the control gesture). Similarly, the user may trace on the touch screen a path of a projectile emitted during a ranged attack. The tracing of the path of the projectile may form at least part of the control gesture initiating the attack. The target and/or the path indicated by the control gesture may impact the determination of the corresponding haptic stimulation determined by stimulation module 26.
Game content may include boxes, rooms, gates, vehicles, and/or other objects or items that are “locked”, and must be opened through performance of some control gesture. In some implementations, in response to a user selecting such an object or item, stimulation module 26 may determine haptic stimulation indicating to the user that the selected object or item is locked. Performance by the user of a control gesture to unlock the object or item may result in the determination of haptic stimulation by stimulation module 26 corresponding to the control gesture.
In one embodiment, the haptic stimulation determined stimulation module 26 for a given control gesture may vary based on characteristics of a character in a game being controlled by user 12. For example, the haptic stimulation may vary as a function of character skill, character fatigue, character injury, weapon type, weapon disrepair, and/or other characteristics.
Changing the haptic stimulation based on character fatigue and/or injury may be a way of handicapping user 12 as a character that was fatigued or injured would be. This may include altering the force, periodicity, and/or other parameters of the haptic stimulation to inhibit precise control by user 12 (e.g., by increasing the force), to inhibit perception of cues and prompts in the haptic stimulation (e.g., by reducing force, increasing the force of portions of the haptic stimulation that do not provide cues and prompts, reducing response time to cues and/or prompts, etc.), and/or otherwise handicapping user 12.
Varying the haptic stimulation based on character skill may provide a gradual and/or graduated “unlocking” of skills and abilities. For a character with more training and/or skill in a given ability, stimulation module 26 may determine haptic stimulation that provides more clear and/or more easily followed guide to performing a control gesture corresponding to the given ability. For example, as the skill of the character increases, the haptic stimulation determined by stimulation module 26 for the character may provide more definite cues and/or prompts (e.g., see the examples above with respect to character fatigue) that guide user 12 through the control gesture. This enables user 12 to utilize the control gesture without being fully trained in the corresponding ability, but may impact the reproducibility of the control gesture for the user unless further training and/or other skill-building is sought.
Altering the haptic stimulation determined for a control gesture based on the equipment of a character may incentivize maintaining repair of equipment and/or obtaining upgraded equipment without making such activities mandatory. Instead of a digital determination of whether a control gesture is available to user 12, the stimulation module 26 provides haptic stimulation that makes the control gesture easier and/or more enjoyable with “better” virtual equipment.
The equipment module 28 is configured to provide control over virtual equipment for user 12. Within the context of a game being provided to user 12 by system 10, user 12 may control virtual equipment (e.g., directly or through a character that is using the virtual equipment. In such cases, content module 22 provides to user 12, via user interface 14 views of the virtual equipment. The equipment module 28 is configured to determine the operating parameters of the virtual equipment, and to simulate its operation.
In one embodiment in which user interface 14 includes a touch sensitive electronic display, equipment module 28 determines the operating parameters of the virtual equipment and/or simulates operation of the virtual equipment based on selections by user 12 of user selectable sections of the views of the virtual equipment. The user selectable sections of the virtual equipment may be located in the views of the virtual equipment that correspond to sections of the virtual equipment analogous to sections of equipment that would be engaged in real life to configure operating parameters of equipment and/or to operate equipment. The operating parameters of the virtual equipment that are configurable by engaging user selectable sections of the views of the virtual equipment include one or more of an amount of loaded rounds of ammunition, a level of disrepair, a stopping power, a projectile velocity, an ammunition type, a noise level, and/or other operating parameters.
By way of illustration, FIG. 2 shows a view of a piece of virtual equipment 32 (e.g., a gun). In the view shown in FIG. 2, piece of virtual equipment 32 includes a plurality of selectable sections 34 (illustrated in FIG. 2 as section 34 a, section 34 b, section 34 c, and section 34 d). By selecting section 34 a, user 12 may configure the handle/stock of piece of virtual equipment 32. By selecting section 34 b, user 12 may configure the ammunition of piece of virtual equipment 32. By selecting section 34 c, user 12 may configure the sights of piece of virtual equipment 32. By selecting section 34 d, user 12 may configure the barrel/muzzle of piece of virtual equipment 32.
Selection of one of selectable sections 34 to configure piece of virtual equipment 32 may include simply selecting the desired selectable section 34, and then selecting a new configuration for the selected section 34. In one embodiment, reconfiguration is made through a touch and drag type of interaction. For example, user 12 may engage an area on user interface 14 that corresponds to ammunition and “drag” the ammunition to section 34 b to reload piece of virtual equipment 32. Similarly, user 12 may select and drag a silencer (and/or other muzzle or barrel feature) to section 34 d of piece of virtual equipment 32 to change the sound characteristics of piece of virtual equipment 32 (and/or other characteristics).
As another illustrative example, FIG. 3 shows a view of a virtual firearm 36 having a plurality of selectable sections 38 (shown in FIG. 3 as section 38 a, section 38 b, section 38 c, and section 38 d). The selectable sections 38 of stimulation module 26 may be selected in a fashion similar to the one discussed above with respect to FIG. 2.
As yet another illustrative example, FIG. 4 shows a view of a piece of virtual equipment 40 configured to show the proximity of various items to a user (or to a character being controlled by a user) in a game. The piece of virtual equipment 40 includes a plurality of selectable sections 42 (shown in FIG. 4 as section 42 a, section 42 b, section 42 c, section 42 d, and section 42 e). Individual ones of selectable sections 42 correspond to individual modes of piece of virtual equipment 40 such that selection of one of selectable sections 42 results in piece of virtual equipment 40 operating in accordance with the selected mode. For example, in a first mode piece of virtual equipment 40 may provide an indication of the proximity of other characters, in a second mode piece of virtual equipment 40 may provide an indication of the proximity of enemies, in a third mode piece of virtual equipment 40 may provide an indication of the proximity of one or more resources (e.g., gold, food, ammunition, power-ups, etc.).
Returning now to FIG. 1, in one embodiment, stimulation module 26 is configured to determine haptic stimulation to be generated for the user that is associated with the operating parameters of the virtual equipment and/or the simulated operation of the virtual equipment. This may include, for example, varying the haptic stimulation based on the current operating parameters of a piece of virtual equipment, determining haptic stimulation during the reconfiguration of a piece of virtual equipment to reflect changes being made in the piece of virtual equipment, and/or other aspects of configuration and/or operation of virtual equipment.
By way of non-limiting example, if a virtual firearm is reloaded by user 12, stimulation module 26 determines haptic feedback indicating the reload. The new ammunition is different from previously used ammunition, stimulation module 26 determines haptic feedback upon firing of the firearm that is different from the haptic stimulation determined during firing of the previously used ammunition.
In one embodiment, actuators 16 are configured to actuate the touch sensitive electronic display that presents the views of the virtual equipment to user 12. In this embodiment, delivery of the haptic stimulation determined by stimulation module 26 to user 12 as user 12 engages the touch sensitive electronic display results in a more immersive virtual equipment control experience. For example, as user 12 engages the touch sensitive display to reload a gun, or to switch out a piece of equipment, haptic stimulation is provided to user 12 through the touch sensitive surface that corresponds to the control being exerted over the gun.
The actuator control module 30 is configured to control actuators 16 to generate the haptic stimulus determined by stimulation module 26. This includes communicating the haptic stimulus to be generated from processor 22 to actuators 16. The haptic stimulus to be generated may be communicated over wired communication links, wireless communication links, and/or other communication links between processor 22 and actuators 16.
FIG. 5 illustrates a method 44 of determining and/or providing haptic stimulation to a user. The haptic stimulation may be determined and/or provided to the user in conjunction with a game, and/or in other contexts. The operations of method 44 presented below are intended to be illustrative. In some embodiments, method 44 may be accomplished with one or more additional operations not described, and/or without one or more of the operations discussed. Additionally, the order in which the operations of method 44 are illustrated in FIG. 5 and described below is not intended to be limiting.
In some embodiments, method 44 may be implemented in one or more processing devices (e.g., a digital processor, an analog processor, a digital circuit designed to process information, an analog circuit designed to process information, a state machine, and/or other mechanisms for electronically processing information). The one or more processing devices may include one or more devices executing some or all of the operations of method 44 in response to instructions stored electronically on an electronic storage medium. The one or more processing devices may include one or more devices configured through hardware, firmware, and/or software to be specifically designed for execution of one or more of the operations of method 44.
At an operation 46, performance of an initial portion of a control gesture by a user is monitored. Performance of the initial portion of the control gesture may be monitored based on the output signal of a user interface similar to or the same as user interface 14 (shown in FIG. 1 and described above). In one embodiment, operation 46 is performed by a gesture module similar to or the same as gesture module 24 (shown in FIG. 1 and described above).
At an operation 48, responsive to performance of the initial portion of the control gesture, a first haptic stimulation is determined. The first haptic stimulation corresponds to the initial portion of the control gesture. In one embodiment, operation 48 is performed by a stimulation module 26 similar to or the same as stimulation module 26 (shown in FIG. 1 and described above).
At an operation 50, the first haptic stimulation is generated for the user. In one embodiment, operation 50 is performed by one or more actuators similar to or the same as actuators 16 (shown in FIG. 1 and described above).
At an operation 52, a determination is made as to whether a first intermediate portion of the control gesture has been performed by the user. Responsive to the first intermediate portion of the control gesture having been performed, method 44 proceeds to an operation 54. In one embodiment operation 52 is performed by a gesture module similar to or the same as gesture module 24 (shown in FIG. 1 and described above).
At operation 54, a second haptic stimulation is determined. The second haptic stimulation corresponds to the first intermediate portion of the control gesture. In one embodiment, operation 54 is performed by a stimulation module similar to or the same as stimulation module 26 (shown in FIG. 1 and described above).
At an operation 56, the second haptic stimulation is generated for the user. In one embodiment, operation 56 is performed by one or more actuators similar to or the same as actuators 16 (shown in FIG. 1 and described above).
Referring back to operation 52, responsive to the first intermediate portion of the control gesture not having been performed, method 44 proceeds to an operation 58 at which a determination is made as to whether the control gesture has failed. This may be determined based on the output signal of a user interface similar to or the same as user interface 14 (shown in FIG. 1 and described above). The determination of whether the control gesture has failed may be made, for example, based on an amount of time since performance of the initial portion of the control gesture, if actions not included in the control gesture have been performed since performance of the initial portion of the control gesture, and/or other factors. In one embodiment, operation 58 is performed by a gesture module similar to or the same as gesture module 24 (shown in FIG. 1 and described above).
Responsive to a determination at operation 58 that the control gesture has not failed, method 44 returns to operation 52. Responsive to a determination at operation 58 that the control gesture has failed, method 44 proceeds to an operation 60. At operation 60, haptic stimulation is determined that indicates failure of the control gesture. In one embodiment, operation 60 is performed by a stimulation module similar to or the same as stimulation module 26 (shown in FIG. 1 and described above).
Upon determination of the haptic stimulation at operation 60, the determined haptic stimulation is generated for the user at operation 62. In one embodiment, operation 62 is performed by one or more actuators similar to or the same as actuators 16 (shown in FIG. 1 and described above).
Referring back to operation 52, further responsive to determination that the first intermediate portion of the control gesture has been performed, method 44 proceeds to an operation 64. At operation 64, a determination is made as to whether an ending portion of the control gesture has been performed. Responsive to a determination that the control gesture has ended, method 44 ends the provision of haptic stimulation associated with the control gesture. In one embodiment, a final haptic stimulation associated with the completion of the control gesture is further determined and generated. In one embodiment, operation 64 is performed by a gesture module similar to or the same as gesture module 24 (shown in FIG. 1 and described above).
Responsive to a determination at operation 64 that the ending portion of the control gesture has not yet been performed, method 44 proceeds to an operation 66. At operation 66, a determination is made as to whether the control gesture has failed. This may be determined based on the output signal of a user interface similar to or the same as user interface 14 (shown in FIG. 1 and described above). The determination of whether the control gesture has failed may be made, for example, based on an amount of time since performance of the first intermediate portion of the control gesture, if actions not included in the control gesture have been performed since performance of the first intermediate portion of the control gesture, and/or other factors. In one embodiment, operation 66 is performed by a gesture module similar to or the same as gesture module 24 (shown in FIG. 1 and described above).
Responsive to a determination at operation 66 that the control gesture has not failed, method 44 returns to operation 64. Responsive to a determination at operation 66 that the control gesture has failed, method 44 proceeds to operations 60 and 62.
In one embodiment, the control gesture includes more than one intermediate portion. In this embodiment, method 44 is expanded between operations 52 and 64 to monitor the additional intermediate portion(s) and generate corresponding haptic stimulation in a manner similar to that shown and described for the first intermediate portion.
FIG. 6 illustrates a method 68 of determining and/or providing haptic stimulation to a user. The haptic stimulation may be determined and/or provided to the user in conjunction with a game, and/or in other contexts. The operations of method 68 presented below are intended to be illustrative. In some embodiments, method 68 may be accomplished with one or more additional operations not described, and/or without one or more of the operations discussed. Additionally, the order in which the operations of method 68 are illustrated in FIG. 6 and described below is not intended to be limiting.
In some embodiments, method 68 may be implemented in one or more processing devices (e.g., a digital processor, an analog processor, a digital circuit designed to process information, an analog circuit designed to process information, a state machine, and/or other mechanisms for electronically processing information). The one or more processing devices may include one or more devices executing some or all of the operations of method 68 in response to instructions stored electronically on an electronic storage medium. The one or more processing devices may include one or more devices configured through hardware, firmware, and/or software to be specifically designed for execution of one or more of the operations of method 68.
At an operation 70, a view of virtual equipment is presented to the user via a touch sensitive electronic display. The view includes selectable sections that are selectable by the user to interact with the virtual equipment shown in the view. In one embodiment, the view of the virtual equipment is determined by a content module similar to or the same content module 22 (shown in FIG. 1 and described above).
At an operation 72, selection of one of the selectable sections of the view of the virtual equipment is received. The selection may be received through the touch sensitive electronic display through which the view is presented.
At an operation 74, operation of the virtual equipment is simulated and/or one or more parameters of the virtual equipment is adjusted based on the selection received at operation 72. In one embodiment, operation 74 is performed by an equipment module similar to or the same as equipment module 28 (shown in FIG. 1 and described above).
At an operation 76, haptic stimulation associated with the operation of the virtual equipment and/or the adjustment of the operating parameters of the virtual equipment effected at operation 74 is determined. In one embodiment, operation 76 is performed by a stimulation module similar to or the same as stimulation module 26 (shown in FIG. 1 and described above).
At an operation 78, the haptic stimulation determined at operation 76 is generated for the user. In one embodiment, operation 78 is performed by one or more actuators similar to or the same as actuators 16 (shown in FIG. 1 and described above).
As was mentioned above, the description of complex haptic stimulation in response to gesture-based control for a single user is not intended to be limiting. Haptic stimulation generated in response to gesture-based control in a multi-user environment is also contemplated. The haptic stimulation provided to a given user may include haptic feedback that is responsive to gesture-based controls of another user and/or other phenomena.
FIG. 7 is a diagram illustrating an example 100 of a portable computing device configured in accordance with one or more aspects of the present invention. Particularly, a tablet computer 102 is shown. Tablet computer 102 includes a screen 104 mounted to a body 106, with the top surface 108 representing a surface with which a user interacts. In this example, top surface 108 is bordered by body 106, but the screen area could extend all the way to the edges of the tablet. Tablet computer 102 can use any number or type of touch-sensing technologies to determine when a user has touched on or near surface 106. For example, surface 108 may include a resistance or capacitance-based touch sensing system and/or an optical touch sensing system.
Although a tablet computer is shown in this example, it can be understood that any touch-enabled computing device can be used. For instance, a smaller device such as a cellular telephone or media player may feature a touch-enabled display and can provide haptic outputs in accordance with the present subject matter. As another example, a haptically-enabled display surface associated with and interfaced to another computing device (e.g., a desktop or server computer) can be used. For example, surface 108 may represent a larger touch-enabled surface such as a table upon which a screen image is projected from above or below. A flat surface is depicted here, although the present subject matter could be applied for use in devices with curved surface and/or non-smooth surfaces. Gestures input to the computer (or other computing device) via touch may be control gestures. It will be appreciated that the description of gestures involving touch are not intended to limit the scope of control gestures discussed herein. Control gestures may include (with or without touch gestures) manipulation of a component or body (e.g., “tilt” controls), manipulation of one or more stick controls, manipulation of one or more buttons, manipulation of one or more switches, and/or manipulation of or interaction with other interface features.
FIG. 7 illustrates an example of the architecture of a computing device 102 at 110. Computing device 102 comprises one or more processors 112 configured to execute computer program modules, a memory 114 such as RAM, ROM, or other memory technology, a display interface 116, a haptic interface 118, I/O interface 120, and network interface 122. Any suitable display technology can be used. In some embodiments, an LCD display is used.
Haptic interface 118 can comprise suitable components for driving one or more actuators used to play back haptic effects so as to provide a physical sensation to a user of device 102. For example, some or all of display 104 may include embedded actuators so that targeted physical output can be provided to a portion of the display to provide a physical effect where the user touches surface 108. Additional actuators may be used to provide haptic output via other surfaces of tablet 102, such as its sides and the surface opposite surface 108 (i.e., the back of the device). It will be appreciated that the location of the actuator relative to the desired physical effect can vary. For example, in order to produce an effect at a first part of the screen, an actuator at a second part of the screen (or elsewhere in the device) may be driven so that the properties of intervening components of the screen and/or device influence what is felt at the first part of the screen.
In one embodiment, the tablet comprises an actuator having an eccentric rotating mass motor. The actuator is coupled either directly or indirectly to a surface of the tablet housing. Powering the motor causes vibration on the surface that a user can feel. By varying the magnitude and freq of the signal sent to the actuator, various effects are possible. As another example, an actuator may be used to raiser or lower sections of the screen to create ridges, troughs, or other features. As a further example, an actuator can comprise a piezoelectric. For example, a piezoelectric actuator can be embedded, at least partially, in an inorganic polymer matrix, such as silicone. As yet a further example, an actuator may comprise a macro-fiber composite actuator or piezocomposite actuator. Such actuators may be formed as a thin layer of piezoelectric fibers suspended in a matrix (e.g., epoxy). The fibers may communicate electrically with polyimide electrodes. Many other types of actuators may be used, and so this exemplary description of actuators is not meant to be limiting.
I/O interface 120 can be used by processor(s) 112 to receive input and provide output using any suitable components. For example, I/O interface 120 may link to speakers and/or a microphone for receiving voice input and providing audio output. As another example, I/O interface 120 may provide a connection to peripheral devices such as a mouse or stylus used to provide input to the device, or to an imaging sensor used to capture still images and/or video.
Network interface 122 can be used to link device 102 to a network using one or more networking technologies. For instance interface 122 may provide a connection to suitable components for connecting to an IEEE 802.11 (Wi-Fi) or 802.16 (Wimax) network, or a connection using Bluetooth technology. As another example, interface 122 may allow communication via a telephone, Ethernet, or other wired connection or may support other wireless technology such as communication via an IR port.
Computing device 102 can comprise additional components—for example, one or more storage components (e.g., magnetic or solid-state hard disk drives) can be included. If computing device 102 comprises a cellular telephone, appropriate RF components may be included as well.
Memory 114 tangibly embodies one or more program components that configure computing device 102 to operate in an intended manner. For example, memory 114 can include one or more applications, an operating system, and can also include stored data. As illustrated, memory 114 also includes a program component 124 for providing an interactive game in accordance with one or more aspects noted below.
Generally, the game can configure computing device 102 to present a play area 126 via display 104, track the movement of a virtual object 128 (e.g., a ball) in the play area, and respond to user interactions to launch and deflect the virtual object during play using paddle 130. Additionally, the game can be configured to play back haptic effects as the virtual object moves through and encounters features of the play area. The haptic effects can be selected to provide a sensation that differs based on the particular features that are encountered. In this example, play area includes a top T, bottom B, left side L, and right side R. The game can be configured so that the virtual object is deflected by paddle 130 prior to reaching left side L. If multiple players are involved, the virtual object may exit at one or more sides T, B, or R and pass to another user's screen as noted below.
FIG. 8A illustrates an example 200A of the use of a game to support multi-user play. The game program can support sending and receiving data to facilitate tracking the position of the virtual object in a play area that comprises multiple screens. In this example a first device 202 is interfaced to a second device 204 via a network 206. Network 206 may comprise a local area network, a wide area network, or may represent a direct connection between devices 202 and 204. By interfacing multiple devices running an instance of the game, a user can experience haptic effects when the virtual object encounters features in another user's screen. Such haptic effects may include haptic effects determined in accordance with a control gesture input on another user's screen.
For example, if the play area 126 of FIG. 7 is used, then when virtual object 128 exits at right side R of a first device it may enter the screen at right side R of the second device. As another example, the devices may have mirror-image layouts; that is, device 202 may feature paddle 130 along left side L while device 204 includes paddle 130 along right side R. In that case, when the virtual object reaches right side R of the play area of the first device, it may enter the play area at left side L of the other device, headed towards right side R and paddle 130 in the play area of the other device.
Server 208 is depicted to illustrate that in some embodiments, multi-user play may be facilitated by a server. However, as noted above, in some embodiments game program is configured to directly interface with other instances without need of a server.
FIG. 8B illustrates another example 200B of a multi-user play. In this example, three devices 202, 204, and 210 are interfaced via a first network 206. A second network 214 facilitates interaction with a fourth device 212. For example, network 206 may comprise a local area network connection while network 214 may comprise a wide-area network connection. Compass 216 is illustrated to show that in some embodiments the relative position of multiple players can be considered. For example, as the “westmost” player, device 202 may include paddle 130 at left side L. Device 204 may have its paddle positioned along top side T, since device 204 is “northmost.” Device 210 may have its paddle positioned along bottom side B, while device 212 may have its paddle positioned along right side R. Alternatively, the paddle may remain on the same side of the screen for each user but with appropriate mapping between edges to maintain the relative position of the users around the shared play area.
It will be understood that in various embodiments the game can be configured to dynamically adjust the behavior and connections between the play areas based on factors such as each device's orientation, the relative position of the players, and other considerations such as the relative size of screen areas. As will be noted below, the virtual object can move from one screen to another in other manners in addition to or instead of encountering the boundaries of the play area.
FIG. 9 illustrates an example of play areas 302A and 302B for two respective users A and B over a plurality of time intervals (I), (II), and (III). Each play area 302 includes a respective paddle 304. At time interval (I), virtual object 306 has been launched or deflected from paddle 304B towards the boundary of play area 302B. As shown at time interval (II), exit point 308 from play area 302B is mapped to an entry point 310 in play area 302A. The mapping may allow the virtual object to pass instantly between the play areas or there may be a delay based on the distance between the players (e.g., as determined by GPS and/or other triangulation or proximity detection techniques). In any event, time interval (II) depicts virtual object 306 encountering paddle 304A. For example, user A may have moved paddle 304A by sliding his or her fingers along the display surface of his device to intercept virtual object 306.
When virtual object 306 encounters paddle 304A, a haptic effect H1 is selected and played back. As illustrated, haptic effect H1 is localized to the point at which user A touches paddle 304A (and/or another part of play area 302). As was noted above, the sensation can be generated by commanding one or more actuators to provide motion or another effect; the actuators may be located at the point at which the effect is intended to be felt and/or elsewhere. FIG. 9 shows effect H1 as “(((H1)))” in play area 302A and as “(H1)” in play area 302B since the effect is also played back for player B and localized to player B's touch point. However, as indicated the intensity of effect H1 differs as between players A and B.
The haptic effect can be selected based on the simulated physics of the game. For example, the paddles 304 may represent a hard surface, and so effect H1 may comprise a strong, sharp effect. Since a deflection is meant to represent a “hit,” effect H1 may be the strongest effect in the game. The “hit” is played back to user B to alert user B that the virtual object will be returned and user B can prepare to deflect the incoming virtual object. The hit may be played back to player B with a suitable indicator of direction—for instance, the effect may be designed to feel like it originated from the left, rather than the top or bottom; this may be helpful when three of more users are playing together.
By playing back the haptic effect even for a collision (or other event) occurring in a different player's screen, the game can enhance the perception that the players are sharing a space even though the players cannot view one another's play areas. Thus, the players may become more immersed in the game and may have a more compelling game experience.
FIG. 10 illustrates another example of play and depicts the virtual object ricocheting off the border of a play area. Particularly, three time intervals (I), (II), and (III) are again shown. Play areas 402A and 402B correspond to players A and B, while paddles 404 and virtual object 406 are also illustrated. As shown at time interval (I), virtual object 406 is launched with a trajectory towards a point 408 at the top boundary of play area 402. Interval (II) illustrates when virtual object 406 encounters point 408. A “bounce” haptic effect H2 is played back to players A and B, localized to their touch points at respective paddles 404A and 404B.
Since the “bounce” occurs in play area 402B and is at a closer distance to paddle 404B than paddle 404A, it is depicted as “((H2))” in play area 402B and “(H2)” in play area 402A since the bounce is “louder” for player B. As shown at time interval (III), after the bounce the virtual object passes to play area 402A. Alerted to the bounce, player A may attempt to intercept the virtual object and prevent it from reaching the goal area behind paddle 404A.
FIG. 11 is a diagram illustrating another aspect of the present subject matter. In addition to or instead of haptic effects played in response to events changing the virtual object trajectory or other “discrete” events, haptic effects can be played back to simulate a continuous effect. In this example, play areas 502 for two players A and B are shown in a time interval (I) and a time interval (II). For purposes of illustration, play areas 502 are shown in a “landscape,” rather than “portrait” orientation. Each play area features a respective paddle 504 as well, and virtual object 506 is depicted.
Each play area of this example also includes seams 508 represented by dotted lines. For example, seams 508 may represent boundaries between planks in a wooden surface depicted in the play area. To simulate a wood panel back ground, a continuous low rumble effect to correlate with the virtual object rolling across the surface can be combined with click effects to correlate with the virtual object encountering seams 508. This effect is shown as “H3” in FIG. 11. At time interval (I), the effect is shown as “((H3))” for players B and “(H3)” for player A since the virtual object is closer to paddle 504B than paddle 504A. At time interval (II), effect H3 is louder for paddle 504A since virtual object 506 is moving towards player A. Although the background effect is shown in conjunction with seams 508, a background effect could be included to simulate a surface alone (i.e. a continuous surface) or could vary as the simulated background surface changes (e.g., from a wood area to a metal area to a concrete area, etc.).
FIG. 12 is a flowchart illustrating illustrative steps in a method 600 for providing a game in accordance with the present subject matter. Block 602 represents setting up one or more play areas. For example, if two users desire to play, respective play areas can be initialized and mappings between the shared boundaries (and/or other entry-exit points) can be determined.
Block 604 occurs while play continues. At least one instance of the game can track the position and motion of the virtual object based on interaction with paddles, obstacles, and characteristics of the play area based on a model simulating physics of the game. For example, the model can provide for changes in the virtual object's speed and direction based on simulating momentum, mass, and material characteristics of the virtual object and the other items in the play area.
At block 606, based on the position and motion of the virtual object at, before, and/or after an event, one or more haptic effects to play back can be determined. For example, if the virtual object encounters a boundary or other object in the play area, a haptic effect associated with the physical interaction between the virtual object and boundary/object can be selected for playback. Different boundaries/objects may result in different effects. For example, a border or paddle may result in a “hard” effect, while obstacles included in the play area may have “soft” effects. Simulated properties of the virtual object can be taken into account as well—the game may support a mode with a hard (e.g., steel) virtual object or a soft (e.g., rubber) virtual object with appropriate changes in the haptic output scheme.
Additionally or alternatively the haptic effect can relate to a background effect. For instance, as was noted above, a continuous haptic effect simulating passage of the virtual object over a simulated surface can be provided based on characteristics of that surface. As another example, the surface may include material or an obstacle for the virtual object to pass through and a suitable haptic effect can be provided to simulate passage through the material/obstacle.
At block 608, the game determines the position of the virtual object relative to the haptic delivery point(s) to adjust how the haptic effect is to be output. For instance, a haptic delivery point can include the point at which a user touches the screen of a device. The “loudness” (i.e., intensity) of the haptic effect can be inversely proportional to the distance between the delivery point and the virtual object. Directionality may also be applied. For example, if a ricochet occurs on another screen, the haptic effect that is presented may include a directional component or may be otherwise be presented to give an indication of where the ricochet occurred.
At block 610, suitable signals are sent to the actuators to generate the haptic effect(s) having the desired volume. For example, the game can consult a library of signal patterns for use in generating different haptic effects and use the signal patterns to command one or more actuators embedded in the screen and/or other portions of the device. The haptic effects may include sounds and/or visual elements as well.
For multi-user play, each respective instance of the game can determine the virtual object's position and motion while in that instance's play area and pass that information to the other instances. When the virtual object exits the play area, information regarding the virtual object's motion (e.g., a vector with direction and speed) can be used to continue tracking by the instance of the game whose play area is to receive the virtual object.
In some embodiments, when an event occurs and/or when background effects are provided, the haptic effect is selected by the instance of the game whose play area contains the virtual object when the haptic effect is to be triggered and that information is provided to the other instances of the game. For example, in a game involving player A and player B, if the virtual object collides with an obstacle, border, or paddle in player A's play area, the instance of the game on player A's device can provide player B's device the desired haptic effect along with information about the collision and position thereof for use by the instance of the game on player B's device in determining a volume or directionality for the effect.
FIG. 13 is a diagram illustrating an example of an interface 700 for an instance of a game configured in accordance with aspects of the present subject matter. In this example, a pinball-like game is presented in which the play area 702 includes a border 704, 706 that extends inward from the screen boundaries. The objective of the game is to prevent a virtual object (not shown), such as a ball or other object, from reaching 708 by deflecting the virtual object using paddle 710. To begin play, the virtual object may be launched from paddle 710 or may appear elsewhere in play area 702.
Interface 700 includes a plurality of control buttons 712, 714, 716, and 718 which may be used to provide inputs and access menus. For example, buttons 712 and 714 may comprise play and pause buttons, while button 716 provides a “launch” command and button 718 exits the game or launches a menu for configuring, saving, or exiting the game. In some embodiments, control buttons 712-718 may be provided with suitable haptic effects to simulate pushing a physical button.
Turning to play area 702, a plurality of simulated lights 720 can be provided to enhance the visual experience; the lights may or may not act as obstacles. A bumper 722 may cause the virtual object to “bounce” in response to a collision in a manner different from a ricochet from border 704, 706. For example, border 704, 706 may be presented as a simulated metal border which causes a sharp ricochet effect. Bumper 722 may feature an initial amount of “give” before imparting force on the virtual object in a manner similar to that of a pinball machine bumper. Accordingly, when the virtual object encounters border 704, 706 or bumper 722, different respective haptic effects may be played back at the point(s) at which the user contacts the screen to provide different sensations in accordance with the simulated physical response of the borders/bumper. Additionally, as was noted above, the intensity of the effect can depend on distance from the point(s) at which the user contacts the screen.
This example features metal bars 724, 726, and 728, which may provide a still further response to collisions with a virtual object and may be assigned their own respective haptic effect. Arrows 730 may comprise a visual effect and/or may result in an acceleration of the virtual object into xylophone structure 732. In this example, xylophone structure 732 comprises a plurality of keys (732A, 732B, 732C identified), with each ascending key having its own associated haptic effect. For example, as the virtual object moves from key 732A to 732B to 732C, the haptic effect may increase in pitch along with corresponding xylophone sound effects. At the same time, the haptic effect may decrease in intensity as the virtual object moves away.
In some embodiments, point 734 represents an exit from the play area and into a second user's play area, which is identical to play area 702. When the virtual object enters play area 702, it may be returned via chute 736 along with an accompanying “rattle” representing passage through the chute. As was noted above, each haptic effect in one user's play area can also be played back in the other user's (or users') play area(s) but with a corresponding decrease in intensity based on the separation from the site of the event causing the haptic effect and the user's point of contact with the screen.
Some embodiments feature one or more instances of a vortex as shown at 738. Vortex 738 can comprise a portion of play area 702 that attracts the virtual object toward opening 740. If the virtual object reaches opening 740, the virtual object may pass into another play area. When the virtual object initially contacts vortex 738, a first haptic effect representing the “pull” of the vortex may be played back, with the effect becoming stronger until (and if) the virtual object reaches opening 740. At that point, an “exit” effect may be played back to represent the virtual object's exit from the vortex in another play area. This can, for instance, alert the user of the play area receiving the virtual object to move his or her paddle 710 into position.
In some embodiments, if the virtual object is not deflected from goal area 708, a haptic effect representing entry of area 708 is presented, such as absorption of the virtual object or an explosion. At that point, depending upon the rules of the game, the virtual object may be provided for launch again via paddle 710 for another round. In some embodiments, the game proceeds until one player reaches a predetermined score level (e.g., 7 goals) and/or score differential (ahead by 3 goals). As another example, point values may be associated with hitting certain obstacles (e.g., bumper 722, bars 724, 726, 728, lights 720) or passing through all keys of xylophone structure 732. Embodiments may support moving or destroying obstacles (e.g., breaking through bricks) during the course of play, with suitable haptic effects provided based on motion or destruction of the obstacles.
In some embodiments, a computing device may be sensitive to distance of a user's fingers or stylus from the touch screen and/or touch pressure. These features may be used during play of the game and/or in configuring the game application. For instance, if the device indicates that the user is not touching the screen or another haptically-enabled area, haptic effects can be turned off to reduce power consumption. As another example, the user may be able to hover to provide input. For instance, the area at which buttons 712-716 are presented may ordinarily appear as brushed metal, but the buttons may appear in response to a user hovering or touching the area.
In several examples above, game play proceeded based on movement of a paddle using touch. Additionally or alternatively, gameplay may depend on tilt sensors and/or accelerometers. For example, users may be able to tilt or swing their devices to affect the movement of the virtual object and/or paddle position. Haptic effects can be delivered at the point(s) at which the users grip their respective devices. In some embodiments, the game may use multiple paddles per user or may use no paddles at all, with all input based on tilt/acceleration.
Several examples of multi-user play were provided. In some embodiments, single-user play is also supported. For example, a play area may be completely closed, with the virtual object returning towards the paddle and goal area. As another example, single user play may proceed with one or more other players simulated, with the other players having a respective simulated play area and with corresponding haptic feedback when the virtual object enters the simulated play area.
Although the invention has been described in detail for the purpose of illustration based on what is currently considered to be the most practical and preferred embodiments, it is to be understood that such detail is solely for that purpose and that the invention is not limited to the disclosed embodiments, but, on the contrary, is intended to cover modifications and equivalent arrangements that are within the spirit and scope of the appended claims. For example, it is to be understood that the present invention contemplates that, to the extent possible, one or more features of any embodiment can be combined with one or more features of any other embodiment.

Claims (19)

What is claimed is:
1. A method of producing a haptic effect, comprising:
monitoring a motion of a body part of a user received from a user interface device with a sensor;
comparing the motion of the body part to a control gesture with a gesture module executable by a processor;
generating a first haptic effect with a haptic output device if the motion of the body part corresponds to the control gesture; and
generating a second haptic effect with the haptic output device if the motion of the body part does not correspond to the control gesture, the second haptic effect being different from the first haptic effect.
2. The method of claim 1, wherein the body part is an appendage of the user.
3. The method of claim 1, wherein the body part is a head of the user.
4. The method of claim 1, wherein the motion of the body part comprises changing a direction of motion of the body part.
5. The method of claim 1, wherein the motion of the body part comprises changing an orientation of the body part.
6. The method of claim 1, wherein monitoring the motion of the body part comprises monitoring an acceleration of the body part.
7. The method of claim 1, further comprising determining the first haptic effect to be generated based on content being provided to the user via the user interface device with a content module executable by the processor.
8. A haptic effect enabled system, comprising:
a user interface device;
a sensor configured to monitor a motion of a body part of a user received from the user interface device;
a haptic actuator configured to generate a haptic effect; and
a processor in operative communication with the user interface, the sensor and the haptic actuator, the processor comprising
a gesture module electronically coupled to the sensor, the gesture module configured to compare the motion of the body part with a control gesture, and
a stimulation module electronically coupled to the gesture module and the haptic device, the stimulation module configured to determine the haptic effect to be generated by the haptic actuator if the motion of the body part corresponds to the control gesture.
9. The system of claim 8, wherein the stimulation module is further configured to determine a second haptic effect to be generated by the haptic actuator that is different from the haptic effect if the motion of the body part does not correspond to the control gesture.
10. The system of claim 8, wherein the body part is an appendage of the user and the user interface device is configured to be carried by the appendage of the user.
11. The system of claim 8, wherein the body part is a head of the user and the user interface device is configured to be carried by the head of the user.
12. The system of claim 8, wherein the motion of the body part comprises changing a direction of motion of the body part.
13. The system of claim 8, wherein the motion of the body part comprises changing an orientation of the body part.
14. The system of claim 8, wherein the sensor is configured to monitor an acceleration of the body part.
15. The system of claim 8, wherein the motion of the body part comprises an initial portion, an intermediate portion and an ending portion, wherein the haptic effect is generated if the initial portion of the motion corresponds to an initial portion of the control gesture, and wherein a second haptic effect that is different from the haptic effect is generated if the intermediate portion of the motion corresponds to an intermediate portion of the control gesture.
16. The system of claim 15, wherein a third haptic effect that is different from the haptic effect and the second haptic effect is generated if the ending portion of the motion corresponds to an ending portion of the control gesture.
17. The system of claim 8, wherein the user interface device is configured to provide content to the user, and wherein the stimulation module is further configured to determine the haptic effect to be generated based on the content provided to the user.
18. A method of producing a haptic effect, comprising:
monitoring a motion of a body part of a user received from a user interface device with a sensor, wherein the motion of the body part comprises an initial portion, an intermediate portion and an ending portion;
comparing the motion of the body part to a control gesture with a gesture module executable by a processor;
generating a first haptic effect with a haptic output device if the initial portion of the motion corresponds to an initial portion of the control gesture; and
generating a second haptic effect with the haptic output device if the intermediate portion of the motion corresponds to an intermediate portion of the control gesture, wherein the second haptic effect is different from the first haptic effect.
19. The method of claim 18, further comprising:
generating a third haptic effect with the haptic output device if the ending portion of the motion corresponds to an ending portion of the control gesture, wherein the third haptic effect is different from the first haptic effect and the second haptic effect.
US13/924,084 2009-07-22 2013-06-21 System and method for providing complex haptic stimulation during input of control gestures, and relating to control of virtual equipment Active 2031-02-17 US9235969B2 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
US13/924,084 US9235969B2 (en) 2009-07-22 2013-06-21 System and method for providing complex haptic stimulation during input of control gestures, and relating to control of virtual equipment
US14/955,694 US9671866B2 (en) 2009-07-22 2015-12-01 System and method for providing complex haptic stimulation during input of control gestures, and relating to control of virtual equipment
US15/600,522 US10139911B2 (en) 2009-07-22 2017-05-19 System and method for providing complex haptic stimulation during input of control gestures, and relating to control of virtual equipment
US16/178,711 US20190073037A1 (en) 2009-07-22 2018-11-02 System and method for providing complex haptic stimulation during input of control gestures, and relating to control of virtual equipment

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US22764509P 2009-07-22 2009-07-22
US12/840,797 US8469806B2 (en) 2009-07-22 2010-07-21 System and method for providing complex haptic stimulation during input of control gestures, and relating to control of virtual equipment
US13/924,084 US9235969B2 (en) 2009-07-22 2013-06-21 System and method for providing complex haptic stimulation during input of control gestures, and relating to control of virtual equipment

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US12/840,797 Continuation US8469806B2 (en) 2009-07-22 2010-07-21 System and method for providing complex haptic stimulation during input of control gestures, and relating to control of virtual equipment

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US14/955,694 Continuation US9671866B2 (en) 2009-07-22 2015-12-01 System and method for providing complex haptic stimulation during input of control gestures, and relating to control of virtual equipment

Publications (2)

Publication Number Publication Date
US20140009273A1 US20140009273A1 (en) 2014-01-09
US9235969B2 true US9235969B2 (en) 2016-01-12

Family

ID=42931900

Family Applications (8)

Application Number Title Priority Date Filing Date
US12/840,797 Active 2031-04-18 US8469806B2 (en) 2009-07-22 2010-07-21 System and method for providing complex haptic stimulation during input of control gestures, and relating to control of virtual equipment
US12/841,027 Active 2031-07-04 US8502651B2 (en) 2009-07-22 2010-07-21 Interactive touch screen gaming metaphors with haptic feedback
US13/924,084 Active 2031-02-17 US9235969B2 (en) 2009-07-22 2013-06-21 System and method for providing complex haptic stimulation during input of control gestures, and relating to control of virtual equipment
US13/954,398 Active 2030-12-30 US9373233B2 (en) 2009-07-22 2013-07-30 Interactive touch screen metaphors with haptic feedback
US14/955,694 Active US9671866B2 (en) 2009-07-22 2015-12-01 System and method for providing complex haptic stimulation during input of control gestures, and relating to control of virtual equipment
US15/186,688 Active US9921655B2 (en) 2009-07-22 2016-06-20 Interactive application with haptic feedback
US15/600,522 Active US10139911B2 (en) 2009-07-22 2017-05-19 System and method for providing complex haptic stimulation during input of control gestures, and relating to control of virtual equipment
US16/178,711 Abandoned US20190073037A1 (en) 2009-07-22 2018-11-02 System and method for providing complex haptic stimulation during input of control gestures, and relating to control of virtual equipment

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US12/840,797 Active 2031-04-18 US8469806B2 (en) 2009-07-22 2010-07-21 System and method for providing complex haptic stimulation during input of control gestures, and relating to control of virtual equipment
US12/841,027 Active 2031-07-04 US8502651B2 (en) 2009-07-22 2010-07-21 Interactive touch screen gaming metaphors with haptic feedback

Family Applications After (5)

Application Number Title Priority Date Filing Date
US13/954,398 Active 2030-12-30 US9373233B2 (en) 2009-07-22 2013-07-30 Interactive touch screen metaphors with haptic feedback
US14/955,694 Active US9671866B2 (en) 2009-07-22 2015-12-01 System and method for providing complex haptic stimulation during input of control gestures, and relating to control of virtual equipment
US15/186,688 Active US9921655B2 (en) 2009-07-22 2016-06-20 Interactive application with haptic feedback
US15/600,522 Active US10139911B2 (en) 2009-07-22 2017-05-19 System and method for providing complex haptic stimulation during input of control gestures, and relating to control of virtual equipment
US16/178,711 Abandoned US20190073037A1 (en) 2009-07-22 2018-11-02 System and method for providing complex haptic stimulation during input of control gestures, and relating to control of virtual equipment

Country Status (6)

Country Link
US (8) US8469806B2 (en)
EP (2) EP2457141B1 (en)
JP (6) JP5613236B2 (en)
KR (9) KR20170026642A (en)
CN (4) CN102473034B (en)
WO (2) WO2011011552A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11204645B2 (en) 2017-05-11 2021-12-21 Samsung Electronics Co., Ltd. Method for providing haptic feedback, and electronic device for performing same

Families Citing this family (164)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7765333B2 (en) 2004-07-15 2010-07-27 Immersion Corporation System and method for ordering haptic effects
BRPI0804355A2 (en) * 2008-03-10 2009-11-03 Lg Electronics Inc terminal and control method
JP4498448B2 (en) * 2008-08-06 2010-07-07 株式会社コナミデジタルエンタテインメント GAME DEVICE, GAME DEVICE CONTROL METHOD, AND PROGRAM
JP4600548B2 (en) * 2008-08-27 2010-12-15 ソニー株式会社 REPRODUCTION DEVICE, REPRODUCTION METHOD, AND PROGRAM
US9737796B2 (en) 2009-07-08 2017-08-22 Steelseries Aps Apparatus and method for managing operations of accessories in multi-dimensions
US8719714B2 (en) 2009-07-08 2014-05-06 Steelseries Aps Apparatus and method for managing operations of accessories
CN102473034B (en) 2009-07-22 2015-04-01 意美森公司 System and method for providing complex haptic stimulation during input of control gestures, and relating to control of virtual equipment
KR101657963B1 (en) * 2009-12-08 2016-10-04 삼성전자 주식회사 Operation Method of Device based on a alteration ratio of touch area And Apparatus using the same
US20110149042A1 (en) * 2009-12-18 2011-06-23 Electronics And Telecommunications Research Institute Method and apparatus for generating a stereoscopic image
CA2719659C (en) * 2010-11-05 2012-02-07 Ibm Canada Limited - Ibm Canada Limitee Haptic device with multitouch display
CN106943742A (en) * 2011-02-11 2017-07-14 漳州市爵晟电子科技有限公司 One kind action amplification system
WO2012121961A1 (en) 2011-03-04 2012-09-13 Apple Inc. Linear vibrator providing localized and generalized haptic feedback
US9218727B2 (en) 2011-05-12 2015-12-22 Apple Inc. Vibration in portable devices
US9710061B2 (en) * 2011-06-17 2017-07-18 Apple Inc. Haptic feedback device
US8830188B2 (en) 2011-06-21 2014-09-09 Microsoft Corporation Infrastructural haptics on wall scale interactive displays
US9220977B1 (en) * 2011-06-30 2015-12-29 Zynga Inc. Friend recommendation system
DE102011107279B4 (en) * 2011-07-15 2013-10-31 Brose Fahrzeugteile Gmbh & Co. Kommanditgesellschaft, Hallstadt Error prevention in the gesture-controlled opening of a motor vehicle control element
US9778737B1 (en) * 2011-08-31 2017-10-03 Amazon Technologies, Inc. Game recommendations based on gesture type
US10522452B2 (en) * 2011-10-18 2019-12-31 Taiwan Semiconductor Manufacturing Company, Ltd. Packaging methods for semiconductor devices including forming trenches in workpiece to separate adjacent packaging substrates
US9582178B2 (en) * 2011-11-07 2017-02-28 Immersion Corporation Systems and methods for multi-pressure interaction on touch-sensitive surfaces
US20130124327A1 (en) * 2011-11-11 2013-05-16 Jumptap, Inc. Identifying a same user of multiple communication devices based on web page visits
KR102024006B1 (en) 2012-02-10 2019-09-24 삼성전자주식회사 Apparatus and method for controlling vibration flow between vibration devices
US8493354B1 (en) 2012-08-23 2013-07-23 Immersion Corporation Interactivity model for shared feedback on mobile devices
US8711118B2 (en) 2012-02-15 2014-04-29 Immersion Corporation Interactivity model for shared feedback on mobile devices
TW201334843A (en) * 2012-02-20 2013-09-01 Fu Li Ye Internat Corp Game control method with touch panel media and game play media
US20140015773A1 (en) * 2012-02-24 2014-01-16 Thomson Licensing Haptic sensation for touch-screen interfaces
US8570296B2 (en) 2012-05-16 2013-10-29 Immersion Corporation System and method for display of multiple data channels on a single haptic display
US9703378B2 (en) * 2012-06-13 2017-07-11 Immersion Corporation Method and apparatus for representing user interface metaphors as physical changes on a shape-changing device
US9245428B2 (en) * 2012-08-02 2016-01-26 Immersion Corporation Systems and methods for haptic remote control gaming
US9280206B2 (en) * 2012-08-20 2016-03-08 Samsung Electronics Co., Ltd. System and method for perceiving images with multimodal feedback
US9292136B2 (en) * 2012-10-02 2016-03-22 At&T Intellectual Property I, L.P. Notification system for providing awareness of an interactive surface
US9330544B2 (en) * 2012-11-20 2016-05-03 Immersion Corporation System and method for simulated physical interactions with haptic effects
US9836150B2 (en) 2012-11-20 2017-12-05 Immersion Corporation System and method for feedforward and feedback with haptic effects
US9202350B2 (en) * 2012-12-19 2015-12-01 Nokia Technologies Oy User interfaces and associated methods
US9195382B2 (en) 2013-01-29 2015-11-24 Google Inc. Intelligent window sizing and control
US9884257B2 (en) * 2013-03-06 2018-02-06 Tencent Technology (Shenzhen) Company Limited Method for preventing misoperations of intelligent terminal, and intelligent terminal
US9866924B2 (en) * 2013-03-14 2018-01-09 Immersion Corporation Systems and methods for enhanced television interaction
US9604147B2 (en) 2013-03-15 2017-03-28 Steelseries Aps Method and apparatus for managing use of an accessory
JP2014180572A (en) 2013-03-15 2014-09-29 Immersion Corp Programmable haptic peripheral
US9423874B2 (en) 2013-03-15 2016-08-23 Steelseries Aps Gaming accessory with sensory feedback device
US9687730B2 (en) 2013-03-15 2017-06-27 Steelseries Aps Gaming device with independent gesture-sensitive areas
US9415299B2 (en) 2013-03-15 2016-08-16 Steelseries Aps Gaming device
US9041647B2 (en) * 2013-03-15 2015-05-26 Immersion Corporation User interface device provided with surface haptic sensations
US9409087B2 (en) 2013-03-15 2016-08-09 Steelseries Aps Method and apparatus for processing gestures
US9939900B2 (en) * 2013-04-26 2018-04-10 Immersion Corporation System and method for a haptically-enabled deformable surface
US10295823B2 (en) 2013-07-02 2019-05-21 Pine Development Corporation Systems and methods for eliciting cutaneous sensations using electromagnetic radiation
USD723625S1 (en) 2013-08-27 2015-03-03 Steelseries Aps Gaming device
US9576445B2 (en) 2013-09-06 2017-02-21 Immersion Corp. Systems and methods for generating haptic effects associated with an envelope in audio signals
US9619980B2 (en) 2013-09-06 2017-04-11 Immersion Corporation Systems and methods for generating haptic effects associated with audio signals
US9652945B2 (en) 2013-09-06 2017-05-16 Immersion Corporation Method and system for providing haptic effects based on information complementary to multimedia content
US9164587B2 (en) 2013-11-14 2015-10-20 Immersion Corporation Haptic spatialization system
US9619029B2 (en) * 2013-11-14 2017-04-11 Immersion Corporation Haptic trigger control system
US9639158B2 (en) * 2013-11-26 2017-05-02 Immersion Corporation Systems and methods for generating friction and vibrotactile effects
FR3015383B1 (en) * 2013-12-19 2017-01-13 Dav CONTROL DEVICE FOR MOTOR VEHICLE AND CONTROL METHOD
US9248840B2 (en) 2013-12-20 2016-02-02 Immersion Corporation Gesture based input system in a vehicle with haptic feedback
US9875019B2 (en) 2013-12-26 2018-01-23 Visteon Global Technologies, Inc. Indicating a transition from gesture based inputs to touch surfaces
US9817489B2 (en) 2014-01-27 2017-11-14 Apple Inc. Texture capture stylus and method
JP6264542B2 (en) * 2014-01-30 2018-01-24 任天堂株式会社 Information processing apparatus, information processing program, information processing system, and information processing method
JP6319328B2 (en) * 2014-02-14 2018-05-09 富士通株式会社 Educational tactile sensation providing apparatus and system
US9396629B1 (en) 2014-02-21 2016-07-19 Apple Inc. Haptic modules with independently controllable vertical and horizontal mass movements
CN103885708A (en) * 2014-02-28 2014-06-25 宇龙计算机通信科技(深圳)有限公司 Touch interaction method and touch interaction system
FR3018122A1 (en) * 2014-02-28 2015-09-04 Orange METHOD FOR CONTROLLING ACCESS BY HAPTIC RETURN
US10067566B2 (en) * 2014-03-19 2018-09-04 Immersion Corporation Systems and methods for a shared haptic experience
US9594429B2 (en) 2014-03-27 2017-03-14 Apple Inc. Adjusting the level of acoustic and haptic output in haptic devices
US9449477B2 (en) 2014-04-02 2016-09-20 Pine Development Corporation Applications of systems and methods for eliciting cutaneous sensations by electromagnetic radiation
US20170098350A1 (en) * 2015-05-15 2017-04-06 Mick Ebeling Vibrotactile control software systems and methods
US20230351868A1 (en) * 2014-05-16 2023-11-02 Not Impossible, Llc Vibrotactile control systems and methods
US10379614B2 (en) 2014-05-19 2019-08-13 Immersion Corporation Non-collocated haptic cues in immersive environments
US10133351B2 (en) * 2014-05-21 2018-11-20 Apple Inc. Providing haptic output based on a determined orientation of an electronic device
US20160202760A1 (en) * 2014-06-06 2016-07-14 Microsoft Technology Licensing Llc Systems and methods for controlling feedback for multiple haptic zones
US9588586B2 (en) * 2014-06-09 2017-03-07 Immersion Corporation Programmable haptic devices and methods for modifying haptic strength based on perspective and/or proximity
US9715279B2 (en) 2014-06-09 2017-07-25 Immersion Corporation Haptic devices and methods for providing haptic effects via audio tracks
JP6341417B2 (en) * 2014-06-10 2018-06-13 任天堂株式会社 Vibration generation system, vibration generation program, and vibration generation method
EP3105669B1 (en) 2014-06-24 2021-05-26 Apple Inc. Application menu for video system
EP2963387B1 (en) * 2014-06-30 2019-07-31 STMicroelectronics Srl Micro-electro-mechanical device with compensation of errors due to disturbance forces, such as quadrature components
US9886090B2 (en) 2014-07-08 2018-02-06 Apple Inc. Haptic notifications utilizing haptic input devices
SG11201701255VA (en) * 2014-08-20 2017-03-30 Touchgram Pty Ltd A system and a method for sending a touch message
US9690381B2 (en) 2014-08-21 2017-06-27 Immersion Corporation Systems and methods for shape input and output for a haptically-enabled deformable surface
JP6390277B2 (en) 2014-09-02 2018-09-19 ソニー株式会社 Information processing apparatus, control method, and program
FR3025902B1 (en) * 2014-09-16 2017-12-08 E-Concept MULTI-DIMENSIONAL MOUSE
US10185396B2 (en) 2014-11-12 2019-01-22 Immersion Corporation Haptic trigger modification system
US9400570B2 (en) 2014-11-14 2016-07-26 Apple Inc. Stylus with inertial sensor
US9535550B2 (en) 2014-11-25 2017-01-03 Immersion Corporation Systems and methods for deformation-based haptic effects
KR101666532B1 (en) * 2014-12-01 2016-10-14 김정훈 Vibration mouse that provide real time vibration feedback
US9919208B2 (en) 2014-12-11 2018-03-20 Immersion Corporation Video gameplay haptics
US9575573B2 (en) 2014-12-18 2017-02-21 Apple Inc. Stylus with touch sensor
US9600076B2 (en) 2014-12-19 2017-03-21 Immersion Corporation Systems and methods for object manipulation with haptic feedback
US9658693B2 (en) 2014-12-19 2017-05-23 Immersion Corporation Systems and methods for haptically-enabled interactions with objects
US20160189427A1 (en) * 2014-12-31 2016-06-30 Immersion Corporation Systems and methods for generating haptically enhanced objects for augmented and virtual reality applications
WO2016148182A1 (en) * 2015-03-18 2016-09-22 株式会社ニコン Electronic device and program
US10613629B2 (en) 2015-03-27 2020-04-07 Chad Laurendeau System and method for force feedback interface devices
WO2016172209A1 (en) * 2015-04-21 2016-10-27 Immersion Corporation Dynamic rendering of etching input
US20160317909A1 (en) * 2015-04-30 2016-11-03 Barry Berman Gesture and audio control of a pinball machine
CN104932691A (en) * 2015-06-19 2015-09-23 中国航天员科研训练中心 Real-time gesture interaction system with tactile perception feedback
JP2017010387A (en) * 2015-06-24 2017-01-12 キヤノン株式会社 System, mixed-reality display device, information processing method, and program
US20170024010A1 (en) 2015-07-21 2017-01-26 Apple Inc. Guidance device for the sensory impaired
EP3349096B1 (en) * 2015-09-08 2023-08-23 Sony Group Corporation Information processing apparatus, method, and computer program
US10013060B2 (en) * 2015-09-18 2018-07-03 Immersion Corporation Systems and methods for providing haptic effects in response to deformation of a cover for an electronic device
WO2017086508A1 (en) * 2015-11-19 2017-05-26 엘지전자 주식회사 Mobile terminal and control method therefor
CN105498205B (en) * 2015-12-10 2020-04-24 联想(北京)有限公司 Electronic game control equipment and control method
US10200332B2 (en) * 2015-12-14 2019-02-05 Immersion Corporation Delivery of haptics to select recipients of a message
US9895607B2 (en) * 2015-12-15 2018-02-20 Igt Canada Solutions Ulc Haptic feedback on a gaming terminal display
WO2017136830A1 (en) * 2016-02-05 2017-08-10 Prizm Labs, Inc. Physical/virtual game system and methods for manipulating virtual objects within a virtual game environment
CN105536249B (en) * 2016-02-18 2023-09-01 高创(苏州)电子有限公司 game system
JP2017134802A (en) * 2016-03-04 2017-08-03 望月 玲於奈 User interface program
US10772394B1 (en) 2016-03-08 2020-09-15 Apple Inc. Tactile output for wearable device
JP6591924B2 (en) * 2016-03-31 2019-10-16 日本電信電話株式会社 Skin sense presentation system
US9898142B2 (en) * 2016-04-01 2018-02-20 Ford Global Technologies, Llc Touch detection on a curved surface
US10585480B1 (en) 2016-05-10 2020-03-10 Apple Inc. Electronic device with an input device having a haptic engine
US9829981B1 (en) 2016-05-26 2017-11-28 Apple Inc. Haptic output device
US10671186B2 (en) 2016-06-15 2020-06-02 Microsoft Technology Licensing, Llc Autonomous haptic stylus
US10649529B1 (en) 2016-06-28 2020-05-12 Apple Inc. Modification of user-perceived feedback of an input device using acoustic or haptic output
US10845878B1 (en) 2016-07-25 2020-11-24 Apple Inc. Input device with tactile feedback
US10372214B1 (en) 2016-09-07 2019-08-06 Apple Inc. Adaptable user-selectable input area in an electronic device
US11794094B2 (en) * 2016-10-17 2023-10-24 Aquimo Inc. Method and system for using sensors of a control device for control of a game
US10078370B2 (en) * 2016-11-23 2018-09-18 Immersion Corporation Devices and methods for modifying haptic effects
JP2017157195A (en) * 2016-12-19 2017-09-07 望月 玲於奈 User interface program
US10437359B1 (en) 2017-02-28 2019-10-08 Apple Inc. Stylus with external magnetic influence
EP3396478B1 (en) * 2017-04-28 2023-06-14 Deere & Company Apparatus, method and computer programme for controlling a machine
US10437336B2 (en) 2017-05-15 2019-10-08 Microsoft Technology Licensing, Llc Haptics to identify button regions
US10471347B2 (en) * 2017-05-24 2019-11-12 Nintendo Co., Ltd. Information processing system, information processing apparatus, storage medium storing information processing program, and information processing method
US10743805B2 (en) 2017-06-02 2020-08-18 International Business Machines Corporation Haptic interface for generating preflex stimulation
JP6613267B2 (en) * 2017-06-02 2019-11-27 任天堂株式会社 Information processing system, information processing program, information processing apparatus, and information processing method
WO2019013044A1 (en) * 2017-07-10 2019-01-17 シャープ株式会社 Input device
US10915174B1 (en) * 2017-07-20 2021-02-09 Apple Inc. Electronic devices with directional haptic output
US10775889B1 (en) 2017-07-21 2020-09-15 Apple Inc. Enclosure with locally-flexible regions
US10768747B2 (en) 2017-08-31 2020-09-08 Apple Inc. Haptic realignment cues for touch-input displays
TWI651120B (en) * 2017-09-01 2019-02-21 玩猴遊戲股份有限公司 Game control method and electronic device
US11054932B2 (en) 2017-09-06 2021-07-06 Apple Inc. Electronic device having a touch sensor, force sensor, and haptic actuator in an integrated module
US10556252B2 (en) 2017-09-20 2020-02-11 Apple Inc. Electronic device having a tuned resonance haptic actuation system
US10768738B1 (en) 2017-09-27 2020-09-08 Apple Inc. Electronic device having a haptic actuator with magnetic augmentation
US10747404B2 (en) * 2017-10-24 2020-08-18 Microchip Technology Incorporated Touchscreen including tactile feedback structures and corresponding virtual user interface elements
GB2568923B (en) * 2017-11-30 2020-09-30 Cyrex Ltd Electrical stimulator apparatus with contactless feedback from a user
JP6888558B2 (en) 2018-01-19 2021-06-16 豊田合成株式会社 Tactile presentation device
US10996755B2 (en) * 2018-02-28 2021-05-04 Google Llc Piezoelectric haptic feedback module
US10353579B1 (en) * 2018-03-28 2019-07-16 Disney Enterprises, Inc. Interpreting user touch gestures to generate explicit instructions
US20190324538A1 (en) * 2018-04-20 2019-10-24 Immersion Corporation Haptic-enabled wearable device for generating a haptic effect in an immersive reality environment
JP6548776B1 (en) * 2018-04-20 2019-07-24 株式会社Cygames Program, electronic device, method, and system
JP7155613B2 (en) * 2018-05-29 2022-10-19 富士フイルムビジネスイノベーション株式会社 Information processing device and program
US10942571B2 (en) 2018-06-29 2021-03-09 Apple Inc. Laptop computing device with discrete haptic regions
US10936071B2 (en) 2018-08-30 2021-03-02 Apple Inc. Wearable electronic device with haptic rotatable input
US10613678B1 (en) 2018-09-17 2020-04-07 Apple Inc. Input device with haptic feedback
US10966007B1 (en) 2018-09-25 2021-03-30 Apple Inc. Haptic output system
US11143523B2 (en) 2018-10-09 2021-10-12 International Business Machines Corporation Providing raised patterns and haptic feedback for mapping applications
US10678264B2 (en) * 2018-10-10 2020-06-09 Midea Group Co., Ltd. Method and system for providing remote robotic control
US20200192480A1 (en) * 2018-12-18 2020-06-18 Immersion Corporation Systems and methods for providing haptic effects based on a user's motion or environment
JP6560840B1 (en) * 2019-04-19 2019-08-14 株式会社Cygames Program, electronic apparatus, method, and system
CN110109726B (en) * 2019-04-30 2022-08-23 网易(杭州)网络有限公司 Virtual object receiving processing method, virtual object transmitting method, virtual object receiving processing device and virtual object transmitting device, and storage medium
USD913017S1 (en) * 2019-06-03 2021-03-16 Nazar Kamangar Interactive multimedia table
CN111113414B (en) * 2019-12-19 2022-08-30 长安大学 Robot three-dimensional space scale prompting method and system based on screen identification
US11024135B1 (en) 2020-06-17 2021-06-01 Apple Inc. Portable electronic device having a haptic button assembly
US20210402292A1 (en) * 2020-06-25 2021-12-30 Sony Interactive Entertainment LLC Method of haptic responses and interacting
CN111782435B (en) * 2020-07-02 2021-08-06 重庆紫光华山智安科技有限公司 Method and system for recovering and processing cascade exception of video monitoring management platform
KR102419901B1 (en) * 2020-09-16 2022-07-13 주식회사 익센트릭게임그루 Immersive contents provision system
US20220111290A1 (en) * 2020-10-09 2022-04-14 Contact Control Interfaces, LLC Haptic engine for spatial computing
US11531400B2 (en) * 2020-12-31 2022-12-20 Snap Inc. Electronic communication interface with haptic feedback response
KR20230128063A (en) * 2020-12-31 2023-09-01 스냅 인코포레이티드 Real-time video communication interface with haptic feedback
US20220206584A1 (en) * 2020-12-31 2022-06-30 Snap Inc. Communication interface with haptic feedback response
US11517812B2 (en) 2021-02-19 2022-12-06 Blok Party, Inc. Application of RFID gamepieces for a gaming console
US11199903B1 (en) * 2021-03-26 2021-12-14 The Florida International University Board Of Trustees Systems and methods for providing haptic feedback when interacting with virtual objects
US20220317773A1 (en) * 2021-03-31 2022-10-06 Snap Inc. Real-time communication interface with haptic and audio feedback response
JP7219862B2 (en) * 2021-06-02 2023-02-09 株式会社Nttコノキュー Communication system, communication method and communication program
US11567575B2 (en) * 2021-06-14 2023-01-31 Microsoft Technology Licensing, Llc Haptic response control
WO2023215975A1 (en) * 2022-05-09 2023-11-16 D-Box Technologies Inc. Method and system for adaptive motion simulation in gaming

Citations (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09164270A (en) 1995-10-09 1997-06-24 Nintendo Co Ltd Controller pack
WO1999038064A2 (en) 1998-01-23 1999-07-29 Koninklijke Philips Electronics N.V. Multiperson tactual virtual environment
JP2001265520A (en) 2000-03-17 2001-09-28 Fuji Electric Co Ltd Input device of terminal
JP2001352414A (en) 2000-06-07 2001-12-21 Toshiba Corp Radio telephone system provided with fishing game function, and computer-readable recording medium
JP2003058321A (en) 2001-08-17 2003-02-28 Fuji Xerox Co Ltd Touch panel device
JP2004290685A (en) 2000-03-16 2004-10-21 Sega Corp Server device, contents delivery method and game program
CN1578964A (en) 2001-10-30 2005-02-09 英默森公司 Methods and apparatus for providing haptic feedback in interacting with virtual pets
JP2005267080A (en) 2004-03-17 2005-09-29 Sony Corp Input device with tactile function, information input method and electronic equipment
US20050212760A1 (en) 2004-03-23 2005-09-29 Marvit David L Gesture based user interface supporting preexisting symbols
US20050212749A1 (en) * 2004-03-23 2005-09-29 Marvit David L Motion sensor engagement for a handheld device
US20050245302A1 (en) 2004-04-29 2005-11-03 Microsoft Corporation Interaction between objects and a virtual environment display
JP2005332063A (en) 2004-05-18 2005-12-02 Sony Corp Input device with tactile function, information inputting method, and electronic device
JP2006068210A (en) 2004-09-01 2006-03-16 Nintendo Co Ltd Game device, and game program
US20070150826A1 (en) 2005-12-23 2007-06-28 Anzures Freddy A Indication of progress towards satisfaction of a user input condition
JP2007221413A (en) 2006-02-16 2007-08-30 Nagoya Institute Of Technology Tactile media transfer system
US20070265096A1 (en) 2006-05-15 2007-11-15 Tsutomu Kouno Game control program, game control method, and game apparatus
US20070279392A1 (en) 1995-12-01 2007-12-06 Rosenberg Louis B Networked applications including haptic feedback
US20080024459A1 (en) 2006-07-31 2008-01-31 Sony Corporation Apparatus and method for touch screen interaction based on tactile feedback and pressure measurement
US20080059138A1 (en) * 1995-11-30 2008-03-06 Immersion Corporation Tactile feedback man-machine interface device
US20080150905A1 (en) 2006-12-21 2008-06-26 Grivna Edward L Feedback mechanism for user detection of reference location on a sensing device
US20080300055A1 (en) * 2007-05-29 2008-12-04 Lutnick Howard W Game with hand motion control
JP2009080720A (en) 2007-09-27 2009-04-16 Kyocera Mita Corp Information input device
JP2009087359A (en) 2007-09-18 2009-04-23 Senseg Oy Method and apparatus for sensory stimulation
WO2009071750A1 (en) 2007-12-07 2009-06-11 Nokia Corporation A user interface
US20110050601A1 (en) * 2009-09-01 2011-03-03 Lg Electronics Inc. Mobile terminal and method of composing message using the same
US20140201666A1 (en) * 2013-01-15 2014-07-17 Raffi Bedikian Dynamic, free-space user interactions for machine control
US20150062086A1 (en) * 2013-08-29 2015-03-05 Rohildev Nattukallingal Method and system of a wearable ring device for management of another computing device
US20150070263A1 (en) * 2013-09-09 2015-03-12 Microsoft Corporation Dynamic Displays Based On User Interaction States

Family Cites Families (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE3784903T2 (en) * 1986-12-18 1993-06-24 Michael Anthony Smithard LEARNING DEVICE.
JP3537238B2 (en) 1995-09-22 2004-06-14 株式会社ナムコ GAME DEVICE AND GAME DEVICE CONTROL METHOD
JP2000005445A (en) * 1998-06-26 2000-01-11 Sony Corp Network game device
JP3949912B2 (en) 2000-08-08 2007-07-25 株式会社エヌ・ティ・ティ・ドコモ Portable electronic device, electronic device, vibration generator, notification method by vibration and notification control method
EP1330811B1 (en) * 2000-09-28 2012-08-22 Immersion Corporation Directional tactile feedback for haptic feedback interface devices
JP2002123840A (en) * 2000-10-17 2002-04-26 Nippon Telegr & Teleph Corp <Ntt> Processing method and processor for providing presence type virtual reality
JP2003180896A (en) * 2001-12-17 2003-07-02 Kazuyoshi Tsukamoto Virtual sport system
TW543493U (en) 2002-04-01 2003-07-21 Lite On Technology Corp The game apparatus for a personal digital assistant
JP4447823B2 (en) * 2002-06-14 2010-04-07 ソニー株式会社 Portable information equipment
JP2005317041A (en) * 2003-02-14 2005-11-10 Sony Corp Information processor, information processing method, and program
CN1280069C (en) * 2003-11-01 2006-10-18 中国科学院合肥智能机械研究所 Flexible tactile sensor and method for detecting infomation of tactile sensation
JP2007531113A (en) * 2004-03-23 2007-11-01 富士通株式会社 Identification of mobile device tilt and translational components
SE528188C8 (en) * 2004-10-25 2006-10-31 Vibrosense Dynamics Ab Apparatus for identification of bibrotactile thresholds on mechanoreceptors in the skin
WO2006090197A1 (en) 2005-02-24 2006-08-31 Nokia Corporation Motion-input device for a computing terminal and method of its operation
JP4167683B2 (en) 2005-10-19 2008-10-15 株式会社タイトー Game device, game server device
WO2007059172A2 (en) 2005-11-14 2007-05-24 Immersion Corporation Systems and methods for editing a model of a physical system for a simulation
JP4119917B2 (en) * 2005-12-27 2008-07-16 株式会社タイトー Gun-type controller, game device using gun-type controller
KR20150044979A (en) * 2006-09-13 2015-04-27 임머숀 코퍼레이션 Systems and methods for casino gaming haptics
US7890863B2 (en) * 2006-10-04 2011-02-15 Immersion Corporation Haptic effects with proximity sensing
JP4926799B2 (en) * 2006-10-23 2012-05-09 キヤノン株式会社 Information processing apparatus and information processing method
JP4925817B2 (en) * 2006-12-28 2012-05-09 株式会社コナミデジタルエンタテインメント Shooting toy
US8621348B2 (en) * 2007-05-25 2013-12-31 Immersion Corporation Customizing haptic effects on an end user device
JP4739302B2 (en) * 2007-09-14 2011-08-03 独立行政法人科学技術振興機構 Penetrating tactile sense presentation device
CN100511104C (en) * 2007-11-22 2009-07-08 上海交通大学 Visual hallucination emulation positioning system based on touch
CN102473034B (en) 2009-07-22 2015-04-01 意美森公司 System and method for providing complex haptic stimulation during input of control gestures, and relating to control of virtual equipment

Patent Citations (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09164270A (en) 1995-10-09 1997-06-24 Nintendo Co Ltd Controller pack
US20080059138A1 (en) * 1995-11-30 2008-03-06 Immersion Corporation Tactile feedback man-machine interface device
US20070279392A1 (en) 1995-12-01 2007-12-06 Rosenberg Louis B Networked applications including haptic feedback
WO1999038064A2 (en) 1998-01-23 1999-07-29 Koninklijke Philips Electronics N.V. Multiperson tactual virtual environment
JP2004290685A (en) 2000-03-16 2004-10-21 Sega Corp Server device, contents delivery method and game program
JP2001265520A (en) 2000-03-17 2001-09-28 Fuji Electric Co Ltd Input device of terminal
JP2001352414A (en) 2000-06-07 2001-12-21 Toshiba Corp Radio telephone system provided with fishing game function, and computer-readable recording medium
JP2003058321A (en) 2001-08-17 2003-02-28 Fuji Xerox Co Ltd Touch panel device
CN1578964A (en) 2001-10-30 2005-02-09 英默森公司 Methods and apparatus for providing haptic feedback in interacting with virtual pets
JP2005267080A (en) 2004-03-17 2005-09-29 Sony Corp Input device with tactile function, information input method and electronic equipment
US20050212760A1 (en) 2004-03-23 2005-09-29 Marvit David L Gesture based user interface supporting preexisting symbols
US20050212749A1 (en) * 2004-03-23 2005-09-29 Marvit David L Motion sensor engagement for a handheld device
US20050245302A1 (en) 2004-04-29 2005-11-03 Microsoft Corporation Interaction between objects and a virtual environment display
JP2005332063A (en) 2004-05-18 2005-12-02 Sony Corp Input device with tactile function, information inputting method, and electronic device
JP2006068210A (en) 2004-09-01 2006-03-16 Nintendo Co Ltd Game device, and game program
US20070150826A1 (en) 2005-12-23 2007-06-28 Anzures Freddy A Indication of progress towards satisfaction of a user input condition
JP2007221413A (en) 2006-02-16 2007-08-30 Nagoya Institute Of Technology Tactile media transfer system
US20070265096A1 (en) 2006-05-15 2007-11-15 Tsutomu Kouno Game control program, game control method, and game apparatus
JP2007301270A (en) 2006-05-15 2007-11-22 Sony Computer Entertainment Inc Game control program, game control method, and game apparatus
US20080024459A1 (en) 2006-07-31 2008-01-31 Sony Corporation Apparatus and method for touch screen interaction based on tactile feedback and pressure measurement
CN101118469A (en) 2006-07-31 2008-02-06 索尼株式会社 Apparatus and method for touch screen interaction based on tactile feedback and pressure measurement
US20080150905A1 (en) 2006-12-21 2008-06-26 Grivna Edward L Feedback mechanism for user detection of reference location on a sensing device
US20080300055A1 (en) * 2007-05-29 2008-12-04 Lutnick Howard W Game with hand motion control
JP2009087359A (en) 2007-09-18 2009-04-23 Senseg Oy Method and apparatus for sensory stimulation
JP2009080720A (en) 2007-09-27 2009-04-16 Kyocera Mita Corp Information input device
WO2009071750A1 (en) 2007-12-07 2009-06-11 Nokia Corporation A user interface
US20110050601A1 (en) * 2009-09-01 2011-03-03 Lg Electronics Inc. Mobile terminal and method of composing message using the same
US20140201666A1 (en) * 2013-01-15 2014-07-17 Raffi Bedikian Dynamic, free-space user interactions for machine control
US20150062086A1 (en) * 2013-08-29 2015-03-05 Rohildev Nattukallingal Method and system of a wearable ring device for management of another computing device
US20150070263A1 (en) * 2013-09-09 2015-03-12 Microsoft Corporation Dynamic Displays Based On User Interaction States

Non-Patent Citations (9)

* Cited by examiner, † Cited by third party
Title
International Preliminary Report on Patentability, including the Written Opinion, as issued for International Application No. PCT/US2010/042795, dated Feb. 2, 2012.
International Preliminary Report on Patentability, including the Written Opinion, as issued for International Application No. PCT/US2010/042805, dated Feb. 2, 2012.
International Search Report as issued for International Application No. PCT/US2010/042795, dated Jan. 12, 2011.
International Search Report as issued for International Application No. PCT/US2010/042805, dated Oct. 29, 2010.
Non-Final Office Action as issued in Japanese Patent Application No. 2012-521762, dated Mar. 24, 2015.
Non-Final Office Action as issued in Japanese Patent Application No. 2012-521762, dated May 13, 2014.
Non-Final Office Action as issued in Japanese Patent Application No. 2012-521764, dated Apr. 22, 2014.
Notification of First Office Action as issued in Chinese Patent Application No. 201080032035.3, dated Apr. 21, 2014.
Notification of First Office Action as issued in Chinese Patent Application No. 201080032051.2, dated Feb. 12, 2014.

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11204645B2 (en) 2017-05-11 2021-12-21 Samsung Electronics Co., Ltd. Method for providing haptic feedback, and electronic device for performing same

Also Published As

Publication number Publication date
KR101993848B1 (en) 2019-06-28
JP2015018562A (en) 2015-01-29
CN104503578A (en) 2015-04-08
US8469806B2 (en) 2013-06-25
KR20120051710A (en) 2012-05-22
JP5613236B2 (en) 2014-10-22
JP6181726B2 (en) 2017-08-16
KR101875354B1 (en) 2018-07-05
JP5932917B2 (en) 2016-06-08
US20110021272A1 (en) 2011-01-27
US20190073037A1 (en) 2019-03-07
JP5823962B2 (en) 2015-11-25
EP2457142A1 (en) 2012-05-30
EP2457142B1 (en) 2019-12-25
CN104679247B (en) 2018-07-24
EP2457141A1 (en) 2012-05-30
CN102473034B (en) 2015-04-01
US20110018697A1 (en) 2011-01-27
US9921655B2 (en) 2018-03-20
CN102473035B (en) 2015-01-21
US9373233B2 (en) 2016-06-21
KR101962081B1 (en) 2019-03-25
KR20120053004A (en) 2012-05-24
US9671866B2 (en) 2017-06-06
US20170168573A1 (en) 2017-06-15
JP2016026361A (en) 2016-02-12
US20170357323A1 (en) 2017-12-14
KR101713358B1 (en) 2017-03-07
KR20180049186A (en) 2018-05-10
WO2011011546A1 (en) 2011-01-27
US20140009273A1 (en) 2014-01-09
KR20190032632A (en) 2019-03-27
JP2021180873A (en) 2021-11-25
US20160231814A1 (en) 2016-08-11
EP2457141B1 (en) 2020-05-06
WO2011011552A1 (en) 2011-01-27
KR20170026642A (en) 2017-03-08
US10139911B2 (en) 2018-11-27
JP6877249B2 (en) 2021-05-26
KR101755051B1 (en) 2017-07-06
KR20170104652A (en) 2017-09-15
JP2013500516A (en) 2013-01-07
JP2013500517A (en) 2013-01-07
CN102473035A (en) 2012-05-23
JP2017201533A (en) 2017-11-09
CN104503578B (en) 2018-02-06
KR20170081287A (en) 2017-07-11
JP2019096347A (en) 2019-06-20
CN104679247A (en) 2015-06-03
US8502651B2 (en) 2013-08-06
KR20190077113A (en) 2019-07-02
KR101873943B1 (en) 2018-07-04
KR20180077309A (en) 2018-07-06
CN102473034A (en) 2012-05-23
JP7186635B2 (en) 2022-12-09
US20130328814A1 (en) 2013-12-12

Similar Documents

Publication Publication Date Title
US10139911B2 (en) System and method for providing complex haptic stimulation during input of control gestures, and relating to control of virtual equipment
US20180341332A1 (en) Systems and Methods for a Shared Haptic Experience
KR101741376B1 (en) Multiple actuation handheld device
JP7469266B2 (en) System and method for providing complex tactile stimuli during input of manipulation gestures and in conjunction with manipulation of a virtual device
JP4172652B2 (en) Video shooting game device

Legal Events

Date Code Title Description
AS Assignment

Owner name: IMMERSION CORPORATION, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:GRANT, DANNY A.;HEUBEL, ROBERT W.;BIRNBAUM, DAVID M.;AND OTHERS;SIGNING DATES FROM 20100831 TO 20100907;REEL/FRAME:030663/0813

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8