US20100167820A1 - Human interface device - Google Patents

Human interface device Download PDF

Info

Publication number
US20100167820A1
US20100167820A1 US12/649,277 US64927709A US2010167820A1 US 20100167820 A1 US20100167820 A1 US 20100167820A1 US 64927709 A US64927709 A US 64927709A US 2010167820 A1 US2010167820 A1 US 2010167820A1
Authority
US
United States
Prior art keywords
communication
controller
micro
sensor
interface device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/649,277
Inventor
Houssam Barakat
Bradford R. Lilly
Krishna Shenai
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Toledo
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US12/649,277 priority Critical patent/US20100167820A1/en
Assigned to THE UNIVERSITY OF TOLEDO reassignment THE UNIVERSITY OF TOLEDO ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: LILLY, BRADFORD R., SHENAI, KRISHNA, BARAKAT, HOUSSAM
Publication of US20100167820A1 publication Critical patent/US20100167820A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/212Input arrangements for video game devices characterised by their sensors, purposes or types using sensors worn by the player, e.g. for measuring heart beat or leg activity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • A63F13/28Output arrangements for video game devices responding to control signals received from the game device for affecting ambient conditions, e.g. for vibrating players' seats, activating scent dispensers or affecting temperature or light
    • A63F13/285Generating tactile feedback signals via the game input device, e.g. force feedback
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1012Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals involving biosensors worn by the player, e.g. for measuring heart beat, limb activity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1037Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted for converting control signals received from the game device into a haptic signal, e.g. using force feedback
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/64Methods for processing data by generating or executing the game program for computing dynamical parameters of game objects, e.g. motion determination or computation of frictional forces for a virtual car
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality

Definitions

  • This invention relates in general to human interaction with computer-simulated environments. More specifically, this invention relates to multimodal devices having an inherent built-in feedback mechanism that can be used to remotely interface human interaction with computer-simulated environments, such as generated by computer-based gaming and virtual reality systems.
  • Gaming and virtual reality systems allow a user or a number of users to interact with a computer-simulated environment.
  • a typical system includes a computer that establishes the computer-simulated environment.
  • the computer-simulated environment can be “virtual reality” or based on a real environment such as, for example, simulations for pilot or combat training.
  • the computer-simulated environment can also be for gaming or based on an imagined environment such as, for example, imaginary interplanetary worlds.
  • Most computer-simulated environments are primarily visual experiences, displayed either on a computer screen or through special or stereoscopic displays, but some computer-simulated environments established by computers include additional sensory experiences, such as sound through speakers or headphones or vibration through user input devices such as controllers.
  • Users can typically interact with such computer-simulated environments through the use of standard input devices such as a keyboard and a mouse, or through multimodal devices such as, for example, controllers, wired gloves, joysticks or steering wheels.
  • a human interface device with an inherent built-in feedback mechanism for use by a user to remotely interface with a computer-simulated environment.
  • the human interface device comprises at least one sensor configured to sense a condition within the action of the computer-simulated environment and also operable to generate a communication concerning the sensed condition.
  • At least one micro-controller is positioned within the human interface device and configured to receive the communication concerning the sensed condition from the at least one sensor.
  • the at least one micro-controller is further configured to generate communication in reaction to the communication from the sensor.
  • At least one actuator is configured to receive the communication from the at least one micro-controller and provide a sensory experience in reaction to the communication.
  • a method of providing a sensory experience to a user using the human interface device described above comprises the steps of sensing a condition within the action of the gaming or virtual reality system established by the computer, generating a communication concerning the sensed condition, sending the communication to a human interface device for processing, generating a communication within the human interface device in reaction to the received communication, sending the human interface device communication to at least one actuator; and providing a sensory experience in reaction to the received human interface device communication.
  • FIG. 1 is a schematic view of the major components of a conventional gaming system.
  • FIG. 2 is a schematic view of the major components of a first embodiment of the invention.
  • FIG. 3 is a side view, in elevation, of a first embodiment of the human interface device of FIG. 2 .
  • FIG. 4 is a side view, in perspective, of a second embodiment of a human interface device.
  • FIG. 5 is a side view, in perspective, of a third embodiment of a human interface device.
  • FIG. 6 is a schematic view of the components of the human interface device shown in FIG. 2 .
  • the terms “gaming system” and “virtual reality system” will be used interchangeably and are defined to include any system, structure(s), or device(s) incorporating a technology which allows a user to interact with a computer-simulated environment.
  • FIG. 1 a gaming system, indicated generally at 10 , that is conventional in the art.
  • a gaming system is one example of system operable to establish a computer-simulated environment and the invention is not limited to gaming systems.
  • the basic structure and mode of operation of the gaming system 10 are well known in the art, and only those portions of the gaming system 10 that are necessary for a complete understanding of the invention will be described.
  • Some examples of commercially available gaming systems 10 include WiiTM marketed by Nintendo, Playstation® marketed by Sony Corporation, and Xbox® marketed by Microsoft Corporation.
  • a typical gaming system 10 includes at least one human interface device (HID) 12 , a plurality of sensors 14 , at least one host device or system or device 16 , and a display (not shown).
  • HID human interface device
  • the HID 12 is an input device used to affect the gaming system 10 or govern the movement or actions of an entity within the computer-simulated game/environment. While the HID 12 is typically connected via wires to the host system 16 , the HID 12 can also be operated in a wireless mode. As will be explained in more detail below, the HID 12 can comprise many different physical forms.
  • the HID 12 typically includes, among other things, a micro-controller 18 and at least one actuator 20 .
  • the micro-controller 18 is configured for processing of information and communication of information to a variety of other devices.
  • the actuator 20 is configured to induce a sensory-based reaction by the user of the HID 12 .
  • Examples of actuators 20 include vibration inducing or motion causing motors, heat causing elements, sound causing elements and motion causing solenoids.
  • the actuator 20 is electrically connected to the micro-controller 18 and configured to respond to signals from the micro-controller 18 .
  • the sensors 14 are configured to provide input of the status of game play or game conditions.
  • the sensors 14 can sense a condition within the action of the gaming system in that the sensors 14 can sense some physical condition associated with the user, such as activity of the user, the physical condition of the user, or the physical environment in which the user is participating in the computer-simulated environment.
  • a sensor 12 is an accelerometer, which can measure the magnitude, direction and force of movement of the user.
  • Other types of sensors 14 can be used to provide input regarding user-initiated actions, such as pulling a trigger and movement of a virtual character.
  • Still other sensors 14 can be used to provide input regarding other conditions, such as for example the heart rate of the user, the temperature and humidity of the gaming environment, and the mental consciousness of the user.
  • FIG. 1 While the plurality of sensors 14 illustrated in FIG. 1 is shown as being positioned exterior to the HID 12 , it should be understood that the plurality of sensors 14 can be positioned in various locations, including internal to the HID 12 . While the embodiment shown in FIG. 1 illustrates a quantity of four sensors 14 , it should be understood than more or less than four sensors can be used.
  • the host system 16 typically includes, among other things, a host central processing unit (CPU) 22 and is configured to control the overall functions of the gaming system. Examples of overall system functions include loading of games software, start up, and shut down.
  • the host CPU 22 typically processes input information from a variety of sources and controls the play of the gaming system.
  • the sensors 14 sense a condition potentially affecting the play of the gaming system 10 and generate a communication concerning the sensed condition.
  • the sensors 14 can communicate with the micro-controller 18 positioned within the HID 12 or the host CPU 22 .
  • the micro-controller 18 can be configured to receive the communications generated by the individual sensors 14 .
  • the communication of the various sensors 14 with the micro-controller 18 is shown in FIG. 1 as communications C 1 a and the communication of the various sensors 14 with the host CPU 22 is shown as communications C 1 b.
  • the micro-controller 18 processes the information and communicates the information from the sensors 14 to the host CPU 22 positioned within the host system 16 as shown by communication C 2 .
  • the host CPU 22 processes the sensor information and determines an appropriate action and/or response.
  • the action and/or response is communicated from the host CPU 22 to the micro-controller 18 positioned within the HID 12 by communication C 3 .
  • the micro-controller 18 receives the communication C 3 and determines the appropriate actuator 20 response.
  • the micro-controller 18 then communicates with the appropriate actuator 20 as shown by communication C 4 .
  • the actuator 20 can be configured to receive the communication C 4 and provide a sensory experience in reaction to the communication.
  • the sensory experience can be a feeling of vibration or recoil.
  • the communication C 4 instructs the actuator 20 to initiate and perform an action, such as generate vibrations for a specified period of time. This cycle is repeated as additional communications C 1 a and C 1 b are received from the sensors 14 .
  • CG additional communication of a general nature, indicated as CG, occurs between the host CPU 22 and the micro-controller 18 positioned within the HID 12 .
  • the communication CG can include typical game play instructions.
  • the gaming system 40 includes at least one HID 42 , a plurality of sensors 44 , and at least one host device or system 46 .
  • the host system 46 can be substantially similar to the host system 16 illustrated in FIG. 1 .
  • the plurality of sensors 44 shown in FIG. 2 can be substantially similar to the plurality of sensors 14 illustrated in FIG. 1 .
  • the HID 42 shown in FIG. 2 includes, among other things, a micro-controller 50 and at least one actuator 48 .
  • the actuator 48 shown in FIG. 2 can be substantially similar to the actuator 20 illustrated in FIG. 1 .
  • the micro-controller 50 varies from the micro-controller 18 illustrated in FIG. 1 in that it performs the additional processing of the sensor information C 1 a without communicating the sensor information C 1 a with the host system 46 . Accordingly, the HID 42 may include different components and different electronic circuitry than the HID 12 illustrated in FIG. 1 .
  • the improved gaming system 40 varies from the traditional gaming system 10 shown in FIG. 1 in several ways.
  • the HID 42 does not communicate with the host system 46 as to the processing of the sensor information C 1 a. Rather, the HID 42 performs the steps of receiving the sensor information C 1 a, internally processing the sensor information C 1 a, determining the appropriate actuator response and communicating with the appropriate actuator without communicating the sensor information C 1 a to the host system 46 .
  • the micro-controller 50 is thus capable of independently determining the response of the actuator 48 that is appropriate or desirable to the condition that is sensed by any of the sensors 44 .
  • the actuator 48 shown in FIG. 2 can be substantially similar to the actuator 20 shown in FIG. 1
  • the actuator 48 is configured to respond solely to sensor information C 1 a processed by the micro-controller 50 . Accordingly, the actuator 48 does not require communication from the host system 46 for operation.
  • the human interface devices 42 can have various physical embodiments.
  • a first physical embodiment of one example of a HID 146 is illustrated.
  • the HID 146 is in the form of a firearm.
  • the firearm HID 146 is configured to interact with the gaming system or virtual reality system such that the firearm HID 146 simulates the firing of a weapon. Accordingly, in an effort to make the gaming system as realistic as possible, the firearm HID 146 has all of the typical firearm components including a stock 160 , a muzzle 161 , a magazine 162 , a trigger 163 , a receiver 164 , a sight 165 and a butt end 166 .
  • the firearm HID 146 also includes a recoil mechanism 170 , a sensor switch 171 , a mode switch 172 , a power supply 173 , a transmitter/receiver 174 , and a motion control 175 . While not illustrated in FIG. 3 , the firearm HID 146 also includes the micro-controller 50 shown in FIG. 2 and associated circuitry.
  • the recoil mechanism 170 is the same device as the actuator 48 shown in FIG. 2 .
  • the recoil mechanism 170 is configured to provide a realistic “backward kick” or force resulting from the act of firing the firearm HID 146 .
  • the recoil mechanism 170 is a solenoid.
  • recoil mechanism can be other suitable actuator devices.
  • a sensor 144 illustrated as positioned internal to the firearm HID 146 , senses the motion of the trigger 163 and sends information to the micro-controller 50 as shown in FIG. 2 .
  • the micro-controller receives the sensor information, internally processes the sensor information and communicates with the recoil mechanism 170 , thereby producing the recoil motion. Accordingly, the recoil mechanism 170 produces an immediate, real time action in the firearm HID 146 in response to sensor input without the interaction with the host system 46 . While the actuator shown in FIG. 3 is a recoil mechanism 170 , the actuator can be other suitable devices, and non-limiting examples are provided below.
  • the sensor switch 171 is configured to enable or disable communication from the sensor 144 to the micro-controller 50 .
  • the sensor switch 171 is an on/off rocker switch which is conventional in the art.
  • the sensor switch 171 can be other suitable devices, such as for example, a slide switch, capable of enabling or disabling the communication from the sensor 144 to the micro-controller 50 .
  • the mode switch 172 is configured to provide the firearm HID 146 with various modes of game play.
  • the mode switch 172 includes settings for single shot, semi-automatic and full automatic firing rate. In operation, as each mode is selected, the sensor 144 and recoil mechanism 170 react accordingly.
  • the mode switch 172 can be configured to provide various modes of game play in accordance with the nature of the virtual reality scenario. For example, in a scenario in which the virtual reality game involves fishing, the HID 42 can be configured as a fishing pole and the mode switch 172 could be configured to provide the amount of bait used.
  • the power supply 173 is configured to provide sufficient power to the firearm HID 146 such that the firearm HID 146 is independent of the host system 46 for power. As shown in FIG. 3 , the power supply 173 is a rechargeable battery pack. However, the power supply 173 can be other suitable devices sufficient to provide power to the firearm HID 146 . In other embodiments, the power supply 173 can be connected to a conventional electrical outlet (not shown).
  • the transmitter/receiver 174 is configured to transmit and receive information to and from the host system 46 .
  • the transmitted and received information can include general communications CG, as shown in FIG. 2 , as well as other necessary information.
  • the transmitter/receiver 174 operates on radio frequencies, although other methods of operation are possible.
  • the motion control 175 is an input device used to affect the gaming system 46 or govern the movement or actions of an entity within the computer-simulated game.
  • the motion control 175 can be used to move an entity within the game forward, backward or sideways.
  • the motion control 175 is knob having 360° of available motion.
  • the motion control 175 can be any mechanism or device suitable to govern the movement or actions of an entity within the computer-simulated game.
  • the firearm HID 146 can project images onto the display of the gaming system.
  • the images can include any suitable visual affect, such as for example a cross-hair and bullet tracers.
  • a second physical embodiment of a HID 246 is illustrated.
  • the HID 246 is in the form of a joystick.
  • Non-limiting examples include a joystick 246 that can be used in gaming systems involving the control of helicopters or antique aircraft.
  • the HID 246 illustrated in FIG. 4 includes the sensor, micro-controller, and feedback mechanism and these components operate in the same manner as described above.
  • a third physical embodiment of a HID 346 is illustrated.
  • the HID 346 is in the form of a steering wheel.
  • the steering wheel 346 can be used in gaming systems involving the control of race cars or boats.
  • the HID 346 illustrated in FIG. 5 includes all of the components and operates in the same manner as described above.
  • the human interface devices include a firearm, joystick, and steering wheel
  • the human interface devices can represent any control mechanism, such as, for example, a fishing pole, a tennis racket, or a surf board suitable to affect the gaming system 10 or govern the movement or actions of an entity within the computer-simulated game.
  • the HID 42 includes the micro-controller 50 , feedback mechanism (or actuator) 48 , a receiver 80 , a transmitter 82 , voltage conversion circuitry 84 , a power supply 86 , memory 88 , a time device 89 , a variable signal generator 90 , various user switches, 92 a and 92 b, and sensor output 94 .
  • the HID 42 is activated through a switch, such as for example user switch 92 a. Activating the HID 42 enables the micro-controller 50 .
  • the micro-controller 50 awaits a user event, a sensor event or sensor input. All sensor outputs 94 and inputs from user switches 92 a and 92 b interact directly with and only with the micro-controller 50 .
  • the micro-controller 50 After receiving input, the micro-controller 50 communicates with the host system 46 .
  • the communications from the micro-controller 50 to the host system 46 are transmitted via industry standard communication protocols and devices, such as for example the receiver 80 and the transmitter 82 .
  • the HID 42 does not require custom protocols, additional or special hardware, or custom or special drivers in addition to the standard protocols, hardware and drivers currently residing on the host system 46 .
  • the micro-controller 50 can interpret certain sensor outputs 94 and certain inputs from user switches 92 a and 92 b as input that require user feedback through the feedback mechanism 48 . In those situations, the micro-controller 50 enables the power supply 86 to supply power to the feedback mechanism 48 . The micro-controller 50 further directs the feedback mechanism 48 to provide a level of feedback as limited according to the user setting 92 a. As described above, the feedback mechanism 48 can apply the feedback to the user in any desired form. In one embodiment, the user input 92 a can be channeled to the micro-controller 50 , causing the micro-controller 50 to enable the variable signal generator 90 .
  • the signal generator 90 communicates a signal to a power amplifier which amplifies the signal so that the signal may vary the state of the feedback mechanism 48 accordingly.
  • the user's settings 92 a and 92 b have an effect on the type of feedback the user receives.
  • the user can adjust the variable signal generator 90 through a user switch 92 b to get a different form of signal wave and, accordingly, a different form of feedback. While the illustrated embodiment shown in FIG. 6 provides one example of suitable internal components and the circuitry within the HID 42 , it should be appreciated that the HID 42 could include different internal components and different circuitry that operate and function in a different manner.
  • the improved HID 42 provides many benefits over a traditional HID 12 .
  • a common HID can be used with different host systems through adaptors.
  • the HID 42 is customizable to communicate with various types of sensors and provide various types of sensory feedback.
  • the HID can have a wide variety of physical embodiments. Other advantages are also apparent from a reading of the specification and claims and from a study of the Figures.
  • a sensor can communicate directly with the feedback mechanism, bypassing the micro-controller.
  • a photo-sensor could trigger the feedback mechanism when the photo-sensor is exposed to light.
  • a pressure sensor button can cause feedback to occur upon being pushed.
  • the sensor and feedback mechanism can be connected by circuitry.

Abstract

A human interface device with an inherent built-in feedback mechanism for use by a user to remotely interface with a computer-simulated environment is disclosed herein. The human interface device comprises at least one sensor configured to sense a condition within the action of the computer-simulated environment and also operable to generate a communication concerning the sensed condition. At least one micro-controller is positioned within the human interface device and configured to receive the communication concerning the sensed condition from the at least one sensor. The at least one micro-controller is further configured to generate communication in reaction to the communication from the sensor. At least one actuator is configured to receive the communication from the at least one micro-controller and provide a sensory experience in reaction to the communication.

Description

    BACKGROUND OF THE INVENTION
  • This invention relates in general to human interaction with computer-simulated environments. More specifically, this invention relates to multimodal devices having an inherent built-in feedback mechanism that can be used to remotely interface human interaction with computer-simulated environments, such as generated by computer-based gaming and virtual reality systems.
  • Gaming and virtual reality systems allow a user or a number of users to interact with a computer-simulated environment. A typical system includes a computer that establishes the computer-simulated environment. The computer-simulated environment can be “virtual reality” or based on a real environment such as, for example, simulations for pilot or combat training. The computer-simulated environment can also be for gaming or based on an imagined environment such as, for example, imaginary interplanetary worlds. Most computer-simulated environments are primarily visual experiences, displayed either on a computer screen or through special or stereoscopic displays, but some computer-simulated environments established by computers include additional sensory experiences, such as sound through speakers or headphones or vibration through user input devices such as controllers. Users can typically interact with such computer-simulated environments through the use of standard input devices such as a keyboard and a mouse, or through multimodal devices such as, for example, controllers, wired gloves, joysticks or steering wheels.
  • SUMMARY OF THE INVENTION
  • According to this invention, there is provided a human interface device with an inherent built-in feedback mechanism for use by a user to remotely interface with a computer-simulated environment. The human interface device comprises at least one sensor configured to sense a condition within the action of the computer-simulated environment and also operable to generate a communication concerning the sensed condition. At least one micro-controller is positioned within the human interface device and configured to receive the communication concerning the sensed condition from the at least one sensor. The at least one micro-controller is further configured to generate communication in reaction to the communication from the sensor. At least one actuator is configured to receive the communication from the at least one micro-controller and provide a sensory experience in reaction to the communication.
  • According to this invention, there is also provided a method of providing a sensory experience to a user using the human interface device described above. The method comprises the steps of sensing a condition within the action of the gaming or virtual reality system established by the computer, generating a communication concerning the sensed condition, sending the communication to a human interface device for processing, generating a communication within the human interface device in reaction to the received communication, sending the human interface device communication to at least one actuator; and providing a sensory experience in reaction to the received human interface device communication.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a schematic view of the major components of a conventional gaming system.
  • FIG. 2 is a schematic view of the major components of a first embodiment of the invention.
  • FIG. 3 is a side view, in elevation, of a first embodiment of the human interface device of FIG. 2.
  • FIG. 4 is a side view, in perspective, of a second embodiment of a human interface device.
  • FIG. 5 is a side view, in perspective, of a third embodiment of a human interface device.
  • FIG. 6 is a schematic view of the components of the human interface device shown in FIG. 2.
  • DETAILED DESCRIPTION OF THE INVENTION
  • For purposes of this patent application, the terms “gaming system” and “virtual reality system” will be used interchangeably and are defined to include any system, structure(s), or device(s) incorporating a technology which allows a user to interact with a computer-simulated environment.
  • Referring now to the drawings, there is illustrated in FIG. 1 a gaming system, indicated generally at 10, that is conventional in the art. A gaming system is one example of system operable to establish a computer-simulated environment and the invention is not limited to gaming systems. The basic structure and mode of operation of the gaming system 10 are well known in the art, and only those portions of the gaming system 10 that are necessary for a complete understanding of the invention will be described. Some examples of commercially available gaming systems 10 include Wii™ marketed by Nintendo, Playstation® marketed by Sony Corporation, and Xbox® marketed by Microsoft Corporation.
  • As shown in FIG. 1, a typical gaming system 10 includes at least one human interface device (HID) 12, a plurality of sensors 14, at least one host device or system or device 16, and a display (not shown).
  • The HID 12 is an input device used to affect the gaming system 10 or govern the movement or actions of an entity within the computer-simulated game/environment. While the HID 12 is typically connected via wires to the host system 16, the HID 12 can also be operated in a wireless mode. As will be explained in more detail below, the HID 12 can comprise many different physical forms.
  • As shown in FIG. 1, the HID 12 typically includes, among other things, a micro-controller 18 and at least one actuator 20. As will be explained in more detail below, the micro-controller 18 is configured for processing of information and communication of information to a variety of other devices. The actuator 20 is configured to induce a sensory-based reaction by the user of the HID 12. Examples of actuators 20 include vibration inducing or motion causing motors, heat causing elements, sound causing elements and motion causing solenoids. As shown in FIG. 1, the actuator 20 is electrically connected to the micro-controller 18 and configured to respond to signals from the micro-controller 18.
  • The sensors 14 are configured to provide input of the status of game play or game conditions. The sensors 14 can sense a condition within the action of the gaming system in that the sensors 14 can sense some physical condition associated with the user, such as activity of the user, the physical condition of the user, or the physical environment in which the user is participating in the computer-simulated environment. For example, one example of a sensor 12 is an accelerometer, which can measure the magnitude, direction and force of movement of the user. Other types of sensors 14 can be used to provide input regarding user-initiated actions, such as pulling a trigger and movement of a virtual character. Still other sensors 14 can be used to provide input regarding other conditions, such as for example the heart rate of the user, the temperature and humidity of the gaming environment, and the mental consciousness of the user. While the plurality of sensors 14 illustrated in FIG. 1 is shown as being positioned exterior to the HID 12, it should be understood that the plurality of sensors 14 can be positioned in various locations, including internal to the HID 12. While the embodiment shown in FIG. 1 illustrates a quantity of four sensors 14, it should be understood than more or less than four sensors can be used.
  • The host system 16 typically includes, among other things, a host central processing unit (CPU) 22 and is configured to control the overall functions of the gaming system. Examples of overall system functions include loading of games software, start up, and shut down. The host CPU 22 typically processes input information from a variety of sources and controls the play of the gaming system.
  • In operation, the sensors 14 sense a condition potentially affecting the play of the gaming system 10 and generate a communication concerning the sensed condition. The sensors 14 can communicate with the micro-controller 18 positioned within the HID 12 or the host CPU 22. The micro-controller 18 can be configured to receive the communications generated by the individual sensors 14. The communication of the various sensors 14 with the micro-controller 18 is shown in FIG. 1 as communications C1 a and the communication of the various sensors 14 with the host CPU 22 is shown as communications C1 b. The micro-controller 18 processes the information and communicates the information from the sensors 14 to the host CPU 22 positioned within the host system 16 as shown by communication C2. The host CPU 22 processes the sensor information and determines an appropriate action and/or response. The action and/or response is communicated from the host CPU 22 to the micro-controller 18 positioned within the HID 12 by communication C3. The micro-controller 18 receives the communication C3 and determines the appropriate actuator 20 response. The micro-controller 18 then communicates with the appropriate actuator 20 as shown by communication C4. The actuator 20 can be configured to receive the communication C4 and provide a sensory experience in reaction to the communication. For example, the sensory experience can be a feeling of vibration or recoil. The communication C4 instructs the actuator 20 to initiate and perform an action, such as generate vibrations for a specified period of time. This cycle is repeated as additional communications C1 a and C1 b are received from the sensors 14.
  • As further shown in FIG. 1, additional communication of a general nature, indicated as CG, occurs between the host CPU 22 and the micro-controller 18 positioned within the HID 12. The communication CG can include typical game play instructions.
  • Referring now to FIG. 2, there is illustrated an improved gaming system, indicated generally at 40, in accordance with a first embodiment of the invention. In the illustrated embodiment as shown in FIG. 2, the gaming system 40 includes at least one HID 42, a plurality of sensors 44, and at least one host device or system 46.
  • As shown in FIG. 2, the host system 46 can be substantially similar to the host system 16 illustrated in FIG. 1. Similarly, the plurality of sensors 44 shown in FIG. 2 can be substantially similar to the plurality of sensors 14 illustrated in FIG. 1.
  • The HID 42 shown in FIG. 2 includes, among other things, a micro-controller 50 and at least one actuator 48. The actuator 48 shown in FIG. 2 can be substantially similar to the actuator 20 illustrated in FIG. 1. The micro-controller 50 varies from the micro-controller 18 illustrated in FIG. 1 in that it performs the additional processing of the sensor information C1 a without communicating the sensor information C1 a with the host system 46. Accordingly, the HID 42 may include different components and different electronic circuitry than the HID 12 illustrated in FIG. 1.
  • Generally, the improved gaming system 40 varies from the traditional gaming system 10 shown in FIG. 1 in several ways. First, the HID 42 does not communicate with the host system 46 as to the processing of the sensor information C1 a. Rather, the HID 42 performs the steps of receiving the sensor information C1 a, internally processing the sensor information C1 a, determining the appropriate actuator response and communicating with the appropriate actuator without communicating the sensor information C1 a to the host system 46. The micro-controller 50 is thus capable of independently determining the response of the actuator 48 that is appropriate or desirable to the condition that is sensed by any of the sensors 44. Second, while the actuator 48 shown in FIG. 2 can be substantially similar to the actuator 20 shown in FIG. 1, the actuator 48 is configured to respond solely to sensor information C1 a processed by the micro-controller 50. Accordingly, the actuator 48 does not require communication from the host system 46 for operation.
  • Referring now to FIGS. 3-5, the human interface devices 42 can have various physical embodiments. As shown in FIG. 3, a first physical embodiment of one example of a HID 146 is illustrated. In this embodiment, the HID 146 is in the form of a firearm. The firearm HID 146 is configured to interact with the gaming system or virtual reality system such that the firearm HID 146 simulates the firing of a weapon. Accordingly, in an effort to make the gaming system as realistic as possible, the firearm HID 146 has all of the typical firearm components including a stock 160, a muzzle 161, a magazine 162, a trigger 163, a receiver 164, a sight 165 and a butt end 166. In the illustrated embodiment, the firearm HID 146 also includes a recoil mechanism 170, a sensor switch 171, a mode switch 172, a power supply 173, a transmitter/receiver 174, and a motion control 175. While not illustrated in FIG. 3, the firearm HID 146 also includes the micro-controller 50 shown in FIG. 2 and associated circuitry.
  • As illustrated in FIG. 3 and with reference to FIG. 2, the recoil mechanism 170 is the same device as the actuator 48 shown in FIG. 2. The recoil mechanism 170 is configured to provide a realistic “backward kick” or force resulting from the act of firing the firearm HID 146. In the illustrated embodiment, the recoil mechanism 170 is a solenoid. However, recoil mechanism can be other suitable actuator devices. In operation, as the trigger 163 is pulled, a sensor 144, illustrated as positioned internal to the firearm HID 146, senses the motion of the trigger 163 and sends information to the micro-controller 50 as shown in FIG. 2. The micro-controller receives the sensor information, internally processes the sensor information and communicates with the recoil mechanism 170, thereby producing the recoil motion. Accordingly, the recoil mechanism 170 produces an immediate, real time action in the firearm HID 146 in response to sensor input without the interaction with the host system 46. While the actuator shown in FIG. 3 is a recoil mechanism 170, the actuator can be other suitable devices, and non-limiting examples are provided below.
  • The sensor switch 171 is configured to enable or disable communication from the sensor 144 to the micro-controller 50. In the illustrated embodiment, the sensor switch 171 is an on/off rocker switch which is conventional in the art. However, the sensor switch 171 can be other suitable devices, such as for example, a slide switch, capable of enabling or disabling the communication from the sensor 144 to the micro-controller 50.
  • The mode switch 172 is configured to provide the firearm HID 146 with various modes of game play. The mode switch 172 includes settings for single shot, semi-automatic and full automatic firing rate. In operation, as each mode is selected, the sensor 144 and recoil mechanism 170 react accordingly. In other embodiments, the mode switch 172 can be configured to provide various modes of game play in accordance with the nature of the virtual reality scenario. For example, in a scenario in which the virtual reality game involves fishing, the HID 42 can be configured as a fishing pole and the mode switch 172 could be configured to provide the amount of bait used.
  • Referring again to FIGS. 2 and 3, the power supply 173 is configured to provide sufficient power to the firearm HID 146 such that the firearm HID 146 is independent of the host system 46 for power. As shown in FIG. 3, the power supply 173 is a rechargeable battery pack. However, the power supply 173 can be other suitable devices sufficient to provide power to the firearm HID 146. In other embodiments, the power supply 173 can be connected to a conventional electrical outlet (not shown).
  • The transmitter/receiver 174 is configured to transmit and receive information to and from the host system 46. The transmitted and received information can include general communications CG, as shown in FIG. 2, as well as other necessary information. In the illustrated embodiment, the transmitter/receiver 174 operates on radio frequencies, although other methods of operation are possible.
  • The motion control 175 is an input device used to affect the gaming system 46 or govern the movement or actions of an entity within the computer-simulated game. As one example, the motion control 175 can be used to move an entity within the game forward, backward or sideways. In the illustrated embodiment, the motion control 175 is knob having 360° of available motion. However, the motion control 175 can be any mechanism or device suitable to govern the movement or actions of an entity within the computer-simulated game.
  • While not illustrated in FIG. 3, it should be understood that the firearm HID 146 can project images onto the display of the gaming system. The images can include any suitable visual affect, such as for example a cross-hair and bullet tracers.
  • As shown in FIG. 4, a second physical embodiment of a HID 246 is illustrated. In this embodiment, the HID 246 is in the form of a joystick. Non-limiting examples include a joystick 246 that can be used in gaming systems involving the control of helicopters or antique aircraft. The HID 246 illustrated in FIG. 4 includes the sensor, micro-controller, and feedback mechanism and these components operate in the same manner as described above.
  • As shown in FIG. 5, a third physical embodiment of a HID 346 is illustrated. In this embodiment, the HID 346 is in the form of a steering wheel. Non-limiting examples where the steering wheel 346 can be used in gaming systems involving the control of race cars or boats. The HID 346 illustrated in FIG. 5 includes all of the components and operates in the same manner as described above.
  • While the illustrated embodiments of the human interface devices include a firearm, joystick, and steering wheel, it should be understood the human interface devices can represent any control mechanism, such as, for example, a fishing pole, a tennis racket, or a surf board suitable to affect the gaming system 10 or govern the movement or actions of an entity within the computer-simulated game.
  • One example of suitable internal components and the circuitry within the HID 42 are shown in FIG. 6. In the illustrated embodiment, the HID 42 includes the micro-controller 50, feedback mechanism (or actuator) 48, a receiver 80, a transmitter 82, voltage conversion circuitry 84, a power supply 86, memory 88, a time device 89, a variable signal generator 90, various user switches, 92 a and 92 b, and sensor output 94.
  • The HID 42 is activated through a switch, such as for example user switch 92 a. Activating the HID 42 enables the micro-controller 50. The micro-controller 50 awaits a user event, a sensor event or sensor input. All sensor outputs 94 and inputs from user switches 92 a and 92 b interact directly with and only with the micro-controller 50.
  • After receiving input, the micro-controller 50 communicates with the host system 46. The communications from the micro-controller 50 to the host system 46 are transmitted via industry standard communication protocols and devices, such as for example the receiver 80 and the transmitter 82. The HID 42 does not require custom protocols, additional or special hardware, or custom or special drivers in addition to the standard protocols, hardware and drivers currently residing on the host system 46.
  • The micro-controller 50 can interpret certain sensor outputs 94 and certain inputs from user switches 92 a and 92 b as input that require user feedback through the feedback mechanism 48. In those situations, the micro-controller 50 enables the power supply 86 to supply power to the feedback mechanism 48. The micro-controller 50 further directs the feedback mechanism 48 to provide a level of feedback as limited according to the user setting 92 a. As described above, the feedback mechanism 48 can apply the feedback to the user in any desired form. In one embodiment, the user input 92 a can be channeled to the micro-controller 50, causing the micro-controller 50 to enable the variable signal generator 90. The signal generator 90 communicates a signal to a power amplifier which amplifies the signal so that the signal may vary the state of the feedback mechanism 48 accordingly. The user's settings 92 a and 92 b have an effect on the type of feedback the user receives. In one embodiment, the user can adjust the variable signal generator 90 through a user switch 92 b to get a different form of signal wave and, accordingly, a different form of feedback. While the illustrated embodiment shown in FIG. 6 provides one example of suitable internal components and the circuitry within the HID 42, it should be appreciated that the HID 42 could include different internal components and different circuitry that operate and function in a different manner.
  • The improved HID 42 provides many benefits over a traditional HID 12. First, because the sensor information processing is performed within the HID 42 and communication with the host system 46 is not necessary for processing the sensor information, the HID 42 is compatible with any gaming or virtual reality system without software or hardware changes to the host system 46. For example, a common HID can be used with different host systems through adaptors. Second, because the HID 42 is independent of the host system 46, any number of HIDs 42 can be used simultaneously with the host system 46. Third, the HID 42 is customizable to communicate with various types of sensors and provide various types of sensory feedback. Fourth, the HID can have a wide variety of physical embodiments. Other advantages are also apparent from a reading of the specification and claims and from a study of the Figures.
  • It is also noted that a sensor can communicate directly with the feedback mechanism, bypassing the micro-controller. For example, a photo-sensor could trigger the feedback mechanism when the photo-sensor is exposed to light. Alternatively, a pressure sensor button can cause feedback to occur upon being pushed. In such an arrangement, the sensor and feedback mechanism can be connected by circuitry.
  • The principle and mode of operation of this invention have been described in its preferred embodiments. However, it should be noted that this invention may be practiced otherwise than as specifically illustrated and described without departing from its scope.

Claims (6)

1. A human interface device for use by a user to interact with a computer-simulated environment comprising:
at least one sensor configured to sense a condition within the computer-simulated environment and generate a first communication concerning the sensed condition;
at least one micro-controller configured to receive the first communication concerning the sensed condition from the at least one sensor, the at least one micro-controller further configured to generate a second communication in reaction to the first communication from the sensor; and
at least one actuator configured to receive the second communication from the at least one micro-controller and provide a sensory experience in reaction to the second communication, wherein the at least one micro-controller is operable to independently determine the response of the at least one actuator appropriate to the condition sensed by the at least one sensor.
2. The human interface device of claim 1 wherein the at least one actuator is configured to respond solely to sensor information processed by the at least one micro-controller.
3. A gaming system comprising:
a host device having a central processing unit configured to control overall functions of the gaming system; and
a human interface device for use by a user of the gaming system and comprising:
at least one sensor configured to sense a condition within the gaming system and generate a first communication concerning the sensed condition;
at least one micro-controller positioned within the human interface device and configured to receive the first communication concerning the sensed condition from the at least one sensor, the at least one micro-controller further configured to independently process the first communication and generate a second communication in reaction to the first communication from the sensor; and
at least one actuator configured to receive the second communication from the at least one micro-controller and provide a sensory experience in reaction to the second communication.
4. The gaming system of claim 3 wherein the at least one controller does not require communication from the host system for operation.
5. The gaming system of claim 3 wherein the at least one controller does not communicate with the host system as to the processing of the first communication.
6. A method of operating a human interface device to remotely interface with a computer, the method comprising the steps of:
sensing a condition within the action of a virtual reality system;
generating a first communication concerning the sensed condition;
sending the first communication to a human interface device for processing;
generating a second communication solely within the human interface device in reaction to the received communication;
sending the second communication to at least one actuator positioned with the human interface device; and
providing a sensory experience with the at least one actuator in reaction to the received human interface device communication.
US12/649,277 2008-12-29 2009-12-29 Human interface device Abandoned US20100167820A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/649,277 US20100167820A1 (en) 2008-12-29 2009-12-29 Human interface device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US14105208P 2008-12-29 2008-12-29
US12/649,277 US20100167820A1 (en) 2008-12-29 2009-12-29 Human interface device

Publications (1)

Publication Number Publication Date
US20100167820A1 true US20100167820A1 (en) 2010-07-01

Family

ID=42285629

Family Applications (1)

Application Number Title Priority Date Filing Date
US12/649,277 Abandoned US20100167820A1 (en) 2008-12-29 2009-12-29 Human interface device

Country Status (1)

Country Link
US (1) US20100167820A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170361216A1 (en) * 2015-03-26 2017-12-21 Bei Jing Xiao Xiao Niu Creative Technologies Ltd. Method and system incorporating real environment for virtuality and reality combined interaction
CN107930110A (en) * 2017-10-31 2018-04-20 深圳华侨城卡乐技术有限公司 A kind of virtual reality device and its control method based on sea rover of playing
US11325030B2 (en) * 2018-09-20 2022-05-10 Protubevr Game pad holder for virtual reality video games, comprising a mechanical force feedback device
US20230398440A1 (en) * 2020-12-17 2023-12-14 Sinden Technology Ltd Improvements in or relating to gaming controllers

Citations (71)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5805140A (en) * 1993-07-16 1998-09-08 Immersion Corporation High bandwidth force feedback interface using voice coils and flexures
US5880714A (en) * 1993-07-16 1999-03-09 Immersion Corporation Three-dimensional cursor control interface with force feedback
US5889672A (en) * 1991-10-24 1999-03-30 Immersion Corporation Tactiley responsive user interface device and method therefor
US5907487A (en) * 1995-09-27 1999-05-25 Immersion Corporation Force feedback device with safety feature
US5956484A (en) * 1995-12-13 1999-09-21 Immersion Corporation Method and apparatus for providing force feedback over a computer network
US5959613A (en) * 1995-12-01 1999-09-28 Immersion Corporation Method and apparatus for shaping force signals for a force feedback device
USRE36387E (en) * 1994-01-26 1999-11-09 Immersion Corporation Percussion input device for personal computer systems
US5999168A (en) * 1995-09-27 1999-12-07 Immersion Corporation Haptic accelerator for force feedback computer peripherals
US6015473A (en) * 1995-08-07 2000-01-18 Immersion Corporation Method for producing a precision 3-D measuring apparatus
US6020876A (en) * 1997-04-14 2000-02-01 Immersion Corporation Force feedback interface with selective disturbance filter
US6020875A (en) * 1997-10-31 2000-02-01 Immersion Corporation High fidelity mechanical transmission system and interface device
US6024576A (en) * 1996-09-06 2000-02-15 Immersion Corporation Hemispherical, high bandwidth mechanical interface for computer systems
US6028593A (en) * 1995-12-01 2000-02-22 Immersion Corporation Method and apparatus for providing simulated physical interactions within computer generated environments
US6037927A (en) * 1994-07-14 2000-03-14 Immersion Corporation Method and apparatus for providing force feedback to the user of an interactive computer simulation
US6050718A (en) * 1996-03-28 2000-04-18 Immersion Corporation Method and apparatus for providing high bandwidth force feedback with improved actuator feel
US6057828A (en) * 1993-07-16 2000-05-02 Immersion Corporation Method and apparatus for providing force sensations in virtual environments in accordance with host software
US6061004A (en) * 1995-11-26 2000-05-09 Immersion Corporation Providing force feedback using an interface device including an indexing function
US6067077A (en) * 1998-04-10 2000-05-23 Immersion Corporation Position sensing for force feedback devices
US6078308A (en) * 1995-12-13 2000-06-20 Immersion Corporation Graphical click surfaces for force feedback applications to provide user selection using cursor interaction with a trigger position within a boundary of a graphical object
US6088019A (en) * 1998-06-23 2000-07-11 Immersion Corporation Low cost force feedback device with actuator for non-primary axis
US6100874A (en) * 1995-11-17 2000-08-08 Immersion Corporation Force feedback mouse interface
US6104382A (en) * 1997-10-31 2000-08-15 Immersion Corporation Force feedback transmission mechanisms
US6104158A (en) * 1992-12-02 2000-08-15 Immersion Corporation Force feedback system
US6102802A (en) * 1997-10-01 2000-08-15 Armstrong; Brad A. Game controller with analog pressure sensor(s)
US6125385A (en) * 1996-08-01 2000-09-26 Immersion Corporation Force feedback implementation in web pages
US6128006A (en) * 1998-03-26 2000-10-03 Immersion Corporation Force feedback mouse wheel and other control wheels
US6131097A (en) * 1992-12-02 2000-10-10 Immersion Corporation Haptic authoring
US6147674A (en) * 1995-12-01 2000-11-14 Immersion Corporation Method and apparatus for designing force sensations in force feedback computer applications
US6154201A (en) * 1996-11-26 2000-11-28 Immersion Corporation Control knob with multiple degrees of freedom and force feedback
US6154198A (en) * 1995-01-18 2000-11-28 Immersion Corporation Force feedback interface apparatus including backlash and for generating feel sensations
US6161126A (en) * 1995-12-13 2000-12-12 Immersion Corporation Implementing force feedback over the World Wide Web and other computer networks
US6166723A (en) * 1995-11-17 2000-12-26 Immersion Corporation Mouse interface device providing force feedback
US6169540B1 (en) * 1995-12-01 2001-01-02 Immersion Corporation Method and apparatus for designing force sensations in force feedback applications
US6211861B1 (en) * 1998-06-23 2001-04-03 Immersion Corporation Tactile mouse device
US6219032B1 (en) * 1995-12-01 2001-04-17 Immersion Corporation Method for providing force feedback to a user of an interface device based on interactions of a controlled cursor with graphical elements in a graphical user interface
US6219033B1 (en) * 1993-07-16 2001-04-17 Immersion Corporation Method and apparatus for controlling force feedback interface systems utilizing a host computer
US6232891B1 (en) * 1996-11-26 2001-05-15 Immersion Corporation Force feedback interface device having isometric functionality
US6243078B1 (en) * 1998-06-23 2001-06-05 Immersion Corporation Pointing device with forced feedback button
US6252583B1 (en) * 1997-11-14 2001-06-26 Immersion Corporation Memory and force output management for a force feedback system
US6252579B1 (en) * 1997-08-23 2001-06-26 Immersion Corporation Interface device and method for providing enhanced cursor control with force feedback
US6256011B1 (en) * 1997-12-03 2001-07-03 Immersion Corporation Multi-function control device with force feedback
US6271828B1 (en) * 1995-01-18 2001-08-07 Immersion Corporation Force feedback interface devices providing resistance forces using a fluid
US6281651B1 (en) * 1997-11-03 2001-08-28 Immersion Corporation Haptic pointing devices
US6285351B1 (en) * 1997-04-25 2001-09-04 Immersion Corporation Designing force sensations for computer applications including sounds
US6292170B1 (en) * 1997-04-25 2001-09-18 Immersion Corporation Designing compound force sensations for computer applications
US6292174B1 (en) * 1997-08-23 2001-09-18 Immersion Corporation Enhanced cursor control using limited-workspace force feedback devices
US6300936B1 (en) * 1997-11-14 2001-10-09 Immersion Corporation Force feedback system including multi-tasking graphical host environment and interface device
US6300937B1 (en) * 1993-07-16 2001-10-09 Immersion Corporation Method and apparatus for controlling force feedback for a computer interface device
US6300938B1 (en) * 1998-04-13 2001-10-09 Immersion Corporation Multiple-cylinder control device for computers and other electronic apparatus
US6304091B1 (en) * 1998-02-10 2001-10-16 Immersion Corporation Absolute position sensing by phase shift detection using a variable capacitor
USRE37528E1 (en) * 1994-11-03 2002-01-22 Immersion Corporation Direct-drive manipulator for pen-based force display
US6344791B1 (en) * 1998-07-24 2002-02-05 Brad A. Armstrong Variable sensor with tactile feedback
US6343991B1 (en) * 1997-10-01 2002-02-05 Brad A. Armstrong Game control with analog pressure sensor
US6374255B1 (en) * 1996-05-21 2002-04-16 Immersion Corporation Haptic authoring
US6400352B1 (en) * 1995-01-18 2002-06-04 Immersion Corporation Mechanical and force transmission for force feedback devices
US6411276B1 (en) * 1996-11-13 2002-06-25 Immersion Corporation Hybrid control of haptic feedback for host computer and interface device
US6424333B1 (en) * 1995-11-30 2002-07-23 Immersion Corporation Tactile feedback man-machine interface device
US6424356B2 (en) * 1999-05-05 2002-07-23 Immersion Corporation Command of force sensations in a forceback system using force effect suites
US6429846B2 (en) * 1998-06-23 2002-08-06 Immersion Corporation Haptic feedback for touchpads and other touch controls
US6437771B1 (en) * 1995-01-18 2002-08-20 Immersion Corporation Force feedback device including flexure member between actuator and user object
US6448977B1 (en) * 1997-11-14 2002-09-10 Immersion Corporation Textures and other spatial sensations for a relative haptic interface device
US6563487B2 (en) * 1998-06-23 2003-05-13 Immersion Corporation Haptic feedback for directional control pads
US6564168B1 (en) * 1999-09-14 2003-05-13 Immersion Corporation High-resolution optical encoder with phased-array photodetectors
US20030157984A1 (en) * 2001-03-30 2003-08-21 Myung-Gwan Kim Method for game using low frequency and thereof device
US6636197B1 (en) * 1996-11-26 2003-10-21 Immersion Corporation Haptic feedback effects for control, knobs and other interface devices
US6639581B1 (en) * 1995-11-17 2003-10-28 Immersion Corporation Flexure mechanism for interface device
US6856873B2 (en) * 1995-06-07 2005-02-15 Automotive Technologies International, Inc. Vehicular monitoring systems using image processing
US7413510B2 (en) * 2002-02-08 2008-08-19 Igt Gaming device having a related symbol selection game
US20080287181A1 (en) * 1994-12-19 2008-11-20 Legal Igaming, Inc. Universal gaming engine
US20090233710A1 (en) * 2007-03-12 2009-09-17 Roberts Thomas J Feedback gaming peripheral
US20100228156A1 (en) * 2009-02-16 2010-09-09 Valero-Cuevas Francisco J Dexterity device

Patent Citations (99)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6195592B1 (en) * 1991-10-24 2001-02-27 Immersion Corporation Method and apparatus for providing tactile sensations using an interface device
US5889672A (en) * 1991-10-24 1999-03-30 Immersion Corporation Tactiley responsive user interface device and method therefor
US5889670A (en) * 1991-10-24 1999-03-30 Immersion Corporation Method and apparatus for tactilely responsive user interface
US6131097A (en) * 1992-12-02 2000-10-10 Immersion Corporation Haptic authoring
US6104158A (en) * 1992-12-02 2000-08-15 Immersion Corporation Force feedback system
US6219033B1 (en) * 1993-07-16 2001-04-17 Immersion Corporation Method and apparatus for controlling force feedback interface systems utilizing a host computer
US6300937B1 (en) * 1993-07-16 2001-10-09 Immersion Corporation Method and apparatus for controlling force feedback for a computer interface device
US5929846A (en) * 1993-07-16 1999-07-27 Immersion Corporation Force feedback interface device including grounded sensor system
US6046727A (en) * 1993-07-16 2000-04-04 Immersion Corporation Three dimensional position sensing interface with force output
US6580417B2 (en) * 1993-07-16 2003-06-17 Immersion Corporation Tactile feedback device providing tactile sensations from host commands
US6057828A (en) * 1993-07-16 2000-05-02 Immersion Corporation Method and apparatus for providing force sensations in virtual environments in accordance with host software
US5805140A (en) * 1993-07-16 1998-09-08 Immersion Corporation High bandwidth force feedback interface using voice coils and flexures
US5880714A (en) * 1993-07-16 1999-03-09 Immersion Corporation Three-dimensional cursor control interface with force feedback
USRE36387E (en) * 1994-01-26 1999-11-09 Immersion Corporation Percussion input device for personal computer systems
US6323837B1 (en) * 1994-07-14 2001-11-27 Immersion Corporation Method and apparatus for interfacing an elongated object with a computer system
US6654000B2 (en) * 1994-07-14 2003-11-25 Immersion Corporation Physically realistic computer simulation of medical procedures
US6037927A (en) * 1994-07-14 2000-03-14 Immersion Corporation Method and apparatus for providing force feedback to the user of an interactive computer simulation
USRE37528E1 (en) * 1994-11-03 2002-01-22 Immersion Corporation Direct-drive manipulator for pen-based force display
US20080287181A1 (en) * 1994-12-19 2008-11-20 Legal Igaming, Inc. Universal gaming engine
US6201533B1 (en) * 1995-01-18 2001-03-13 Immersion Corporation Method and apparatus for applying force in force feedback devices using friction
US6154198A (en) * 1995-01-18 2000-11-28 Immersion Corporation Force feedback interface apparatus including backlash and for generating feel sensations
US6271828B1 (en) * 1995-01-18 2001-08-07 Immersion Corporation Force feedback interface devices providing resistance forces using a fluid
US6437771B1 (en) * 1995-01-18 2002-08-20 Immersion Corporation Force feedback device including flexure member between actuator and user object
US6246390B1 (en) * 1995-01-18 2001-06-12 Immersion Corporation Multiple degree-of-freedom mechanical interface to a computer system
US6400352B1 (en) * 1995-01-18 2002-06-04 Immersion Corporation Mechanical and force transmission for force feedback devices
US6856873B2 (en) * 1995-06-07 2005-02-15 Automotive Technologies International, Inc. Vehicular monitoring systems using image processing
US6486872B2 (en) * 1995-06-09 2002-11-26 Immersion Corporation Method and apparatus for providing passive fluid force feedback
US6015473A (en) * 1995-08-07 2000-01-18 Immersion Corporation Method for producing a precision 3-D measuring apparatus
US6661403B1 (en) * 1995-09-27 2003-12-09 Immersion Corporation Method and apparatus for streaming force values to a force feedback device
US5907487A (en) * 1995-09-27 1999-05-25 Immersion Corporation Force feedback device with safety feature
US6348911B1 (en) * 1995-09-27 2002-02-19 Immersion Corporation Force feedback device including safety switch and force magnitude ramping
US5929607A (en) * 1995-09-27 1999-07-27 Immersion Corporation Low cost force feedback interface with efficient power sourcing
US5999168A (en) * 1995-09-27 1999-12-07 Immersion Corporation Haptic accelerator for force feedback computer peripherals
US6342880B2 (en) * 1995-09-27 2002-01-29 Immersion Corporation Force feedback system including multiple force processors
US6166723A (en) * 1995-11-17 2000-12-26 Immersion Corporation Mouse interface device providing force feedback
US6639581B1 (en) * 1995-11-17 2003-10-28 Immersion Corporation Flexure mechanism for interface device
US6191774B1 (en) * 1995-11-17 2001-02-20 Immersion Corporation Mouse interface for providing force feedback
US6100874A (en) * 1995-11-17 2000-08-08 Immersion Corporation Force feedback mouse interface
US6061004A (en) * 1995-11-26 2000-05-09 Immersion Corporation Providing force feedback using an interface device including an indexing function
US6424333B1 (en) * 1995-11-30 2002-07-23 Immersion Corporation Tactile feedback man-machine interface device
US6366272B1 (en) * 1995-12-01 2002-04-02 Immersion Corporation Providing interactions between simulated objects using force feedback
US6169540B1 (en) * 1995-12-01 2001-01-02 Immersion Corporation Method and apparatus for designing force sensations in force feedback applications
US6028593A (en) * 1995-12-01 2000-02-22 Immersion Corporation Method and apparatus for providing simulated physical interactions within computer generated environments
US6147674A (en) * 1995-12-01 2000-11-14 Immersion Corporation Method and apparatus for designing force sensations in force feedback computer applications
US6219032B1 (en) * 1995-12-01 2001-04-17 Immersion Corporation Method for providing force feedback to a user of an interface device based on interactions of a controlled cursor with graphical elements in a graphical user interface
US5959613A (en) * 1995-12-01 1999-09-28 Immersion Corporation Method and apparatus for shaping force signals for a force feedback device
US6278439B1 (en) * 1995-12-01 2001-08-21 Immersion Corporation Method and apparatus for shaping force signals for a force feedback device
US6353850B1 (en) * 1995-12-13 2002-03-05 Immersion Corporation Force feedback provided in web pages
US6078308A (en) * 1995-12-13 2000-06-20 Immersion Corporation Graphical click surfaces for force feedback applications to provide user selection using cursor interaction with a trigger position within a boundary of a graphical object
US6317116B1 (en) * 1995-12-13 2001-11-13 Immersion Corporation Graphical click surfaces for force feedback applications to provide selection of functions using cursor interaction with a trigger position of a graphical object
US5956484A (en) * 1995-12-13 1999-09-21 Immersion Corporation Method and apparatus for providing force feedback over a computer network
US6101530A (en) * 1995-12-13 2000-08-08 Immersion Corporation Force feedback provided over a computer network
US6161126A (en) * 1995-12-13 2000-12-12 Immersion Corporation Implementing force feedback over the World Wide Web and other computer networks
US6050718A (en) * 1996-03-28 2000-04-18 Immersion Corporation Method and apparatus for providing high bandwidth force feedback with improved actuator feel
US6374255B1 (en) * 1996-05-21 2002-04-16 Immersion Corporation Haptic authoring
US6125385A (en) * 1996-08-01 2000-09-26 Immersion Corporation Force feedback implementation in web pages
US6024576A (en) * 1996-09-06 2000-02-15 Immersion Corporation Hemispherical, high bandwidth mechanical interface for computer systems
US6411276B1 (en) * 1996-11-13 2002-06-25 Immersion Corporation Hybrid control of haptic feedback for host computer and interface device
US6636161B2 (en) * 1996-11-26 2003-10-21 Immersion Corporation Isometric haptic feedback interface
US6232891B1 (en) * 1996-11-26 2001-05-15 Immersion Corporation Force feedback interface device having isometric functionality
US6636197B1 (en) * 1996-11-26 2003-10-21 Immersion Corporation Haptic feedback effects for control, knobs and other interface devices
US6259382B1 (en) * 1996-11-26 2001-07-10 Immersion Corporation Isotonic-isometric force feedback interface
US6154201A (en) * 1996-11-26 2000-11-28 Immersion Corporation Control knob with multiple degrees of freedom and force feedback
US6020876A (en) * 1997-04-14 2000-02-01 Immersion Corporation Force feedback interface with selective disturbance filter
US6310605B1 (en) * 1997-04-14 2001-10-30 Immersion Corporation Force feedback interface with selective disturbance filter
US6292170B1 (en) * 1997-04-25 2001-09-18 Immersion Corporation Designing compound force sensations for computer applications
US6285351B1 (en) * 1997-04-25 2001-09-04 Immersion Corporation Designing force sensations for computer applications including sounds
US6252579B1 (en) * 1997-08-23 2001-06-26 Immersion Corporation Interface device and method for providing enhanced cursor control with force feedback
US6288705B1 (en) * 1997-08-23 2001-09-11 Immersion Corporation Interface device and method for providing indexed cursor control with force feedback
US6292174B1 (en) * 1997-08-23 2001-09-18 Immersion Corporation Enhanced cursor control using limited-workspace force feedback devices
US6102802A (en) * 1997-10-01 2000-08-15 Armstrong; Brad A. Game controller with analog pressure sensor(s)
US6343991B1 (en) * 1997-10-01 2002-02-05 Brad A. Armstrong Game control with analog pressure sensor
US6380925B1 (en) * 1997-10-31 2002-04-30 Immersion Corporation Force feedback device with spring selection mechanism
US6104382A (en) * 1997-10-31 2000-08-15 Immersion Corporation Force feedback transmission mechanisms
US6020875A (en) * 1997-10-31 2000-02-01 Immersion Corporation High fidelity mechanical transmission system and interface device
US6281651B1 (en) * 1997-11-03 2001-08-28 Immersion Corporation Haptic pointing devices
US6448977B1 (en) * 1997-11-14 2002-09-10 Immersion Corporation Textures and other spatial sensations for a relative haptic interface device
US6300936B1 (en) * 1997-11-14 2001-10-09 Immersion Corporation Force feedback system including multi-tasking graphical host environment and interface device
US6343349B1 (en) * 1997-11-14 2002-01-29 Immersion Corporation Memory caching for force feedback effects
US6252583B1 (en) * 1997-11-14 2001-06-26 Immersion Corporation Memory and force output management for a force feedback system
US6256011B1 (en) * 1997-12-03 2001-07-03 Immersion Corporation Multi-function control device with force feedback
US6304091B1 (en) * 1998-02-10 2001-10-16 Immersion Corporation Absolute position sensing by phase shift detection using a variable capacitor
US6128006A (en) * 1998-03-26 2000-10-03 Immersion Corporation Force feedback mouse wheel and other control wheels
US6067077A (en) * 1998-04-10 2000-05-23 Immersion Corporation Position sensing for force feedback devices
US6300938B1 (en) * 1998-04-13 2001-10-09 Immersion Corporation Multiple-cylinder control device for computers and other electronic apparatus
US6353427B1 (en) * 1998-06-23 2002-03-05 Immersion Corporation Low cost force feedback device with actuator for non-primary axis
US6211861B1 (en) * 1998-06-23 2001-04-03 Immersion Corporation Tactile mouse device
US6563487B2 (en) * 1998-06-23 2003-05-13 Immersion Corporation Haptic feedback for directional control pads
US6243078B1 (en) * 1998-06-23 2001-06-05 Immersion Corporation Pointing device with forced feedback button
US6469692B2 (en) * 1998-06-23 2002-10-22 Immersion Corporation Interface device with tactile feedback button
US6088019A (en) * 1998-06-23 2000-07-11 Immersion Corporation Low cost force feedback device with actuator for non-primary axis
US6429846B2 (en) * 1998-06-23 2002-08-06 Immersion Corporation Haptic feedback for touchpads and other touch controls
US6344791B1 (en) * 1998-07-24 2002-02-05 Brad A. Armstrong Variable sensor with tactile feedback
US6424356B2 (en) * 1999-05-05 2002-07-23 Immersion Corporation Command of force sensations in a forceback system using force effect suites
US6564168B1 (en) * 1999-09-14 2003-05-13 Immersion Corporation High-resolution optical encoder with phased-array photodetectors
US20030157984A1 (en) * 2001-03-30 2003-08-21 Myung-Gwan Kim Method for game using low frequency and thereof device
US7413510B2 (en) * 2002-02-08 2008-08-19 Igt Gaming device having a related symbol selection game
US20090233710A1 (en) * 2007-03-12 2009-09-17 Roberts Thomas J Feedback gaming peripheral
US20100228156A1 (en) * 2009-02-16 2010-09-09 Valero-Cuevas Francisco J Dexterity device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Communication Tutorial 1: Human-Interface Devices, 11 May 2008. *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170361216A1 (en) * 2015-03-26 2017-12-21 Bei Jing Xiao Xiao Niu Creative Technologies Ltd. Method and system incorporating real environment for virtuality and reality combined interaction
US10661157B2 (en) * 2015-03-26 2020-05-26 Bei Jing Xiao Xiao Niu Creative Technologies Ltd. Method and system incorporating real environment for virtuality and reality combined interaction
CN107930110A (en) * 2017-10-31 2018-04-20 深圳华侨城卡乐技术有限公司 A kind of virtual reality device and its control method based on sea rover of playing
US11325030B2 (en) * 2018-09-20 2022-05-10 Protubevr Game pad holder for virtual reality video games, comprising a mechanical force feedback device
US20230398440A1 (en) * 2020-12-17 2023-12-14 Sinden Technology Ltd Improvements in or relating to gaming controllers
US11904236B2 (en) * 2020-12-17 2024-02-20 Sinden Technology Ltd In or relating to gaming controllers

Similar Documents

Publication Publication Date Title
CN110325947B (en) Haptic interaction method, tool and system
EP1711239B1 (en) Control apparatus for use with a computer or video game system
US9498705B2 (en) Video game system having novel input devices
US7896733B2 (en) Method and apparatus for providing interesting and exciting video game play using a stability/energy meter
US8096863B2 (en) Emotion-based game character manipulation
US8740708B2 (en) Gun handle attachment for game controller
US20180050267A1 (en) Tactile feedback systems and methods for augmented reality and virtual reality systems
CN201267713Y (en) Gun type multifunctional game input handle
US20080096657A1 (en) Method for aiming and shooting using motion sensing controller
US20080096654A1 (en) Game control using three-dimensional motions of controller
EP1388357A2 (en) Group behavioural modification using external stimuli
US10569165B2 (en) Tactile feedback systems and methods for augmented reality and virtual reality systems
US20020107060A1 (en) Entertainment system, entertainment apparatus, recording medium, and program
US20080125224A1 (en) Method and apparatus for controlling simulated in flight realistic and non realistic object effects by sensing rotation of a hand-held controller
JP2019017475A (en) Game system, game program, game device and game processing method
US20130324249A1 (en) Computer readable storage medium, game apparatus, game system, and game processing method
US10697996B2 (en) Accelerometer sensing and object control
EP2533132A1 (en) Computer, and recording medium
US20100167820A1 (en) Human interface device
KR100871442B1 (en) FPS game control device
US20230277936A1 (en) Information processing system, non-transitory computer-readable storage medium having stored therein information processing program, information processing method, and information processing apparatus
JP2024039732A (en) Program, information processing method, and information processing device
KR200212384Y1 (en) A mouse type input device which generates vibration in order to realize computer game
CN114042322A (en) Animation display method and device, computer equipment and storage medium
JP2007069011A (en) Game system and game information storage medium used therefor

Legal Events

Date Code Title Description
AS Assignment

Owner name: THE UNIVERSITY OF TOLEDO,OHIO

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BARAKAT, HOUSSAM;LILLY, BRADFORD R.;SHENAI, KRISHNA;SIGNING DATES FROM 20100119 TO 20100120;REEL/FRAME:023851/0552

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION