US20100009753A1 - User Interface for Functionally Limited Users - Google Patents

User Interface for Functionally Limited Users Download PDF

Info

Publication number
US20100009753A1
US20100009753A1 US12/501,248 US50124809A US2010009753A1 US 20100009753 A1 US20100009753 A1 US 20100009753A1 US 50124809 A US50124809 A US 50124809A US 2010009753 A1 US2010009753 A1 US 2010009753A1
Authority
US
United States
Prior art keywords
interface
user interface
user
processor
wearable
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/501,248
Inventor
Eddo Stern
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US12/501,248 priority Critical patent/US20100009753A1/en
Publication of US20100009753A1 publication Critical patent/US20100009753A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • A63F13/06
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/212Input arrangements for video game devices characterised by their sensors, purposes or types using sensors worn by the player, e.g. for measuring heart beat or leg activity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/35Details of game servers
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/533Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game for prompting the player, e.g. by displaying a game menu
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1012Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals involving biosensors worn by the player, e.g. for measuring heart beat, limb activity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/64Methods for processing data by generating or executing the game program for computing dynamical parameters of game objects, e.g. motion determination or computation of frictional forces for a virtual car

Definitions

  • This disclosure relates to a user interface for functionally limited users.
  • the present disclosure generally relates to devices and methods for creating multi-sensory user interfaces.
  • the present disclosure also generally relates to interactive user headsets for use with a computer system and computer programs, such as computer or electronic games.
  • Serious Gaming deals with the educational, medical, economical and political application of games. Examples include news games—games that are based on news events and that function as a tool for a better understanding of our society; virtual store fronts—games for training employees on customer interactions and relations; virtual assembly lines—games for introducing new assembly techniques.
  • This disclosure describes technologies relating to a wearable user device for imparting sensory information to the user to determine the identity and characterizing information about specific information used in a computer program.
  • a wearable user interface including a hub or core; a processor, a potentiometer, one or more flexible supports, and one or more pulse motors.
  • the processor can be configured to convert source information and vector information (e.g, distance, direction, and identification of an object) used as part of an electronic game into pulsed patterns.
  • the pulsed patterns can vary in direction, duration, intensity, pattern type, pulse source, etc.
  • the processor can be coupled to a potentiometer to increase or decrease signal intensity from the processor.
  • the processor can be coupled to one or more pulse motors.
  • the pulse motors are configured to receive the pulsed patterns generated by the processor.
  • the pulse motors convert the electronic pulsed pattern information generated by the processor into a pulsed sensory pattern.
  • the pulsed sensory pattern can be a vibratory pattern.
  • the pulsed sensory pattern can be a light pattern.
  • the pulsed sensory pattern can be an auditory pattern.
  • a method includes: exporting location information relevant to various aspects of a computer gaming system; processing the location information into multidirectional metered pulses; and activating one or more sensory signal generators to correspond with the multidirectional metered pulse.
  • Other implementations of this aspect include corresponding systems, apparatus, and computer program products.
  • Particular implementations can convey sensory information including distance and direction of virtual and real world objects to a user without the use of visual or verbal signals.
  • identifying information such as source, type, affiliation, etc., can be conveyed to a user without the use of visual or verbal signals.
  • FIG. 1 depicts an implementation of a user headset.
  • FIG. 2 shows a system diagram for an implementation of an interactive wearable interface.
  • FIG. 3 is a flowchart of an implementation of a process of a game interacting with the headset.
  • FIG. 4 depicts an implementation of a wearable user interface.
  • Limiting conditions can be functional limitations, or disabilities—such as blindness, deafness, or mobility limitations.
  • the field of game accessibility deals with the accessibility of electronic games (computer games, console games, etc.) for disabled gamers. Game accessibility allows user the ability to play or otherwise participate in an electronic game even when functioning under limiting conditions
  • a wearable user interface and associated software provides an electronic game interface for users who operate under limiting conditions, such as blindness or deafness.
  • the wearable user interface functions to translate a three-dimensional or two-dimensional virtual game environment into haptic (vibratory), visual, or audible feedback provided to the user through the wearable interface, such as a headset.
  • the three or two dimensional game environment can include static and dynamic three-dimensional rendered entities (i.e., game objects), such as other players, computer controlled players, and structural geometry such as buildings, terrain and other static or dynamic objects.
  • the wearable user interface can be any type of article that is worn on the person's body.
  • the wearable user interface is similar to an article of clothing (e.g., a vest or shirt).
  • the wearable user interface can be a headset, helmet, headband, glove, bracelet, or a combination of different articles.
  • the wearable user interface can be distinguished from a conventional handheld game controller.
  • the software component and associated processors transform relative position and identification data of various entities found in an electronic game into a vector comprising:
  • the position and identification data is transmitted from a computer or over a network and received by a processor housed in the wearable interface.
  • the processor or microcontroller in the wearable interface such as a headset, translates the position and identification data into an array of pulses produced by pulse generators positioned about the headset or other wearable interface.
  • the pulsed information can be haptic (vibratory), pulsed light, or audible pulses.
  • the pulsed information signals to the user the direction, distance, and identifying information of the entities present in the virtual gaming environment.
  • the software and processors can be configured to filter the position and identification information received so as to establish a maximum “seeing distance.” This filtering aspect allows the user to perceive virtual entities that are of relevance to the user's position, affiliation, or other aspects that may be of importance to the electronic game.
  • distance can be represented by the intensity of the pulse of a vibration generated, for example by an electric motor.
  • the intensity can be measured on a scale, for example (using pulse width modulation through a Darlington array).
  • the scale can be for example, 0 to 255, wherein 255 represents an intense or maximum strength, signaling close proximity. Lower numbers represent further distances or distances out of the viewable distance filtered through the processor of the wearable device.
  • Two dimensional direction can be represented by specific pulse generators activated around the user. For example, in one implementation, four, six, eight, ten, twelve, sixteen or more pulse generators can be positioned about the user to represent directions around a periphery.
  • three dimensional direction can be represent by activating specific pulse generators positioned in an array across the top of the user's head.
  • unique haptic identification or source information are represented by a haptic or vibratory signature that is unique to each entity in the virtual environment of the electronic game.
  • Each entity or entity type in the game has a certain pulse sequence associated with it, thereby allowing users to not only sense position (direction and distance) of game entities in their headset, but also can identify which entities are located where. For example: all trees can have a 500 ms pulse followed by a 100 ms pause associated with them. All structures can have a 100 ms pulse with a 50 ms pause.
  • Player1 can have a repeatable pattern of 200 ms on, 50 ms off, 300 ms on, 100 ms off.
  • Player2 can have a repeatable pattern of 600 ms on 50 ms off, 50 ms on, 50 ms off, 50 ms on, 100 ms off pattern.
  • FIG. 1 shows an example implementation of a wearable user interface 10 that comprises a core structure 15 , multiple flexible supports 18 , and multiple pulse generators 20 .
  • Core structure 15 can house a processor, memory, a power supply, a network interface or connection interface for communicating with a network or computer, and can serve as a structural base or hub for the multiple flexible supports 18 .
  • Flexible supports 18 can be configured to fit around a specified body part, such as the user's head. Flexible supports 18 can be configured to have an elastic quality or a self-adjusting quality to ensure proper fit and maintain the fit when the wearable user interface 10 is in operation. The number of flexible supports 18 can vary based on the intended application. For example, in FIG. 2 , the wearable user interface 10 has eight flexible supports 18 . In other implementations, the wearable user interface 10 can have four flexible supports 18 .
  • the flexible supports 18 can be arranged around the user wearable interface 10 in any configuration.
  • FIG. 2 illustrates an implementation of the wearable user interface 10 with eight flexible supports 18 are arranged around the wearable user interface 10 so that the individual flexible supports 18 are each separated by the same distance.
  • the flexible supports 18 can have a near-random arrangement or a non-symmetric arrangement.
  • the length of the flexible supports 18 can be adjustable or can be fixed lengths. In some implementations, the flexible supports 18 can have varying lengths such that the flexible supports 18 are in contact with the user's body at varying vertical positions. For example, in an implementation with four flexible supports 18 , two flexible supports 18 can have lengths equal to six inches and the other two flexible supports 18 can have lengths equal to four inches so they come in contact with the user's body at different points.
  • Pulse generators 20 can be positioned at any point along the flexible supports 18 .
  • One or more pulse generators 20 can be included on any given flexible support 18 .
  • a pulse generator 20 is positioned at a distal end of each flexible support 18 .
  • multiple pulse generators 20 can be a positioned along the length of each flexible support 18 .
  • pulse generators 20 can be near the core structure 15 or on the core structure 15 such that the pulse generators 20 are near the top of the user's head.
  • the pulse generators 20 can be in direct contact with the user's body.
  • the pulse generators 20 are arranged such that they are in indirect contact with the user's body (i.e., the pulse generators 20 and the user's body are separated by a wall or barrier).
  • the pulse generators 20 are coupled to the processor housed in the core structure 15 and configured to receive electronic signals from the processor housed in the core structure 15 .
  • the pulse generators 20 are configured to convert electronic signals into pulsed sensory information including pulsed vibratory information, pulsed light or visual information, or pulsed auditory information.
  • Pulse generator 20 can vary the duration, intensity, pattern, and type of pulsed information as an output to the user.
  • One or more pulse generators 20 can cooperatively operate to convey changing duration, intensity, pattern and type of the pulsed information, thereby giving the user a sense of direction, distance, and identifying information.
  • FIG. 2 shows an implementation of a computer system incorporating the wearable user interface of the present invention.
  • the computer system comprises a computer terminal 30 , a server 32 or a mobile device 34 , a network 40 and a wearable interface 50 similar to the wearable user interface 10 described above in connection with FIG. 1 .
  • the computer terminal 30 can be a mobile computer, a game console or a mobile gaming device.
  • the network 40 can be any type of network.
  • network 40 can be a LAN, WAN or a wireless network.
  • Wearable interface 50 comprises a processor 52 for receiving position and identification information from the computer 30 or server 32 via network 40 .
  • the processor 52 can receive position and identification information over a wireless network or a hardwired network connection via the connector 54 .
  • the connector 54 can be any type of network interface or connector.
  • the connector 54 can be a RS-232 connector, a USB connector, a wireless internet receiver/transmitter or can be an Ethernet connector.
  • processor 52 can receive position and identification information directly from computer 30 or server 32 .
  • Processor 52 can be configured to translate position and identification information into a plurality of signals 55 .
  • the plurality of signals 55 include instructions to multiple pulse generators 58 .
  • a potentiometer 56 can be included to boost or otherwise adjust the strength of signals 55 before the signals reach the pulse generators 58 .
  • processor 52 can be coupled to or otherwise in communication with peripheral devices 60 to trigger other signals to the user.
  • the processor 52 can be coupled to a game controller 60 that can also provide pulsed information such as vibrations.
  • the processor 52 can be coupled to or arranged to communicate with memory 59 .
  • the memory 59 can be any type of memory such as RAM, EEPROM, or a flash memory.
  • the memory 59 can be used to store software, haptic signatures or game information.
  • FIG. 3 is a flowchart is an exemplary process 100 of a game operating on a game console/computer terminal 30 interacting with a player using the wearable user interface 10 .
  • the game first determines whether any game objects (e.g., another player, an enemy character, a bullet or obstacles) are in the area around the player's on-screen character (i.e., scan the area around the player) (block 102 ).
  • the game periodically scans the area around the player's on-screen character.
  • the game may be configured to scan the area around the character every half second.
  • the scan period i.e., the amount of time between each scan
  • the user can trigger a scan by pressing a keyboard/game controller button or using some other form of input, such as a voice command.
  • the game can periodically scan the area around the player's character and accept inputs from the user to trigger a scan of the area.
  • the area around the player's on-screen character to be scanned can be determined by the game, and in other implementations, the character zone can be determined by the user through game options or different user inputs.
  • the character zone can be defined to be the entire screen or any portion of the screen around the user's character.
  • the character zone can be defined using any form of measurement or distance.
  • the character zone can be an area around the player's character measured by pixels, relative distances within the game (i.e., distances as measured in the game environment), or by actual on-screen distances (i.e., distances as measured on the screen).
  • the game and the user wearable interface 10 are both configured to interpret the character zone in the same measurement units.
  • the game determines that there is a game object in the area around the player, the game prepares a message to transmit to the user wearable interface 10 (block 106 ).
  • the message can be a vector including relative position and identification data of the object(s) found in the area around the player.
  • a message can be a vector comprising: 1) the distance the game object is from the user's character; 2) the direction the object is from the user's character; 3) the relative velocity of the object; and 4) an identifier for the object (“an object identifier”). For example, if the game performs a scan and determines a tree is two feet to the right of the player's character in the game environment, the game can prepare a message such as (2, Right, 0, Tree).
  • the game can prepare a message such as (1, Up, 0.5, Bullet).
  • the object identifiers are predetermined and are common to the game and the user wearable interface 10 .
  • the message is then transmitted to the user wearable interface 10 by the game console/computer 30 (block 108 ).
  • the message can be transmitted using any existing communication protocol, such as RS232 or USB, or can use a custom communication protocol similar to the communication protocol between a Sony Playstation and the Sony Playstation controllers.
  • the message can be communicated to the user wearable interface 10 via the network 40 .
  • the message can be communicated directly to the user wearable interface 10 .
  • the user wearable interface 10 is physically connected to the game console/computer 30
  • the user wearable interface 10 is directly connected to the game console/computer 30 using some wireless communication protocol such as Bluetooth or an infrared communication protocol.
  • the user wearable interface 10 receives the message from the game and then decodes the message (block 110 ).
  • the user wearable interface 10 decodes the message into the various components and translates the information into an array of pulses to be produced by the pulse generators 20 (i.e., haptic signature).
  • the user wearable interface 10 can access a database that maps the object identifier to the haptic signature for each game object and directional information to specific pulse generators 20 .
  • the Tree identifier can be stored in the database as a 500 ms pulse followed by a 100 ms pause or the Bullet identifier can be mapped to a repeatable pattern of a 50 ms pulse and a 100 ms pause.
  • the user wearable interface 10 then transmits the haptic signature to one or more pulse generators 20 , which then actuates the pulse generators 20 (block 112 ). For example, if the user wearable interface 10 receives the message (2, Right, 0, Tree), the processor 50 can access the database and determine that the haptic signature is a 500 ms pulse followed by a 100 ms pause that will be produced by a pulse generator 20 located on the right side of the user wearable interface 10 .
  • Process 100 then returns to scanning the area around the user's character (block 102 ).
  • the process 100 operates in parallel to a process to receive input from the user.
  • process 100 can operate in parallel to a process that determines if the user is moving the character or shooting an object.
  • Implementations of the subject matter and the functional operations described in this specification can be implemented in digital electronic circuitry, or in computer software, firmware, or hardware, including the structures disclosed in this specification and their structural equivalents, or in combinations of one or more of them. Implementations of the subject matter described in this specification can be implemented as one or more computer program products, i.e., one or more modules of computer program instructions encoded on a computer-readable medium for execution by, or to control the operation of, data processing apparatus.
  • the computer-readable medium can be a machine-readable storage device, a machine-readable storage substrate, a memory device, a composition of matter effecting a machine-readable propagated signal, or a combination of one or more of them.
  • data processing apparatus encompasses all apparatus, devices, and machines for processing data, including by way of example a programmable processor, a computer, or multiple processors or computers.
  • the apparatus can include, in addition to hardware, code that creates an execution environment for the computer program in question, e.g., code that constitutes processor firmware, a protocol stack, a database management system, an operating system, or a combination of one or more of them.
  • a propagated signal is an artificially generated signal, e.g., a machine-generated electrical, optical, or electromagnetic signal, that is generated to encode information for transmission to suitable receiver apparatus.
  • a computer program (also known as a program, software, software application, script, or code) can be written in any form of programming language, including compiled or interpreted languages, and it can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment.
  • a computer program does not necessarily correspond to a file in a file system.
  • a program can be stored in a portion of a file that holds other programs or data (e.g., one or more scripts stored in a markup language document), in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub-programs, or portions of code).
  • a computer program can be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.
  • the processes and logic flows described in this specification can be performed by one or more programmable processors executing one or more computer programs to perform functions by operating on input data and generating output.
  • the processes and logic flows can also be performed by, and apparatus can also be implemented as, special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application-specific integrated circuit).
  • processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer.
  • a processor will receive instructions and data from a read-only memory or a random access memory or both.
  • the essential elements of a computer are a processor for performing instructions and one or more memory devices for storing instructions and data.
  • a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto-optical disks, or optical disks.
  • mass storage devices for storing data, e.g., magnetic, magneto-optical disks, or optical disks.
  • a computer need not have such devices.
  • a computer can be embedded in another device, e.g., a mobile telephone, a personal digital assistant (PDA), a mobile audio player, a Global Positioning System (GPS) receiver, to name just a few.
  • Computer-readable media suitable for storing computer program instructions and data include all for ms of non-volatile memory, media and memory devices, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks, e.g., internal hard disks or removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks.
  • the processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.
  • implementations of the subject matter described in this specification can be implemented on a computer having a display device, e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor, for displaying information to the user and a keyboard and a pointing device, e.g., a mouse or a trackball, by which the user can provide input to the computer.
  • a display device e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor
  • keyboard and a pointing device e.g., a mouse or a trackball
  • Other kinds of devices can be used to provide for interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback, e.g., visual feedback, auditory feedback, or tactile feedback; and input from the user can be received in any form, including acoustic, speech, or tactile input.
  • Implementations of the subject matter described in this specification can be implemented in a computing system that includes a back-end component, e.g., as a data server, or that includes a middleware component, e.g., an application server, or that includes a front-end component, e.g., a client computer having a graphical user interface or a Web browser through which a user can interact with an implementation of the subject matter described is this specification, or any combination of one or more such back-end, middleware, or front-end components.
  • the components of the system can be interconnected by any form or medium of digital data communication, e.g., a communication network. Examples of communication networks include a local area network (“LAN”) and a wide area network (“WAN”), e.g., the Internet.
  • LAN local area network
  • WAN wide area network
  • the computing system can include clients and servers.
  • a client and server are generally remote from each other and typically interact through a communication network.
  • the relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
  • the wearable user interface can similar to a headband, as shown in FIG. 4 .
  • the wearable user interface 100 can include a central core 115 , a band 118 and a plurality of pulse generators 120 .
  • the central core 115 is similar to the central core 15 described above in connection with FIG. 1
  • the pulse generators 120 are similar to the pulse generators 20 described above in connection with FIG. 1 .
  • the band 118 can be configured to fit around a specified body part, such as the user's head or arm.
  • the band 118 can also be configured to have an elastic quality or self-adjusting quality to ensure proper fit and maintain the fit when the wearable user interface 100 is in operation.
  • the band 118 can be user-adjustable similar to a belt worn with clothing.
  • the pulse generators 120 can be positioned at any point along the surface of the band 118 . Any number of pulse generators 120 can be placed at any position along the band 118 . For example, one pulse generator 120 can be placed over the right ear, one pulse generator can be placed over the left ear, one pulse generator 120 can be placed on the user's forehead and the last pulse generator 120 can be placed on the back of the user's forehead. In some implementations, the pulse generators 120 are placed within the band such that the pulse generators 120 are not in direct contact with the user's body. In some implementations, the pulse generators 120 are placed along the surface of the band 118 that is intended to be in contact with the user's body but is covered by fabric or other material.
  • the central core 115 can be configured such that it is located near the back of the user's head or near the user's neck. In some implementations, the central core 115 can be placed away from the user's body and coupled to the band 118 and/or pulse generators 120 via wires or other type of connector. For example, the central core 115 can be worn on the user's waistband and connected to the band 118 via a wire extending from the central core 115 to the band 118 .

Abstract

A gaming interface that includes a wearable user interface. The wearable user interface is configured to be worn on a user's body and includes a central body and a plurality of flexible members extending from the central body. A pulse generator is arranged along each of the plurality of flexible members.

Description

    CLAIM OF PRIORITY
  • This application claims priority under 35 U.S.C. § 119(e) to U.S. Provisional Patent Application Ser. No. 61/080,163, filed Jul. 11, 2008, which is incorporated by reference in its entirety.
  • TECHNICAL FIELD
  • This disclosure relates to a user interface for functionally limited users.
  • BACKGROUND
  • The present disclosure generally relates to devices and methods for creating multi-sensory user interfaces. The present disclosure also generally relates to interactive user headsets for use with a computer system and computer programs, such as computer or electronic games.
  • Games have been growing in depth and complexity over the past years and have become one of the most popular for ms of entertainment, with yearly sales exceeding 7.4 billion dollars. Many innovative developments in new media and arise from the game industry. At the same time there is a huge increase in game-oriented studies at universities and academies.
  • Electronic games have evolved beyond entertainment and have found their way into other applications, such as for example, mixed-media educational programs. The number of games not intended for entertainment is growing fast and has led to a completely new branch in the game industry, sometimes referred to as Serious Gaming. Serious gaming deals with the educational, medical, economical and political application of games. Examples include news games—games that are based on news events and that function as a tool for a better understanding of our society; virtual store fronts—games for training employees on customer interactions and relations; virtual assembly lines—games for introducing new assembly techniques.
  • SUMMARY
  • This disclosure describes technologies relating to a wearable user device for imparting sensory information to the user to determine the identity and characterizing information about specific information used in a computer program.
  • In general, and in accordance with one implementation, a wearable user interface is provided including a hub or core; a processor, a potentiometer, one or more flexible supports, and one or more pulse motors. The processor can be configured to convert source information and vector information (e.g, distance, direction, and identification of an object) used as part of an electronic game into pulsed patterns. The pulsed patterns can vary in direction, duration, intensity, pattern type, pulse source, etc. The processor can be coupled to a potentiometer to increase or decrease signal intensity from the processor. The processor can be coupled to one or more pulse motors. The pulse motors are configured to receive the pulsed patterns generated by the processor. The pulse motors convert the electronic pulsed pattern information generated by the processor into a pulsed sensory pattern. The pulsed sensory pattern can be a vibratory pattern. The pulsed sensory pattern can be a light pattern. The pulsed sensory pattern can be an auditory pattern.
  • In one aspect, a method includes: exporting location information relevant to various aspects of a computer gaming system; processing the location information into multidirectional metered pulses; and activating one or more sensory signal generators to correspond with the multidirectional metered pulse. Other implementations of this aspect include corresponding systems, apparatus, and computer program products.
  • Particular implementations can convey sensory information including distance and direction of virtual and real world objects to a user without the use of visual or verbal signals. In addition, identifying information such as source, type, affiliation, etc., can be conveyed to a user without the use of visual or verbal signals.
  • The details of one or more implementations of the invention are set forth in the accompanying drawings and the description below. Other features and aspects will become apparent from the description, the drawings, and the claims.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 depicts an implementation of a user headset.
  • FIG. 2 shows a system diagram for an implementation of an interactive wearable interface.
  • FIG. 3 is a flowchart of an implementation of a process of a game interacting with the headset.
  • FIG. 4 depicts an implementation of a wearable user interface.
  • DETAILED DESCRIPTION
  • Introduction
  • The vast majority of modern computer games do not meet the needs of gamers or users who function under limiting conditions. Limiting conditions can be functional limitations, or disabilities—such as blindness, deafness, or mobility limitations. The field of game accessibility deals with the accessibility of electronic games (computer games, console games, etc.) for disabled gamers. Game accessibility allows user the ability to play or otherwise participate in an electronic game even when functioning under limiting conditions
  • Description of Some implementations
  • A wearable user interface and associated software provides an electronic game interface for users who operate under limiting conditions, such as blindness or deafness. The wearable user interface functions to translate a three-dimensional or two-dimensional virtual game environment into haptic (vibratory), visual, or audible feedback provided to the user through the wearable interface, such as a headset. The three or two dimensional game environment can include static and dynamic three-dimensional rendered entities (i.e., game objects), such as other players, computer controlled players, and structural geometry such as buildings, terrain and other static or dynamic objects.
  • The wearable user interface can be any type of article that is worn on the person's body. For example, in some implementations, the wearable user interface is similar to an article of clothing (e.g., a vest or shirt). In other implementations, the wearable user interface can be a headset, helmet, headband, glove, bracelet, or a combination of different articles. As another example, the wearable user interface can be distinguished from a conventional handheld game controller.
  • The software component and associated processors transform relative position and identification data of various entities found in an electronic game into a vector comprising:
  • (1) distance; (2) direction; and (3) identification. In an implementation the position and identification data is transmitted from a computer or over a network and received by a processor housed in the wearable interface. The processor or microcontroller in the wearable interface, such as a headset, translates the position and identification data into an array of pulses produced by pulse generators positioned about the headset or other wearable interface. The pulsed information can be haptic (vibratory), pulsed light, or audible pulses. The pulsed information signals to the user the direction, distance, and identifying information of the entities present in the virtual gaming environment. The software and processors can be configured to filter the position and identification information received so as to establish a maximum “seeing distance.” This filtering aspect allows the user to perceive virtual entities that are of relevance to the user's position, affiliation, or other aspects that may be of importance to the electronic game.
  • In an implementation, distance can be represented by the intensity of the pulse of a vibration generated, for example by an electric motor. The intensity can be measured on a scale, for example (using pulse width modulation through a Darlington array). The scale can be for example, 0 to 255, wherein 255 represents an intense or maximum strength, signaling close proximity. Lower numbers represent further distances or distances out of the viewable distance filtered through the processor of the wearable device.
  • Two dimensional direction can be represented by specific pulse generators activated around the user. For example, in one implementation, four, six, eight, ten, twelve, sixteen or more pulse generators can be positioned about the user to represent directions around a periphery.
  • In an implementation, three dimensional direction can be represent by activating specific pulse generators positioned in an array across the top of the user's head.
  • In some implementations, unique haptic identification or source information are represented by a haptic or vibratory signature that is unique to each entity in the virtual environment of the electronic game. Each entity or entity type in the game has a certain pulse sequence associated with it, thereby allowing users to not only sense position (direction and distance) of game entities in their headset, but also can identify which entities are located where. For example: all trees can have a 500 ms pulse followed by a 100 ms pause associated with them. All structures can have a 100 ms pulse with a 50 ms pause. Player1 can have a repeatable pattern of 200 ms on, 50 ms off, 300 ms on, 100 ms off. Player2 can have a repeatable pattern of 600 ms on 50 ms off, 50 ms on, 50 ms off, 50 ms on, 100 ms off pattern.
  • FIG. 1 shows an example implementation of a wearable user interface 10 that comprises a core structure 15, multiple flexible supports 18, and multiple pulse generators 20. Core structure 15 can house a processor, memory, a power supply, a network interface or connection interface for communicating with a network or computer, and can serve as a structural base or hub for the multiple flexible supports 18.
  • Flexible supports 18 can be configured to fit around a specified body part, such as the user's head. Flexible supports 18 can be configured to have an elastic quality or a self-adjusting quality to ensure proper fit and maintain the fit when the wearable user interface 10 is in operation. The number of flexible supports 18 can vary based on the intended application. For example, in FIG. 2, the wearable user interface 10 has eight flexible supports 18. In other implementations, the wearable user interface 10 can have four flexible supports 18.
  • In addition, the flexible supports 18 can be arranged around the user wearable interface 10 in any configuration. For example, FIG. 2 illustrates an implementation of the wearable user interface 10 with eight flexible supports 18 are arranged around the wearable user interface 10 so that the individual flexible supports 18 are each separated by the same distance. In other implementations, the flexible supports 18 can have a near-random arrangement or a non-symmetric arrangement.
  • The length of the flexible supports 18 can be adjustable or can be fixed lengths. In some implementations, the flexible supports 18 can have varying lengths such that the flexible supports 18 are in contact with the user's body at varying vertical positions. For example, in an implementation with four flexible supports 18, two flexible supports 18 can have lengths equal to six inches and the other two flexible supports 18 can have lengths equal to four inches so they come in contact with the user's body at different points.
  • Pulse generators 20 can be positioned at any point along the flexible supports 18. One or more pulse generators 20 can be included on any given flexible support 18. For example, as shown in FIG. 2, a pulse generator 20 is positioned at a distal end of each flexible support 18. Or multiple pulse generators 20 can be a positioned along the length of each flexible support 18. In some implementations, pulse generators 20 can be near the core structure 15 or on the core structure 15 such that the pulse generators 20 are near the top of the user's head. The pulse generators 20 can be in direct contact with the user's body. In some implementations, the pulse generators 20 are arranged such that they are in indirect contact with the user's body (i.e., the pulse generators 20 and the user's body are separated by a wall or barrier).
  • The pulse generators 20 are coupled to the processor housed in the core structure 15 and configured to receive electronic signals from the processor housed in the core structure 15. The pulse generators 20 are configured to convert electronic signals into pulsed sensory information including pulsed vibratory information, pulsed light or visual information, or pulsed auditory information. Pulse generator 20 can vary the duration, intensity, pattern, and type of pulsed information as an output to the user. One or more pulse generators 20 can cooperatively operate to convey changing duration, intensity, pattern and type of the pulsed information, thereby giving the user a sense of direction, distance, and identifying information.
  • FIG. 2 shows an implementation of a computer system incorporating the wearable user interface of the present invention. The computer system comprises a computer terminal 30, a server 32 or a mobile device 34, a network 40 and a wearable interface 50 similar to the wearable user interface 10 described above in connection with FIG. 1. In some implementations, the computer terminal 30 can be a mobile computer, a game console or a mobile gaming device. The network 40 can be any type of network. For example, network 40 can be a LAN, WAN or a wireless network.
  • Wearable interface 50 comprises a processor 52 for receiving position and identification information from the computer 30 or server 32 via network 40. In an implementation, the processor 52 can receive position and identification information over a wireless network or a hardwired network connection via the connector 54. The connector 54 can be any type of network interface or connector. For example, the connector 54 can be a RS-232 connector, a USB connector, a wireless internet receiver/transmitter or can be an Ethernet connector. In some implementations, processor 52 can receive position and identification information directly from computer 30 or server 32.
  • Processor 52 can be configured to translate position and identification information into a plurality of signals 55. The plurality of signals 55 include instructions to multiple pulse generators 58. A potentiometer 56 can be included to boost or otherwise adjust the strength of signals 55 before the signals reach the pulse generators 58. It will be appreciated that processor 52 can be coupled to or otherwise in communication with peripheral devices 60 to trigger other signals to the user. For example, the processor 52 can be coupled to a game controller 60 that can also provide pulsed information such as vibrations.
  • In addition, the processor 52 can be coupled to or arranged to communicate with memory 59. The memory 59 can be any type of memory such as RAM, EEPROM, or a flash memory. The memory 59 can be used to store software, haptic signatures or game information.
  • FIG. 3 is a flowchart is an exemplary process 100 of a game operating on a game console/computer terminal 30 interacting with a player using the wearable user interface 10. The game first determines whether any game objects (e.g., another player, an enemy character, a bullet or obstacles) are in the area around the player's on-screen character (i.e., scan the area around the player) (block 102). In some implementations, the game periodically scans the area around the player's on-screen character. For example, the game may be configured to scan the area around the character every half second. The scan period (i.e., the amount of time between each scan) can be adjusted for each game by the game developer, the user, or can dynamically change based on the events of the game. In some implementations, the user can trigger a scan by pressing a keyboard/game controller button or using some other form of input, such as a voice command. In some implementations, the game can periodically scan the area around the player's character and accept inputs from the user to trigger a scan of the area.
  • In some implementations, the area around the player's on-screen character to be scanned (i.e., a character zone or seeing distance) can be determined by the game, and in other implementations, the character zone can be determined by the user through game options or different user inputs. The character zone can be defined to be the entire screen or any portion of the screen around the user's character. In some implementations, the character zone can be defined using any form of measurement or distance. For example, the character zone can be an area around the player's character measured by pixels, relative distances within the game (i.e., distances as measured in the game environment), or by actual on-screen distances (i.e., distances as measured on the screen). In implementations using a character zone, the game and the user wearable interface 10 are both configured to interpret the character zone in the same measurement units.
  • If the game determines that there is a game object in the area around the player, the game prepares a message to transmit to the user wearable interface 10 (block 106). The message can be a vector including relative position and identification data of the object(s) found in the area around the player. In some implementations, a message can be a vector comprising: 1) the distance the game object is from the user's character; 2) the direction the object is from the user's character; 3) the relative velocity of the object; and 4) an identifier for the object (“an object identifier”). For example, if the game performs a scan and determines a tree is two feet to the right of the player's character in the game environment, the game can prepare a message such as (2, Right, 0, Tree). As another example, if the game detects that a bullet located one on-screen inch to the left of the user's character and is traveling at 0.5 inches/sec in the direction of the user's character, the game can prepare a message such as (1, Up, 0.5, Bullet). The object identifiers are predetermined and are common to the game and the user wearable interface 10.
  • The message is then transmitted to the user wearable interface 10 by the game console/computer 30 (block 108). The message can be transmitted using any existing communication protocol, such as RS232 or USB, or can use a custom communication protocol similar to the communication protocol between a Sony Playstation and the Sony Playstation controllers. In addition, the message can be communicated to the user wearable interface 10 via the network 40. In some implementations, the message can be communicated directly to the user wearable interface 10. For example, in some implementations, the user wearable interface 10 is physically connected to the game console/computer 30, in other implementations, the user wearable interface 10 is directly connected to the game console/computer 30 using some wireless communication protocol such as Bluetooth or an infrared communication protocol.
  • The user wearable interface 10 receives the message from the game and then decodes the message (block 110). The user wearable interface 10 decodes the message into the various components and translates the information into an array of pulses to be produced by the pulse generators 20 (i.e., haptic signature). The user wearable interface 10 can access a database that maps the object identifier to the haptic signature for each game object and directional information to specific pulse generators 20. For example, the Tree identifier can be stored in the database as a 500 ms pulse followed by a 100 ms pause or the Bullet identifier can be mapped to a repeatable pattern of a 50 ms pulse and a 100 ms pause.
  • The user wearable interface 10 then transmits the haptic signature to one or more pulse generators 20, which then actuates the pulse generators 20 (block 112). For example, if the user wearable interface 10 receives the message (2, Right, 0, Tree), the processor 50 can access the database and determine that the haptic signature is a 500 ms pulse followed by a 100 ms pause that will be produced by a pulse generator 20 located on the right side of the user wearable interface 10.
  • Process 100 then returns to scanning the area around the user's character (block 102). In some implementations, the process 100 operates in parallel to a process to receive input from the user. For example, process 100 can operate in parallel to a process that determines if the user is moving the character or shooting an object.
  • Implementations of the subject matter and the functional operations described in this specification can be implemented in digital electronic circuitry, or in computer software, firmware, or hardware, including the structures disclosed in this specification and their structural equivalents, or in combinations of one or more of them. Implementations of the subject matter described in this specification can be implemented as one or more computer program products, i.e., one or more modules of computer program instructions encoded on a computer-readable medium for execution by, or to control the operation of, data processing apparatus. The computer-readable medium can be a machine-readable storage device, a machine-readable storage substrate, a memory device, a composition of matter effecting a machine-readable propagated signal, or a combination of one or more of them. The term “data processing apparatus” encompasses all apparatus, devices, and machines for processing data, including by way of example a programmable processor, a computer, or multiple processors or computers. The apparatus can include, in addition to hardware, code that creates an execution environment for the computer program in question, e.g., code that constitutes processor firmware, a protocol stack, a database management system, an operating system, or a combination of one or more of them. A propagated signal is an artificially generated signal, e.g., a machine-generated electrical, optical, or electromagnetic signal, that is generated to encode information for transmission to suitable receiver apparatus.
  • A computer program (also known as a program, software, software application, script, or code) can be written in any form of programming language, including compiled or interpreted languages, and it can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment. A computer program does not necessarily correspond to a file in a file system. A program can be stored in a portion of a file that holds other programs or data (e.g., one or more scripts stored in a markup language document), in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub-programs, or portions of code). A computer program can be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.
  • The processes and logic flows described in this specification can be performed by one or more programmable processors executing one or more computer programs to perform functions by operating on input data and generating output. The processes and logic flows can also be performed by, and apparatus can also be implemented as, special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application-specific integrated circuit).
  • Processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer. Generally, a processor will receive instructions and data from a read-only memory or a random access memory or both. The essential elements of a computer are a processor for performing instructions and one or more memory devices for storing instructions and data. Generally, a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto-optical disks, or optical disks. However, a computer need not have such devices. Moreover, a computer can be embedded in another device, e.g., a mobile telephone, a personal digital assistant (PDA), a mobile audio player, a Global Positioning System (GPS) receiver, to name just a few. Computer-readable media suitable for storing computer program instructions and data include all for ms of non-volatile memory, media and memory devices, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks, e.g., internal hard disks or removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks. The processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.
  • To provide for interaction with a user, implementations of the subject matter described in this specification can be implemented on a computer having a display device, e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor, for displaying information to the user and a keyboard and a pointing device, e.g., a mouse or a trackball, by which the user can provide input to the computer. Other kinds of devices can be used to provide for interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback, e.g., visual feedback, auditory feedback, or tactile feedback; and input from the user can be received in any form, including acoustic, speech, or tactile input.
  • Implementations of the subject matter described in this specification can be implemented in a computing system that includes a back-end component, e.g., as a data server, or that includes a middleware component, e.g., an application server, or that includes a front-end component, e.g., a client computer having a graphical user interface or a Web browser through which a user can interact with an implementation of the subject matter described is this specification, or any combination of one or more such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication, e.g., a communication network. Examples of communication networks include a local area network (“LAN”) and a wide area network (“WAN”), e.g., the Internet.
  • The computing system can include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
  • Other implementations are within the scope of the following claims. For example, the wearable user interface can similar to a headband, as shown in FIG. 4.
  • As shown in FIG. 4, the wearable user interface 100 can include a central core 115, a band 118 and a plurality of pulse generators 120. The central core 115 is similar to the central core 15 described above in connection with FIG. 1, and the pulse generators 120 are similar to the pulse generators 20 described above in connection with FIG. 1.
  • The band 118 can be configured to fit around a specified body part, such as the user's head or arm. The band 118 can also be configured to have an elastic quality or self-adjusting quality to ensure proper fit and maintain the fit when the wearable user interface 100 is in operation. In some implementations, the band 118 can be user-adjustable similar to a belt worn with clothing.
  • The pulse generators 120 can be positioned at any point along the surface of the band 118. Any number of pulse generators 120 can be placed at any position along the band 118. For example, one pulse generator 120 can be placed over the right ear, one pulse generator can be placed over the left ear, one pulse generator 120 can be placed on the user's forehead and the last pulse generator 120 can be placed on the back of the user's forehead. In some implementations, the pulse generators 120 are placed within the band such that the pulse generators 120 are not in direct contact with the user's body. In some implementations, the pulse generators 120 are placed along the surface of the band 118 that is intended to be in contact with the user's body but is covered by fabric or other material.
  • As shown in FIG. 4, the central core 115 can be configured such that it is located near the back of the user's head or near the user's neck. In some implementations, the central core 115 can be placed away from the user's body and coupled to the band 118 and/or pulse generators 120 via wires or other type of connector. For example, the central core 115 can be worn on the user's waistband and connected to the band 118 via a wire extending from the central core 115 to the band 118.
  • Thus, particular implementations of the invention have been described.

Claims (24)

1. A system comprising:
a wearable user interface device comprising a processor and a plurality of pulse generators; and
one or more computers operable to interact with the user interface device.
2. The system of claim 1, wherein the one or more computers comprise a server operable to interact with the user interface device through a data communication network, and the user interface device is operable to interact with the server as a client.
3. The system of claim 1, wherein the user interface device comprises a personal computer running a web browser.
4. The system of claim 1, wherein the one or more computers comprises one personal computer, and the personal computer comprises the user interface device.
5. A gaming interface comprising:
a wearable user interface configured to be worn on a user's body comprising:
a central body;
a plurality of flexible members extending from the central body; and
a pulse generator coupled to each of the plurality of flexible members; and
one or more computers operable to interact with the wearable user interface.
6. The gaming interface of claim 5 wherein the wearable user interface further comprises:
a processor housed in the central body and
a network interface coupled to the processor and the one or more computers.
7. The gaming interface of claim 6 wherein the processor is configured to decode position and identification information received through the network interface.
8. The gaming interface of claim 6 wherein the processor is configured to actuate the pulse generator of each of the plurality of flexible members.
9. The gaming interface of claim 5 wherein the wearable user interface is configured to be worn on a user's head.
10. The gaming interface of claim 5 wherein the plurality of flexible members are elastic and configured to maintain contact with the user's body.
11. The gaming interface of claim 9 wherein the plurality of flexible members are configured to maintain contact with the user's head.
12. The gaming interface of claim 10 wherein the pulse generator of each of the plurality of flexible members indirectly contacts the user's body.
13. The gaming interface of claim 5, wherein the pulse generator is connected to a distal end of each of the plurality of flexible members.
14. A method for conveying directional information to a person using a wearable user interface comprising:
scanning a proximal area of an on-screen representation of the person for a game object;
preparing a message wherein the message comprises identification information associated with the game object and location information associated with the game object, wherein the location information comprises directional information relative to the on-screen representation of the person;
transmitting the message to the wearable user interface; and
actuating at least one pulse generator coupled to the wearable user interface in response to the message, wherein the wearable user interface comprises a plurality of pulse generators.
15. The method of claim 14 wherein the proximal area of the on-screen representation of the person is the entire screen.
16. The method of claim 14 wherein the proximal area of the on-screen representation of the person is a portion of the screen surrounding the on-screen representation of the person, wherein the portion is defined using distance.
17. The method of claim 14 wherein the directional information comprises three-dimensional directional information.
18. The method of claim 14 wherein the directional information comprises two-dimensional directional information.
19. The method of claim 14 further comprising:
decoding the message to determine a haptic signature and the location information;
using the location information to determine a particular pulse generator to actuate; and
transmitting the haptic signature to the particular pulse generator.
20. A gaming interface comprising:
a wearable user interface configured to be worn on a user's body comprising:
a central body;
a band coupled to the central body; and
a plurality of pulse generators coupled to the band, wherein plurality of pulse generators indirectly contact the user's body; and
one or more computers operable to interact with the wearable user interface.
21. The gaming interface of claim 20 wherein the band is elastic.
22. The gaming interface of claim 20 wherein the wearable user interface further comprises:
a processor housed in the central body and
a network interface coupled to the processor and the one or more computers.
23. The gaming interface of claim 20 wherein the processor is configured to decode position and identification information received through the network interface.
24. The gaming interface of claim 20 wherein the processor is configured to actuate the plurality of pulse generators.
US12/501,248 2008-07-11 2009-07-10 User Interface for Functionally Limited Users Abandoned US20100009753A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/501,248 US20100009753A1 (en) 2008-07-11 2009-07-10 User Interface for Functionally Limited Users

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US8016308P 2008-07-11 2008-07-11
US12/501,248 US20100009753A1 (en) 2008-07-11 2009-07-10 User Interface for Functionally Limited Users

Publications (1)

Publication Number Publication Date
US20100009753A1 true US20100009753A1 (en) 2010-01-14

Family

ID=41505634

Family Applications (1)

Application Number Title Priority Date Filing Date
US12/501,248 Abandoned US20100009753A1 (en) 2008-07-11 2009-07-10 User Interface for Functionally Limited Users

Country Status (1)

Country Link
US (1) US20100009753A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180239428A1 (en) * 2016-12-31 2018-08-23 Vasuyantra Corp., A Delaware Corporation Remote perception of depth and shape of objects and surfaces

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4713651A (en) * 1985-03-29 1987-12-15 Meir Morag Information display system
US8290192B2 (en) * 2005-02-03 2012-10-16 Nokia Corporation Gaming headset vibrator

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4713651A (en) * 1985-03-29 1987-12-15 Meir Morag Information display system
US8290192B2 (en) * 2005-02-03 2012-10-16 Nokia Corporation Gaming headset vibrator

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
"Bringing Haptics to Second Life for Visually Impaired People", Maurizio de Pascale, Sara Mulatto, Domenico Prattichizzo, 6th International Conference, EuroHaptics 2008 Madrid, Spain, June 10-13, 2008 Proceedings *
Enemy Engaged: Apache Havoc game manual released circa 1999 *
M. de Pascale, S. Mulatto and D. Prattichizzo. "Bringing Haptics to Second Life" published in Proc. of the Workshop on Haptic in Ambient Systems 2008 (HAS2008) Quebec City, Feb. 14 2008 HAS2008 Slides [pdf] *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180239428A1 (en) * 2016-12-31 2018-08-23 Vasuyantra Corp., A Delaware Corporation Remote perception of depth and shape of objects and surfaces
US10782780B2 (en) * 2016-12-31 2020-09-22 Vasuyantra Corp. Remote perception of depth and shape of objects and surfaces

Similar Documents

Publication Publication Date Title
US20170131775A1 (en) System and method of haptic feedback by referral of sensation
US9968846B2 (en) Information processing apparatus, storage medium having stored therein information processing program, information processing system, and information processing method
US10449445B2 (en) Feedback for enhanced situational awareness
JP2021193575A (en) System and method for haptically-enabled neural interface
Chatzidimitris et al. SoundPacman: Audio augmented reality in location-based games
WO2017170146A1 (en) Control method, virtual reality experience providing device, and program
EP3160606B1 (en) Interactive play sets
JP6306442B2 (en) Program and game system
CN108121441A (en) Targetedly tactile projects
US9755848B2 (en) System and method for simulating a user presence
JP2005182843A (en) Operation method of virtual reality circuit
Stoll et al. Navigating from a depth image converted into sound
US11347312B1 (en) Ultrasonic haptic output devices
US20190357771A1 (en) Systems and methods for delivering, eliciting, and modifying tactile sensations using electromagnetic radiation
Hu et al. StereoPilot: A wearable target location system for blind and visually impaired using spatial audio rendering
US20090237355A1 (en) Head tracking for virtual reality displays
Kim et al. Body-penetrating tactile phantom sensations
Jones et al. Localization and pattern recognition with tactile displays
Valkov et al. Vibro-tactile feedback for real-world awareness in immersive virtual environments
KR20120046578A (en) Serious game system for old-aged person, and method for the same
US20100009753A1 (en) User Interface for Functionally Limited Users
Batmaz et al. Effects of different auditory feedback frequencies in virtual reality 3D pointing tasks
US11809629B1 (en) Wearable electronic device for inducing transient sensory events as user feedback
Afonso et al. How to not hit a virtual wall: aural spatial awareness for collision avoidance in virtual environments
Kim et al. Rendering vibrotactile flow on backside of the head: Initial study

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION