US20080153591A1 - Teleportation Systems and Methods in a Virtual Environment - Google Patents
Teleportation Systems and Methods in a Virtual Environment Download PDFInfo
- Publication number
- US20080153591A1 US20080153591A1 US11/816,968 US81696806A US2008153591A1 US 20080153591 A1 US20080153591 A1 US 20080153591A1 US 81696806 A US81696806 A US 81696806A US 2008153591 A1 US2008153591 A1 US 2008153591A1
- Authority
- US
- United States
- Prior art keywords
- user
- virtual environment
- teleportation
- directional input
- create
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/01—Indexing scheme relating to G06F3/01
- G06F2203/012—Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment
Definitions
- the present disclosure is generally related to virtual technology and, more particularly, is related to systems and methods for providing user interaction in a virtual environment.
- IVEs Immersive Virtual Environments
- input devices such as handheld and fixed station user input devices as well as environment specific devices such as, for example, a virtual reality snowboard.
- environment specific devices such as, for example, a virtual reality snowboard.
- the utilization of these input devices is awkward or unnatural and may require extensive training, especially if the device offers many degrees of freedom.
- some of the previous input devices have required the user to memorize and perform specific coded gestures or sequences of gestures to make virtual environmental changes such as a direction or mode change. In such a device having many degrees of freedom, the user is tasked with memorizing and performing many potentially unnatural tasks and gestures to travel and navigate within a large scale IVE.
- Embodiments of the present disclosure provide a system and method for teleportation in a virtual environment.
- a head mounted display configured to provide an immersive virtual environment
- a teleportation device configured to provide navigation in the virtual environment
- at least one feedback device configured to provide a user with information corresponding to movement of the teleportation device within the virtual environment
- a plurality of input devices configured to generate a plurality of input signals in response to inputs from the user
- a computing device configured to receive the plurality of input signals and control the at least one feedback device.
- Embodiments of the present disclosure can also be viewed as methods for providing teleportation in a virtual environment.
- one embodiment of such a method can be broadly summarized by the following steps: delivering a video signal, corresponding to a virtual environment, to a user; delivering an audio signal, corresponding to the virtual environment, to the user; receiving a plurality of inputs corresponding to a three-dimensional position for each of a plurality of user physiological features; providing a vibratory feedback, corresponding to the virtual environment, to the user; and directing air towards the user to create a motion sensation.
- FIG. 1 is a schematic diagram of an embodiment of a system for teleportation in a virtual environment.
- FIG. 2 is a schematic diagram of an alternative embodiment of a system for teleportation in a virtual environment.
- FIG. 3 is a schematic diagram illustrating a top view of an embodiment of a system for teleportation in a virtual environment.
- FIG. 4 is a schematic diagram illustrating a top view of an embodiment of a teleportation device showing exemplary inputs to a directional input component.
- FIG. 5 is a schematic diagram illustrating a partial front view of a system for teleportation in a virtual environment.
- FIG. 6 is a schematic diagram illustrating a side view of an embodiment of a teleportation device showing exemplary inputs to a directional input component.
- FIG. 7 is a schematic diagram illustrating a side view of an alternative embodiment of a teleportation device.
- FIG. 8 is a functional block diagram illustrating an embodiment of a control arrangement for a teleportation system as disclosed herein.
- FIG. 9 is a block diagram illustrating an embodiment of an architecture for controlling a teleportation system.
- FIG. 10 is a block diagram illustrating an embodiment of a method for providing teleportation in a virtual environment.
- FIG. 1 is a schematic diagram of an embodiment of a system 100 for teleportation in an immersive virtual environment.
- An immersive virtual environment includes multiple sources of feedback for a user to create the sensation that the user is fully immersed in the virtual environment.
- the system 100 includes a teleportation device 104 that provides for general purpose navigation in virtual environments. The navigation activities can include, for example, traveling from one place to another for exploring and searching within the virtual environment.
- a user 108 can rotate himself/herself and the teleportation device 104 , physically move forward and backward (and up and down), and change the speed of travel.
- the system 100 also includes a computing device 102 , which can include a processor, memory, and one or more input/output devices, all communicatively coupled via one or more data buses.
- the computing device 102 is configured to provide data to a head mounted display 114 .
- the head mounted display 114 is configured to communicate video and audio signals to a user 108 using one or more displays and audio output components.
- the computing device 102 is also configured to receive user position data from user position sensors 112 proximate to different user physiological features. Examples of user physiological features that might provide useful position data include, but are not limited to, the head, hands, arms, feet, and legs.
- the embodiment of FIG. 1 includes user position sensors 112 at the users head and hands. In addition to providing three-dimensional position data, the user position sensors 112 can also be used to provide orientation data to the computing device 102 .
- the computing device 102 is also configured to receive position and orientation data from one or more teleportation device position sensors 116 that are mounted to the teleportation device 104 . In this manner, the computing device can render the virtual environment based on the position and orientation of the teleportation device 104 .
- the teleportation device 104 includes a base 118 configured to optionally support all or a portion of the user 108 .
- the base 118 is attached to a directional input component 110 through a moveable coupling 120 .
- the moveable coupling 120 of this embodiment includes one or more springs configured in modes of compression, tension, or some combination thereof.
- the teleportation device 104 also includes a vibratory feedback device 106 configured to be controlled by the computing device 102 .
- the vibratory feedback device 106 is used to deliver sound and/or vibration to the user 108 to simulate varying rates of movement within the virtual environment. In this maimer the sound and/or vibration of the teleportation device 104 in motion is simulated.
- the vibratory feedback device 106 may be configured to operate at a low frequency and output level when the teleportation device 104 is moving through the virtual environment at a slow speed. Accordingly, the output level and frequency might be increased as the speed of the teleportation device 104 is increased.
- the vibratory feedback device 106 can be configured as a subwoofer speaker, for example.
- to the vibratory feedback device 106 can be implemented as vibrotactile devices mounted at a variety of points on the teleportation device 104 .
- FIG. 2 is a schematic diagram of an alternative embodiment of a system 122 for teleportation in a virtual environment.
- the system 122 also includes a position interface 124 , configured to communicate with the position sensors 112 , 116 . Communication between the position interface 124 and the position sensors 112 , 116 can be accomplished using any one of a variety of wired or wireless communication technologies.
- the position interface 124 also referred to as a 3-D tracker, reports the position and orientation of each of the position sensors 112 , 116 to the computing device 102 .
- the system 122 also includes one or more fans 128 for generating a wind simulation.
- the fan or fans 128 can be controlled by the computing device 102 through an output device controller 130 .
- the output device controller 130 can include, for example, relays and or electronic speed controllers to vary the speed and direction of the simulated wind.
- the system 122 can also optionally include a status interface system 125 configured to maintain the status of one or more of the peripheral devices external to the computing device 102 .
- the status interface system 125 can be implemented to replace or supplement either or both of the position interface 124 and the output device controller 130 . Additionally, the status interface system 125 includes the functionality to detect the operation of user input devices such as buttons or switches.
- the status interface system 125 may be implemented in separate units, or as a single unit (e.g., with two cards in it, one corresponding to the switching action function of a relay controller and the other having functionality to detect button presses and releases).
- the status interface system 125 when implemented as a single unit, may have additional cards corresponding to analog-to-digital conversion (ADC) and digital-to-analog conversion (DAC) to control, for example, fan speed.
- ADC analog-to-digital conversion
- DAC digital-to-analog conversion
- FIG. 3 is a schematic diagram illustrating a top view of an embodiment of a system for teleportation in a virtual environment.
- the system 138 includes a teleportation device 104 having a base 118 and a directional input component 110 , also referred to as a steering wheel or handle bar.
- the teleportation device 104 includes a vibratory feedback device 106 and one or more user interface devices configured to allow the user to cause or trigger an operation within the virtual environment.
- the user interface devices can include switches and buttons, among others. Alternative embodiments may include user interface devices using one or more touch screens.
- the user interface devices can include an UP button 140 and a DOWN button 142 for causing the teleportation device 104 to move up or down within the virtual environment.
- the UP and DOWN functions could be combined into one multiple position switch, for example a three position center return switch.
- User interface devices can also be implemented as a STOP 144 button configured to cause the teleportation device 104 to stop within the virtual environment.
- a FLY/DRIVE switch 150 is also included. The FLY/DRIVE switch 150 can be toggled between a fly mode and a drive mode.
- an INC button 154 and a DEC button 156 configured to cause the teleportation device to increase speed or decrease speed, respectively.
- the INC and DEC functions can alternatively be combined into a multiple position switch such as a toggle switch.
- Other alternative embodiments can include throttle and/or handbrake structures that can generate, for example, analog signals to increase or decrease the speed, respectively.
- the analog signals from a throttle and/or a handbrake may be processed using, for example, analog-to-digital conversion hardware and/or software.
- a throttle and/or handbrake can also be configured to generate digital signals. For example, devices providing a quadrature pulse output in conjunction with a counter can be used for increasing and decreasing the speed.
- the teleportation device 104 can also include a DEBUG button 146 configured to allow the user to debug one or more applications running on the computing device 102 . For example, a user may experience a situation where he or she cannot move in the virtual environment due to a collision with multiple objects, such as might occur during a glitch in an application's implementation. A user can activate the DEBUG button 146 and disable collision detection temporarily to enable testing of other parts of the application.
- the teleportation device 104 can also include a JUMP button 152 to permit the vehicle to jump over obstacles in the virtual environment when in drive mode.
- the system 138 also includes all example arrangement of fans 128 .
- the fans 128 used independently or in selective combination can be used to simulate wind that corresponds to the motion within the virtual environment. For example, where the teleportation device 104 is traveling to one side or another, the corresponding fan 128 would be activated to simulate wind commensurate with that motion. Also, when the teleportation device 104 is turned or rotated, a three dimensional position sensor 116 can detect which direction the teleportation device 104 is facing and operate one or more fans 128 corresponding to movement in the new direction.
- FIG. 4 is a schematic diagram illustrating a top view of an embodiment of a teleportation device showing exemplary inputs to a directional input component.
- the teleportation device 104 includes a base 118 moveably coupled to a directional input component 110 .
- a directional input component 110 By way of example, when a user rotates the directional input component 110 clockwise, the teleportation device 104 will turn to the right in the virtual environment. Similarly, when the a user rotates the directional input component 110 counter-clockwise, the teleportation device 104 will turn to the left in the virtual environment. To cause an upward movement of the teleportation device 104 in the virtual environment, the directional input component 110 is pulled or tilted towards the user.
- the directional input component 110 is pushed or tilted away from the user.
- Alternative embodiments may use a directional input component 110 mounted to a telescopic shaft where the up and down motions are accomplished by manipulating the directional input component in a substantially vertical up and down motion.
- FIG. 5 is a schematic diagram illustrating a partial front view of a system for teleportation in a virtual environment.
- An arrangement of multiple fans of an embodiment includes an over the head fan 210 for simulating, for example, upward movement in the virtual environment.
- the arrangement includes a right side fan 212 and a left side fan 214 for simulating right and left motion, respectively.
- a left ground fan 218 and a right ground fan 216 can be used to simulate left and right downward movement, respectively.
- a front of face fan 220 can be used to simulate forward motion.
- Each of the fans can be driven at varying speeds to create the sensation of changing speeds within the virtual environment. Additionally, the fans can be used alone or in combination to create varying degrees of speed and directional simulation.
- FIG. 6 is a schematic diagram illustrating a side view of an embodiment of a teleportation device showing exemplary inputs to a directional input component.
- the teleportation device 104 includes a base 118 coupled to a directional input component 110 via a moveable coupling 120 .
- the user 108 pushes or tilts the directional input component 110 away from himself/herself.
- the user 108 pulls or tilts the directional input device 110 towards himself/herself.
- Alternative embodiments can feature a telescopic arrangement such that the directional input device is moved substantially vertically up and down to direct the upward or downward movement of the teleportation device 104 within the virtual environment.
- FIG. 7 is a schematic diagram illustrating a side view of an alternative embodiment of a teleportation device.
- the teleportation device 104 includes a base 118 attached to a directional input component 110 through a moveable coupling 120 .
- the moveable coupling 120 is a spring. Additional springs 230 are included to provide force feedback through additional resistance. Multiple springs or other biasing elements can be used independently or in combination to achieve a desired level of force feedback in all or selected axes.
- FIG. 8 is a functional block diagram illustrating an embodiment of a control arrangement for a teleportation system as disclosed herein.
- the computer 160 (herein, computer or host computer) communicates with a 3-D tracker 162 , a fan/relay controller 176 , an eye tracking controller 182 , and the status interface 166 . Note that in some embodiments, fewer or more components and/or functionality can be implemented.
- the 3-D tracker 162 provides the position and orientation of the user's head, hands, the teleportation device, etc. to perform the following functionality:
- the teleportation system comprises a physics component 196 to simulate gravity, so that the user stays on the ground and not in the middle of the air when operating in the “drive” (as opposed to “fly”) mode. For example, when the user jumps over an obstacle in a virtual environment, he/she lands on the ground in the virtual environment.
- the output of the physics component 196 is fed to a vibrator controller 198 that simulates vibrations and it also provides input to a graphics generator 190 that drives the 3-D output graphics on a head mounted display 202 that is fully immersive.
- the graphics generator 190 may retrieve environment data from an environment storage 192 .
- the head mounted display 202 is used and comprises a display and headphones or speakers.
- the headphones can be used to hear things or events in the virtual environment, such as a bouncing ball, etc.
- the teleportation system can simulate circumstances such as when a user collides with another object by activating one or more vibration units 200 to provide tactile simulation.
- the host computer 160 controls the wind generator units 180 (on/off) and their speed (how much air they blow).
- the wind generator units 180 can be driven through a fan speed controller 178 and a fan/relay controller 176 .
- the host computer 160 also drives a sound generator 172 that simulates the noise generated by the teleportation device and can also serve as a secondary vibration mechanism.
- the sound generator 172 can be used to drive sound output units 174 using data in a sound data storing unit 170 .
- the status interface 166 can use a switch polling facility 168 to detect button presses (e.g., user interface devices coupled to activation devices or switches) from user input devices that are attached to the teleportation device and sends that information to the host computer 160 .
- the teleportation system may also comprise a speech recognizer 164 that recognizes commands that a user verbally issues.
- the eye tracking controller 182 can communicate with a separate computer coupled to the host computer through, for example, an output interface 184 .
- a head mounted display 202 comprises a camera that tracks the user's eye.
- the eye tracking controller 182 determines the coordinates of the eye and further determines what the user is observing in the virtual environment. Such a feature may be useful in games. For example, as a missile from the enemy is coming at the user, the user can look at the missile and press a button located on the teleportation device and activate a missile interceptor to destroy the incoming missile.
- FIG. 9 is a block diagram illustrating an embodiment of an architecture for controlling a teleportation system.
- the control computer generally includes a processor 240 , memory 242 , and one or more input and/or output (I/O) devices 250 (or peripherals) that are communicatively coupled via a local interface 244 .
- the local interface 244 may be, for example, one or more buses or other wired or wireless connections.
- the local interface 244 may have additional elements such as controllers, buffers (caches), drivers, repeaters, and receivers, to enable communication. Further, the local interface 244 may include address, control, and/or data connections that enable appropriate communication among the aforementioned components.
- the processor 240 is a hardware device for executing software, particularly that which is stored in memory.
- the processor 240 may be any custom made or commercially available processor, a central processing unit (CPU), an auxiliary processor among several processors associated with the processing device, a semiconductor-based microprocessor (in the form of a microchip or chip set), a macroprocessor, or generally any device for executing software instructions.
- the memory 242 may include any one or combination of volatile memory elements (e.g., random access memory (RAM)) and nonvolatile memory elements (e.g., ROM, hard drive, etc.). Moreover, the memory 242 may incorporate electronic, magnetic, optical, and/or other types of storage media. Note that the memory 242 may have a distributed architecture in which where various components are situated remotely from one another but may be accessed by the processor 240 .
- volatile memory elements e.g., random access memory (RAM)
- nonvolatile memory elements e.g., ROM, hard drive, etc.
- the memory 242 may incorporate electronic, magnetic, optical, and/or other types of storage media. Note that the memory 242 may have a distributed architecture in which where various components are situated remotely from one another but may be accessed by the processor 240 .
- the software in memory 242 may include one or more separate programs, each of which comprises an ordered listing of executable instructions for implementing logical functions, such as the logical functions shown in FIG. 8 .
- the software in the memory 242 includes control software 246 for providing one or more of the functionality shown in FIG. 8 according to an embodiment.
- the memory 242 may also comprise a suitable operating system (O/S) 248 .
- the operating system 248 essentially controls the execution of other computer programs, such as the control software, and provides scheduling, input-output control, file and data management, memory management, and communication control and related services.
- the control software 246 is a source program, executable program (object code), script, or any other entity comprising a set of instructions to be performed.
- the control software 246 can be implemented, in one embodiment, as a distributed network of modules, where one or more of the modules can be accessed by one or more applications or programs or components thereof. In some embodiments, the control software 246 can be implemented as a single module with all of the functionality of the aforementioned modules.
- the control software 246 is a source program, then the program is translated via a compiler, assembler, interpreter, or the like, which may or may not be included within the memory, so as to operate properly in connection with the operating system 248 .
- control software 246 can be written with (a) an object oriented programming language, which has classes of data and methods, or (b) a procedure programming language, which has routines, subroutines, and/or functions, for example but not limited to, C, C++, Pascal, Basic, Fortran, Cobol, Perl, Java, and Ada.
- the I/O devices 250 may include input devices such as, for example, a keyboard, mouse, scanner, microphone, sensor(s), etc. Furthermore, the I/O devices 250 may also include output devices such as, for example, a printer, display, audio devices, vibration devices, etc. Finally, the I/O devices 250 may further include devices that communicate both inputs and outputs such as, for instance, a modulator/demodulator (modem for accessing another device, system, or network), a radio frequency (RF) or other transceiver, a telephonic interface, a bridge, a router, etc.
- a modulator/demodulator modem for accessing another device, system, or network
- RF radio frequency
- the processor 240 When the control computer is in operation, the processor 240 is configured to execute software stored within the memory 242 , to communicate data to and from the memory 242 , and to generally control operations of the control computer pursuant to the software.
- the control software 246 and the operating system 248 in whole or in part, but typically the latter, are read by the processor 240 , perhaps buffered within the processor 240 , and then executed.
- control software 246 can be stored on any computer-readable medium for use by or in connection with any computer-related system or method.
- a computer-readable medium is an electronic, magnetic, optical, or other physical device or means that can contain or store a computer program for use by or in connection with a computer related system or method.
- the control software 246 can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions.
- control software 246 can be implemented with any or a combination of the following technologies, which are each well known in the art: a discrete logic circuit(s) having logic gates for implementing logic functions upon data signals, an application specific integrated circuit (ASIC) having appropriate combinational logic gates, a programmable gate array(s) (PGA), a field programmable gate array (FPGA), etc; or can be implemented with other technologies now known or later developed.
- ASIC application specific integrated circuit
- PGA programmable gate array
- FPGA field programmable gate array
- FIG. 10 is a block diagram illustrating an embodiment of a method 300 for providing teleportation in a virtual environment.
- the method 300 includes the step of delivering a video signal to the user in block 310 .
- the video signal may be delivered using, for example, one or more displays configured in a head mounted device.
- the video signal provides the user with the visual information corresponding to the virtual environment.
- the method 300 also includes the step of delivering an audio signal to a user in block 320 .
- the audio signal can be delivered through, for example, headphones or speakers.
- the audio signal can be used to communicate sounds within the virtual environment that correspond to objects or events.
- the method 300 also includes the step of receiving position inputs relating to user physiological features and the teleportation device in block 330 .
- the three-dimensional position and orientation of the hands and head of the user can serve to ensure that the user's position and video signal correspond to the virtual environment.
- the computer controlling the virtual environment can correctly render the teleportation device in the virtual environment.
- a user is provided vibratory feedback in block 340 .
- a user can experience the sounds and vibrations corresponding to different rates of speed and events such as collisions in the virtual environment.
- air is directed towards the user in block 350 .
- the air is directed at varying rates and from different directions to create the sensation of moving at different speeds and in different directions.
- Air can be directed using multiple wind generation devices including for example, fans or blowers. Each wind generation device can be driven independently or in combination at one or more preset speeds or at any speed over a range of speeds. Controlling the wind generation units can be accomplished using relays, electronic speed controllers, electronic motor drives, or any combination thereof.
Abstract
Provided are systems and methods for teleportation in a virtual environment. One embodiment of such a system can be implemented as a head mounted display configured to provide an immersive virtual environment and a teleportation device configured to provide navigation in the virtual environment; at least one feedback device configured to provide a user with information corresponding to movement of the teleportation device within the virtual environment. The system also includes a plurality of input devices configured to generate a plurality of input signals in response to inputs from the user and a computing device configured to receive the plurality of input signals and control the at least one feedback device.
Description
- This application claims priority to copending U.S. provisional application entitled, “TELEPORTATION SYSTEMS AND METHODS,” having Ser. No. 60/659,283, filed Mar. 7, 2005, which is entirely incorporated herein by reference.
- The present disclosure is generally related to virtual technology and, more particularly, is related to systems and methods for providing user interaction in a virtual environment.
- Large scale Immersive Virtual Environments (IVEs) are common in current research. Some of the major problems in large scale IVEs, however, are traveling and navigation. These problems has been addressed by input devices such as handheld and fixed station user input devices as well as environment specific devices such as, for example, a virtual reality snowboard. The utilization of these input devices, however, is awkward or unnatural and may require extensive training, especially if the device offers many degrees of freedom. For example, some of the previous input devices have required the user to memorize and perform specific coded gestures or sequences of gestures to make virtual environmental changes such as a direction or mode change. In such a device having many degrees of freedom, the user is tasked with memorizing and performing many potentially unnatural tasks and gestures to travel and navigate within a large scale IVE.
- Embodiments of the present disclosure provide a system and method for teleportation in a virtual environment. Briefly described one embodiment of the system, among others, can be implemented as follows: a head mounted display configured to provide an immersive virtual environment; a teleportation device configured to provide navigation in the virtual environment; at least one feedback device configured to provide a user with information corresponding to movement of the teleportation device within the virtual environment; a plurality of input devices configured to generate a plurality of input signals in response to inputs from the user; and a computing device configured to receive the plurality of input signals and control the at least one feedback device.
- Embodiments of the present disclosure can also be viewed as methods for providing teleportation in a virtual environment. In this regard, one embodiment of such a method, among others, can be broadly summarized by the following steps: delivering a video signal, corresponding to a virtual environment, to a user; delivering an audio signal, corresponding to the virtual environment, to the user; receiving a plurality of inputs corresponding to a three-dimensional position for each of a plurality of user physiological features; providing a vibratory feedback, corresponding to the virtual environment, to the user; and directing air towards the user to create a motion sensation.
- Other systems, methods, features, and advantages of the present disclosure will be or become apparent to one with skill in the art upon examination of the following drawings and detailed description. It is intended that all such additional systems, methods, features, and advantages be included within this description, be within the scope of the present disclosure, and be protected by the accompanying claims.
- Many aspects of the disclosure can be better understood with reference to the following drawings. The components in the drawings are not necessarily to scale, emphasis instead being placed upon clearly illustrating the principles of the present disclosure. Moreover, in the drawings, like reference numerals designate corresponding parts throughout the several views.
-
FIG. 1 is a schematic diagram of an embodiment of a system for teleportation in a virtual environment. -
FIG. 2 is a schematic diagram of an alternative embodiment of a system for teleportation in a virtual environment. -
FIG. 3 is a schematic diagram illustrating a top view of an embodiment of a system for teleportation in a virtual environment. -
FIG. 4 is a schematic diagram illustrating a top view of an embodiment of a teleportation device showing exemplary inputs to a directional input component. -
FIG. 5 is a schematic diagram illustrating a partial front view of a system for teleportation in a virtual environment. -
FIG. 6 is a schematic diagram illustrating a side view of an embodiment of a teleportation device showing exemplary inputs to a directional input component. -
FIG. 7 is a schematic diagram illustrating a side view of an alternative embodiment of a teleportation device. -
FIG. 8 is a functional block diagram illustrating an embodiment of a control arrangement for a teleportation system as disclosed herein. -
FIG. 9 is a block diagram illustrating an embodiment of an architecture for controlling a teleportation system. -
FIG. 10 is a block diagram illustrating an embodiment of a method for providing teleportation in a virtual environment. - Having summarized various aspects of the present disclosure, reference will now be made in detail to the description of the disclosure as illustrated in the drawings. While the disclosure will be described in connection with these drawings, there is no intent to limit it to the embodiment or embodiments disclosed herein. On the contrary, the intent is to cover all alternatives, modifications and equivalents included within the spirit and scope of the disclosure as defined by the appended claims.
- Reference is first made to
FIG. 1 , which is a schematic diagram of an embodiment of asystem 100 for teleportation in an immersive virtual environment. An immersive virtual environment includes multiple sources of feedback for a user to create the sensation that the user is fully immersed in the virtual environment. Thesystem 100 includes ateleportation device 104 that provides for general purpose navigation in virtual environments. The navigation activities can include, for example, traveling from one place to another for exploring and searching within the virtual environment. Auser 108 can rotate himself/herself and theteleportation device 104, physically move forward and backward (and up and down), and change the speed of travel. - The
system 100 also includes acomputing device 102, which can include a processor, memory, and one or more input/output devices, all communicatively coupled via one or more data buses. Thecomputing device 102 is configured to provide data to a head mounteddisplay 114. The head mounteddisplay 114 is configured to communicate video and audio signals to auser 108 using one or more displays and audio output components. Thecomputing device 102 is also configured to receive user position data fromuser position sensors 112 proximate to different user physiological features. Examples of user physiological features that might provide useful position data include, but are not limited to, the head, hands, arms, feet, and legs. The embodiment ofFIG. 1 includesuser position sensors 112 at the users head and hands. In addition to providing three-dimensional position data, theuser position sensors 112 can also be used to provide orientation data to thecomputing device 102. - The
computing device 102 is also configured to receive position and orientation data from one or more teleportationdevice position sensors 116 that are mounted to theteleportation device 104. In this manner, the computing device can render the virtual environment based on the position and orientation of theteleportation device 104. - The
teleportation device 104 includes abase 118 configured to optionally support all or a portion of theuser 108. Thebase 118 is attached to adirectional input component 110 through amoveable coupling 120. Themoveable coupling 120 of this embodiment includes one or more springs configured in modes of compression, tension, or some combination thereof. - The
teleportation device 104 also includes avibratory feedback device 106 configured to be controlled by thecomputing device 102. Thevibratory feedback device 106 is used to deliver sound and/or vibration to theuser 108 to simulate varying rates of movement within the virtual environment. In this maimer the sound and/or vibration of theteleportation device 104 in motion is simulated. For example, thevibratory feedback device 106 may be configured to operate at a low frequency and output level when theteleportation device 104 is moving through the virtual environment at a slow speed. Accordingly, the output level and frequency might be increased as the speed of theteleportation device 104 is increased. In some embodiments, thevibratory feedback device 106 can be configured as a subwoofer speaker, for example. Alternatively, or in addition, to thevibratory feedback device 106 can be implemented as vibrotactile devices mounted at a variety of points on theteleportation device 104. - Reference is now made to
FIG. 2 , which is a schematic diagram of an alternative embodiment of asystem 122 for teleportation in a virtual environment. In addition to the components of thesystem 100 described above in reference toFIG. 1 , thesystem 122 also includes aposition interface 124, configured to communicate with theposition sensors position interface 124 and theposition sensors position interface 124, also referred to as a 3-D tracker, reports the position and orientation of each of theposition sensors computing device 102. - The
system 122 also includes one ormore fans 128 for generating a wind simulation. The fan orfans 128 can be controlled by thecomputing device 102 through anoutput device controller 130. Theoutput device controller 130 can include, for example, relays and or electronic speed controllers to vary the speed and direction of the simulated wind. - The
system 122 can also optionally include astatus interface system 125 configured to maintain the status of one or more of the peripheral devices external to thecomputing device 102. Thestatus interface system 125 can be implemented to replace or supplement either or both of theposition interface 124 and theoutput device controller 130. Additionally, thestatus interface system 125 includes the functionality to detect the operation of user input devices such as buttons or switches. Thestatus interface system 125 may be implemented in separate units, or as a single unit (e.g., with two cards in it, one corresponding to the switching action function of a relay controller and the other having functionality to detect button presses and releases). Thestatus interface system 125, when implemented as a single unit, may have additional cards corresponding to analog-to-digital conversion (ADC) and digital-to-analog conversion (DAC) to control, for example, fan speed. - Reference is now made to
FIG. 3 , which is a schematic diagram illustrating a top view of an embodiment of a system for teleportation in a virtual environment. Thesystem 138 includes ateleportation device 104 having a base 118 and adirectional input component 110, also referred to as a steering wheel or handle bar. Theteleportation device 104 includes avibratory feedback device 106 and one or more user interface devices configured to allow the user to cause or trigger an operation within the virtual environment. The user interface devices can include switches and buttons, among others. Alternative embodiments may include user interface devices using one or more touch screens. - The user interface devices can include an
UP button 140 and aDOWN button 142 for causing theteleportation device 104 to move up or down within the virtual environment. Alternatively, the UP and DOWN functions could be combined into one multiple position switch, for example a three position center return switch. User interface devices can also be implemented as aSTOP 144 button configured to cause theteleportation device 104 to stop within the virtual environment. A FLY/DRIVE switch 150 is also included. The FLY/DRIVE switch 150 can be toggled between a fly mode and a drive mode. - Also included are an
INC button 154 and aDEC button 156 configured to cause the teleportation device to increase speed or decrease speed, respectively. Like the UP and DOWN functions, the INC and DEC functions can alternatively be combined into a multiple position switch such as a toggle switch. Other alternative embodiments can include throttle and/or handbrake structures that can generate, for example, analog signals to increase or decrease the speed, respectively. The analog signals from a throttle and/or a handbrake may be processed using, for example, analog-to-digital conversion hardware and/or software. A throttle and/or handbrake can also be configured to generate digital signals. For example, devices providing a quadrature pulse output in conjunction with a counter can be used for increasing and decreasing the speed. - Other user interface devices can be included such as a
LIGHTS button 148 for adjusting the lighting levels in the virtual environment. Some embodiments may feature a simple on and off control for the lighting. Other embodiments may include incremental changes in the lighting levels through actuation of theLIGHTS button 148. Theteleportation device 104 can also include aDEBUG button 146 configured to allow the user to debug one or more applications running on thecomputing device 102. For example, a user may experience a situation where he or she cannot move in the virtual environment due to a collision with multiple objects, such as might occur during a glitch in an application's implementation. A user can activate theDEBUG button 146 and disable collision detection temporarily to enable testing of other parts of the application. Theteleportation device 104 can also include aJUMP button 152 to permit the vehicle to jump over obstacles in the virtual environment when in drive mode. - The
system 138 also includes all example arrangement offans 128. Thefans 128 used independently or in selective combination can be used to simulate wind that corresponds to the motion within the virtual environment. For example, where theteleportation device 104 is traveling to one side or another, the correspondingfan 128 would be activated to simulate wind commensurate with that motion. Also, when theteleportation device 104 is turned or rotated, a threedimensional position sensor 116 can detect which direction theteleportation device 104 is facing and operate one ormore fans 128 corresponding to movement in the new direction. - Brief reference is now made to
FIG. 4 , which is a schematic diagram illustrating a top view of an embodiment of a teleportation device showing exemplary inputs to a directional input component. Theteleportation device 104 includes a base 118 moveably coupled to adirectional input component 110. By way of example, when a user rotates thedirectional input component 110 clockwise, theteleportation device 104 will turn to the right in the virtual environment. Similarly, when the a user rotates thedirectional input component 110 counter-clockwise, theteleportation device 104 will turn to the left in the virtual environment. To cause an upward movement of theteleportation device 104 in the virtual environment, thedirectional input component 110 is pulled or tilted towards the user. Similarly, to cause a downward movement of theteleportation device 104 in the virtual environment, thedirectional input component 110 is pushed or tilted away from the user. Alternative embodiments may use adirectional input component 110 mounted to a telescopic shaft where the up and down motions are accomplished by manipulating the directional input component in a substantially vertical up and down motion. - Brief reference is now made to
FIG. 5 , which is a schematic diagram illustrating a partial front view of a system for teleportation in a virtual environment. An arrangement of multiple fans of an embodiment includes an over thehead fan 210 for simulating, for example, upward movement in the virtual environment. Similarly, the arrangement includes aright side fan 212 and aleft side fan 214 for simulating right and left motion, respectively. Aleft ground fan 218 and aright ground fan 216 can be used to simulate left and right downward movement, respectively. Similarly, a front offace fan 220 can be used to simulate forward motion. Each of the fans can be driven at varying speeds to create the sensation of changing speeds within the virtual environment. Additionally, the fans can be used alone or in combination to create varying degrees of speed and directional simulation. - Brief reference is made to
FIG. 6 , which is a schematic diagram illustrating a side view of an embodiment of a teleportation device showing exemplary inputs to a directional input component. Theteleportation device 104 includes a base 118 coupled to adirectional input component 110 via amoveable coupling 120. To direct theteleportation device 104 to move down, theuser 108 pushes or tilts thedirectional input component 110 away from himself/herself. Similarly, to direct theteleportation device 104 to move up, theuser 108 pulls or tilts thedirectional input device 110 towards himself/herself. Alternative embodiments can feature a telescopic arrangement such that the directional input device is moved substantially vertically up and down to direct the upward or downward movement of theteleportation device 104 within the virtual environment. - Brief reference is made to
FIG. 7 , which is a schematic diagram illustrating a side view of an alternative embodiment of a teleportation device. Theteleportation device 104 includes a base 118 attached to adirectional input component 110 through amoveable coupling 120. In this embodiment, themoveable coupling 120 is a spring.Additional springs 230 are included to provide force feedback through additional resistance. Multiple springs or other biasing elements can be used independently or in combination to achieve a desired level of force feedback in all or selected axes. - Reference is now made to
FIG. 8 , which is a functional block diagram illustrating an embodiment of a control arrangement for a teleportation system as disclosed herein. The computer 160 (herein, computer or host computer) communicates with a 3-D tracker 162, a fan/relay controller 176, aneye tracking controller 182, and thestatus interface 166. Note that in some embodiments, fewer or more components and/or functionality can be implemented. The 3-D tracker 162 provides the position and orientation of the user's head, hands, the teleportation device, etc. to perform the following functionality: -
- 1. Enable a glove interface 186 (used to manipulate 3-D objects in a virtual environment) and a
gesture recognizer 188 to recognize gestures and manipulate 3-D objects in a virtual environment. - 2. Enable a
collision detector 194 to detect collisions between the user, the teleportation device, and a 3-D virtual environment.
- 1. Enable a glove interface 186 (used to manipulate 3-D objects in a virtual environment) and a
- As shown in
FIG. 8 , the teleportation system comprises aphysics component 196 to simulate gravity, so that the user stays on the ground and not in the middle of the air when operating in the “drive” (as opposed to “fly”) mode. For example, when the user jumps over an obstacle in a virtual environment, he/she lands on the ground in the virtual environment. - The output of the
physics component 196 is fed to avibrator controller 198 that simulates vibrations and it also provides input to agraphics generator 190 that drives the 3-D output graphics on a head mounteddisplay 202 that is fully immersive. Thegraphics generator 190 may retrieve environment data from anenvironment storage 192. In one embodiment, the head mounteddisplay 202 is used and comprises a display and headphones or speakers. The headphones can be used to hear things or events in the virtual environment, such as a bouncing ball, etc. For example, the teleportation system can simulate circumstances such as when a user collides with another object by activating one ormore vibration units 200 to provide tactile simulation. That is, if the user takes a turn at 100 mile/hour or 5 miles/hour in the virtual environment, he/she feels the difference in wind blowing at him/her, the vibration from the subwoofer, and perhaps the vibration from the vibrotactile devices. - In one embodiment, the
host computer 160, thestatus interface 166, or both, controls the wind generator units 180 (on/off) and their speed (how much air they blow). Thewind generator units 180 can be driven through afan speed controller 178 and a fan/relay controller 176. In one embodiment, thehost computer 160 also drives asound generator 172 that simulates the noise generated by the teleportation device and can also serve as a secondary vibration mechanism. Thesound generator 172 can be used to drivesound output units 174 using data in a sounddata storing unit 170. Thestatus interface 166 can use aswitch polling facility 168 to detect button presses (e.g., user interface devices coupled to activation devices or switches) from user input devices that are attached to the teleportation device and sends that information to thehost computer 160. In some embodiments, the teleportation system may also comprise aspeech recognizer 164 that recognizes commands that a user verbally issues. - In one embodiment, the
eye tracking controller 182 can communicate with a separate computer coupled to the host computer through, for example, anoutput interface 184. A head mounteddisplay 202 comprises a camera that tracks the user's eye. Theeye tracking controller 182 determines the coordinates of the eye and further determines what the user is observing in the virtual environment. Such a feature may be useful in games. For example, as a missile from the enemy is coming at the user, the user can look at the missile and press a button located on the teleportation device and activate a missile interceptor to destroy the incoming missile. -
FIG. 9 is a block diagram illustrating an embodiment of an architecture for controlling a teleportation system. The control computer generally includes aprocessor 240,memory 242, and one or more input and/or output (I/O) devices 250 (or peripherals) that are communicatively coupled via alocal interface 244. Thelocal interface 244 may be, for example, one or more buses or other wired or wireless connections. Thelocal interface 244 may have additional elements such as controllers, buffers (caches), drivers, repeaters, and receivers, to enable communication. Further, thelocal interface 244 may include address, control, and/or data connections that enable appropriate communication among the aforementioned components. - The
processor 240 is a hardware device for executing software, particularly that which is stored in memory. Theprocessor 240 may be any custom made or commercially available processor, a central processing unit (CPU), an auxiliary processor among several processors associated with the processing device, a semiconductor-based microprocessor (in the form of a microchip or chip set), a macroprocessor, or generally any device for executing software instructions. - The
memory 242 may include any one or combination of volatile memory elements (e.g., random access memory (RAM)) and nonvolatile memory elements (e.g., ROM, hard drive, etc.). Moreover, thememory 242 may incorporate electronic, magnetic, optical, and/or other types of storage media. Note that thememory 242 may have a distributed architecture in which where various components are situated remotely from one another but may be accessed by theprocessor 240. - The software in
memory 242 may include one or more separate programs, each of which comprises an ordered listing of executable instructions for implementing logical functions, such as the logical functions shown inFIG. 8 . In the example ofFIG. 9 , the software in thememory 242 includescontrol software 246 for providing one or more of the functionality shown inFIG. 8 according to an embodiment. Thememory 242 may also comprise a suitable operating system (O/S) 248. Theoperating system 248 essentially controls the execution of other computer programs, such as the control software, and provides scheduling, input-output control, file and data management, memory management, and communication control and related services. - The
control software 246 is a source program, executable program (object code), script, or any other entity comprising a set of instructions to be performed. Thecontrol software 246 can be implemented, in one embodiment, as a distributed network of modules, where one or more of the modules can be accessed by one or more applications or programs or components thereof. In some embodiments, thecontrol software 246 can be implemented as a single module with all of the functionality of the aforementioned modules. When thecontrol software 246 is a source program, then the program is translated via a compiler, assembler, interpreter, or the like, which may or may not be included within the memory, so as to operate properly in connection with theoperating system 248. Furthermore, thecontrol software 246 can be written with (a) an object oriented programming language, which has classes of data and methods, or (b) a procedure programming language, which has routines, subroutines, and/or functions, for example but not limited to, C, C++, Pascal, Basic, Fortran, Cobol, Perl, Java, and Ada. - The I/
O devices 250 may include input devices such as, for example, a keyboard, mouse, scanner, microphone, sensor(s), etc. Furthermore, the I/O devices 250 may also include output devices such as, for example, a printer, display, audio devices, vibration devices, etc. Finally, the I/O devices 250 may further include devices that communicate both inputs and outputs such as, for instance, a modulator/demodulator (modem for accessing another device, system, or network), a radio frequency (RF) or other transceiver, a telephonic interface, a bridge, a router, etc. - When the control computer is in operation, the
processor 240 is configured to execute software stored within thememory 242, to communicate data to and from thememory 242, and to generally control operations of the control computer pursuant to the software. Thecontrol software 246 and theoperating system 248, in whole or in part, but typically the latter, are read by theprocessor 240, perhaps buffered within theprocessor 240, and then executed. - It should be noted that the
control software 246 can be stored on any computer-readable medium for use by or in connection with any computer-related system or method. In the context of this document, a computer-readable medium is an electronic, magnetic, optical, or other physical device or means that can contain or store a computer program for use by or in connection with a computer related system or method. Thecontrol software 246 can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. - In an alternative embodiment, where the functionality of the
control software 246 is implemented in hardware, or as a combination of software and hardware, the functionality of thecontrol software 246 can be implemented with any or a combination of the following technologies, which are each well known in the art: a discrete logic circuit(s) having logic gates for implementing logic functions upon data signals, an application specific integrated circuit (ASIC) having appropriate combinational logic gates, a programmable gate array(s) (PGA), a field programmable gate array (FPGA), etc; or can be implemented with other technologies now known or later developed. - Reference is now made to
FIG. 10 , which is a block diagram illustrating an embodiment of amethod 300 for providing teleportation in a virtual environment. Themethod 300 includes the step of delivering a video signal to the user inblock 310. The video signal may be delivered using, for example, one or more displays configured in a head mounted device. The video signal provides the user with the visual information corresponding to the virtual environment. Themethod 300 also includes the step of delivering an audio signal to a user inblock 320. The audio signal can be delivered through, for example, headphones or speakers. The audio signal can be used to communicate sounds within the virtual environment that correspond to objects or events. - The
method 300 also includes the step of receiving position inputs relating to user physiological features and the teleportation device inblock 330. For example, the three-dimensional position and orientation of the hands and head of the user can serve to ensure that the user's position and video signal correspond to the virtual environment. Similarly, by receiving the three-dimensional position and orientation data for the teleportation device, the computer controlling the virtual environment can correctly render the teleportation device in the virtual environment. - A user is provided vibratory feedback in
block 340. By providing the vibratory feedback, a user can experience the sounds and vibrations corresponding to different rates of speed and events such as collisions in the virtual environment. Additionally, to further enhance the sensation of motion, air is directed towards the user inblock 350. The air is directed at varying rates and from different directions to create the sensation of moving at different speeds and in different directions. Air can be directed using multiple wind generation devices including for example, fans or blowers. Each wind generation device can be driven independently or in combination at one or more preset speeds or at any speed over a range of speeds. Controlling the wind generation units can be accomplished using relays, electronic speed controllers, electronic motor drives, or any combination thereof. - Any process descriptions or blocks in flow charts should be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps in the process, and alternate implementations are included within the scope of an embodiment of the present disclosure in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present disclosure.
- It should be emphasized that the above-described embodiments of the present disclosure, particularly, any illustrated embodiments, are merely possible examples of implementations, merely set forth for a clear understanding of the principles of the disclosure. Many variations and modifications may be made to the above-described embodiment(s) of the disclosure without departing substantially from the spirit and principles of the disclosure. All such modifications and variations are intended to be included herein within the scope of this disclosure and the present disclosure and protected by the following claims.
Claims (23)
1. A system for teleportation in a virtual environment, comprising:
a head mounted display configured to provide an immersive virtual environment;
a teleportation device configured to provide navigation in the virtual environment;
at least one feedback device configured to provide a user with information corresponding to movement of the teleportation device within the virtual environment;
a plurality of input devices configured to generate a plurality of input signals in response to inputs from the user; and
a computing device configured to receive the plurality of input signals and control the at least one feedback device.
2. The system of claim 1 , wherein the head mounted display comprises:
a video display configured to provide a video signal corresponding to the virtual environment; and
an audio device configured to provide an audio signal corresponding to the virtual environment.
3. The system of claim 1 , wherein the at least one feedback device comprises a fan directed to the user and configured to create a motion sensation.
4. The system of claim 3 , further comprising a plurality of fans configured to create the motion sensation in a plurality of directions.
5. The system of claim 1 , wherein the at least one feedback device comprises a speaker configured to generate information in the form of an audio signal and a vibratory signal to the user, the audio and vibratory signals configured to create a motion sensation corresponding to changes in the virtual environment.
6. The system of claim 1 , wherein the plurality of input devices comprise a plurality of user position sensors configured to provide three-dimensional location data corresponding to a plurality of user physiological features.
7. The system of claim 6 , wherein the plurality of user physiological features are selected from the group consisting of: hands, arms, head, and torso.
8. The system of claim 6 , wherein one of the plurality of sensors comprises a teleportation device position sensor configured to provide three dimensional location data corresponding to the teleportation device.
9. The system of claim 1 , wherein one of the plurality of input devices comprises a user interface device configured to trigger an operation within the virtual environment.
10. The system of claim 9 , wherein the user interface device is an electrical switch.
11. The system of claim 9 , wherein the operation is selected from the group consisting of: flight mode, lights, stop, move up, move down, and jump.
12. The system of claim 1 , further comprising a position interface configured to receive a position sensor input and transmit position and orientation data to the computing device.
13. The system of claim 1 , wherein the teleportation device comprises:
a base configured to support at least a portion of the user; and
a directional input portion coupled to the base using a moveable coupling and configured to simulate a directional input member of a personal vehicle.
14. The system of claim 13 , wherein the moveable coupling comprises a biasing element.
15. The system of claim 13 , wherein the directional input portion is configured to tilt away from the user to cause upward movement in the virtual environment and wherein the directional input portion is configured to tilt toward the user to cause a downward movement in the virtual environment.
16. The system of claim 1 , further comprising a means for controlling a plurality of fans with the computing device.
17. A method for providing teleportation in a virtual environment, comprising:
delivering a video signal, corresponding to a virtual environment, to a user;
delivering an audio signal, corresponding to the virtual environment, to the user;
receiving a plurality of inputs corresponding to a three-dimensional position for each of a plurality of user physiological features;
providing a vibratory feedback, corresponding to the virtual environment, to the user; and
directing air towards the user to create a motion sensation.
18. The method of claim 17 , wherein the directing comprises varying a fan output to create the motion sensation corresponding to a plurality of velocities.
19. The method of claim 17 , further comprising receiving user interface inputs configured to trigger an operation within the virtual environment.
20. The method of claim 19 , wherein the operation is selected from the group consisting of: flight mode, lights, stop, move up, move down, and jump.
21. The method of claim 17 , further comprising supporting a portion of the user in a configuration consistent with a personal vehicle.
22. The method of claim 21 , wherein the personal vehicle comprises a scooter.
23. A system for teleportation in a virtual environment, comprising:
a head mounted display configured to provide a video signal and an audio signal to a user;
a teleportation device configured to support a portion of the user, the teleportation device comprising a base moveably coupled to a directional input component;
a plurality of user position sensors configured to transmit three-dimensional position and orientation data corresponding to a plurality of user physiological features;
a directional input component sensor configured to transmit three-dimensional position and orientation data corresponding to the directional input component of the teleportation device;
a low frequency driver attached to the teleportation device and configured to provide vibratory feedback to the user corresponding to motion in the virtual environment;
a plurality of fans directed at the user and configured to create a motion sensation in a plurality of directions by controlling the output of each of the plurality of fans independently;
a computing device configured to receive a plurality of input signals and generate a plurality of output commands to control a plurality of output devices;
an output device controller, configured to receive a portion of the plurality of output commands and control a portion of the plurality of output devices; and
a position interface device configured to receive signals from the plurality of user position sensors and transmit three-dimensional position and orientation data to the computing device.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/816,968 US20080153591A1 (en) | 2005-03-07 | 2006-03-07 | Teleportation Systems and Methods in a Virtual Environment |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US65928305P | 2005-03-07 | 2005-03-07 | |
PCT/US2006/008264 WO2006096776A2 (en) | 2005-03-07 | 2006-03-07 | Teleportation systems and methods in a virtual environment |
US11/816,968 US20080153591A1 (en) | 2005-03-07 | 2006-03-07 | Teleportation Systems and Methods in a Virtual Environment |
Publications (1)
Publication Number | Publication Date |
---|---|
US20080153591A1 true US20080153591A1 (en) | 2008-06-26 |
Family
ID=36954010
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/816,968 Abandoned US20080153591A1 (en) | 2005-03-07 | 2006-03-07 | Teleportation Systems and Methods in a Virtual Environment |
Country Status (2)
Country | Link |
---|---|
US (1) | US20080153591A1 (en) |
WO (1) | WO2006096776A2 (en) |
Cited By (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090280896A1 (en) * | 2006-06-19 | 2009-11-12 | Ambx Uk Limited | Game enhancer |
US20110078170A1 (en) * | 2009-09-29 | 2011-03-31 | International Business Machines Corporation | Routing a Teleportation Request Based on Compatibility with User Contexts |
US20130023342A1 (en) * | 2011-07-18 | 2013-01-24 | Samsung Electronics Co., Ltd. | Content playing method and apparatus |
US20150290533A1 (en) * | 2014-04-14 | 2015-10-15 | International Business Machines Corporation | Simulation based on audio signals |
US9254438B2 (en) | 2009-09-29 | 2016-02-09 | International Business Machines Corporation | Apparatus and method to transition between a media presentation and a virtual environment |
US9265458B2 (en) | 2012-12-04 | 2016-02-23 | Sync-Think, Inc. | Application of smooth pursuit cognitive testing paradigms to clinical drug development |
DE102014013961A1 (en) | 2014-09-19 | 2016-03-24 | Audi Ag | Virtual reality glasses, system with virtual reality glasses and method of operating a virtual reality glasses |
US9380976B2 (en) | 2013-03-11 | 2016-07-05 | Sync-Think, Inc. | Optical neuroinformatics |
US20180286268A1 (en) * | 2017-03-28 | 2018-10-04 | Wichita State University | Virtual reality driver training and assessment system |
US20190163274A1 (en) * | 2015-03-17 | 2019-05-30 | Whirlwind VR, Inc. | System and Method for Modulating a Peripheral Device Based on an Unscripted Feed Using Computer Vision |
US20190192965A1 (en) * | 2017-12-26 | 2019-06-27 | Disney Enterprises, Inc. | Directed wind effect for ar/vr experience |
US10466790B2 (en) * | 2015-03-17 | 2019-11-05 | Whirlwind VR, Inc. | System and method for processing an audio and video input in a point of view program for haptic delivery |
US20200174809A1 (en) * | 2014-09-08 | 2020-06-04 | Wirepath Home Systems, Llc | Method for electronic device virtualization and management |
US10777008B2 (en) | 2017-08-31 | 2020-09-15 | Disney Enterprises, Inc. | Drones generating various air flow effects around a virtual reality or augmented reality user |
US10960303B2 (en) * | 2018-04-20 | 2021-03-30 | Korea Advanced Institute Of Science And Technology (Kaist) | Kinesthetic-feedback wearable apparatus for virtual reality and augmented reality and method for controlling the same |
US20220154964A1 (en) * | 2020-11-16 | 2022-05-19 | Mumarba LLC | Artificial Breeze System |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE202016103302U1 (en) | 2016-06-22 | 2016-07-11 | Stefan Zimmermann | For a virtual reality glasses certain guidance of an electrical line |
CN110335511A (en) * | 2019-05-30 | 2019-10-15 | 桂林蓝港科技有限公司 | A kind of student side virtual reality head-mounted display apparatus control system and method |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6184847B1 (en) * | 1998-09-22 | 2001-02-06 | Vega Vista, Inc. | Intuitive control of portable data displays |
US20010041328A1 (en) * | 2000-05-11 | 2001-11-15 | Fisher Samuel Heyward | Foreign language immersion simulation process and apparatus |
US6591250B1 (en) * | 1998-02-23 | 2003-07-08 | Genetic Anomalies, Inc. | System and method for managing virtual property |
US6952716B1 (en) * | 2000-07-12 | 2005-10-04 | Treehouse Solutions, Inc. | Method and system for presenting data over a network based on network user choices and collecting real-time data related to said choices |
US6972734B1 (en) * | 1999-06-11 | 2005-12-06 | Canon Kabushiki Kaisha | Mixed reality apparatus and mixed reality presentation method |
US7098891B1 (en) * | 1992-09-18 | 2006-08-29 | Pryor Timothy R | Method for providing human input to a computer |
US7300352B2 (en) * | 2001-09-27 | 2007-11-27 | Igt | Method and apparatus for graphically portraying gaming environment and information regarding components thereof |
US7584082B2 (en) * | 2003-08-07 | 2009-09-01 | The Mathworks, Inc. | Synchronization and data review system |
US7753785B2 (en) * | 2003-05-06 | 2010-07-13 | Nintendo Co., Ltd. | Game apparatus, storing medium that stores control program of virtual camera, and control method of virtual camera |
US7828657B2 (en) * | 2003-05-20 | 2010-11-09 | Turbine, Inc. | System and method for enhancing the experience of participant in a massively multiplayer game |
US7850525B2 (en) * | 2004-05-10 | 2010-12-14 | Sega Corporation | Mechanism of generating a sound radar image in a video game device |
-
2006
- 2006-03-07 WO PCT/US2006/008264 patent/WO2006096776A2/en active Application Filing
- 2006-03-07 US US11/816,968 patent/US20080153591A1/en not_active Abandoned
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7098891B1 (en) * | 1992-09-18 | 2006-08-29 | Pryor Timothy R | Method for providing human input to a computer |
US6591250B1 (en) * | 1998-02-23 | 2003-07-08 | Genetic Anomalies, Inc. | System and method for managing virtual property |
US6184847B1 (en) * | 1998-09-22 | 2001-02-06 | Vega Vista, Inc. | Intuitive control of portable data displays |
US6972734B1 (en) * | 1999-06-11 | 2005-12-06 | Canon Kabushiki Kaisha | Mixed reality apparatus and mixed reality presentation method |
US20010041328A1 (en) * | 2000-05-11 | 2001-11-15 | Fisher Samuel Heyward | Foreign language immersion simulation process and apparatus |
US6952716B1 (en) * | 2000-07-12 | 2005-10-04 | Treehouse Solutions, Inc. | Method and system for presenting data over a network based on network user choices and collecting real-time data related to said choices |
US7860942B2 (en) * | 2000-07-12 | 2010-12-28 | Treehouse Solutions, Inc. | Method and system for presenting data over a network based on network user choices and collecting real-time data related to said choices |
US7300352B2 (en) * | 2001-09-27 | 2007-11-27 | Igt | Method and apparatus for graphically portraying gaming environment and information regarding components thereof |
US7753785B2 (en) * | 2003-05-06 | 2010-07-13 | Nintendo Co., Ltd. | Game apparatus, storing medium that stores control program of virtual camera, and control method of virtual camera |
US7828657B2 (en) * | 2003-05-20 | 2010-11-09 | Turbine, Inc. | System and method for enhancing the experience of participant in a massively multiplayer game |
US7584082B2 (en) * | 2003-08-07 | 2009-09-01 | The Mathworks, Inc. | Synchronization and data review system |
US7850525B2 (en) * | 2004-05-10 | 2010-12-14 | Sega Corporation | Mechanism of generating a sound radar image in a video game device |
Cited By (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8376844B2 (en) * | 2006-06-19 | 2013-02-19 | Ambx Uk Limited | Game enhancer |
US20090280896A1 (en) * | 2006-06-19 | 2009-11-12 | Ambx Uk Limited | Game enhancer |
US9254438B2 (en) | 2009-09-29 | 2016-02-09 | International Business Machines Corporation | Apparatus and method to transition between a media presentation and a virtual environment |
US9256347B2 (en) * | 2009-09-29 | 2016-02-09 | International Business Machines Corporation | Routing a teleportation request based on compatibility with user contexts |
US20110078170A1 (en) * | 2009-09-29 | 2011-03-31 | International Business Machines Corporation | Routing a Teleportation Request Based on Compatibility with User Contexts |
US20130023342A1 (en) * | 2011-07-18 | 2013-01-24 | Samsung Electronics Co., Ltd. | Content playing method and apparatus |
US9265458B2 (en) | 2012-12-04 | 2016-02-23 | Sync-Think, Inc. | Application of smooth pursuit cognitive testing paradigms to clinical drug development |
US9380976B2 (en) | 2013-03-11 | 2016-07-05 | Sync-Think, Inc. | Optical neuroinformatics |
US20150290533A1 (en) * | 2014-04-14 | 2015-10-15 | International Business Machines Corporation | Simulation based on audio signals |
US9393490B2 (en) * | 2014-04-14 | 2016-07-19 | International Business Machines Corporation | Simulation based on audio signals |
US20200174809A1 (en) * | 2014-09-08 | 2020-06-04 | Wirepath Home Systems, Llc | Method for electronic device virtualization and management |
US11861385B2 (en) * | 2014-09-08 | 2024-01-02 | Snap One, Llc | Method for electronic device virtualization and management |
DE102014013961A1 (en) | 2014-09-19 | 2016-03-24 | Audi Ag | Virtual reality glasses, system with virtual reality glasses and method of operating a virtual reality glasses |
US20190163274A1 (en) * | 2015-03-17 | 2019-05-30 | Whirlwind VR, Inc. | System and Method for Modulating a Peripheral Device Based on an Unscripted Feed Using Computer Vision |
US10466790B2 (en) * | 2015-03-17 | 2019-11-05 | Whirlwind VR, Inc. | System and method for processing an audio and video input in a point of view program for haptic delivery |
US10768704B2 (en) * | 2015-03-17 | 2020-09-08 | Whirlwind VR, Inc. | System and method for modulating a peripheral device based on an unscripted feed using computer vision |
US11023048B2 (en) * | 2015-03-17 | 2021-06-01 | Whirlwind VR, Inc. | System and method for modulating a light-emitting peripheral device based on an unscripted feed using computer vision |
US10825350B2 (en) * | 2017-03-28 | 2020-11-03 | Wichita State University | Virtual reality driver training and assessment system |
US20180286268A1 (en) * | 2017-03-28 | 2018-10-04 | Wichita State University | Virtual reality driver training and assessment system |
US10777008B2 (en) | 2017-08-31 | 2020-09-15 | Disney Enterprises, Inc. | Drones generating various air flow effects around a virtual reality or augmented reality user |
US20190192965A1 (en) * | 2017-12-26 | 2019-06-27 | Disney Enterprises, Inc. | Directed wind effect for ar/vr experience |
US10898798B2 (en) * | 2017-12-26 | 2021-01-26 | Disney Enterprises, Inc. | Directed wind effect for AR/VR experience |
US10960303B2 (en) * | 2018-04-20 | 2021-03-30 | Korea Advanced Institute Of Science And Technology (Kaist) | Kinesthetic-feedback wearable apparatus for virtual reality and augmented reality and method for controlling the same |
US20220154964A1 (en) * | 2020-11-16 | 2022-05-19 | Mumarba LLC | Artificial Breeze System |
Also Published As
Publication number | Publication date |
---|---|
WO2006096776A2 (en) | 2006-09-14 |
WO2006096776A3 (en) | 2007-08-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20080153591A1 (en) | Teleportation Systems and Methods in a Virtual Environment | |
US6147674A (en) | Method and apparatus for designing force sensations in force feedback computer applications | |
US10322336B2 (en) | Haptic braille output for a game controller | |
US6864877B2 (en) | Directional tactile feedback for haptic feedback interface devices | |
US5803738A (en) | Apparatus for robotic force simulation | |
JP4441179B2 (en) | Tactile remote control device for toys | |
US8737035B2 (en) | Magnetically movable objects over a display of an electronic device | |
JP2020030845A (en) | Non-collocated haptic cues in immersive environments | |
US20090325699A1 (en) | Interfacing with virtual reality | |
WO2006121533A2 (en) | Manifold compatibility electronic omni axis human interface | |
Rahman et al. | Motion-path based in car gesture control of the multimedia devices | |
WO2005050427A1 (en) | Tactile force sense information display system and method | |
KR20140112352A (en) | Systems and methods for haptic remote control gaming | |
JP2010061667A (en) | Method and apparatus for controlling force feedback interface utilizing host computer | |
CN102335510A (en) | Human-computer interaction system | |
US20200057501A1 (en) | System, devices, and methods for remote projection of haptic effects | |
US20170348594A1 (en) | Device, System, and Method for Motion Feedback Controller | |
CN210845261U (en) | Multi-functional immersive VR motion platform device | |
Borst et al. | Touchpad-driven haptic communication using a palm-sized vibrotactile array with an open-hardware controller design | |
WO2021240601A1 (en) | Virtual space body sensation system | |
KR20180105285A (en) | Haptic sensible apparatus and system | |
WO2001026089A1 (en) | Cursor positioning device with tactile output capability (the 'living mouse') | |
TWI479364B (en) | Portable device with magnetic controlling touch feedback function and magnetic controlling touch feedback device | |
JP2023148749A (en) | Input-output device | |
CN114984563A (en) | Cloud game control system, gamepad, and control method and device of gamepad |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: UNIVERSITY OF GEORGIA RESEARCH FOUNDATION, INC., G Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:DELIGIANNIDIS, LEONIDAS;REEL/FRAME:019738/0456 Effective date: 20070822 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |