US20020052724A1 - Hybrid vehicle operations simulator - Google Patents

Hybrid vehicle operations simulator Download PDF

Info

Publication number
US20020052724A1
US20020052724A1 US10/001,362 US136201A US2002052724A1 US 20020052724 A1 US20020052724 A1 US 20020052724A1 US 136201 A US136201 A US 136201A US 2002052724 A1 US2002052724 A1 US 2002052724A1
Authority
US
United States
Prior art keywords
vehicle
operator
scene
environment
operation simulator
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/001,362
Inventor
Thomas Sheridan
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US10/001,362 priority Critical patent/US20020052724A1/en
Publication of US20020052724A1 publication Critical patent/US20020052724A1/en
Priority to US11/454,601 priority patent/US7246050B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes
    • G09B9/02Simulators for teaching or training purposes for teaching control of vehicles or other craft
    • G09B9/04Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of land vehicles
    • G09B9/05Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of land vehicles the view from a vehicle being simulated

Definitions

  • This invention relates to vehicle simulation, and more particularly to methods and apparatus for simulating the human operation of a moving vehicle under selected operating conditions.
  • simulators have been used for a number of purposes, including research, training, and vehicle engineering. Simulators have become increasingly prevalent and useful for reproducing the experience of operating aircraft, motor vehicles, trains, spacecraft, and other vehicles. Aviation simulators have become particularly prevalent, with nearly every airline now using simulators for training and research—the first time a commercial pilot flies a new aircraft, it is often filled with passengers. The military services use simulators extensively for training personnel in the operation of ships, tanks, aircraft, and other vehicles.
  • simulators especially those providing the operator(s) with motion cues, have primarily remained the tools of large organizations with significant financial resources.
  • motor vehicle i.e., driving
  • driving simulators have not yet progressed beyond manufacturers, suppliers, government agencies (including the military), and academic institutions, largely because of their cost.
  • driving simulators have a number of valuable safety applications.
  • Driver response behavior Highways are becoming populated by more vehicles, moving at greater speeds, with a greater portion of drivers comprised of older adults with reduced sensory and response capabilities.
  • simulators can provide an improved means for training and evaluating drivers. Most driver training is conducted either in classrooms or in automobiles in normal traffic, which rarely exposes trainees to unexpected hazards. Devices that would allow trainees to experience potential collision situations, visibility hazards, or other unusual driving situations, without actual exposure to risk would provide useful training.
  • simulators provide manufacturers and suppliers useful data from which to further develop their products. Vehicle manufacturers, suppliers and car/truck fleet owners usually perform developmental tests in actual vehicles, but this is limited to experiences not involving collision or other hazards. The use of simulators to perform these functions is costly, particularly for programming and measuring motion and configuring the simulator to represent the appropriate vehicle, limiting the usefulness of these simulators for most research applications.
  • Simulators are primarily tasked with recreating the sensory cues their “real-world experience” counterparts offer.
  • Most state of the art simulators do a credible job recreating visual and audible stimuli, but only the most expensive provide credible cues for the vestibular senses (controlled by semicircular canals in the inner ear, which sense rotary acceleration, and otolith organs, which sense translational acceleration) and the muscle and joint sensors of motion.
  • Motor vehicle simulators in particular, struggle to provide a faithful motion representation without exorbitant cost.
  • the present invention overcomes the cost and motion fidelity issues of other vehicle operation simulators with a novel combination of existing devices.
  • the operator is carried by and operates an actual vehicle.
  • Vehicle examples include but are not limited to an automobile, motorcycle, aircraft, wheelchair, bicycle, skis, and a ship.
  • the vehicle is operated in a “natural environment” using its normal vehicle control, and in accordance with visual and audible cues provided by a virtual reality device.
  • the natural environment may be an open space (e.g., a large field or a parking lot), a track, an unused or seldom used roadway, snow-covered mountain slope, air space, or other environment appropriate for the mobile vehicle being used.
  • the virtual reality device takes advantage of recent advances in computer processing and image generation speeds to create a realistic vehicle operation environment, including hazards which are not present in the actual/natural environment.
  • the invention provides realistic motion cues at reasonable cost, thereby creating training, research, and product development opportunities not previously possible.
  • the present invention is operated in a large, open area free of obstacles and hazards. Since its intent is to provide a realistic vehicle operation experience, these areas provide the greatest opportunity to simulate all types of uninterrupted operation experiences. For example, if the invention were to simulate the operation of an automobile, it would be difficult to simulate driving on a highway for any useful period if the invention were used in an urban area, or even in a small field. However, for certain uses, it is envisioned that the invention may be operated on certain less-trafficked roads or streets.
  • the present invention provides both apparatus for a vehicle operation simulator and a method for simulating vehicle operation.
  • the apparatus includes a mobile vehicle having vehicle controls, a scene generator, and a scene display that receives input from the scene generator and presents at least a partial virtual environment view to the operator of the vehicle.
  • a method for use includes the scene generator creating at least one element within an environment view, transmitting an electronic signal comprising the at least one element within the environment view to the scene display, the scene display presenting the environment view to the vehicle operator, and, based on resulting operator actuation of vehicle control or vehicle movement, regenerating the environment view to provide continually updated cues to the operator, including visual and/or audible cues.
  • the scene display may present the operator an environment view consisting of artificial elements (produced by the scene generator) or a combination of artificial and natural elements.
  • the components of the scene display may, for example, include a head-mounted unit, a projector, and/or a projection screen which is either partially transparent (e.g., half-silvered) or opaque (e.g., a flat screen).
  • the environment view may consist of a viewable image presented within the head-mounted unit worn by the operator, an image projected onto flat or curved screens shaped like the windshield and/or side and rear windows, or images projected onto a semi-transparent screen so as to superimpose artificial elements on the operator's view of the surrounding natural environment.
  • the scene generator transmits an electronic signal to the scene display comprising at least one element within the environment view, which includes the location of natural and artificial images within the display.
  • the scene generator continually regenerates the environment view for display to the vehicle operator via the scene display.
  • the scene generator may alter artificial images within the environment view in response to vehicle movement, operator actuation of vehicle controls, and predetermined artificial image movement.
  • Components of the scene generator may include a general-purpose programmed computer, and a means for transmitting a signal to the scene display.
  • the environment view may be presented to the vehicle operator to suggest behavior—for example, a velocity—different from the actual behavior exhibited by the vehicle.
  • a mechanism may be employed to allow the vehicle to respond to control actuation as though the vehicle were behaving as shown in the environment view.
  • the environment view might be presented to the operator of an automobile as though it were traveling at 70 miles per hour, when the vehicle actually is travelling at only 35 miles per hour.
  • This mechanism might alter the operator's actuation of the steering wheel, for example, to cause a much sharper turn than under normal operation at 35 miles per hour, or at least to provide a simulated view of such a sharper turn.
  • the scene generator may take one or more forms of vehicle and operator movement and/or position data multiple as input. These may include acceleration data from an accelerometer or gyroscopic inertial measurement unit, velocity data from a unit which measures either translational or rotational velocity in up to six degrees of freedom, or position data from a positional measurement unit.
  • a mechanism may be used to maintain equivalent light brightness between a natural environment seen outside the vehicle and an image projected on such natural environment.
  • This mechanism may include a component mounted on the exterior of the vehicle to take light brightness measurements of the natural environment. This mechanism provides these measurements to the scene generator, continually incorporating any shifting brightness of the environment into the generation of artificial images so that they remain realistic under changing conditions.
  • the vehicle may employ secondary vehicle controls to enhance operator safety, such that the vehicle responds exclusively to the secondary controls, or to both controls when the secondary controls are actuated.
  • This secondary vehicle control might be used in instances where only a secondary operator can see the actual movement of the vehicle in relation to the natural environment, and might, in an automobile for example, include conventional dual controls used in driver training vehicles.
  • the vehicle may employ parameter-constraining apparatus that act to restrict the movement of the vehicle or the actuation of vehicle control.
  • this apparatus might restrict movement to prevent it from exceeding certain speeds.
  • this apparatus might restrain control actuation to prevent a roll so sharp it would disorient the operator.
  • FIG. 1 is a block diagram of an illustrative embodiment of the invention.
  • FIG. 2 is a view from the operator's perspective of an illustrative embodiment.
  • FIG. 3 is a flowchart illustrating operation of the invention.
  • FIGS. 4A, 4B, 4 C, and 4 D are semi-block diagrams illustrating four different ways in which the invention may be implemented.
  • the vehicle operation simulator for an illustrative embodiment includes a mobile vehicle 100 , which may be any vehicle that moves with at least one degree of freedom, for which movement represents an ordinary feature of operation, and which includes at least one component for regulation or control of said movement. Examples include, but are not limited to, an automobile, aircraft, ship, truck, railroad train, motorcycle, wheelchair, bicycle, snowboard, roller skates, and skis.
  • Mobile vehicle 100 may be operated in a natural environment; for example, in an open space appropriate to the mobile vehicle. This open space should be large and preferably free of other vehicles, potential hazards, and pedestrians. However, it is also contemplated that the invention be practiced on unused or seldom-used tracks, streets, air space, snow slopes, roadway, or other environment on which the particular vehicle 100 might normally be operated.
  • a scene generator 130 is provided which generates an electronic signal and transmits it to scene display 140 , which presents an environment view 170 to human operator 120 .
  • the scene generator includes a programmed general-purpose computer to generate images and sounds associated with a virtual environment which may include obstacles and hazards, including, but not limited to, other vehicles, animals, people, or fixed objects. These computer-generated elements typically exhibit behavior and characteristics of their real-world counterparts. For example, a computer-generated image of a person might be animated so as to appear to cross the street in front of the mobile vehicle 100 . It should be noted that other equipment might also provide the functionality of the scene generator 130 including, for example, an array of projectable photographic or video images.
  • the environment view is completely artificial.
  • the environment view may include a computer-generated artificial background, at least one computer-generated artificial element, and have no elements taken from the natural environment surrounding the mobile vehicle.
  • the environment view may be comprised of a composite of natural elements and artificial elements.
  • computer-generated artificial elements might be superimposed on a display screen that also allows the view of a natural environment to pass through.
  • the scene generator would superimpose computer-generated artificial elements against a backdrop of a color video signal of the actual natural environment, or of a selectively modified natural environment. For example, a simulation conducted during the day may be modified to simulate night driving.
  • the scene generator may also receive input on the state of the natural environment from, for example, vehicle-mounted cameras, and use this input in generating at least one element within an environment view that is related in predetermined ways to the actual environment.
  • the scene display 140 may take many forms.
  • the scene display is a head-mounted display that presents the environment view in the field of vision of the operator, and allows for a realistic field of vision no matter how the operator's head is oriented or where in his field of vision the operator's eyes are focused.
  • the scene display includes an electronic display and/or a projection unit affixed to the vehicle. Either the head-mounted or fixed display may include a half-silvered mirror, allowing the items projected on to the half-silvered mirror and the natural environment 180 behind it to comprise the environment view.
  • the scene display includes a projection unit and a flat screen, constructing an environment view 170 consisting entirely of projected elements.
  • the environment view may consist of images projected on a single surface or, where appropriate, multiple surfaces. For example, the simulation of the operation of a helicopter might require the use of multiple display surfaces to present the operator simulated views above the airplane and on either side, as well as in front.
  • the operator's actuation of vehicle control 110 is input to a computerized mathematical model 135 , which may run on the same computer as the scene generator.
  • This mathematical model may then provide input to scene generator 130 , causing the scene generator to alter the environment view presented on the scene display as appropriate to compensate for vehicle orientation and/or position, and the operating environment to be simulated.
  • Data on vehicle activity may also be provided to the scene generator via the measurement unit 150 .
  • This unit may measure the velocity of the vehicle (by measuring, for example, an automobile's wheel rotation and angle), measure its translational or rotational acceleration (for example, with an accelerometer, inertial acceleration measurement unit, or gyroscopic sensors), or measure changes in its position (using, for example, a global positioning system device, or laser triangulation in the operating area). Regardless of the measurement device used, however, this velocity, acceleration, or position data will encompass up to six degrees of freedom including translation and rotation. In one example, the measurement unit might discern a ship's velocity by combining measurements of water flow past a predetermined point on the ship's hull with measurements of the rudder angle over time.
  • the measurement unit might discern an automobile's acceleration or deceleration relative to the ground and/or gyroscopic changes in its heading over time.
  • the use of inertial, position, and velocity measurement units will be well-known by those skilled in the art.
  • Data from either of these measurement units may supplement or replace input from vehicle control 110 to a mathematical model and/or the scene generator.
  • the scene generator may then alter the environment view as appropriate given the mobile vehicle's movement (i.e., changes in angle or position relative to the earth) using conventional computer graphic transformations of image geometry.
  • Operator 120 actuates vehicle control 110 to control mobile vehicle 100 , triggering cues to the operator's motion sense organs.
  • Some embodiments may employ additional features to ensure the safety of the operator. For example, air bags and lap belts may be used to secure the operator in place during operation.
  • Either vehicle control 110 , or the motion of mobile vehicle 100 may be constrained by parameter-constraining apparatus 160 .
  • the parameter-constraining apparatus may comprise a computer system designed to assume control of the vehicle under certain hazardous conditions, a governor mechanism designed to limit vehicle velocity, or a mechanism limiting turn radius, angle of descent and/or other motion parameters. This apparatus may restrain motion either totally or in a manner dependent on vehicle operating conditions. The constraints may limit actuation of vehicle controls, but preferably limit the response of the vehicle to the controls.
  • scene generator 130 may also take input from light brightness measurement unit 190 and video camera 200 .
  • a light brightness measurement unit may provide data enabling the scene generator to maintain consistent brightness between the natural environment and any artificial elements that are superimposed. Therefore, this unit may be mounted or otherwise affixed to the vehicle so as to enable measuring the light brightness of the environment view as seen by the operator, as will be appreciated by those skilled in the art.
  • One or more video cameras may provide one or more video signals depicting the natural environment, for use when the natural environment is not otherwise visible to the operator. Therefore the video camera(s) may also be mounted or otherwise positioned on the vehicle's exterior or on the operator's head so as to capture the visible elements of the natural environment from a perspective collinear with the operator's field of vision; methods for appropriate capture of the natural environment using video camera apparatus will also be well-known by those skilled in the art. While the camera(s) may provide a video image directly to scene display 140 , it is preferable that camera output be provided, as shown, to scene generator 130 , where it may be used to reproduce either the actual—or a modified version of—the natural environment.
  • FIG. 2 depicts the interior of mobile vehicle 100 which, for the illustrative embodiment, is automobile 200 with controls 210 including a steering wheel, an accelerator, a brake, and other suitable controls such as a gear shift, clutch, de-fogger, etc. (controls not shown).
  • Scene generator 130 may be a programmed general-purpose computer stored within automobile 200 .
  • a half-silvered mirror 220 integrated with or separate from the vehicle's windshield, or attached to the head-mounted display, receives either projected images from a projector (not shown) situated within automobile 200 (in the case of a screen display), or a signal from the scene generator (in the case of the head-mounted display).
  • Obstacles 240 are placed in the environment view such that it appears superimposed on the natural environment also viewable through the half-silvered mirror 220 .
  • Some embodiments may also include a secondary vehicle control 230 to promote the safe operation of the automobile.
  • a secondary vehicle operator who monitors the operator's actions and corrects or overrides vehicle control actuation that would result in danger or injury, operates secondary vehicle control 230 .
  • the secondary operator may experience the same environment view as the operator, may experience only the natural environment, may experience both environments (for example, on a split screen view), or may experience some other view appropriate to maximize safe operation of the vehicle.
  • FIG. 3 is a flow diagram of a method for simulating vehicle operation utilizing the apparatus of FIG. 1.
  • an environment view is created, which may consist of artificial elements designed to wholly comprise the environment view, or artificial elements intended to be superimposed on natural elements to comprise the environment view.
  • the scene generator transmits these elements to the scene display.
  • the scene display presents the environment view to the operator.
  • the viewing surface may encompass the field of vision regardless of the operator's head movement—i.e., the viewing surface will allow the operator to see a projected image in all relevant directions for the particular vehicle.
  • step 330 the operator actuates vehicle control in accordance with the environment view.
  • the actuation of vehicle control will include at least one operator act—for example, applying rotational force to a steering wheel, controlling force on an accelerator, applying force to a brake pedal, applying force on one edge of a snowboard, and/or applying force on the control stick of an airplane.
  • step 340 the vehicle responds to actuation of the vehicle control.
  • parameter-restraining apparatus may be employed to restrict vehicle movement, to enhance operator safety or for other reasons.
  • This apparatus may act to restrain control actuation by, for example, preventing the operator from applying more than a predetermined downward force on the accelerator, or from applying more than a predetermined rotational force on the steering wheel.
  • This apparatus may alternatively (or in addition) restrict vehicle movement resulting from operation of the control by, for example, preventing the vehicle from exceeding a predetermined speed or executing an overly sharp turn.
  • the scene generator may react to the controls as operated, or to the constrained control operation or vehicle movement.
  • the actuation of vehicle control in step 330 and/or vehicle movement in step 340 will provide input to the regeneration of the environment view in step 310 .
  • sensors on one or more vehicle controls may provide input to a mathematical model of vehicle activity, which in turn provides input to the scene generator.
  • a measurement unit mounted on the vehicle may provide input to the scene generator.
  • the scene generator processes this input to update at least some elements within the environment view, and the scene display presents the environment view to the operator. The frequency of this update will vary based on the processing power of the computer within the scene generator, but may take place thousands of times per second.
  • the scene generator may create elements of an environment view that do not coincide with the actual behavior of the vehicle.
  • a mechanism may supplement, detract from, or otherwise alter the force applied by the operator to actuate vehicle control and/or the vehicle response to such actuation, in order to simulate vehicle control actuation under the conditions presented in the environment view. For example, if the environment view is presented to simulate an automobile moving at 70 miles per hour, but the vehicle is actually moving at 35 miles per hour, a mechanism may translate a rotational force the operator applies to the steering wheel to a much sharper actual turn of the front axle, consistent with a velocity of 70 miles per hour.
  • a mechanism may translate the downward force applied to the brake pedal to a much weaker force, or otherwise alter the force, actually applied to the brake pads to simulate deceleration in slippery conditions.
  • Those skilled in the art will be able to offer several methods through which this may be accomplished. Regardless of the method employed, data on the operator's actuation of vehicle control will be fed to the scene generator for continual regeneration of the environment view.
  • FIGS. 4 A- 4 D depict alternative components suitable for use in implementing the apparatus depicted in FIG. 1.
  • Components within FIGS. 4 A- 4 D are numbered according to the corresponding component from FIG. 1 and given alphabetic suffixes corresponding to the specific figure.
  • particular components shown in FIG. 1 comprise more than one component shown in FIGS. 4 A- 4 D; in these instances identifiers are assigned in FIGS. 4 A- 4 D so as to indicate a numeric association between the components
  • scene display 140 in FIG. 1 equates to scene display half-silvered mirror 143 B and scene display projector 145 B in FIG. 4B.
  • operator 120 A wears head-mounted scene display 140 A.
  • This head-mounted display receives a signal from scene generator 130 A.
  • the display may consist of, for example, a roughly planar surface and three-dimensional elements therein (for simulation of automobile operation, for instance), or a relatively unobstructed view of the open space before the vehicle (for an airplane, for instance).
  • the head-mounted display and scene generator are capable of presenting the vehicle operator with an environment view commensurate with head movement toward the left, right, up, or down, and commensurate with vehicle movement, since the operator remains in a relatively fixed position within the vehicle. At any one time, however, it presents the operator with an environment view comprised of the operator's field of vision given his/her head orientation.
  • the operator's environment view varies as a function of both vehicle movement or position, and of head movement or position.
  • the inertial measurement unit (IMU) 150 A ascertains acceleration of both the vehicle and the operator's head, and provides this input to scene generator 130 A so as to regenerate the environment view for rendering on the head-mounted scene display.
  • the scene generator maintains a realistic simulation of the operator's field of vision by accepting data on head and vehicle acceleration from the IMU, regenerating the environment view based on this data, and transmitting it to the head-mounted scene display 140 A.
  • Those skilled in the art will be able to offer several alternatives for how the transmission of velocity data from the IMU to the scene generator might be accomplished.
  • suitable power sources for the scene generator, IMU, head display, and video camera such that the risk of equipment failure and resulting operator danger due to power outage is minimized.
  • the scene generator may be secured within the vehicle, or may be a portable unit that can be worn or otherwise held by the operator while the mobile vehicle is in motion.
  • FIG. 4B depicts another illustrative embodiment of the invention, wherein operator 120 B observes the environment view through half-silvered mirror 143 B, which is sufficiently transparent to allow the operator to view the natural environment 180 B through it, and sufficiently opaque to allow the operator to view artificial images projected by scene display projector 145 B.
  • the scene generator 130 B transmits a signal consisting of artificial elements to be displayed and their location in the environment view, among other data, to the scene display projector, and the scene display projector projects the image on half-silvered mirror 143 B.
  • the composite image/environment view 170 B viewed by an operator is a combination of natural elements from the scene ahead and superimposed artificial elements projected by the scene display projector.
  • the half-silvered mirror is depicted in FIG. 4B as a flat, windshield-like screen, other embodiments might employ a cylindrical half-silvered mirror mounted to the vehicle structure, a cylindrical half-silvered mirror mounted to the operator's head, or other variations.
  • the measurement unit 150 B provides input on the vehicle's velocity to the scene generator so that artificial elements within the environment view can be updated appropriately for presentation by the scene display projector.
  • the scene generator accepts this input on the vehicle's position to continually regenerate the environment view.
  • a light brightness equivalence mechanism 190 B measures the intensity of light outside the vehicle and provides this input to the scene generator. The scene generator then adjusts the brightness of images to be superimposed by scene display projector 145 B, so that composite image 170 B constitutes a realistic rendering of an operating environment.
  • This aspect of the invention may be especially important for vehicle operation during periods of low sunlight, during periods of especially bright daylight, or in instances of high glare.
  • the scene generator, the scene display projector, the measurement unit and the light brightness equivalence mechanism may be stored within or mounted upon the vehicle.
  • FIG. 4C depicts another embodiment of the invention, which is the same as FIG. 4A except that the scene generator 130 C receives input from video camera 200 C, which is mounted on the operator's head so as to be collinear with the operator's view.
  • This video signal may depict the natural environment, or it may be altered before presentation to the operator in a predetermined fashion.
  • scene generator 130 C alters the signal sent by video camera 200 C to insert artificial elements and their location into the environment view, and in some cases also makes selected variations in the natural environment. Thus, operation of the vehicle at night might be simulated during daylight hours. This altered signal is then input to head-mounted scene display 140 C.
  • a scene generator may be mounted on or within the vehicle, or may be a portable unit that can be worn or otherwise held by the operator while the vehicle is in motion.
  • the inertial measurement unit 150 C affixed to the head-mounted display provides input on the acceleration of the vehicle and/or the operator's head.
  • additional measurement of head orientation or position, and/or of operator position within the vehicle may be provided by means of an electromagnetic sensor and/or mechanical linkage sensor with a potentiometer (not shown) affixed to the vehicle. This may prove useful for simulating the operation of a vehicle which may requires the operator to move about within the vehicle's interior (e.g., a ship or railroad car).
  • the scene generator will combine data provided by the sensor(s) and other measurements of the vehicle's and operator's position to provide an accurate environment view to the operator.
  • FIG. 4D depicts another embodiment of the invention, which is largely the same as FIG. 4B except that video camera 200 D provides input to scene generator 130 D in the form of a video image collinear with the operator's view, and operator 120 D views an image projected on flat screen 140 D.
  • a measurement unit 150 D transmits input on vehicle position, provided by means which may include a global positioning system (GPS) unit, laser triangulation within the operating area, or other position measurement techniques, to the scene generator.
  • GPS global positioning system
  • the scene generator manipulates the signal sent by the video camera, which may depict the natural environment, to insert artificial elements and their location. This altered signal is then fed to scene display projector 140 D, which projects environment view 170 D on to the flat screen.

Abstract

The invention uses an actual mobile vehicle whose operation is to be simulated, combined with computer-based image generation devices, and velocity, acceleration, and/or position measurement tools to improve the simulated operation of the vehicle, including perception of and response to hazards. Using the actual vehicle whose operation is to be simulated improves operator vestibular cues, visual cues, and motion fidelity, thereby producing a safer, less expensive means to produce the simulation.

Description

    FIELD OF THE INVENTION
  • This invention relates to vehicle simulation, and more particularly to methods and apparatus for simulating the human operation of a moving vehicle under selected operating conditions. [0001]
  • BACKGROUND OF THE INVENTION
  • Since the advent of computer technology first made them possible, vehicle simulators have been used for a number of purposes, including research, training, and vehicle engineering. Simulators have become increasingly prevalent and useful for reproducing the experience of operating aircraft, motor vehicles, trains, spacecraft, and other vehicles. Aviation simulators have become particularly prevalent, with nearly every airline now using simulators for training and research—the first time a commercial pilot flies a new aircraft, it is often filled with passengers. The military services use simulators extensively for training personnel in the operation of ships, tanks, aircraft, and other vehicles. [0002]
  • Despite their varied usage, simulators, especially those providing the operator(s) with motion cues, have primarily remained the tools of large organizations with significant financial resources. The use of motor vehicle (i.e., driving) simulators has not yet progressed beyond manufacturers, suppliers, government agencies (including the military), and academic institutions, largely because of their cost. However, driving simulators have a number of valuable safety applications. First, they offer research into driver response behavior. Highways are becoming populated by more vehicles, moving at greater speeds, with a greater portion of drivers comprised of older adults with reduced sensory and response capabilities. Cellular telephones, navigation systems, and other devices (often developed by third parties and added to vehicles without suitable integration with manufacturer-supplied devices) place increased demands on the driver's attention, and drugs continually arrive on the market that affect driver alertness. These factors mandate a better understanding of driver limitations, particularly those of older drivers. Researching driver behavior in emergencies by examination of real accidents has limited yield, because every accident is unique to some extent, determining causation is difficult, and controlled experimental research is inherently not possible for real accidents. Driving simulators could provide data on the driver's response to emergency situations without exposure to actual risk. [0003]
  • Second, simulators can provide an improved means for training and evaluating drivers. Most driver training is conducted either in classrooms or in automobiles in normal traffic, which rarely exposes trainees to unexpected hazards. Devices that would allow trainees to experience potential collision situations, visibility hazards, or other unusual driving situations, without actual exposure to risk would provide useful training. [0004]
  • Third, simulators provide manufacturers and suppliers useful data from which to further develop their products. Vehicle manufacturers, suppliers and car/truck fleet owners usually perform developmental tests in actual vehicles, but this is limited to experiences not involving collision or other hazards. The use of simulators to perform these functions is costly, particularly for programming and measuring motion and configuring the simulator to represent the appropriate vehicle, limiting the usefulness of these simulators for most research applications. [0005]
  • Simulators are primarily tasked with recreating the sensory cues their “real-world experience” counterparts offer. Most state of the art simulators do a credible job recreating visual and audible stimuli, but only the most expensive provide credible cues for the vestibular senses (controlled by semicircular canals in the inner ear, which sense rotary acceleration, and otolith organs, which sense translational acceleration) and the muscle and joint sensors of motion. Motor vehicle simulators, in particular, struggle to provide a faithful motion representation without exorbitant cost. Because of the cost of providing such functionality, relatively few driving simulators even attempt to provide motion cues, despite the fact that tests reveal subjects tend to over-steer or experience vertigo because of a mismatch between visual and motion cues. Those that provide motion cues usually do so with an expensive hydraulic simulator base, but very few motion-base driving simulators are in use, and even these lack the ability to accurately convey translational acceleration. Some state of the art simulators promise to rectify this problem, but this capability typically entails a significant cost. Clearly, simulators that represent motion cues faithfully are not cost-effective research or training tools for most applications. [0006]
  • The same financial barriers that prevent more widespread use of driving simulators have also prevented the development of simulators for the operation of wheelchairs, skis, snowboards, and many other vehicles which move, sometimes at high speeds, when in actual operation. Clearly the availability of more cost-effective simulators could enable better research, training, and engineering, resulting in safer and more user-friendly vehicles, both those indicated above and others, and in better, safer operation thereof. [0007]
  • SUMMARY OF THE INVENTION
  • The present invention overcomes the cost and motion fidelity issues of other vehicle operation simulators with a novel combination of existing devices. In order to generate realistic motion cues, the operator is carried by and operates an actual vehicle. Vehicle examples include but are not limited to an automobile, motorcycle, aircraft, wheelchair, bicycle, skis, and a ship. According to one aspect of the invention, the vehicle is operated in a “natural environment” using its normal vehicle control, and in accordance with visual and audible cues provided by a virtual reality device. The natural environment may be an open space (e.g., a large field or a parking lot), a track, an unused or seldom used roadway, snow-covered mountain slope, air space, or other environment appropriate for the mobile vehicle being used. The virtual reality device takes advantage of recent advances in computer processing and image generation speeds to create a realistic vehicle operation environment, including hazards which are not present in the actual/natural environment. Thus the invention provides realistic motion cues at reasonable cost, thereby creating training, research, and product development opportunities not previously possible. [0008]
  • In its most preferred embodiment, the present invention is operated in a large, open area free of obstacles and hazards. Since its intent is to provide a realistic vehicle operation experience, these areas provide the greatest opportunity to simulate all types of uninterrupted operation experiences. For example, if the invention were to simulate the operation of an automobile, it would be difficult to simulate driving on a highway for any useful period if the invention were used in an urban area, or even in a small field. However, for certain uses, it is envisioned that the invention may be operated on certain less-trafficked roads or streets. [0009]
  • The present invention provides both apparatus for a vehicle operation simulator and a method for simulating vehicle operation. The apparatus includes a mobile vehicle having vehicle controls, a scene generator, and a scene display that receives input from the scene generator and presents at least a partial virtual environment view to the operator of the vehicle. A method for use includes the scene generator creating at least one element within an environment view, transmitting an electronic signal comprising the at least one element within the environment view to the scene display, the scene display presenting the environment view to the vehicle operator, and, based on resulting operator actuation of vehicle control or vehicle movement, regenerating the environment view to provide continually updated cues to the operator, including visual and/or audible cues. [0010]
  • The scene display may present the operator an environment view consisting of artificial elements (produced by the scene generator) or a combination of artificial and natural elements. The components of the scene display may, for example, include a head-mounted unit, a projector, and/or a projection screen which is either partially transparent (e.g., half-silvered) or opaque (e.g., a flat screen). Depending on the equipment used, the environment view may consist of a viewable image presented within the head-mounted unit worn by the operator, an image projected onto flat or curved screens shaped like the windshield and/or side and rear windows, or images projected onto a semi-transparent screen so as to superimpose artificial elements on the operator's view of the surrounding natural environment. [0011]
  • The scene generator transmits an electronic signal to the scene display comprising at least one element within the environment view, which includes the location of natural and artificial images within the display. The scene generator continually regenerates the environment view for display to the vehicle operator via the scene display. The scene generator may alter artificial images within the environment view in response to vehicle movement, operator actuation of vehicle controls, and predetermined artificial image movement. Components of the scene generator may include a general-purpose programmed computer, and a means for transmitting a signal to the scene display. [0012]
  • The environment view may be presented to the vehicle operator to suggest behavior—for example, a velocity—different from the actual behavior exhibited by the vehicle. In these embodiments, a mechanism may be employed to allow the vehicle to respond to control actuation as though the vehicle were behaving as shown in the environment view. For example, the environment view might be presented to the operator of an automobile as though it were traveling at 70 miles per hour, when the vehicle actually is travelling at only 35 miles per hour. This mechanism might alter the operator's actuation of the steering wheel, for example, to cause a much sharper turn than under normal operation at 35 miles per hour, or at least to provide a simulated view of such a sharper turn. [0013]
  • The scene generator may take one or more forms of vehicle and operator movement and/or position data multiple as input. These may include acceleration data from an accelerometer or gyroscopic inertial measurement unit, velocity data from a unit which measures either translational or rotational velocity in up to six degrees of freedom, or position data from a positional measurement unit. [0014]
  • A mechanism may be used to maintain equivalent light brightness between a natural environment seen outside the vehicle and an image projected on such natural environment. For example, embodiments in which images are projected on to a semi-transparent screen so as to superimpose artificial elements on a natural environment will need to calibrate the brightness of such images to the brightness of the natural environment in order to present a realistic operating environment. This mechanism may include a component mounted on the exterior of the vehicle to take light brightness measurements of the natural environment. This mechanism provides these measurements to the scene generator, continually incorporating any shifting brightness of the environment into the generation of artificial images so that they remain realistic under changing conditions. [0015]
  • The vehicle may employ secondary vehicle controls to enhance operator safety, such that the vehicle responds exclusively to the secondary controls, or to both controls when the secondary controls are actuated. This secondary vehicle control might be used in instances where only a secondary operator can see the actual movement of the vehicle in relation to the natural environment, and might, in an automobile for example, include conventional dual controls used in driver training vehicles. [0016]
  • The vehicle may employ parameter-constraining apparatus that act to restrict the movement of the vehicle or the actuation of vehicle control. In one example, on a wheelchair this apparatus might restrict movement to prevent it from exceeding certain speeds. In another example, on an airplane this apparatus might restrain control actuation to prevent a roll so sharp it would disorient the operator. [0017]
  • Further features and advantages of the present invention, as well as the structure and operation of various embodiments of the present invention, are described in detail below with reference to the accompanying drawings. In these drawings, the same or equivalent reference numerals are used to represent the same element in the various figures.[0018]
  • IN THE DRAWINGS
  • FIG. 1 is a block diagram of an illustrative embodiment of the invention. [0019]
  • FIG. 2 is a view from the operator's perspective of an illustrative embodiment. [0020]
  • FIG. 3 is a flowchart illustrating operation of the invention. [0021]
  • FIGS. 4A, 4B, [0022] 4C, and 4D are semi-block diagrams illustrating four different ways in which the invention may be implemented.
  • DETAILED DESCRIPTION
  • Referring to FIG. 1, the vehicle operation simulator for an illustrative embodiment includes a [0023] mobile vehicle 100, which may be any vehicle that moves with at least one degree of freedom, for which movement represents an ordinary feature of operation, and which includes at least one component for regulation or control of said movement. Examples include, but are not limited to, an automobile, aircraft, ship, truck, railroad train, motorcycle, wheelchair, bicycle, snowboard, roller skates, and skis. Mobile vehicle 100 may be operated in a natural environment; for example, in an open space appropriate to the mobile vehicle. This open space should be large and preferably free of other vehicles, potential hazards, and pedestrians. However, it is also contemplated that the invention be practiced on unused or seldom-used tracks, streets, air space, snow slopes, roadway, or other environment on which the particular vehicle 100 might normally be operated.
  • A [0024] scene generator 130 is provided which generates an electronic signal and transmits it to scene display 140, which presents an environment view 170 to human operator 120. Typically, the scene generator includes a programmed general-purpose computer to generate images and sounds associated with a virtual environment which may include obstacles and hazards, including, but not limited to, other vehicles, animals, people, or fixed objects. These computer-generated elements typically exhibit behavior and characteristics of their real-world counterparts. For example, a computer-generated image of a person might be animated so as to appear to cross the street in front of the mobile vehicle 100. It should be noted that other equipment might also provide the functionality of the scene generator 130 including, for example, an array of projectable photographic or video images.
  • In some embodiments, the environment view is completely artificial. In one example, the environment view may include a computer-generated artificial background, at least one computer-generated artificial element, and have no elements taken from the natural environment surrounding the mobile vehicle. In other embodiments, the environment view may be comprised of a composite of natural elements and artificial elements. In one example, computer-generated artificial elements might be superimposed on a display screen that also allows the view of a natural environment to pass through. In another example, the scene generator would superimpose computer-generated artificial elements against a backdrop of a color video signal of the actual natural environment, or of a selectively modified natural environment. For example, a simulation conducted during the day may be modified to simulate night driving. The scene generator may also receive input on the state of the natural environment from, for example, vehicle-mounted cameras, and use this input in generating at least one element within an environment view that is related in predetermined ways to the actual environment. [0025]
  • The [0026] scene display 140 may take many forms. In some embodiments, the scene display is a head-mounted display that presents the environment view in the field of vision of the operator, and allows for a realistic field of vision no matter how the operator's head is oriented or where in his field of vision the operator's eyes are focused. In other embodiments, the scene display includes an electronic display and/or a projection unit affixed to the vehicle. Either the head-mounted or fixed display may include a half-silvered mirror, allowing the items projected on to the half-silvered mirror and the natural environment 180 behind it to comprise the environment view. In other embodiments, the scene display includes a projection unit and a flat screen, constructing an environment view 170 consisting entirely of projected elements. The environment view may consist of images projected on a single surface or, where appropriate, multiple surfaces. For example, the simulation of the operation of a helicopter might require the use of multiple display surfaces to present the operator simulated views above the airplane and on either side, as well as in front.
  • In some embodiments the operator's actuation of [0027] vehicle control 110 is input to a computerized mathematical model 135, which may run on the same computer as the scene generator. This mathematical model may then provide input to scene generator 130, causing the scene generator to alter the environment view presented on the scene display as appropriate to compensate for vehicle orientation and/or position, and the operating environment to be simulated.
  • Data on vehicle activity may also be provided to the scene generator via the [0028] measurement unit 150. This unit may measure the velocity of the vehicle (by measuring, for example, an automobile's wheel rotation and angle), measure its translational or rotational acceleration (for example, with an accelerometer, inertial acceleration measurement unit, or gyroscopic sensors), or measure changes in its position (using, for example, a global positioning system device, or laser triangulation in the operating area). Regardless of the measurement device used, however, this velocity, acceleration, or position data will encompass up to six degrees of freedom including translation and rotation. In one example, the measurement unit might discern a ship's velocity by combining measurements of water flow past a predetermined point on the ship's hull with measurements of the rudder angle over time. In another example, the measurement unit might discern an automobile's acceleration or deceleration relative to the ground and/or gyroscopic changes in its heading over time. (The use of inertial, position, and velocity measurement units will be well-known by those skilled in the art.) Data from either of these measurement units may supplement or replace input from vehicle control 110 to a mathematical model and/or the scene generator. The scene generator may then alter the environment view as appropriate given the mobile vehicle's movement (i.e., changes in angle or position relative to the earth) using conventional computer graphic transformations of image geometry.
  • [0029] Operator 120 actuates vehicle control 110 to control mobile vehicle 100, triggering cues to the operator's motion sense organs. Some embodiments may employ additional features to ensure the safety of the operator. For example, air bags and lap belts may be used to secure the operator in place during operation. Either vehicle control 110, or the motion of mobile vehicle 100, may be constrained by parameter-constraining apparatus 160. The parameter-constraining apparatus may comprise a computer system designed to assume control of the vehicle under certain hazardous conditions, a governor mechanism designed to limit vehicle velocity, or a mechanism limiting turn radius, angle of descent and/or other motion parameters. This apparatus may restrain motion either totally or in a manner dependent on vehicle operating conditions. The constraints may limit actuation of vehicle controls, but preferably limit the response of the vehicle to the controls.
  • Depending on the embodiment, [0030] scene generator 130 may also take input from light brightness measurement unit 190 and video camera 200. A light brightness measurement unit may provide data enabling the scene generator to maintain consistent brightness between the natural environment and any artificial elements that are superimposed. Therefore, this unit may be mounted or otherwise affixed to the vehicle so as to enable measuring the light brightness of the environment view as seen by the operator, as will be appreciated by those skilled in the art.
  • One or more video cameras may provide one or more video signals depicting the natural environment, for use when the natural environment is not otherwise visible to the operator. Therefore the video camera(s) may also be mounted or otherwise positioned on the vehicle's exterior or on the operator's head so as to capture the visible elements of the natural environment from a perspective collinear with the operator's field of vision; methods for appropriate capture of the natural environment using video camera apparatus will also be well-known by those skilled in the art. While the camera(s) may provide a video image directly to [0031] scene display 140, it is preferable that camera output be provided, as shown, to scene generator 130, where it may be used to reproduce either the actual—or a modified version of—the natural environment.
  • FIG. 2 depicts the interior of [0032] mobile vehicle 100 which, for the illustrative embodiment, is automobile 200 with controls 210 including a steering wheel, an accelerator, a brake, and other suitable controls such as a gear shift, clutch, de-fogger, etc. (controls not shown). Scene generator 130 may be a programmed general-purpose computer stored within automobile 200. A half-silvered mirror 220, integrated with or separate from the vehicle's windshield, or attached to the head-mounted display, receives either projected images from a projector (not shown) situated within automobile 200 (in the case of a screen display), or a signal from the scene generator (in the case of the head-mounted display). Either the image projector or the head-mounted unit, combined with the half-silvered mirror 220, form scene display 140. Obstacles 240 are placed in the environment view such that it appears superimposed on the natural environment also viewable through the half-silvered mirror 220.
  • Some embodiments may also include a secondary vehicle control [0033] 230 to promote the safe operation of the automobile. A secondary vehicle operator, who monitors the operator's actions and corrects or overrides vehicle control actuation that would result in danger or injury, operates secondary vehicle control 230. The secondary operator may experience the same environment view as the operator, may experience only the natural environment, may experience both environments (for example, on a split screen view), or may experience some other view appropriate to maximize safe operation of the vehicle.
  • FIG. 3 is a flow diagram of a method for simulating vehicle operation utilizing the apparatus of FIG. 1. In [0034] step 310 an environment view is created, which may consist of artificial elements designed to wholly comprise the environment view, or artificial elements intended to be superimposed on natural elements to comprise the environment view. The scene generator transmits these elements to the scene display. In step 320, the scene display presents the environment view to the operator. In some embodiments, if scene display is accomplished via projection on a viewing surface, the viewing surface may encompass the field of vision regardless of the operator's head movement—i.e., the viewing surface will allow the operator to see a projected image in all relevant directions for the particular vehicle.
  • In [0035] step 330, the operator actuates vehicle control in accordance with the environment view. The actuation of vehicle control will include at least one operator act—for example, applying rotational force to a steering wheel, controlling force on an accelerator, applying force to a brake pedal, applying force on one edge of a snowboard, and/or applying force on the control stick of an airplane.
  • As shown in [0036] step 340, the vehicle responds to actuation of the vehicle control. In some instances, parameter-restraining apparatus may be employed to restrict vehicle movement, to enhance operator safety or for other reasons. This apparatus may act to restrain control actuation by, for example, preventing the operator from applying more than a predetermined downward force on the accelerator, or from applying more than a predetermined rotational force on the steering wheel. This apparatus may alternatively (or in addition) restrict vehicle movement resulting from operation of the control by, for example, preventing the vehicle from exceeding a predetermined speed or executing an overly sharp turn. The scene generator may react to the controls as operated, or to the constrained control operation or vehicle movement.
  • The actuation of vehicle control in [0037] step 330 and/or vehicle movement in step 340 will provide input to the regeneration of the environment view in step 310. If the environment view responds to control actuation, sensors on one or more vehicle controls may provide input to a mathematical model of vehicle activity, which in turn provides input to the scene generator. If the environment view responds to vehicle movement, a measurement unit mounted on the vehicle may provide input to the scene generator. In either case, the scene generator processes this input to update at least some elements within the environment view, and the scene display presents the environment view to the operator. The frequency of this update will vary based on the processing power of the computer within the scene generator, but may take place thousands of times per second.
  • In some embodiments, the scene generator may create elements of an environment view that do not coincide with the actual behavior of the vehicle. In these embodiments, a mechanism may supplement, detract from, or otherwise alter the force applied by the operator to actuate vehicle control and/or the vehicle response to such actuation, in order to simulate vehicle control actuation under the conditions presented in the environment view. For example, if the environment view is presented to simulate an automobile moving at 70 miles per hour, but the vehicle is actually moving at 35 miles per hour, a mechanism may translate a rotational force the operator applies to the steering wheel to a much sharper actual turn of the front axle, consistent with a velocity of 70 miles per hour. Also, for example, if the environment view is presented to simulate an automobile traveling in the snow, a mechanism may translate the downward force applied to the brake pedal to a much weaker force, or otherwise alter the force, actually applied to the brake pads to simulate deceleration in slippery conditions. Those skilled in the art will be able to offer several methods through which this may be accomplished. Regardless of the method employed, data on the operator's actuation of vehicle control will be fed to the scene generator for continual regeneration of the environment view. [0038]
  • FIGS. [0039] 4A-4D depict alternative components suitable for use in implementing the apparatus depicted in FIG. 1. Components within FIGS. 4A-4D are numbered according to the corresponding component from FIG. 1 and given alphabetic suffixes corresponding to the specific figure. In some instances, particular components shown in FIG. 1 comprise more than one component shown in FIGS. 4A-4D; in these instances identifiers are assigned in FIGS. 4A-4D so as to indicate a numeric association between the components For example, scene display 140 in FIG. 1 equates to scene display half-silvered mirror 143B and scene display projector 145B in FIG. 4B.
  • Referring to FIG. 4A, operator [0040] 120A wears head-mounted scene display 140A. This head-mounted display receives a signal from scene generator 130A. Depending on the vehicle whose operation is to be simulated, the display may consist of, for example, a roughly planar surface and three-dimensional elements therein (for simulation of automobile operation, for instance), or a relatively unobstructed view of the open space before the vehicle (for an airplane, for instance). The head-mounted display and scene generator are capable of presenting the vehicle operator with an environment view commensurate with head movement toward the left, right, up, or down, and commensurate with vehicle movement, since the operator remains in a relatively fixed position within the vehicle. At any one time, however, it presents the operator with an environment view comprised of the operator's field of vision given his/her head orientation. Thus, the operator's environment view varies as a function of both vehicle movement or position, and of head movement or position.
  • The inertial measurement unit (IMU) [0041] 150A ascertains acceleration of both the vehicle and the operator's head, and provides this input to scene generator 130A so as to regenerate the environment view for rendering on the head-mounted scene display. The scene generator maintains a realistic simulation of the operator's field of vision by accepting data on head and vehicle acceleration from the IMU, regenerating the environment view based on this data, and transmitting it to the head-mounted scene display 140A. Those skilled in the art will be able to offer several alternatives for how the transmission of velocity data from the IMU to the scene generator might be accomplished. Those skilled in the art will also be able to offer suitable power sources for the scene generator, IMU, head display, and video camera, such that the risk of equipment failure and resulting operator danger due to power outage is minimized.
  • Depending on the vehicle used, and other factors, the scene generator may be secured within the vehicle, or may be a portable unit that can be worn or otherwise held by the operator while the mobile vehicle is in motion. [0042]
  • FIG. 4B depicts another illustrative embodiment of the invention, wherein operator [0043] 120B observes the environment view through half-silvered mirror 143B, which is sufficiently transparent to allow the operator to view the natural environment 180B through it, and sufficiently opaque to allow the operator to view artificial images projected by scene display projector 145B. The scene generator 130B transmits a signal consisting of artificial elements to be displayed and their location in the environment view, among other data, to the scene display projector, and the scene display projector projects the image on half-silvered mirror 143B. Thus the composite image/environment view 170B viewed by an operator is a combination of natural elements from the scene ahead and superimposed artificial elements projected by the scene display projector.
  • Although the half-silvered mirror is depicted in FIG. 4B as a flat, windshield-like screen, other embodiments might employ a cylindrical half-silvered mirror mounted to the vehicle structure, a cylindrical half-silvered mirror mounted to the operator's head, or other variations. [0044]
  • The measurement unit [0045] 150B provides input on the vehicle's velocity to the scene generator so that artificial elements within the environment view can be updated appropriately for presentation by the scene display projector. The scene generator accepts this input on the vehicle's position to continually regenerate the environment view.
  • A light brightness equivalence mechanism [0046] 190B measures the intensity of light outside the vehicle and provides this input to the scene generator. The scene generator then adjusts the brightness of images to be superimposed by scene display projector 145B, so that composite image 170B constitutes a realistic rendering of an operating environment. This aspect of the invention may be especially important for vehicle operation during periods of low sunlight, during periods of especially bright daylight, or in instances of high glare.
  • The scene generator, the scene display projector, the measurement unit and the light brightness equivalence mechanism may be stored within or mounted upon the vehicle. [0047]
  • FIG. 4C depicts another embodiment of the invention, which is the same as FIG. 4A except that the scene generator [0048] 130C receives input from video camera 200C, which is mounted on the operator's head so as to be collinear with the operator's view. This video signal may depict the natural environment, or it may be altered before presentation to the operator in a predetermined fashion. In an illustrative embodiment, scene generator 130C alters the signal sent by video camera 200C to insert artificial elements and their location into the environment view, and in some cases also makes selected variations in the natural environment. Thus, operation of the vehicle at night might be simulated during daylight hours. This altered signal is then input to head-mounted scene display 140C. A scene generator may be mounted on or within the vehicle, or may be a portable unit that can be worn or otherwise held by the operator while the vehicle is in motion.
  • The inertial measurement unit [0049] 150C affixed to the head-mounted display provides input on the acceleration of the vehicle and/or the operator's head. When the head-mounted display is used, additional measurement of head orientation or position, and/or of operator position within the vehicle, may be provided by means of an electromagnetic sensor and/or mechanical linkage sensor with a potentiometer (not shown) affixed to the vehicle. This may prove useful for simulating the operation of a vehicle which may requires the operator to move about within the vehicle's interior (e.g., a ship or railroad car). The scene generator will combine data provided by the sensor(s) and other measurements of the vehicle's and operator's position to provide an accurate environment view to the operator.
  • FIG. 4D depicts another embodiment of the invention, which is largely the same as FIG. 4B except that video camera [0050] 200D provides input to scene generator 130D in the form of a video image collinear with the operator's view, and operator 120D views an image projected on flat screen 140D. A measurement unit 150D transmits input on vehicle position, provided by means which may include a global positioning system (GPS) unit, laser triangulation within the operating area, or other position measurement techniques, to the scene generator. In a preferred embodiment, the scene generator manipulates the signal sent by the video camera, which may depict the natural environment, to insert artificial elements and their location. This altered signal is then fed to scene display projector 140D, which projects environment view 170D on to the flat screen.
  • While in the discussion above, head and vehicle movement and position have been measured to control the scene generator, in some applications other movements may be monitored, as appropriate, to provide a realistic simulation of vehicle operation. Thus, while the invention has been particularly shown and described with reference to specific embodiments, and variations thereon have been indicated, it should be understood by those skilled in the art that various additional changes in form and detail may be made therein without departing from the spirit and scope of the invention as defined by the following claims.[0051]

Claims (32)

What is claimed is:
1. A vehicle operation simulator comprising:
a mobile vehicle operable in a natural environment having at least one vehicle control;
a scene generator;
a scene display in communication with said scene generator and viewable by a vehicle operator,
an environment view being presented on said scene display which is created at least
in part by said scene generator; and
wherein said mobile vehicle carries the operator and is controlled by the operator in accordance with said environment view, said mobile vehicle responding to actuation of said at least one vehicle control and said environment view responding to at least one of operation of said at least one vehicle control, operator movement, and vehicle movement.
2. The vehicle operation simulator of claim 1 including at least one of an inertial acceleration measurement unit, a gyroscopic measurement unit, and a pendulum, responding to motion of said mobile vehicle in up to six degrees of freedom to provide input to said scene generator.
3. The vehicle operation simulator of claim 1 including at least one of an inertial acceleration measurement unit, a gyroscopic measurement unit, and a pendulum, responding to motion of said operator's head in up to six degrees of freedom to provide input to said scene generator.
4. The vehicle operation simulator of claim 1 including a measurement unit responding to a velocity of said mobile vehicle in up to six degrees of freedom to provide input to said scene generator.
5. The vehicle operation simulator of claim 1 including at least one of a global positioning system unit and a laser triangulation unit, responding to changes in position of said mobile vehicle in up to six degrees of freedom to provide input to said scene generator.
6. The vehicle operation simulator of claim 5 including at least one of an electromagnetic arm, a mechanical arm with potentiometer, and magnetic sensors, responding to changes in position of said operator's head in relation to said mobile vehicle in up to six degrees of freedom to provide input to said scene generator.
7. The vehicle operation simulator of claim 1 including a computer-based mathematical model of activity of said vehicle, responding to at least one of said vehicle control and movement of said vehicle, to provide data on at least one of position of said vehicle in up to six degrees of freedom as input to said scene generator.
8. The vehicle operation simulator of claim 1 wherein said environment view is wholly comprised of elements from said scene generator.
9. The vehicle operation simulator of claim 8 wherein said scene display is affixed to at least one of said mobile vehicle and a head-mounted display worn by the vehicle operator.
10. The vehicle operation simulator of claim 1 wherein said environment view is a composite of at least one element from said scene generator, and at least one element from the natural environment.
11. The vehicle operation simulator of claim 10 wherein said at least one element from the natural environment is captured with a video camera and input to said scene generator.
12. The vehicle operation simulator of claim 10 wherein said at least one element from the natural environment is visible to said operator through a partially transparent viewing screen.
13. The vehicle operation simulator of claim 12 wherein the viewing screen is affixed to at least one of the mobile vehicle and a head-mounted display worn by the vehicle operator.
14. The vehicle operation simulator of claim 10 wherein said scene generator includes a mechanism maintaining equivalent light brightness between at least one element from said scene display and the natural environment.
15. The vehicle operation simulator of claim 1 wherein the at least one element presented in said environment view differ in a controlled fashion from the actual behavior of said mobile vehicle.
16. The vehicle operation simulator of claim 15 wherein the vehicle responds to operator actuation of vehicle control in accordance with movement represented in said environment view rather than the movement of said vehicle.
17. The vehicle operation simulator of claim 1 including one or more sensors responding to movement of said operator within said mobile vehicle to provide input to said scene generator and said scene display.
18. The vehicle operation simulator of claim 1 including secondary vehicle control for said mobile vehicle, said secondary vehicle control to be actuated by a second operator.
19. The vehicle operation simulator of claim 18 wherein said mobile vehicle responds exclusively to said secondary vehicle control when said secondary vehicle control is actuated.
20. The vehicle operation simulator of claim 18 wherein said mobile vehicle selectively responds to both said secondary vehicle control and said vehicle control when said secondary vehicle control is actuated.
21. The vehicle operation simulator of claim 1 including parameter-constraining apparatus limiting at least one of the movement of said mobile vehicle and the actuation of said vehicle control.
22. The vehicle operation simulator of claim 1 wherein said scene display includes at least one of a mirror, a flat opaque viewing screen, a curved opaque viewing screen, and electronic display, and a partially transparent half-silvered mirror.
23. A method for simulated operation of a vehicle in a natural environment including the steps:
(a) generating an environment view;
(b) presenting the environment view to an operator carried by a mobile vehicle, said mobile vehicle operating in a natural environment;
(c) the operator actuating controls for said mobile vehicle, movement of said mobile vehicle responding to said actuation; and
(d) altering said environment view in response to at least one of vehicle movement, operator actuation of controls, operator head movement, and operator movement within said vehicle.
24. The method of claim 23 wherein the elements presented in said environment view differ in a controlled fashion from those commensurate with the actual behavior of said mobile vehicle.
25. The method of claim 23 wherein said environment view comprises at least one element rendered by said scene generator and at least one element from the natural environment.
26. The method of claim 25 wherein step (a) includes maintaining equivalent light brightness between at least one element from said scene display and the natural environment.
27. The method of claim 23 wherein said environment view is wholly comprised of elements from said scene generator.
28. The method of claim 23 wherein said vehicle has secondary vehicle control, and including the step of said secondary vehicle control being actuated by a second operator under selected conditions.
29. The method of claim 28 wherein said mobile vehicle responds exclusively to said secondary vehicle control when said secondary vehicle control is actuated.
30. The method of claim 28 wherein said mobile vehicle selectively responds to both said secondary vehicle control and said vehicle control when said secondary vehicle control is actuated.
31. The method of claim 23 wherein said environment view is presented on at least one of a scene display affixed to said mobile vehicle and a head-mounted display worn by the vehicle operator.
32. The method of claim 23 wherein step (c) includes limiting at least one of the movement of said mobile vehicle and actuation of said vehicle control.
US10/001,362 2000-10-23 2001-10-23 Hybrid vehicle operations simulator Abandoned US20020052724A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US10/001,362 US20020052724A1 (en) 2000-10-23 2001-10-23 Hybrid vehicle operations simulator
US11/454,601 US7246050B2 (en) 2000-10-23 2006-06-16 Vehicle operations simulator with augmented reality

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US24261400P 2000-10-23 2000-10-23
US10/001,362 US20020052724A1 (en) 2000-10-23 2001-10-23 Hybrid vehicle operations simulator

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US11/454,601 Continuation US7246050B2 (en) 2000-10-23 2006-06-16 Vehicle operations simulator with augmented reality

Publications (1)

Publication Number Publication Date
US20020052724A1 true US20020052724A1 (en) 2002-05-02

Family

ID=26668927

Family Applications (2)

Application Number Title Priority Date Filing Date
US10/001,362 Abandoned US20020052724A1 (en) 2000-10-23 2001-10-23 Hybrid vehicle operations simulator
US11/454,601 Expired - Fee Related US7246050B2 (en) 2000-10-23 2006-06-16 Vehicle operations simulator with augmented reality

Family Applications After (1)

Application Number Title Priority Date Filing Date
US11/454,601 Expired - Fee Related US7246050B2 (en) 2000-10-23 2006-06-16 Vehicle operations simulator with augmented reality

Country Status (1)

Country Link
US (2) US20020052724A1 (en)

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020087296A1 (en) * 2001-01-03 2002-07-04 Wynn Owen John Williams Simulator
US20030130822A1 (en) * 2001-11-28 2003-07-10 Steele Robert C. Multimedia racing experience system
US20040199311A1 (en) * 2003-03-07 2004-10-07 Michael Aguilar Vehicle for simulating impaired driving
US20050275717A1 (en) * 2004-06-10 2005-12-15 Sarnoff Corporation Method and apparatus for testing stereo vision methods using stereo imagery data
US20070176851A1 (en) * 2005-12-06 2007-08-02 Willey Stephen R Projection display with motion compensation
US20070282564A1 (en) * 2005-12-06 2007-12-06 Microvision, Inc. Spatially aware mobile projection
US20080206719A1 (en) * 2005-06-14 2008-08-28 Volvo Aero Corporation Method For Training A Person While Operating A Vehicle
WO2009151335A1 (en) * 2008-06-09 2009-12-17 Ship Manoeuvring Simulator Centre As System for training an operator of a vessel
US7788081B1 (en) 2006-06-22 2010-08-31 At&T Intellectual Property I, L.P. Method of communicating data from virtual setting into real-time devices
US20110111849A1 (en) * 2005-12-06 2011-05-12 Microvision, Inc. Spatially Aware Mobile Projection
WO2012113686A1 (en) * 2011-02-22 2012-08-30 Rheinmetall Defence Electronics Gmbh Simulator for training a team, in particular for training a helicopter crew
US8257084B1 (en) * 2006-06-22 2012-09-04 At&T Intellectual Property I, L.P. Method of integrating real time data into virtual settings
US8441501B1 (en) 2006-06-22 2013-05-14 At&T Intellectual Property I, L.P. Adaptive access in virtual settings based on established virtual profile
GB2496742A (en) * 2011-11-11 2013-05-22 Cobham Cts Ltd Hazardous device detection training system
US20150024347A1 (en) * 2013-07-18 2015-01-22 Daegu Gyeongbuk Institute Of Science And Technology Driving simulator apparatus and method of driver rehabilitation training using the same
AT518667A4 (en) * 2016-06-29 2017-12-15 Peterseil Thomas Driving Safety Training arrangement
CN111638709A (en) * 2020-03-24 2020-09-08 上海黑眸智能科技有限责任公司 Automatic obstacle avoidance tracking method, system, terminal and medium
EP3613031A4 (en) * 2017-05-02 2021-01-06 The Regents of The University of Michigan Simulated vehicle traffic for autonomous vehicles
US20210272468A1 (en) * 2018-09-19 2021-09-02 Offshore Certification Ltd. A system for simulating a maritime environment
CN113380104A (en) * 2016-04-04 2021-09-10 雷蒙德股份有限公司 System and method for vehicle simulation
US20220406212A1 (en) * 2021-06-19 2022-12-22 Danny Baldwin Virtual Reality Vehicle Operation Simulation

Families Citing this family (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7088310B2 (en) * 2003-04-30 2006-08-08 The Boeing Company Method and system for presenting an image of an external view in a moving vehicle
US7046259B2 (en) * 2003-04-30 2006-05-16 The Boeing Company Method and system for presenting different views to passengers in a moving vehicle
JP4134939B2 (en) * 2004-04-22 2008-08-20 株式会社デンソー Vehicle periphery display control device
US20070160961A1 (en) * 2006-01-11 2007-07-12 Cyrus Lum Transportation simulator
US8123527B2 (en) 2006-10-31 2012-02-28 Hoelljes H Christian Active learning device and method
AU2008339124B2 (en) * 2007-12-19 2014-07-03 Auguste Holdings Limited Vehicle competition implementation system
US8190295B1 (en) 2008-05-14 2012-05-29 Sandia Corporation Apparatus and method for modifying the operation of a robotic vehicle in a real environment, to emulate the operation of the robotic vehicle operating in a mixed reality environment
JP4831374B2 (en) * 2009-03-27 2011-12-07 アイシン・エィ・ダブリュ株式会社 Driving support device, driving support method, and driving support program
DE102009029318A1 (en) * 2009-09-09 2011-03-17 Ford Global Technologies, LLC, Dearborn Method and device for testing a vehicle construction
AU2010221799B2 (en) 2009-09-16 2014-06-12 Sydac Pty Ltd Visual presentation system
DE102010016113A1 (en) * 2010-03-24 2011-09-29 Krauss-Maffei Wegmann Gmbh & Co. Kg Method for training a crew member of a particular military vehicle
JP2012155655A (en) * 2011-01-28 2012-08-16 Sony Corp Information processing device, notification method, and program
EP3591624B1 (en) * 2012-03-16 2020-12-23 Volvo Car Corporation Augmented vision in image sequence generated from a moving vehicle
JP2014174447A (en) * 2013-03-12 2014-09-22 Japan Automobile Research Institute Vehicle dangerous scene reproducer, and method of use thereof
EP2972089A4 (en) 2013-03-15 2016-09-14 Huntington Ingalls Inc System and method for determining and maintaining object location and status
PL124467U1 (en) * 2013-06-18 2017-01-16 Alexandr Alexandrovich Kolotov Protective helmet for motorcyclists and persons involved in extreme kind of activities
WO2015003056A1 (en) * 2013-07-02 2015-01-08 ROFFE, Brian Real time car driving simulator
EP3132379B1 (en) 2014-04-15 2018-11-07 Huntington Ingalls Incorporated System and method for augmented reality display of dynamic environment information
US9626802B2 (en) 2014-05-01 2017-04-18 Microsoft Technology Licensing, Llc Determining coordinate frames in a dynamic environment
DE102014208352A1 (en) 2014-05-05 2015-11-05 Bayerische Motoren Werke Aktiengesellschaft System and method for instructing a subscriber of a driver training
US10915754B2 (en) 2014-06-09 2021-02-09 Huntington Ingalls Incorporated System and method for use of augmented reality in outfitting a dynamic structural space
US10504294B2 (en) 2014-06-09 2019-12-10 Huntington Ingalls Incorporated System and method for augmented reality discrepancy determination and reporting
US10147234B2 (en) * 2014-06-09 2018-12-04 Huntington Ingalls Incorporated System and method for augmented reality display of electrical system information
US20180182261A1 (en) * 2014-07-02 2018-06-28 Iotmotive Ltd. Real Time Car Driving Simulator
US20160048027A1 (en) * 2014-08-18 2016-02-18 Sam Shpigelman Virtual reality experience tied to incidental acceleration
US20160321940A1 (en) * 2015-04-29 2016-11-03 Ivan Banga Driver Education and Training System and Method for Training Drivers in Simulated Emergencies
US10713970B2 (en) 2015-04-29 2020-07-14 Ivan Banga Driver education system and method for training in simulated road emergencies
JP6723720B2 (en) * 2015-10-20 2020-07-15 キヤノン株式会社 Display system, information processing method and program
US9988008B2 (en) 2015-10-26 2018-06-05 Active Knowledge Ltd. Moveable internal shock-absorbing energy dissipation padding in an autonomous vehicle
US10710608B2 (en) 2015-10-26 2020-07-14 Active Knowledge Ltd. Provide specific warnings to vehicle occupants before intense movements
US10717406B2 (en) 2015-10-26 2020-07-21 Active Knowledge Ltd. Autonomous vehicle having an external shock-absorbing energy dissipation padding
US11332061B2 (en) 2015-10-26 2022-05-17 Atnomity Ltd. Unmanned carrier for carrying urban manned vehicles
US10059347B2 (en) 2015-10-26 2018-08-28 Active Knowledge Ltd. Warning a vehicle occupant before an intense movement
EP3220233B1 (en) * 2016-03-18 2020-11-04 Volvo Car Corporation Method and system for enabling interaction in a test environment
US10580386B2 (en) 2017-04-21 2020-03-03 Ford Global Technologies, Llc In-vehicle projected reality motion correction
US10134279B1 (en) 2017-05-05 2018-11-20 Toyota Motor Engineering & Manufacturing North America, Inc. Systems and methods for visualizing potential risks
US10560735B2 (en) 2017-05-31 2020-02-11 Lp-Research Inc. Media augmentation through automotive motion
US10997781B1 (en) 2017-12-27 2021-05-04 Disney Enterprises, Inc. Systems and methods of real-time ambient light simulation based on generated imagery
CN108919785B (en) * 2018-07-25 2020-01-14 安徽江淮汽车集团股份有限公司 Test system and test method

Citations (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3936955A (en) * 1974-12-17 1976-02-10 Driver Training Institute Driver training simulator
US4464117A (en) * 1980-08-27 1984-08-07 Dr. Ing. Reiner Foerst Gmbh Driving simulator apparatus
US4846686A (en) * 1987-02-02 1989-07-11 Doron Precision Systems, Inc. Motor vehicle simulator with multiple images
US4856771A (en) * 1987-10-22 1989-08-15 Nelson, Berg Enterprises Video simulation apparatus
US5018973A (en) * 1988-11-30 1991-05-28 Thomson-Csf Motion simulator for vehicle driver
US5209662A (en) * 1989-06-30 1993-05-11 Honda Giken Kogyo Kabushiki Kaisha Riding simulation system of motorcycle
US5277584A (en) * 1991-09-06 1994-01-11 Occusym Limited Liability Company Vehicle vibration simulator and method for programming and using same
US5368484A (en) * 1992-05-22 1994-11-29 Atari Games Corp. Vehicle simulator with realistic operating feedback
US5399091A (en) * 1992-04-27 1995-03-21 Tomy Company, Ltd. Drive simulation apparatus
US5577913A (en) * 1990-08-01 1996-11-26 Atari Games Corporation System and method for driver training with multiple driver competition
US5618179A (en) * 1992-05-22 1997-04-08 Atari Games Corpooration Driver training system and method with performance data feedback
US5660547A (en) * 1993-02-17 1997-08-26 Atari Games Corporation Scenario development system for vehicle simulators
US5707237A (en) * 1993-04-20 1998-01-13 Kabushiki Kaisha Ace Denken Driving simulation system
US5721679A (en) * 1995-12-18 1998-02-24 Ag-Chem Equipment Co., Inc. Heads-up display apparatus for computer-controlled agricultural product application equipment
US5752834A (en) * 1995-11-27 1998-05-19 Ling; Shou Hung Motion/force simulators with six or three degrees of freedom
US6102832A (en) * 1996-08-08 2000-08-15 Tani Shiraito Virtual reality simulation apparatus
US6118414A (en) * 1997-12-02 2000-09-12 Kintz; Gregory J. Virtual reality system and method
US6123547A (en) * 1998-12-21 2000-09-26 Teresi Publications, Inc. Stationary drag racing simulation system
US6175343B1 (en) * 1998-02-24 2001-01-16 Anivision, Inc. Method and apparatus for operating the overlay of computer-generated effects onto a live image
US6196845B1 (en) * 1998-06-29 2001-03-06 Harold R. Streid System and method for stimulating night vision goggles
US6234011B1 (en) * 1997-07-24 2001-05-22 Hitachi, Ltd. Vehicle testing apparatus and method thereof
US6327708B1 (en) * 1998-09-15 2001-12-04 True Image, L.L.C. System of absolute measurement for radiological image luminance control
US6474159B1 (en) * 2000-04-21 2002-11-05 Intersense, Inc. Motion-tracking
US6611297B1 (en) * 1998-04-13 2003-08-26 Matsushita Electric Industrial Co., Ltd. Illumination control method and illumination device

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5547382A (en) * 1990-06-28 1996-08-20 Honda Giken Kogyo Kabushiki Kaisha Riding simulation system for motorcycles
US5320534A (en) * 1990-11-05 1994-06-14 The United States Of America As Represented By The Secretary Of The Air Force Helmet mounted area of interest (HMAoI) for the display for advanced research and training (DART)
US6064749A (en) * 1996-08-02 2000-05-16 Hirota; Gentaro Hybrid tracking for augmented reality using both camera motion detection and landmark tracking
US6094625A (en) * 1997-07-03 2000-07-25 Trimble Navigation Limited Augmented vision for survey work and machine control
US6195625B1 (en) * 1999-02-26 2001-02-27 Engineering Dynamics Corporation Method for simulating collisions
US20030210228A1 (en) * 2000-02-25 2003-11-13 Ebersole John Franklin Augmented reality situational awareness system and method
US20020196202A1 (en) * 2000-08-09 2002-12-26 Bastian Mark Stanley Method for displaying emergency first responder command, control, and safety information using augmented reality
US6903752B2 (en) * 2001-07-16 2005-06-07 Information Decision Technologies, Llc Method to view unseen atmospheric phenomenon using augmented reality

Patent Citations (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3936955A (en) * 1974-12-17 1976-02-10 Driver Training Institute Driver training simulator
US4464117A (en) * 1980-08-27 1984-08-07 Dr. Ing. Reiner Foerst Gmbh Driving simulator apparatus
US4846686A (en) * 1987-02-02 1989-07-11 Doron Precision Systems, Inc. Motor vehicle simulator with multiple images
US4856771A (en) * 1987-10-22 1989-08-15 Nelson, Berg Enterprises Video simulation apparatus
US5018973A (en) * 1988-11-30 1991-05-28 Thomson-Csf Motion simulator for vehicle driver
US5209662A (en) * 1989-06-30 1993-05-11 Honda Giken Kogyo Kabushiki Kaisha Riding simulation system of motorcycle
US5577913A (en) * 1990-08-01 1996-11-26 Atari Games Corporation System and method for driver training with multiple driver competition
US5277584A (en) * 1991-09-06 1994-01-11 Occusym Limited Liability Company Vehicle vibration simulator and method for programming and using same
US5399091A (en) * 1992-04-27 1995-03-21 Tomy Company, Ltd. Drive simulation apparatus
US5368484A (en) * 1992-05-22 1994-11-29 Atari Games Corp. Vehicle simulator with realistic operating feedback
US5618179A (en) * 1992-05-22 1997-04-08 Atari Games Corpooration Driver training system and method with performance data feedback
US5660547A (en) * 1993-02-17 1997-08-26 Atari Games Corporation Scenario development system for vehicle simulators
US5707237A (en) * 1993-04-20 1998-01-13 Kabushiki Kaisha Ace Denken Driving simulation system
US5752834A (en) * 1995-11-27 1998-05-19 Ling; Shou Hung Motion/force simulators with six or three degrees of freedom
US5721679A (en) * 1995-12-18 1998-02-24 Ag-Chem Equipment Co., Inc. Heads-up display apparatus for computer-controlled agricultural product application equipment
US6102832A (en) * 1996-08-08 2000-08-15 Tani Shiraito Virtual reality simulation apparatus
US6234011B1 (en) * 1997-07-24 2001-05-22 Hitachi, Ltd. Vehicle testing apparatus and method thereof
US6118414A (en) * 1997-12-02 2000-09-12 Kintz; Gregory J. Virtual reality system and method
US6175343B1 (en) * 1998-02-24 2001-01-16 Anivision, Inc. Method and apparatus for operating the overlay of computer-generated effects onto a live image
US6611297B1 (en) * 1998-04-13 2003-08-26 Matsushita Electric Industrial Co., Ltd. Illumination control method and illumination device
US6196845B1 (en) * 1998-06-29 2001-03-06 Harold R. Streid System and method for stimulating night vision goggles
US6327708B1 (en) * 1998-09-15 2001-12-04 True Image, L.L.C. System of absolute measurement for radiological image luminance control
US6123547A (en) * 1998-12-21 2000-09-26 Teresi Publications, Inc. Stationary drag racing simulation system
US6474159B1 (en) * 2000-04-21 2002-11-05 Intersense, Inc. Motion-tracking
US6681629B2 (en) * 2000-04-21 2004-01-27 Intersense, Inc. Motion-tracking

Cited By (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020087296A1 (en) * 2001-01-03 2002-07-04 Wynn Owen John Williams Simulator
US7200536B2 (en) * 2001-01-03 2007-04-03 Seos Limited Simulator
US20030130822A1 (en) * 2001-11-28 2003-07-10 Steele Robert C. Multimedia racing experience system
US20040199311A1 (en) * 2003-03-07 2004-10-07 Michael Aguilar Vehicle for simulating impaired driving
US20050275717A1 (en) * 2004-06-10 2005-12-15 Sarnoff Corporation Method and apparatus for testing stereo vision methods using stereo imagery data
US20080206719A1 (en) * 2005-06-14 2008-08-28 Volvo Aero Corporation Method For Training A Person While Operating A Vehicle
US20110111849A1 (en) * 2005-12-06 2011-05-12 Microvision, Inc. Spatially Aware Mobile Projection
US20070282564A1 (en) * 2005-12-06 2007-12-06 Microvision, Inc. Spatially aware mobile projection
US20070176851A1 (en) * 2005-12-06 2007-08-02 Willey Stephen R Projection display with motion compensation
US7788081B1 (en) 2006-06-22 2010-08-31 At&T Intellectual Property I, L.P. Method of communicating data from virtual setting into real-time devices
US8651868B2 (en) 2006-06-22 2014-02-18 At&T Intellectual Property I, L.P. Integrating real time data into virtual settings
US10213696B2 (en) 2006-06-22 2019-02-26 At&T Intellectual Property I, L.P. Adaptation of gaming applications to participants
US9262046B2 (en) 2006-06-22 2016-02-16 At&T Intellectual Property I, Lp Adaptation of gaming applications to participants
US8257084B1 (en) * 2006-06-22 2012-09-04 At&T Intellectual Property I, L.P. Method of integrating real time data into virtual settings
US8366446B2 (en) 2006-06-22 2013-02-05 At&T Intellectual Property I, L.P. Integrating real time data into virtual settings
US8441501B1 (en) 2006-06-22 2013-05-14 At&T Intellectual Property I, L.P. Adaptive access in virtual settings based on established virtual profile
WO2009151335A1 (en) * 2008-06-09 2009-12-17 Ship Manoeuvring Simulator Centre As System for training an operator of a vessel
US20110123960A1 (en) * 2008-06-09 2011-05-26 Ship Manoeuvring Simulator Centre As System for training an operator of a vessel
NO333473B1 (en) * 2008-06-09 2013-06-17 Ship Manoeuvring Simulator Centre System for training a master of a vessel
WO2012113686A1 (en) * 2011-02-22 2012-08-30 Rheinmetall Defence Electronics Gmbh Simulator for training a team, in particular for training a helicopter crew
GB2496742B (en) * 2011-11-11 2013-11-27 Cobham Cts Ltd Hazardous device detection training system
GB2496742A (en) * 2011-11-11 2013-05-22 Cobham Cts Ltd Hazardous device detection training system
US20150024347A1 (en) * 2013-07-18 2015-01-22 Daegu Gyeongbuk Institute Of Science And Technology Driving simulator apparatus and method of driver rehabilitation training using the same
CN113380104A (en) * 2016-04-04 2021-09-10 雷蒙德股份有限公司 System and method for vehicle simulation
AT518667B1 (en) * 2016-06-29 2017-12-15 Peterseil Thomas Driving Safety Training arrangement
AT518667A4 (en) * 2016-06-29 2017-12-15 Peterseil Thomas Driving Safety Training arrangement
EP3613031A4 (en) * 2017-05-02 2021-01-06 The Regents of The University of Michigan Simulated vehicle traffic for autonomous vehicles
US11669653B2 (en) 2017-05-02 2023-06-06 The Regents Of The University Of Michigan Simulated vehicle traffic for autonomous vehicles
US20210272468A1 (en) * 2018-09-19 2021-09-02 Offshore Certification Ltd. A system for simulating a maritime environment
CN111638709A (en) * 2020-03-24 2020-09-08 上海黑眸智能科技有限责任公司 Automatic obstacle avoidance tracking method, system, terminal and medium
US20220406212A1 (en) * 2021-06-19 2022-12-22 Danny Baldwin Virtual Reality Vehicle Operation Simulation
US11854434B2 (en) * 2021-06-19 2023-12-26 Danny Baldwin Virtual reality vehicle operation simulation

Also Published As

Publication number Publication date
US7246050B2 (en) 2007-07-17
US20070136041A1 (en) 2007-06-14

Similar Documents

Publication Publication Date Title
US7246050B2 (en) Vehicle operations simulator with augmented reality
US5184956A (en) Method and device for training in the driving of vehicles
EP3507125B1 (en) Vehicle user interface apparatus and vehicle
McGovern Experience and results in teleoperation of land vehicles¹
CN105788401B (en) Defensive drive simulation based on real vehicle body experiences training system
US7424414B2 (en) System for combining driving simulators and data acquisition systems and methods of use thereof
US5707237A (en) Driving simulation system
Blana A Survey of Driving Research Simulators Around the World.
US10713970B2 (en) Driver education system and method for training in simulated road emergencies
US20160321940A1 (en) Driver Education and Training System and Method for Training Drivers in Simulated Emergencies
DE10256612B3 (en) Automobile driver training method using training vehicle with virtual imaging of dangerous driving situations
CN110136535A (en) Examination of driver simulation system and method
CN109658519B (en) Vehicle-mounted multi-mode augmented reality system based on real road condition information image processing
CN112487549B (en) System and method for testing reaction behavior of driver after automatic driving steering failure
EP1586861B1 (en) Method and apparatus for displaying information for the driver taking into account other movable objects
US11590902B2 (en) Vehicle display system for displaying surrounding event information
JP2003150038A (en) Apparatus and method for driving simulation
CN211956795U (en) Training system and training device for self rescue by conditioned reflex in emergency
JP5687879B2 (en) Information processing apparatus, automobile, information processing method and program
CN115315614A (en) Display device, display method and vehicle
JP4493575B2 (en) MOBILE BODY SIMULATOR, ITS CONTROL METHOD, AND CONTROL PROGRAM
Allen et al. Driving simulation—requirements, mechanization and application
Stall et al. The national advanced driving simulator: potential applications to ITS and AHS research
McGovern Experiences in teleoperation of land vehicles
Stoner Developing a high fidelity ground vehicle simulator for human factors studies

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION