US9690104B2 - Augmented reality HUD display method and device for vehicle - Google Patents

Augmented reality HUD display method and device for vehicle Download PDF

Info

Publication number
US9690104B2
US9690104B2 US14/846,781 US201514846781A US9690104B2 US 9690104 B2 US9690104 B2 US 9690104B2 US 201514846781 A US201514846781 A US 201514846781A US 9690104 B2 US9690104 B2 US 9690104B2
Authority
US
United States
Prior art keywords
augmented reality
eye
hud display
display coordinates
reality hud
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
US14/846,781
Other versions
US20160163108A1 (en
Inventor
Sung Un Kim
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hyundai Motor Co
Original Assignee
Hyundai Motor Co
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020150065842A external-priority patent/KR101713740B1/en
Application filed by Hyundai Motor Co filed Critical Hyundai Motor Co
Assigned to HYUNDAI MOTOR COMPANY reassignment HYUNDAI MOTOR COMPANY ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KIM, SUNG UN
Publication of US20160163108A1 publication Critical patent/US20160163108A1/en
Application granted granted Critical
Publication of US9690104B2 publication Critical patent/US9690104B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/04Display arrangements
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye

Definitions

  • the present disclosure relates generally to an augmented reality head-up display (HUD)-related technology for vehicles, and more particularly, to an augmented reality HUD display method and device for a vehicle which can minimize perception errors in augmented reality HUD graphics on a HUD.
  • HUD head-up display
  • Head-up displays are often used in automobiles for projecting information to a driver's eyes.
  • a HUD is a front display device that is designed to present vehicle driving information on a front window (i.e., windshield) of a vehicle.
  • a HUD unit produces and displays virtual images to allow the driver to view various types of information, such as speed, fuel level, temperature, warnings, directions, etc., which have been conventionally displayed on a vehicle's instrument cluster.
  • HUDs were originally introduced for providing a pilot with an enhanced field of view in an aircraft.
  • HUDs are beginning to be implemented in vehicles for the purpose of displaying driving information and reducing accidents caused by drivers looking away from the road while driving. For instance, through the use of a head-up display unit, drivers can keep their attention focused ahead (i.e., toward the road), thereby reducing the risk of accidents.
  • Certain HUD units also offer a night vision feature that allows drivers to identify objects ahead in darkness, as well as displaying information deriving from the instrument cluster.
  • a HUD may be a device that presents information without requiring drivers to divert their attention from the road ahead while driving, by displaying images of information about the operation of a vehicle.
  • the HUD is implemented through a screen film inserted in the windshield at the front so as to minimize the driver's eye movement.
  • Such a HUD may be comprised of an image source (e.g., a liquid crystal display (LCD)) for generating images, an optical system for forming an image generated by and projected from the image source, and an interface for the driver's control.
  • the image should be projected from the image source at an optimum distance from the windshield and at an effective focal length.
  • a HUD for vehicles can display information deriving from the instrument panel cluster, such as vehicle speed, mileage, revolutions per minute (RPM), etc. on the front windshield so that the driver is able to get driving information easily while driving. Also, the HUD displays virtual images on the windshield by rendering information on a variety of internal systems of the vehicle into images when the vehicle is brought to a halt or the driver shifts the vehicle from park.
  • RPM revolutions per minute
  • the present disclosure has been made in an effort to provide an augmented reality HUD display method and device for a vehicle which can minimize perception errors in augmented reality HUD graphics, perceived by the vehicle driver or user.
  • Embodiments of the present disclosure provide an augmented reality HUD display method for a vehicle that includes: detecting a position of an object outside of the vehicle at which a driver of the vehicle is looking; detecting a position of an eye of the driver while the driver is viewing external object information displayed on a windshield of the vehicle; extracting augmented reality HUD display coordinates of the object based on the detected object position and augmented reality HUD display coordinates of the eye based on the detected eye position; correcting one or more errors in the augmented reality HUD display coordinates of the object and one or more errors in the augmented reality HUD display coordinates of the eye using an error correction parameter for the augmented reality HUD display coordinates of the object and an error correction parameter for the augmented reality HUD display coordinates of the eye, the error correction parameters varying from one another; receiving the corrected augmented reality HUD display coordinates of the object and the corrected augmented reality HUD display coordinates of the eye; and displaying augmented reality HUD graphics of the external object information on the windshield based on the received corrected augmented reality
  • the correcting of the one or more errors may include: detecting a position of a plurality of objects outside of the vehicle; setting a first correction parameter for correcting one or more errors in augmented reality HUD display coordinates of a first object of the plurality of objects and one or more errors in the augmented reality HUD display coordinates of the eye while the driver is viewing the first object; and setting a second correction parameter for correcting one or more errors in augmented reality HUD display coordinates of a second object of the plurality of objects and the augmented reality HUD display coordinates of the eye while the driver is viewing the second object.
  • the first object may be an external object that is a first distance away from the eye of the driver
  • the second object may be an external object that is a second distance away from the eye of the driver that is shorter than the first distance
  • the second correction parameter may be set to a lower error correction value than the first correction parameter
  • the method may further include detecting the position of the object using a radar sensor or a lidar sensor.
  • the method may also further include detecting the position of the eye using a camera.
  • the correcting of the one or more errors may include: low-pass filtering the one or more errors in the augmented reality HUD display coordinates of the object and the one or more errors in the augmented reality HUD display coordinates of the eye.
  • a cut-off frequency given as a first correction parameter for the low-pass filtering may be lower than a cut-off frequency given as a second correction parameter for the low-pass filtering.
  • HUD display information corresponding to the external object information may include speed information of the object or navigation information of the object.
  • the navigation information may include turn-by-turn (TBT) information.
  • an augmented reality HUD display device for a vehicle includes: an object detection sensor detecting a position of an object outside of the vehicle at which a driver of the vehicle is looking; an eye position detector detecting a position of an eye of the driver while the driver is viewing external object information displayed on a windshield of the vehicle; an augmented reality display coordinates extractor extracting a augmented reality HUD display coordinates the object based on the detected object position and augmented reality HUD display coordinates of the eye based on the detected eye position; an error correction module correcting one or more errors in the augmented reality HUD display coordinates of the object and one or more errors in the augmented reality HUD display coordinates of the eye using an error correction parameter for the augmented reality HUD display coordinates of the object and an error correction parameter for the augmented reality HUD display coordinates of the eye, the error correction parameters varying from one another; and a graphics display unit receiving, from the error correction module, the corrected augmented reality HUD display coordinates of the object and the corrected augmented reality
  • the object detection sensor may detect a position of a plurality of objects outside of the vehicle; and the error correction module may set a first correction parameter for correcting one or more errors in augmented reality HUD display coordinates of a first object of the plurality of objects and one or more errors in augmented reality HUD display coordinates of the eye while the driver is viewing the first object and a second correction parameter for correcting one or more errors in augmented reality HUD display coordinates of a second object of the plurality of objects and augmented reality HUD display coordinates of the eye while the driver is viewing the second object.
  • the first object may be an external object that is a first distance away from the eye of the driver
  • the second object may be an external object that is a second distance away from the eye of the driver that is shorter than the first distance
  • the second correction parameter may be set to a lower error correction value than the first correction parameter
  • the object detection sensor may include a radar sensor or a lidar sensor.
  • the eye position detector may include a camera.
  • the error correction module may include a low-pass filter, and a cut-off frequency given as a first correction parameter for the low-pass filter may be lower than a cut-off frequency given as a second correction parameter for the low-pass filter.
  • HUD display information corresponding to the external object information may include speed information of the object or navigation information of the object.
  • the navigation information may include TBT information.
  • a non-transitory computer readable medium containing program instructions for an augmented reality HUD display method for a vehicle includes: program instructions that detect a position of an object outside of the vehicle at which a driver of the vehicle is looking; program instructions that detect a position of an eye of the driver while the driver is viewing external object information displayed on a windshield of the vehicle; program instructions that extract augmented reality HUD display coordinates of the object based on the detected object position and augmented reality HUD display coordinates of the eye based on the detected eye position; program instructions that correct one or more errors in the augmented reality HUD display coordinates of the object and one or more errors in the augmented reality HUD display coordinates of the eye using an error correction parameter for the augmented reality HUD display coordinates of the object and an error correction parameter for the augmented reality HUD display coordinates of the eye, the error correction parameters varying from one another; program instructions that receive the corrected augmented reality HUD display coordinates of the object and the corrected augmented reality HUD display coordinates of the eye
  • an augmented reality HUD display device and method for a vehicle allow the driver of a vehicle to intuitively perceive the real-world driving environment on an augmented reality HUD system (i.e., augmented reality HUD device) for the vehicle, by correcting graphics errors perceived by the driver in a way that varies with the distance to an object the driver is looking at.
  • the present disclosure may realize an algorithm that costs very little to implement an augmented reality HUD display method for a vehicle by making a trade-off between sensor cost and sensor performance, even if sensor technology is expected to make quite a lot of progress.
  • FIG. 1 and FIG. 2 are views showing examples of an augmented reality HUD display method.
  • FIG. 3 is a view showing an example of the technical construction of an augmented reality HUD.
  • FIG. 4 is a view for explaining an eye position detection error and a driver's angle of view on an augmented reality HUD.
  • FIG. 5 is a view for explaining an error in the measurement of the distance to an object using a vehicle sensor.
  • FIG. 6 is a block diagram for explaining an augmented reality HUD display device for a vehicle according to embodiments of the present disclosure.
  • FIG. 7 is a graph for explaining an example of the error correction module of FIG. 6 .
  • FIG. 8 is a view for explaining an example of augmented reality HUD graphics displayed by the augmented reality HUD display device for a vehicle shown in FIG. 6 .
  • FIG. 9 is a view for explaining another example of augmented reality HUD graphics displayed by the augmented reality HUD display device for a vehicle shown in FIG. 6 .
  • FIG. 10 is a view for explaining yet another example of augmented reality HUD graphics displayed by the augmented reality HUD display device for a vehicle shown in FIG. 6 .
  • FIG. 11 is a view for explaining a further example of augmented reality HUD graphics displayed by the augmented reality HUD display device for a vehicle shown in FIG. 6 .
  • vehicle or “vehicular” or other similar term as used herein is inclusive of motor vehicles in general such as passenger automobiles including sports utility vehicles (SUV), buses, trucks, various commercial vehicles, watercraft including a variety of boats and ships, aircraft, and the like, and includes hybrid vehicles, electric vehicles, plug-in hybrid electric vehicles, hydrogen-powered vehicles and other alternative fuel vehicles (e.g., fuels derived from resources other than petroleum).
  • a hybrid vehicle is a vehicle that has two or more sources of power, for example both gasoline-powered and electric-powered vehicles.
  • controller may refer to a hardware device that includes a memory and a processor.
  • the memory is configured to store program instructions, and the processor is specifically programmed to execute the program instructions to perform one or more processes which are described further below.
  • the below methods may be executed by an apparatus comprising the controller in conjunction with one or more other components, as would be appreciated by a person of ordinary skill in the art.
  • controller of the present disclosure may be embodied as non-transitory computer readable media on a computer readable medium containing executable program instructions executed by a processor, controller or the like.
  • the computer readable mediums include, but are not limited to, ROM, RAM, compact disc (CD)-ROMs, magnetic tapes, floppy disks, flash drives, smart cards and optical data storage devices.
  • the computer readable recording medium can also be distributed in network coupled computer systems so that the computer readable media is stored and executed in a distributed fashion, e.g., by a telematics server or a Controller Area Network (CAN).
  • a telematics server or a Controller Area Network (CAN).
  • CAN Controller Area Network
  • FIGS. 1 and 2 are views showing examples of an augmented reality HUD display method.
  • Sensor data from a camera sensor for detecting the eye position, a lidar (light detection and ranging) sensor, and/or radar sensor for detecting objects outside the vehicle has a certain angle of error due to sensor vibration or eye blinks.
  • Such an error involves a perception error that varies with the distance to a target object to be displayed on the augmented reality HUD, which will cause confusion for the vehicle driver or user.
  • the use of a distance-independent perception error reduction algorithm may lead to difficulty in keeping the performance of augmented reality HUD graphics consistent.
  • FIG. 3 is a view showing an example of the technical construction of an augmented reality HUD.
  • the eye position of the driver of the vehicle must be detected, and the eye position is usually detected by a camera installed in the vehicle.
  • the eye-tracking coordinates have some noise because of the camera resolution, eye blinks, etc., and a sensor for sensing external objects also has some coordinate error due to resolution issues.
  • FIG. 4 is a view for explaining an eye position detection error and a driver's angle of view on an augmented reality HUD. Specifically, FIG. 4 explains a display error on an object outside the vehicle the driver is looking at, caused by eye noise.
  • an augmented reality HUD may include an eye position detecting camera 120 for detecting the driver's eye, and a radar (radio detecting and ranging) sensor 205 and/or lidar (light detection and ranging) sensor for detecting (i.e., measuring) the position of an external object.
  • a radar radio detecting and ranging
  • lidar light detection and ranging
  • An eye vector can be indicated by a line connecting the center of the driver's eye 125 and the center of the pupil.
  • the line of sight i.e., angle of view of an eye position detection system
  • eye noise i.e., eye position detection error
  • FIG. 4 the farther from the eye, the larger the margin of error in orthogonal coordinates in a transverse plane.
  • the margin of orthogonal error i.e., orthogonal coordinate error
  • distance i.e., the distance from the driver's eye to an object the driver is looking at
  • an augmented reality HUD graphic display of a far object 105 on the vehicle's windshield glass 115 has a larger margin of error than an augmented reality HUD graphic display of a near object 110 on it, thus leading to a higher level of perceived noise (i.e., perception error) on the far object.
  • FIG. 5 is a view for explaining an error in the measurement of the distance to an object using a vehicle sensor.
  • the same principle as explained with reference to FIG. 4 may be applied to a sensor such as a radar sensor or lidar sensor 215 installed at the front of the vehicle 220 .
  • Radio waves or light (or laser light) radiation from the sensor 215 departs from one point on the vehicle 220 and scans the area ahead of the vehicle. Accordingly, sensing noise also radiates like the aforementioned eye noise, at a certain angle (i.e., angle of error from a distance measurement sensor).
  • sensing noise varies with the distance between the vehicle 220 and the far object 205 or near object 210 outside the vehicle 220 .
  • the far object 205 of the same size as the near object 210 is displayed with a different level of graphics noise, causing the driver to perceive noise at different levels for the far object 205 and the near object 210 .
  • FIG. 6 is a block diagram for explaining an augmented reality HUD display device for a vehicle according to embodiments of the present disclosure.
  • an augmented reality HUD display device 300 for a vehicle may include an object detection sensor 305 , an eye position detector 310 , an augmented reality display coordinates extractor 315 , an error correction module 320 , and a graphics display unit 325 .
  • augmented reality may refer to computer graphics technology that synthesizes a virtual object or virtual information within the real world to make it look like a real-world object.
  • augmented reality may refer to virtual reality technology that combines virtual information with the real world as viewed through the eye to produce one image.
  • the augmented reality technology is well known in the art, so a detailed description thereof will be omitted in this specification.
  • a head-up display (HUD) device such as the augmented reality HUD display device 300 for a vehicle may be a device that reflects an image onto a windshield of the vehicle or a combiner (i.e., transparent panel) to provide a vehicle driver with vehicle information such as vehicle speed, mileage, or revolution per minute (RPM), or navigation information.
  • vehicle information such as vehicle speed, mileage, or revolution per minute (RPM), or navigation information.
  • RPM revolution per minute
  • the HUD area i.e., HUD display area or HUD screen area
  • the HUD area may indicate a vehicle information image area that is delivered to the vehicle driver's eye by presenting vehicle information such as vehicle driving information on the windshield of the vehicle.
  • the HUD area may indicate a virtual area where an HUD image is displayed.
  • the HUD area may refer to an area that is within a display screen which the driver's eye is on and which presents an image when the driver looks ahead.
  • the augmented reality HUD display device 300 for a vehicle may perform a process of correcting augmented reality graphics coordinates (i.e., augmented reality HUD graphic coordinates), and may perform a method of minimizing graphic errors caused by the vehicle driver's or user's shifting their eyes (i.e., graphic errors caused by eye movements) in the design of an augmented reality HUD graphics interface. More specifically, the augmented reality HUD display device 300 for a vehicle may execute an algorithm that gives more priority to response rate than to accuracy for an object near the vehicle and vice versa for an object, such as a building, far from the vehicle.
  • augmented reality HUD graphic coordinates i.e., augmented reality HUD graphic coordinates
  • the augmented reality HUD display device 300 for a vehicle may execute an algorithm that gives more priority to response rate than to accuracy for an object near the vehicle and vice versa for an object, such as a building, far from the vehicle.
  • the augmented reality HUD display device 300 for a vehicle may use error correction parameters that vary with the distance between the driver and an object to be displayed by the augmented reality HUD (i.e., the distance between the driver's eye and an object at which the driver is looking), in order to give consistent perception errors to the driver.
  • error correction parameters that vary with the distance between the driver and an object to be displayed by the augmented reality HUD (i.e., the distance between the driver's eye and an object at which the driver is looking), in order to give consistent perception errors to the driver.
  • it is better to reduce errors in all sensor data, but this may involve degradation of other properties.
  • low-pass filtering one of the most typical methods of error reduction, can significantly reduce noise but may lead to a low response speed.
  • error correction parameters are set in such a way that reduces the margin of error on the far object shown in FIG. 4 or FIG. 5 to a greater degree and reduces the margin of error on the near object to a lesser degree. This is because, while a low response speed on the far object causes no problem in the performance (i.e., display accuracy) of the augmented reality HUD display device 300 since long-distance movement of the far object is not presented to the driver, the near object has less noise for its size and response speed is more critical for the near object.
  • the object detection sensor 305 may detect the position of an object outside the vehicle the driver is looking at.
  • the object detection sensor 305 may measure the distance from the vehicle to the external object.
  • the object detection sensor 305 may deliver distance information to the error correction module 320 that uses sensor data error correction parameters so as to use this information as a reference in the error correction of the error correction module 320 .
  • the object detection sensor 305 may include a radar sensor and/or a lidar (Light Detection and Ranging) sensor.
  • the lidar sensor a type of laser radar sensor, may be a radar system that measures the coordinates of the position of a reflecting object by measuring the time for a laser pulse to be irradiated and reflected and return.
  • the eye position detector 310 may detect the eye position of the driver viewing external object information or augmented reality HUD display information corresponding to the external object information which is displayed on the windshield of the vehicle.
  • the eye position detector 310 may include a camera.
  • the augmented reality display coordinates extractor (i.e., augmented reality HUD display coordinates extractor) 315 may extract the augmented reality HUD display coordinates of an external object detected by the object detection sensor 305 and the augmented reality HUD display coordinates (or eye-tracking coordinates) of the eye position detected by the eye position detector 310 .
  • the error correction module 320 may correct errors in the augmented reality HUD display coordinates of the external object and errors in the augmented reality HUD display coordinates of the eye position by using an error correction parameter for the augmented reality HUD display coordinates of the external object and an error correction parameter for the augmented reality HUD display coordinates of the eye position, the error correction parameters varying with distance information between the driver's eye and the external object.
  • the distance information may be delivered from the object detection sensor 305 to the error correction module 320 .
  • the error correction module 320 may also set a first correction parameter for correcting errors in the augmented reality HUD display coordinates of first object and errors in the augmented reality HUD display coordinates of the eye position of the vehicle driver viewing the first object, and a second correction parameter for correcting errors in the augmented reality HUD display coordinates of second object and the augmented reality HUD display coordinates of the position of the eye viewing the second object.
  • the first object is the external object that is at a first distance from the driver's eye
  • the second object is the external object that is at a second distance from the driver's eye which is shorter than the first distance
  • the second correction parameter is set to a lower error correction value than the first correction parameter.
  • the augmented reality HUD display device 300 for a vehicle may further include a camera that captures an image of the road ahead of the vehicle that is matched with the external object information or the HUD display information (i.e., virtual image information).
  • the image of the road ahead may be of a scene the driver is seeing through the windshield.
  • the error correction module 320 may include a low-pass filter (LPF).
  • LPF low-pass filter
  • a cut-off frequency given as a first correction parameter for the low-pass filter may be lower than a cut-off frequency given as a second correction parameter for the low-pass filter.
  • the graphics display unit 325 may receive, from the error correction module 320 , the corrected augmented reality HUD display coordinates of the external object and the corrected augmented reality HUD display coordinates of the eye position, and display augmented reality HUD graphics of the external object information on the windshield.
  • HUD display information corresponding to the external object information may include speed information of the external object shown in FIG. 10 or navigation information related to the external object.
  • the navigation information may include turn-by-turn (TBT) information shown in FIG. 11 .
  • the TBT information may include a direction change icon.
  • the augmented reality HUD display device 300 for a vehicle may further include a controller (not shown).
  • the controller may perform the functions of a central processing unit (CPU) or processor and control the overall operation of the object detection sensor 305 , eye position detector 310 , augmented reality display coordinates extractor 315 , error correction module 320 , and graphics display unit 325 .
  • the controller may include a program containing a series of commands for performing an augmented reality HUD display method for a vehicle according to the present disclosure to be described later.
  • the augmented reality HUD display method for a vehicle may be applied to the augmented reality HUD display device 300 for a vehicle shown in FIG. 6 , and may also be referred to as a method of displaying variable errors on an augmented reality HUD for a vehicle.
  • the augmented reality HUD display method for a vehicle may include, for example, a first detection step, a second detection step, an extraction step, a correction step, and a display step.
  • the position of an object outside the vehicle the vehicle driver is looking at may be detected by the object detection sensor 305 .
  • the sensor for detecting the position of the external object may include a radar sensor or a lidar sensor.
  • the eye position of the vehicle driver viewing external object information displayed on the windshield of the vehicle may be detected by the eye position detector 310 .
  • the sensor for detecting the eye position may include a camera.
  • HUD display information corresponding to the external object information may include speed information of the external object shown in FIG. 10 or navigation information regarding the external object.
  • the navigation information may include TBT information shown in FIG. 11 .
  • the augmented reality HUD display coordinates of the detected external object or the augmented reality HUD display coordinates (or eye-tracking coordinates) of the detected eye may be extracted by the augmented reality display coordinates extractor 315 .
  • errors in the augmented reality HUD display coordinates of the external object and errors in the augmented reality HUD display coordinates of the eye position may be corrected by the error correction module 320 by using an error correction parameter for the augmented reality HUD display coordinates of the external object and an error correction parameter for the augmented reality HUD display coordinates of the eye position, the error correction parameters varying (i.e., changing) with distance information (i.e., eye distance information) between the driver's eye and the external object.
  • a first correction parameter for correcting errors in the augmented reality HUD display coordinates of first object and errors in the augmented reality HUD display coordinates of the eye position of the vehicle driver viewing the first object, and a second correction parameter for correcting errors in the augmented reality HUD display coordinates of second object and the augmented reality HUD display coordinates of the position of the eye viewing the second object may be set by the error correction module 320 .
  • the first object is the external object that is at a first distance from the driver's eye
  • the second object is the external object that is at a second distance from the driver's eye which is shorter than the first distance
  • the second correction parameter is set to a lower error correction value than the first correction parameter.
  • errors in the augmented reality HUD display coordinates of the external object and errors in the augmented reality HUD display coordinates of the eye position may be low-pass-filtered.
  • a cut-off frequency given as a first correction parameter for the low-pass filtering may be lower than a cut-off frequency given as a second correction parameter for the low-pass filtering.
  • the corrected augmented reality HUD display coordinates of the external object and the corrected augmented reality HUD display coordinates of the eye position may be received, and augmented reality HUD graphics of the external object information may be displayed on the windshield by the graphics display unit 325 .
  • FIG. 7 is a graph for explaining an example of the error correction module of FIG. 6 .
  • a cut-off frequency may be given as a correction parameter.
  • the cut-off frequency of the LPF may decrease, and when the object is near the driver (or vehicle), the cut-off frequency of the LPF may increase.
  • the accuracy and response speed of sensor data may be adjusted by cut-off frequency adjustment.
  • errors in the augmented reality HUD display coordinates of the external object and errors in the augmented reality HUD display coordinates of the eye may be corrected by low-pass-filtering the range of coordinates of the eye position and the range of coordinates of the position of a detected object (not shown) extracted by the augmented reality display coordinates extractor (i.e., 315 of FIG. 6 ).
  • the cut-off frequency of the LPF may be used as an error correction parameter, and the cut-off frequency may be adjusted depending on the distance between the driver (or vehicle) and an external object.
  • the present disclosure can minimize perception errors in augmented reality HUD graphics in a HUD device, perceived by the driver.
  • FIG. 8 is a view for explaining an example of augmented reality HUD graphics displayed by the augmented reality HUD display device for a vehicle shown in FIG. 6 .
  • FIG. 8 may show a graphical representation of perception errors on objects that vary with the distance from the driver's viewpoint in a real situation.
  • a vehicle driver 405 may see a first object 415 at a long distance and a second object 410 at a short distance on the windshield 420 , with variable errors obtained by correcting errors in graphic coordinates.
  • the first object 415 corresponds to first HUD display information
  • the second object 410 corresponds to second HUD display information.
  • Large-scale error correction may be performed on the first object 415 , as indicated by the larger double-headed arrow of FIG. 8
  • small-scale error correction may be performed on the second object 410 , as indicated by the smaller double-headed arrow of FIG. 8 .
  • perception errors in graphics may be made equal regardless of the distance between the driver (or vehicle) and the objects, thereby minimizing distance-dependent cursor blinking on the displayed objects 410 and 415 .
  • FIG. 9 is a view for explaining another example of augmented reality HUD graphics displayed by the augmented reality HUD display device for a vehicle shown in FIG. 6 . That is, FIG. 9 shows an application of the present disclosure which displays the distance to a vehicle ahead on an augmented reality HUD.
  • graphics of a near object 510 i.e. a second object
  • graphics of a far object 515 i.e. a first object
  • graphics of a far object 515 may be displayed, with the display error parameter set to correspond to accuracy rather than response speed.
  • the vehicle driver 505 is able to see on the windshield 520 a graphic display of the near vehicle 510 and a graphic display of the far vehicle 515 where perception errors in graphics are equal regardless of the distance between the driver (or vehicle) and the objects.
  • FIG. 10 is a view for explaining yet another example of augmented reality HUD graphics displayed by the augmented reality HUD display device for a vehicle shown in FIG. 6 .
  • FIG. 10 shows an application of the present disclosure which displays the speed of a vehicle ahead on an augmented reality HUD.
  • graphics of a near object 610 i.e., a second object
  • graphics of a far object 615 i.e., a first object
  • the vehicle driver 605 is able to see on the windshield 620 a graphic display of the near vehicle 610 and a graphic display of the far vehicle 615 where perception errors in graphics are equal regardless of the distance between the driver (or vehicle) and the objects.
  • FIG. 11 is a view for explaining a further example of augmented reality HUD graphics displayed by the augmented reality HUD display device for a vehicle shown in FIG. 6 . That is, FIG. 11 shows an application of the present disclosure which displays TBT information on an augmented reality HUD.
  • TBT information for a short distance (e.g., 50 m) may be displayed, with the coordinate error parameter set to correspond to response speed rather than accuracy
  • TBT information for a long distance (e.g., 150 m) may be displayed, with the coordinate error parameter set to correspond to accuracy rather than response speed.
  • the components, units, blocks, or modules used in the present disclosure may be implemented by software components, such as tasks, classes, subroutines, processes, objects, execution threads, or programs, or by hardware components, such as a Field Programmable Gate Array (FPGA) or an Application Specific Integrated Circuit ASIC, or by combinations of the software and hardware components.
  • the components may be included in a computer-readable storage medium, or some of the components may be distributed in a plurality of computers.

Abstract

An augmented reality head-up display (HUD) display method for a vehicle includes: detecting a position of an object outside of the vehicle at which a driver of the vehicle is looking; detecting a position of an eye of the driver while the driver is viewing external object information displayed on a windshield of the vehicle; extracting augmented reality HUD display coordinates of the object based on the detected object position and augmented reality HUD display coordinates of the eye based on the detected eye position; correcting one or more errors in the augmented reality HUD display coordinates of the object and one or more errors in the augmented reality HUD display coordinates of the eye using an error correction parameter for the augmented reality HUD display coordinates of the object and an error correction parameter for the augmented reality HUD display coordinates of the eye, the error correction parameters varying from one another; receiving the corrected augmented reality HUD display coordinates of the object and the corrected augmented reality HUD display coordinates of the eye; and displaying augmented reality HUD graphics of the external object information on the windshield based on the received corrected augmented reality HUD display coordinates.

Description

CROSS-REFERENCE TO RELATED APPLICATIONS
This application claims priority to and the benefit of Korean Patent Application No. 10-2014-0175097 filed in the Korean Intellectual Property Office on Dec. 8, 2014 and Korean Patent Application No. 10-2015-0065842 filed in the Korean Intellectual Property Office on May 12, 2015, the entire contents of which are incorporated herein by reference.
BACKGROUND OF THE DISCLOSURE
(a) Technical Field
The present disclosure relates generally to an augmented reality head-up display (HUD)-related technology for vehicles, and more particularly, to an augmented reality HUD display method and device for a vehicle which can minimize perception errors in augmented reality HUD graphics on a HUD.
(b) Description of the Related Art
Head-up displays (HUDs) are often used in automobiles for projecting information to a driver's eyes. A HUD is a front display device that is designed to present vehicle driving information on a front window (i.e., windshield) of a vehicle. In other words, a HUD unit produces and displays virtual images to allow the driver to view various types of information, such as speed, fuel level, temperature, warnings, directions, etc., which have been conventionally displayed on a vehicle's instrument cluster.
HUDs were originally introduced for providing a pilot with an enhanced field of view in an aircraft. Now, HUDs are beginning to be implemented in vehicles for the purpose of displaying driving information and reducing accidents caused by drivers looking away from the road while driving. For instance, through the use of a head-up display unit, drivers can keep their attention focused ahead (i.e., toward the road), thereby reducing the risk of accidents. Certain HUD units also offer a night vision feature that allows drivers to identify objects ahead in darkness, as well as displaying information deriving from the instrument cluster.
Accordingly, a HUD may be a device that presents information without requiring drivers to divert their attention from the road ahead while driving, by displaying images of information about the operation of a vehicle. Often, the HUD is implemented through a screen film inserted in the windshield at the front so as to minimize the driver's eye movement. Such a HUD may be comprised of an image source (e.g., a liquid crystal display (LCD)) for generating images, an optical system for forming an image generated by and projected from the image source, and an interface for the driver's control. The image should be projected from the image source at an optimum distance from the windshield and at an effective focal length.
A HUD for vehicles can display information deriving from the instrument panel cluster, such as vehicle speed, mileage, revolutions per minute (RPM), etc. on the front windshield so that the driver is able to get driving information easily while driving. Also, the HUD displays virtual images on the windshield by rendering information on a variety of internal systems of the vehicle into images when the vehicle is brought to a halt or the driver shifts the vehicle from park.
The above information disclosed in this Background section is only for enhancement of understanding of the background of the disclosure and therefore it may contain information that does not form the related art that is already known in this country to a person of ordinary skill in the art.
SUMMARY OF THE DISCLOSURE
The present disclosure has been made in an effort to provide an augmented reality HUD display method and device for a vehicle which can minimize perception errors in augmented reality HUD graphics, perceived by the vehicle driver or user.
Embodiments of the present disclosure provide an augmented reality HUD display method for a vehicle that includes: detecting a position of an object outside of the vehicle at which a driver of the vehicle is looking; detecting a position of an eye of the driver while the driver is viewing external object information displayed on a windshield of the vehicle; extracting augmented reality HUD display coordinates of the object based on the detected object position and augmented reality HUD display coordinates of the eye based on the detected eye position; correcting one or more errors in the augmented reality HUD display coordinates of the object and one or more errors in the augmented reality HUD display coordinates of the eye using an error correction parameter for the augmented reality HUD display coordinates of the object and an error correction parameter for the augmented reality HUD display coordinates of the eye, the error correction parameters varying from one another; receiving the corrected augmented reality HUD display coordinates of the object and the corrected augmented reality HUD display coordinates of the eye; and displaying augmented reality HUD graphics of the external object information on the windshield based on the received corrected augmented reality HUD display coordinates.
The correcting of the one or more errors may include: detecting a position of a plurality of objects outside of the vehicle; setting a first correction parameter for correcting one or more errors in augmented reality HUD display coordinates of a first object of the plurality of objects and one or more errors in the augmented reality HUD display coordinates of the eye while the driver is viewing the first object; and setting a second correction parameter for correcting one or more errors in augmented reality HUD display coordinates of a second object of the plurality of objects and the augmented reality HUD display coordinates of the eye while the driver is viewing the second object. The first object may be an external object that is a first distance away from the eye of the driver, the second object may be an external object that is a second distance away from the eye of the driver that is shorter than the first distance, and the second correction parameter may be set to a lower error correction value than the first correction parameter.
The method may further include detecting the position of the object using a radar sensor or a lidar sensor. The method may also further include detecting the position of the eye using a camera.
The correcting of the one or more errors may include: low-pass filtering the one or more errors in the augmented reality HUD display coordinates of the object and the one or more errors in the augmented reality HUD display coordinates of the eye. A cut-off frequency given as a first correction parameter for the low-pass filtering may be lower than a cut-off frequency given as a second correction parameter for the low-pass filtering.
HUD display information corresponding to the external object information may include speed information of the object or navigation information of the object. The navigation information may include turn-by-turn (TBT) information.
Furthermore, according to embodiments of the present disclosure, an augmented reality HUD display device for a vehicle includes: an object detection sensor detecting a position of an object outside of the vehicle at which a driver of the vehicle is looking; an eye position detector detecting a position of an eye of the driver while the driver is viewing external object information displayed on a windshield of the vehicle; an augmented reality display coordinates extractor extracting a augmented reality HUD display coordinates the object based on the detected object position and augmented reality HUD display coordinates of the eye based on the detected eye position; an error correction module correcting one or more errors in the augmented reality HUD display coordinates of the object and one or more errors in the augmented reality HUD display coordinates of the eye using an error correction parameter for the augmented reality HUD display coordinates of the object and an error correction parameter for the augmented reality HUD display coordinates of the eye, the error correction parameters varying from one another; and a graphics display unit receiving, from the error correction module, the corrected augmented reality HUD display coordinates of the object and the corrected augmented reality HUD display coordinates of the eye and displaying augmented reality HUD graphics of the external object information on the windshield based on the received corrected augmented reality HUD display coordinates.
The object detection sensor may detect a position of a plurality of objects outside of the vehicle; and the error correction module may set a first correction parameter for correcting one or more errors in augmented reality HUD display coordinates of a first object of the plurality of objects and one or more errors in augmented reality HUD display coordinates of the eye while the driver is viewing the first object and a second correction parameter for correcting one or more errors in augmented reality HUD display coordinates of a second object of the plurality of objects and augmented reality HUD display coordinates of the eye while the driver is viewing the second object. The first object may be an external object that is a first distance away from the eye of the driver, the second object may be an external object that is a second distance away from the eye of the driver that is shorter than the first distance, and the second correction parameter may be set to a lower error correction value than the first correction parameter.
The object detection sensor may include a radar sensor or a lidar sensor. The eye position detector may include a camera.
The error correction module may include a low-pass filter, and a cut-off frequency given as a first correction parameter for the low-pass filter may be lower than a cut-off frequency given as a second correction parameter for the low-pass filter.
HUD display information corresponding to the external object information may include speed information of the object or navigation information of the object. The navigation information may include TBT information.
Furthermore, according to embodiments of the present disclosure, a non-transitory computer readable medium containing program instructions for an augmented reality HUD display method for a vehicle includes: program instructions that detect a position of an object outside of the vehicle at which a driver of the vehicle is looking; program instructions that detect a position of an eye of the driver while the driver is viewing external object information displayed on a windshield of the vehicle; program instructions that extract augmented reality HUD display coordinates of the object based on the detected object position and augmented reality HUD display coordinates of the eye based on the detected eye position; program instructions that correct one or more errors in the augmented reality HUD display coordinates of the object and one or more errors in the augmented reality HUD display coordinates of the eye using an error correction parameter for the augmented reality HUD display coordinates of the object and an error correction parameter for the augmented reality HUD display coordinates of the eye, the error correction parameters varying from one another; program instructions that receive the corrected augmented reality HUD display coordinates of the object and the corrected augmented reality HUD display coordinates of the eye; and program instructions that display augmented reality HUD graphics of the external object information on the windshield based on the received corrected augmented reality HUD display coordinates.
Accordingly, an augmented reality HUD display device and method for a vehicle allow the driver of a vehicle to intuitively perceive the real-world driving environment on an augmented reality HUD system (i.e., augmented reality HUD device) for the vehicle, by correcting graphics errors perceived by the driver in a way that varies with the distance to an object the driver is looking at. Furthermore, the present disclosure may realize an algorithm that costs very little to implement an augmented reality HUD display method for a vehicle by making a trade-off between sensor cost and sensor performance, even if sensor technology is expected to make quite a lot of progress.
BRIEF DESCRIPTION OF THE DRAWINGS
In order to fully understand the drawings used in the detailed description of the present disclosure, the respective drawings will be briefly described.
FIG. 1 and FIG. 2 are views showing examples of an augmented reality HUD display method.
FIG. 3 is a view showing an example of the technical construction of an augmented reality HUD.
FIG. 4 is a view for explaining an eye position detection error and a driver's angle of view on an augmented reality HUD.
FIG. 5 is a view for explaining an error in the measurement of the distance to an object using a vehicle sensor.
FIG. 6 is a block diagram for explaining an augmented reality HUD display device for a vehicle according to embodiments of the present disclosure.
FIG. 7 is a graph for explaining an example of the error correction module of FIG. 6.
FIG. 8 is a view for explaining an example of augmented reality HUD graphics displayed by the augmented reality HUD display device for a vehicle shown in FIG. 6.
FIG. 9 is a view for explaining another example of augmented reality HUD graphics displayed by the augmented reality HUD display device for a vehicle shown in FIG. 6.
FIG. 10 is a view for explaining yet another example of augmented reality HUD graphics displayed by the augmented reality HUD display device for a vehicle shown in FIG. 6.
FIG. 11 is a view for explaining a further example of augmented reality HUD graphics displayed by the augmented reality HUD display device for a vehicle shown in FIG. 6.
DETAILED DESCRIPTION OF THE EMBODIMENTS
For better understanding of the present disclosure, and to show more clearly how it may be carried into effect, reference will now be made, by way of examples, to the accompanying drawings which show embodiments of the present disclosure.
Hereinafter, embodiments of the present disclosure will be described in detail with reference to the accompanying drawings. In describing the embodiments of the present disclosure, a detailed description of pertinent known constructions or functions will be omitted if it is deemed to make the gist of the present disclosure unnecessarily vague. Wherever possible, the same reference numbers will be used throughout the drawings to refer to the same or like parts.
The terms used in the specification are used to describe only specific embodiments and are not intended to limit the present disclosure. Singular forms are intended to include plural forms unless the context clearly indicates otherwise. It will be further understood that the terms “include”, “comprise”, or “have” used in this specification specify the presence of stated features, steps, operations, components, parts, or a combination thereof, but do not preclude the presence or addition of one or more other features, numerals, steps, operations, components, parts, or a combination thereof.
Unless indicated otherwise, it is to be understood that all the terms used in the specification including technical and scientific terms have the same meaning as those that are understood by those who are skilled in the art. It must be understood that the terms defined by the dictionary are identical with the meanings within the context of the related art, and they should not be ideally or excessively formally defined unless the context clearly dictates otherwise.
It is understood that the term “vehicle” or “vehicular” or other similar term as used herein is inclusive of motor vehicles in general such as passenger automobiles including sports utility vehicles (SUV), buses, trucks, various commercial vehicles, watercraft including a variety of boats and ships, aircraft, and the like, and includes hybrid vehicles, electric vehicles, plug-in hybrid electric vehicles, hydrogen-powered vehicles and other alternative fuel vehicles (e.g., fuels derived from resources other than petroleum). As referred to herein, a hybrid vehicle is a vehicle that has two or more sources of power, for example both gasoline-powered and electric-powered vehicles.
Additionally, it is understood that one or more of the below methods, or aspects thereof, may be executed by at least one controller. The term “controller” may refer to a hardware device that includes a memory and a processor. The memory is configured to store program instructions, and the processor is specifically programmed to execute the program instructions to perform one or more processes which are described further below. Moreover, it is understood that the below methods may be executed by an apparatus comprising the controller in conjunction with one or more other components, as would be appreciated by a person of ordinary skill in the art.
Furthermore, the controller of the present disclosure may be embodied as non-transitory computer readable media on a computer readable medium containing executable program instructions executed by a processor, controller or the like. Examples of the computer readable mediums include, but are not limited to, ROM, RAM, compact disc (CD)-ROMs, magnetic tapes, floppy disks, flash drives, smart cards and optical data storage devices. The computer readable recording medium can also be distributed in network coupled computer systems so that the computer readable media is stored and executed in a distributed fashion, e.g., by a telematics server or a Controller Area Network (CAN).
Generally, in order to realize an augmented reality HUD as shown in FIGS. 1 and 2, it is necessary to detect the vehicle driver's eye position and the coordinates of a target the driver intends to see. FIG. 1 and FIG. 2 are views showing examples of an augmented reality HUD display method.
Sensor data from a camera sensor for detecting the eye position, a lidar (light detection and ranging) sensor, and/or radar sensor for detecting objects outside the vehicle has a certain angle of error due to sensor vibration or eye blinks. Such an error involves a perception error that varies with the distance to a target object to be displayed on the augmented reality HUD, which will cause confusion for the vehicle driver or user. Particularly, the use of a distance-independent perception error reduction algorithm may lead to difficulty in keeping the performance of augmented reality HUD graphics consistent.
Further, in order to realize an augmented reality HUD in a vehicle as shown in FIG. 3, an image is projected on the windshield glass, and the user may see the projected image, as a virtual image, overlaid onto the real world beyond the windshield. FIG. 3 is a view showing an example of the technical construction of an augmented reality HUD.
To accurately match an obstacle or indicator (mark) ahead of the vehicle, the eye position of the driver of the vehicle must be detected, and the eye position is usually detected by a camera installed in the vehicle. The eye-tracking coordinates have some noise because of the camera resolution, eye blinks, etc., and a sensor for sensing external objects also has some coordinate error due to resolution issues.
FIG. 4 is a view for explaining an eye position detection error and a driver's angle of view on an augmented reality HUD. Specifically, FIG. 4 explains a display error on an object outside the vehicle the driver is looking at, caused by eye noise.
As shown in FIG. 4, the technical components of an augmented reality HUD may include an eye position detecting camera 120 for detecting the driver's eye, and a radar (radio detecting and ranging) sensor 205 and/or lidar (light detection and ranging) sensor for detecting (i.e., measuring) the position of an external object.
An eye vector can be indicated by a line connecting the center of the driver's eye 125 and the center of the pupil. Thus, the line of sight (i.e., angle of view of an eye position detection system) has a vector which radiates from the center of the eye, as shown in FIG. 4. As such, eye noise (i.e., eye position detection error) radiates (at a certain angle), and as shown in FIG. 4, the farther from the eye, the larger the margin of error in orthogonal coordinates in a transverse plane.
As further shown in FIG. 4, the margin of orthogonal error (i.e., orthogonal coordinate error) in the eye position varies with respect to distance (i.e., the distance from the driver's eye to an object the driver is looking at), even for an object of the same size. Thus, an augmented reality HUD graphic display of a far object 105 on the vehicle's windshield glass 115 has a larger margin of error than an augmented reality HUD graphic display of a near object 110 on it, thus leading to a higher level of perceived noise (i.e., perception error) on the far object.
FIG. 5 is a view for explaining an error in the measurement of the distance to an object using a vehicle sensor.
As shown in FIG. 5, the same principle as explained with reference to FIG. 4 may be applied to a sensor such as a radar sensor or lidar sensor 215 installed at the front of the vehicle 220. Radio waves or light (or laser light) radiation from the sensor 215 departs from one point on the vehicle 220 and scans the area ahead of the vehicle. Accordingly, sensing noise also radiates like the aforementioned eye noise, at a certain angle (i.e., angle of error from a distance measurement sensor).
Therefore, as shown in FIG. 5, sensing noise varies with the distance between the vehicle 220 and the far object 205 or near object 210 outside the vehicle 220. In an augmented reality HUD graphic representation of the variation of sensing noise with distance, the far object 205 of the same size as the near object 210 is displayed with a different level of graphics noise, causing the driver to perceive noise at different levels for the far object 205 and the near object 210.
Referring now to the disclosed embodiments, FIG. 6 is a block diagram for explaining an augmented reality HUD display device for a vehicle according to embodiments of the present disclosure.
As shown in FIG. 6, an augmented reality HUD display device 300 for a vehicle may include an object detection sensor 305, an eye position detector 310, an augmented reality display coordinates extractor 315, an error correction module 320, and a graphics display unit 325.
As is known in the art, augmented reality may refer to computer graphics technology that synthesizes a virtual object or virtual information within the real world to make it look like a real-world object. In other words, augmented reality may refer to virtual reality technology that combines virtual information with the real world as viewed through the eye to produce one image. The augmented reality technology is well known in the art, so a detailed description thereof will be omitted in this specification.
A head-up display (HUD) device such as the augmented reality HUD display device 300 for a vehicle may be a device that reflects an image onto a windshield of the vehicle or a combiner (i.e., transparent panel) to provide a vehicle driver with vehicle information such as vehicle speed, mileage, or revolution per minute (RPM), or navigation information. Since augmented reality HUDs require matching a real-world object with the eye position, depending on the eye position, the driver's eye position may have to be matched with an HUD screen (HUD area). The HUD area (i.e., HUD display area or HUD screen area) may indicate a vehicle information image area that is delivered to the vehicle driver's eye by presenting vehicle information such as vehicle driving information on the windshield of the vehicle. The HUD area may indicate a virtual area where an HUD image is displayed. The HUD area may refer to an area that is within a display screen which the driver's eye is on and which presents an image when the driver looks ahead.
The augmented reality HUD display device 300 for a vehicle may perform a process of correcting augmented reality graphics coordinates (i.e., augmented reality HUD graphic coordinates), and may perform a method of minimizing graphic errors caused by the vehicle driver's or user's shifting their eyes (i.e., graphic errors caused by eye movements) in the design of an augmented reality HUD graphics interface. More specifically, the augmented reality HUD display device 300 for a vehicle may execute an algorithm that gives more priority to response rate than to accuracy for an object near the vehicle and vice versa for an object, such as a building, far from the vehicle.
Moreover, the augmented reality HUD display device 300 for a vehicle may use error correction parameters that vary with the distance between the driver and an object to be displayed by the augmented reality HUD (i.e., the distance between the driver's eye and an object at which the driver is looking), in order to give consistent perception errors to the driver. Basically, it is better to reduce errors in all sensor data, but this may involve degradation of other properties. For example, low-pass filtering, one of the most typical methods of error reduction, can significantly reduce noise but may lead to a low response speed.
Accordingly, in order to give equal perception errors regardless of the distance between the driver and an object, error correction parameters are set in such a way that reduces the margin of error on the far object shown in FIG. 4 or FIG. 5 to a greater degree and reduces the margin of error on the near object to a lesser degree. This is because, while a low response speed on the far object causes no problem in the performance (i.e., display accuracy) of the augmented reality HUD display device 300 since long-distance movement of the far object is not presented to the driver, the near object has less noise for its size and response speed is more critical for the near object.
The object detection sensor 305 may detect the position of an object outside the vehicle the driver is looking at. The object detection sensor 305 may measure the distance from the vehicle to the external object. Moreover, the object detection sensor 305 may deliver distance information to the error correction module 320 that uses sensor data error correction parameters so as to use this information as a reference in the error correction of the error correction module 320.
The object detection sensor 305 may include a radar sensor and/or a lidar (Light Detection and Ranging) sensor. The lidar sensor, a type of laser radar sensor, may be a radar system that measures the coordinates of the position of a reflecting object by measuring the time for a laser pulse to be irradiated and reflected and return.
The eye position detector 310 may detect the eye position of the driver viewing external object information or augmented reality HUD display information corresponding to the external object information which is displayed on the windshield of the vehicle. The eye position detector 310 may include a camera.
The augmented reality display coordinates extractor (i.e., augmented reality HUD display coordinates extractor) 315 may extract the augmented reality HUD display coordinates of an external object detected by the object detection sensor 305 and the augmented reality HUD display coordinates (or eye-tracking coordinates) of the eye position detected by the eye position detector 310.
The error correction module 320 may correct errors in the augmented reality HUD display coordinates of the external object and errors in the augmented reality HUD display coordinates of the eye position by using an error correction parameter for the augmented reality HUD display coordinates of the external object and an error correction parameter for the augmented reality HUD display coordinates of the eye position, the error correction parameters varying with distance information between the driver's eye and the external object. The distance information may be delivered from the object detection sensor 305 to the error correction module 320.
The error correction module 320 may also set a first correction parameter for correcting errors in the augmented reality HUD display coordinates of first object and errors in the augmented reality HUD display coordinates of the eye position of the vehicle driver viewing the first object, and a second correction parameter for correcting errors in the augmented reality HUD display coordinates of second object and the augmented reality HUD display coordinates of the position of the eye viewing the second object. The first object is the external object that is at a first distance from the driver's eye, the second object is the external object that is at a second distance from the driver's eye which is shorter than the first distance, and the second correction parameter is set to a lower error correction value than the first correction parameter.
The augmented reality HUD display device 300 for a vehicle may further include a camera that captures an image of the road ahead of the vehicle that is matched with the external object information or the HUD display information (i.e., virtual image information). The image of the road ahead may be of a scene the driver is seeing through the windshield.
Additionally, the error correction module 320 may include a low-pass filter (LPF). A cut-off frequency given as a first correction parameter for the low-pass filter may be lower than a cut-off frequency given as a second correction parameter for the low-pass filter.
The graphics display unit 325 may receive, from the error correction module 320, the corrected augmented reality HUD display coordinates of the external object and the corrected augmented reality HUD display coordinates of the eye position, and display augmented reality HUD graphics of the external object information on the windshield. HUD display information corresponding to the external object information may include speed information of the external object shown in FIG. 10 or navigation information related to the external object. The navigation information may include turn-by-turn (TBT) information shown in FIG. 11. The TBT information may include a direction change icon.
Additionally, the augmented reality HUD display device 300 for a vehicle may further include a controller (not shown). The controller may perform the functions of a central processing unit (CPU) or processor and control the overall operation of the object detection sensor 305, eye position detector 310, augmented reality display coordinates extractor 315, error correction module 320, and graphics display unit 325. The controller may include a program containing a series of commands for performing an augmented reality HUD display method for a vehicle according to the present disclosure to be described later.
An augmented reality HUD display method for a vehicle according to embodiments of the present disclosure will now be described below with reference to FIG. 6. The augmented reality HUD display method for a vehicle may be applied to the augmented reality HUD display device 300 for a vehicle shown in FIG. 6, and may also be referred to as a method of displaying variable errors on an augmented reality HUD for a vehicle.
The augmented reality HUD display method for a vehicle may include, for example, a first detection step, a second detection step, an extraction step, a correction step, and a display step.
In the first detection step, the position of an object outside the vehicle the vehicle driver is looking at may be detected by the object detection sensor 305. The sensor for detecting the position of the external object may include a radar sensor or a lidar sensor.
In the second detection step, the eye position of the vehicle driver viewing external object information displayed on the windshield of the vehicle may be detected by the eye position detector 310. The sensor for detecting the eye position may include a camera. HUD display information corresponding to the external object information may include speed information of the external object shown in FIG. 10 or navigation information regarding the external object. The navigation information may include TBT information shown in FIG. 11.
In the extraction step, the augmented reality HUD display coordinates of the detected external object or the augmented reality HUD display coordinates (or eye-tracking coordinates) of the detected eye may be extracted by the augmented reality display coordinates extractor 315.
In the correction step, errors in the augmented reality HUD display coordinates of the external object and errors in the augmented reality HUD display coordinates of the eye position may be corrected by the error correction module 320 by using an error correction parameter for the augmented reality HUD display coordinates of the external object and an error correction parameter for the augmented reality HUD display coordinates of the eye position, the error correction parameters varying (i.e., changing) with distance information (i.e., eye distance information) between the driver's eye and the external object.
In the correction step, a first correction parameter for correcting errors in the augmented reality HUD display coordinates of first object and errors in the augmented reality HUD display coordinates of the eye position of the vehicle driver viewing the first object, and a second correction parameter for correcting errors in the augmented reality HUD display coordinates of second object and the augmented reality HUD display coordinates of the position of the eye viewing the second object may be set by the error correction module 320. The first object is the external object that is at a first distance from the driver's eye, the second object is the external object that is at a second distance from the driver's eye which is shorter than the first distance, and the second correction parameter is set to a lower error correction value than the first correction parameter.
Additionally, errors in the augmented reality HUD display coordinates of the external object and errors in the augmented reality HUD display coordinates of the eye position may be low-pass-filtered. A cut-off frequency given as a first correction parameter for the low-pass filtering may be lower than a cut-off frequency given as a second correction parameter for the low-pass filtering.
In the display step, the corrected augmented reality HUD display coordinates of the external object and the corrected augmented reality HUD display coordinates of the eye position may be received, and augmented reality HUD graphics of the external object information may be displayed on the windshield by the graphics display unit 325.
FIG. 7 is a graph for explaining an example of the error correction module of FIG. 6.
As shown in FIG. 7, if a low-pass filter (LPF) is applied to the error correction module 320 of FIG. 6, a cut-off frequency may be given as a correction parameter. Here, when an object is far from the driver (or vehicle), the cut-off frequency of the LPF may decrease, and when the object is near the driver (or vehicle), the cut-off frequency of the LPF may increase. The accuracy and response speed of sensor data may be adjusted by cut-off frequency adjustment.
More specifically, as shown in FIG. 7, errors in the augmented reality HUD display coordinates of the external object and errors in the augmented reality HUD display coordinates of the eye may be corrected by low-pass-filtering the range of coordinates of the eye position and the range of coordinates of the position of a detected object (not shown) extracted by the augmented reality display coordinates extractor (i.e., 315 of FIG. 6).
As described above, in embodiments of the present disclosure, the cut-off frequency of the LPF may be used as an error correction parameter, and the cut-off frequency may be adjusted depending on the distance between the driver (or vehicle) and an external object. As a result, the present disclosure can minimize perception errors in augmented reality HUD graphics in a HUD device, perceived by the driver.
FIG. 8 is a view for explaining an example of augmented reality HUD graphics displayed by the augmented reality HUD display device for a vehicle shown in FIG. 6. FIG. 8 may show a graphical representation of perception errors on objects that vary with the distance from the driver's viewpoint in a real situation.
As shown in FIG. 8, a vehicle driver 405 may see a first object 415 at a long distance and a second object 410 at a short distance on the windshield 420, with variable errors obtained by correcting errors in graphic coordinates. The first object 415 corresponds to first HUD display information, and the second object 410 corresponds to second HUD display information.
Large-scale error correction may be performed on the first object 415, as indicated by the larger double-headed arrow of FIG. 8, and small-scale error correction may be performed on the second object 410, as indicated by the smaller double-headed arrow of FIG. 8. As a result, perception errors in graphics may be made equal regardless of the distance between the driver (or vehicle) and the objects, thereby minimizing distance-dependent cursor blinking on the displayed objects 410 and 415.
FIG. 9 is a view for explaining another example of augmented reality HUD graphics displayed by the augmented reality HUD display device for a vehicle shown in FIG. 6. That is, FIG. 9 shows an application of the present disclosure which displays the distance to a vehicle ahead on an augmented reality HUD.
As shown in FIG. 9, graphics of a near object 510, i.e. a second object, may be displayed, with the display error parameter set to correspond to response speed rather than accuracy, and graphics of a far object 515, i.e. a first object, may be displayed, with the display error parameter set to correspond to accuracy rather than response speed.
Accordingly, the vehicle driver 505 is able to see on the windshield 520 a graphic display of the near vehicle 510 and a graphic display of the far vehicle 515 where perception errors in graphics are equal regardless of the distance between the driver (or vehicle) and the objects.
FIG. 10 is a view for explaining yet another example of augmented reality HUD graphics displayed by the augmented reality HUD display device for a vehicle shown in FIG. 6. In other words, FIG. 10 shows an application of the present disclosure which displays the speed of a vehicle ahead on an augmented reality HUD.
As shown in FIG. 10, graphics of a near object 610, i.e., a second object, may be displayed, with the display error parameter set to correspond to response speed rather than accuracy, and graphics of a far object 615, i.e., a first object, may be displayed, with the display error parameter set to correspond to accuracy rather than response speed. Accordingly, the vehicle driver 605 is able to see on the windshield 620 a graphic display of the near vehicle 610 and a graphic display of the far vehicle 615 where perception errors in graphics are equal regardless of the distance between the driver (or vehicle) and the objects.
FIG. 11 is a view for explaining a further example of augmented reality HUD graphics displayed by the augmented reality HUD display device for a vehicle shown in FIG. 6. That is, FIG. 11 shows an application of the present disclosure which displays TBT information on an augmented reality HUD.
As shown in FIG. 11, TBT information for a short distance (e.g., 50 m) may be displayed, with the coordinate error parameter set to correspond to response speed rather than accuracy, and TBT information for a long distance (e.g., 150 m) may be displayed, with the coordinate error parameter set to correspond to accuracy rather than response speed.
The components, units, blocks, or modules used in the present disclosure may be implemented by software components, such as tasks, classes, subroutines, processes, objects, execution threads, or programs, or by hardware components, such as a Field Programmable Gate Array (FPGA) or an Application Specific Integrated Circuit ASIC, or by combinations of the software and hardware components. The components may be included in a computer-readable storage medium, or some of the components may be distributed in a plurality of computers.
Accordingly, embodiments have been disclosed herein (i.e., in the drawings and the specification). Although specific terms have been used herein, they have been used merely for the purpose of describing the present disclosure, and have not been used to limit the meanings thereof and the scope of the present disclosure set forth in the claims. Therefore, it will be understood by those having ordinary knowledge in the art that various modifications and other equivalent embodiments can be made. Accordingly, the true technical protection range of this disclosure should be defined by the technical spirit of the attached claims.
DESCRIPTION OF SYMBOLS
    • 305: object detection sensor
    • 310: eye position detector
    • 315: augmented reality display coordinates extractor
    • 320: error correction module
    • 325: graphics display unit

Claims (15)

What is claimed is:
1. An augmented reality head-up display (HUD) display method for a vehicle, the method comprising:
detecting, by a controller, a position of an object outside of the vehicle at which a driver of the vehicle is looking;
detecting, by the controller, a position of an eye of the driver while the driver is viewing external object information displayed on a windshield of the vehicle;
extracting, by the controller, augmented reality HUD display coordinates of the object based on the detected object position and augmented reality HUD display coordinates of the eye based on the detected eye position;
correcting, by the controller, one or more errors in the augmented reality HUD display coordinates of the object and one or more errors in the augmented reality HUD display coordinates of the eye using an error correction parameter for the augmented reality HUD display coordinates of the object and an error correction parameter for the augmented reality HUD display coordinates of the eye based on distance information between the eye and the object, the error correction parameters varying from one another;
receiving, by the controller, the corrected augmented reality HUD display coordinates of the object and the corrected augmented reality HUD display coordinates of the eye; and
displaying, by the controller, augmented reality HUD graphics of the external object information on the windshield based on the received corrected augmented reality HUD display coordinates.
2. The method of claim 1, wherein the correcting of the one or more errors comprises:
detecting a position of a plurality of objects outside of the vehicle;
setting a first correction parameter for correcting one or more errors in augmented reality HUD display coordinates of a first object of the plurality of objects and one or more errors in the augmented reality HUD display coordinates of the eye while the driver is viewing the first object; and
setting a second correction parameter for correcting one or more errors in augmented reality HUD display coordinates of a second object of the plurality of objects and the augmented reality HUD display coordinates of the eye while the driver is viewing the second object,
wherein the first object is an external object that is a first distance away from the eye of the driver, the second object is an external object that is a second distance away from the eye of the driver that is shorter than the first distance, and the second correction parameter is set to a lower error correction value than the first correction parameter.
3. The method of claim 1, further comprising:
detecting, by the controller, the position of the object using a radar sensor or a lidar sensor.
4. The method of claim 1, further comprising:
detecting, by the controller, the position of the eye using a camera.
5. The method of claim 1, wherein the correcting of the one or more errors comprises:
low-pass filtering the one or more errors in the augmented reality HUD display coordinates of the object and the one or more errors in the augmented reality HUD display coordinates of the eye,
wherein a cut-off frequency given as a first correction parameter for the low-pass filtering is lower than a cut-off frequency given as a second correction parameter for the low-pass filtering.
6. The method of claim 1, wherein HUD display information corresponding to the external object information includes speed information of the object or navigation information of the object.
7. The method of claim 6, wherein the navigation information includes turn-by-turn (TBT) information.
8. An augmented reality HUD display device for a vehicle, the device comprising:
an object detection sensor detecting a position of an object outside of the vehicle at which a driver of the vehicle is looking;
an eye position detector detecting a position of an eye of the driver while the driver is viewing external object information displayed on a windshield of the vehicle;
an augmented reality display coordinates extractor extracting a augmented reality HUD display coordinates the object based on the detected object position and augmented reality HUD display coordinates of the eye based on the detected eye position;
an error correction module correcting one or more errors in the augmented reality HUD display coordinates of the object and one or more errors in the augmented reality HUD display coordinates of the eye using an error correction parameter for the augmented reality HUD display coordinates of the object and an error correction parameter for the augmented reality HUD display coordinates of the eye based on distance information between the eye and the object, the error correction parameters varying from one another; and
a graphics display unit receiving, from the error correction module, the corrected augmented reality HUD display coordinates of the object and the corrected augmented reality HUD display coordinates of the eye and displaying augmented reality HUD graphics of the external object information on the windshield based on the received corrected augmented reality HUD display coordinates.
9. The device of claim 8, wherein:
the object detection sensor detects a position of a plurality of objects outside of the vehicle; and
the error correction module sets a first correction parameter for correcting one or more errors in augmented reality HUD display coordinates of a first object of the plurality of objects and one or more errors in augmented reality HUD display coordinates of the eye while the driver is viewing the first object and a second correction parameter for correcting one or more errors in augmented reality HUD display coordinates of a second object of the plurality of objects and augmented reality HUD display coordinates of the eye while the driver is viewing the second object,
wherein the first object is an external object that is a first distance away from the eye of the driver, the second object is an external object that is a second distance away from the eye of the driver that is shorter than the first distance, and the second correction parameter is set to a lower error correction value than the first correction parameter.
10. The device of claim 8, wherein the object detection sensor includes a radar sensor or a lidar sensor.
11. The device of claim 8, wherein the eye position detector includes a camera.
12. The device of claim 8, wherein the error correction module includes a low-pass filter, and a cut-off frequency given as a first correction parameter for the low-pass filter is lower than a cut-off frequency given as a second correction parameter for the low-pass filter.
13. The device of claim 8, wherein HUD display information corresponding to the external object information includes speed information of the object or navigation information of the object.
14. The device of claim 13, wherein the navigation information includes TBT information.
15. A non-transitory computer readable medium containing program instructions for an augmented reality HUD display method for a vehicle, the computer readable medium comprising:
program instructions that detect a position of an object outside of the vehicle at which a driver of the vehicle is looking;
program instructions that detect a position of an eye of the driver while the driver is viewing external object information displayed on a windshield of the vehicle;
program instructions that extract augmented reality HUD display coordinates of the object based on the detected object position and augmented reality HUD display coordinates of the eye based on the detected eye position;
program instructions that correct one or more errors in the augmented reality HUD display coordinates of the object and one or more errors in the augmented reality HUD display coordinates of the eye using an error correction parameter for the augmented reality HUD display coordinates of the object and an error correction parameter for the augmented reality HUD display coordinates of the eye based on distance information between the eye and the object, the error correction parameters varying from one another;
program instructions that receive the corrected augmented reality HUD display coordinates of the object and the corrected augmented reality HUD display coordinates of the eye; and
program instructions that display augmented reality HUD graphics of the external object information on the windshield based on the received corrected augmented reality HUD display coordinates.
US14/846,781 2014-12-08 2015-09-06 Augmented reality HUD display method and device for vehicle Active US9690104B2 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR20140175097 2014-12-08
KR10-2014-0175097 2014-12-08
KR1020150065842A KR101713740B1 (en) 2014-12-08 2015-05-12 Method and device for displaying augmented reality HUD for vehicle
KR10-2015-0065842 2015-05-12

Publications (2)

Publication Number Publication Date
US20160163108A1 US20160163108A1 (en) 2016-06-09
US9690104B2 true US9690104B2 (en) 2017-06-27

Family

ID=55974284

Family Applications (1)

Application Number Title Priority Date Filing Date
US14/846,781 Active US9690104B2 (en) 2014-12-08 2015-09-06 Augmented reality HUD display method and device for vehicle

Country Status (2)

Country Link
US (1) US9690104B2 (en)
DE (1) DE102015218162B4 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180086265A1 (en) * 2016-09-26 2018-03-29 Volvo Car Corporation Method, system and vehicle for use of an object displaying device in a vehicle
US20180090007A1 (en) * 2015-03-16 2018-03-29 Denso Corporation Image generation apparatus
US10746987B2 (en) 2018-07-12 2020-08-18 Toyota Research Institute, Inc. Vehicle systems and methods for redirecting a driver's gaze towards an object of interest
US10795166B2 (en) 2018-03-07 2020-10-06 Pegatron Corporation Head up display system and control method thereof
EP3932719A1 (en) 2020-07-03 2022-01-05 Honda Research Institute Europe GmbH Method for assisting a user of an assistance system, assistance system and vehicle comprising such a system
US20220074753A1 (en) * 2020-09-09 2022-03-10 Volkswagen Aktiengesellschaft Method for Representing a Virtual Element
US11422764B1 (en) 2018-06-03 2022-08-23 Epic Optix, Inc. Multi-platform integrated display
US20230121388A1 (en) * 2021-10-14 2023-04-20 Taslim Arefin Khan Systems and methods for prediction-based driver assistance

Families Citing this family (46)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9836895B1 (en) * 2015-06-19 2017-12-05 Waymo Llc Simulating virtual objects
WO2017072841A1 (en) * 2015-10-27 2017-05-04 日立マクセル株式会社 Information display device
JP6456516B2 (en) * 2015-10-30 2019-01-23 三菱電機株式会社 Driving assistance device
US20170169612A1 (en) * 2015-12-15 2017-06-15 N.S. International, LTD Augmented reality alignment system and method
WO2017113757A1 (en) * 2015-12-31 2017-07-06 北京小鸟看看科技有限公司 Method of laying out surrounding interface, methods of switching content and switching list in three-dimensional immersive environment
JP6410987B2 (en) * 2016-02-25 2018-10-24 富士フイルム株式会社 Driving support device, driving support method, and driving support program
EP3441375B1 (en) 2016-04-07 2022-01-05 Agc Inc. Laminated glass
KR102582092B1 (en) * 2016-04-22 2023-09-25 한국전자통신연구원 Apparatus and method for transforming augmented reality information of head-up display for vehicle
US10057511B2 (en) * 2016-05-11 2018-08-21 International Business Machines Corporation Framing enhanced reality overlays using invisible light emitters
US20180017799A1 (en) * 2016-07-13 2018-01-18 Ford Global Technologies, Llc Heads Up Display For Observing Vehicle Perception Activity
JP6717093B2 (en) * 2016-07-15 2020-07-01 Agc株式会社 Laminated glass
KR20180016027A (en) * 2016-08-05 2018-02-14 삼성전자주식회사 A display apparatus for car and method for controlling the display apparatus thereof
CA2976543A1 (en) * 2016-08-23 2018-02-23 8696322 Canada Inc. System and method for augmented reality head up display for vehicles
KR102564479B1 (en) * 2016-11-22 2023-08-07 삼성전자주식회사 Method and apparatus of 3d rendering user' eyes
CN106500716A (en) * 2016-12-13 2017-03-15 英业达科技有限公司 Automobile navigation optical projection system and its method
WO2018126257A1 (en) * 2017-01-02 2018-07-05 Visteon Global Technologies, Inc. Automatic eye box adjustment
CN106740581A (en) * 2017-01-03 2017-05-31 青岛海信移动通信技术股份有限公司 A kind of control method of mobile unit, AR devices and AR systems
US10082869B2 (en) 2017-02-03 2018-09-25 Qualcomm Incorporated Maintaining occupant awareness in vehicles
KR20180101746A (en) * 2017-03-06 2018-09-14 삼성전자주식회사 Method, electronic device and system for providing augmented reality contents
US10242457B1 (en) * 2017-03-20 2019-03-26 Zoox, Inc. Augmented reality passenger experience
JP6861375B2 (en) * 2017-06-30 2021-04-21 パナソニックIpマネジメント株式会社 Display system, information presentation system, display system control method, program, and mobile
US10334199B2 (en) * 2017-07-17 2019-06-25 Microsoft Technology Licensing, Llc Augmented reality based community review for automobile drivers
US10421397B2 (en) * 2017-08-11 2019-09-24 Visteon Global Technologies, Inc. Forward maneuvering assistance using head-up display
US10469819B2 (en) * 2017-08-17 2019-11-05 Shenzhen China Star Optoelectronics Semiconductor Display Technology Co., Ltd Augmented reality display method based on a transparent display device and augmented reality display device
DE102017215024A1 (en) 2017-08-28 2019-02-28 Volkswagen Aktiengesellschaft A method, apparatus and computer readable storage medium having instructions for providing information for a head-up display device for a motor vehicle
US10000153B1 (en) 2017-08-31 2018-06-19 Honda Motor Co., Ltd. System for object indication on a vehicle display and method thereof
JP7067005B2 (en) * 2017-09-26 2022-05-16 スズキ株式会社 Vehicle front structure
TWI679555B (en) * 2017-10-12 2019-12-11 華碩電腦股份有限公司 Augmented reality system and method for providing augmented reality
KR20200096811A (en) 2017-12-07 2020-08-13 시리얼 테크놀로지즈 에스.에이. Head up display
KR20190078664A (en) 2017-12-11 2019-07-05 삼성전자주식회사 Method and apparatus for displaying content
DE112018006464T5 (en) * 2017-12-19 2020-08-27 Sony Corporation INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD, PROGRAM, DISPLAY SYSTEM AND MOBILE OBJECT
US10532697B2 (en) 2018-06-14 2020-01-14 International Business Machines Corporation Augmented reality-based roadside content viewing within primary field of view
DE102018219481A1 (en) * 2018-11-15 2020-05-20 Robert Bosch Gmbh Assembly for a LiDAR sensor and LiDAR sensor
US11703590B2 (en) * 2018-11-19 2023-07-18 Suteng Innovation Technology Co., Ltd. Lidar signal receiving circuits, lidar signal gain control methods, and lidars using the same
JP7260425B2 (en) * 2019-07-08 2023-04-18 ファナック株式会社 Display device
EP3809359A1 (en) 2019-10-14 2021-04-21 Ningbo Geely Automobile Research & Development Co. Ltd. Vehicle driving challenge system and corresponding method
EP3866055A1 (en) * 2020-02-12 2021-08-18 Aptiv Technologies Limited System and method for displaying spatial information in the field of view of a driver of a vehicle
US10983681B1 (en) * 2020-04-07 2021-04-20 Gm Cruise Holdings Llc Image identification system
KR20210131129A (en) * 2020-04-23 2021-11-02 현대자동차주식회사 User interface generating apparatus for vehicle guide and ui generating method using the same
KR20220036456A (en) * 2020-09-15 2022-03-23 현대자동차주식회사 Apparatus for displaying information based on augmented reality
GB202019489D0 (en) * 2020-12-10 2021-01-27 Bae Systems Plc Augmented reality window
TWI790640B (en) * 2021-06-11 2023-01-21 宏碁股份有限公司 Augmented reality display device and method
DE102022101893B4 (en) * 2022-01-27 2023-09-28 Joynext Gmbh Method for displaying augmented reality information in vehicles
US11919451B2 (en) * 2022-02-28 2024-03-05 Nissan North America, Inc. Vehicle data display system
WO2022266556A1 (en) * 2022-08-09 2022-12-22 Innopeak Technology, Inc. Methods and systems for motion prediction
CN117681656A (en) * 2022-08-25 2024-03-12 纬创资通股份有限公司 Electronic device and method for configuring head-up display

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7706978B2 (en) * 2005-09-02 2010-04-27 Delphi Technologies, Inc. Method for estimating unknown parameters for a vehicle object detection system
KR20120066472A (en) 2010-12-14 2012-06-22 한국전자통신연구원 Apparatus and method for displaying augmented reality contents using a front object
KR20120067854A (en) 2010-12-16 2012-06-26 한국전자통신연구원 Display system for augmented reality in vehicle, and method for the same
US20130169679A1 (en) * 2011-12-30 2013-07-04 Automotive Research & Test Center Vehicle image display system and correction method thereof
KR20130089139A (en) 2012-02-01 2013-08-09 한국전자통신연구원 Augmented reality head-up display apparatus and method for vehicles
US20130315446A1 (en) 2009-08-26 2013-11-28 Jacob BEN TZVI Projecting location based elements over a heads up display
US20140070934A1 (en) * 2012-09-07 2014-03-13 GM Global Technology Operations LLC Methods and systems for monitoring driver object detection
US20140160012A1 (en) * 2012-12-11 2014-06-12 Automotive Research & Test Center Automatic correction device of vehicle display system and method thereof
US20160216521A1 (en) * 2013-10-22 2016-07-28 Nippon Seiki Co., Ltd. Vehicle information projection system and projection device

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150223028A1 (en) 2012-10-03 2015-08-06 Qualcomm Incorporated Broadcast/Multicast Used for M2M/MTC
WO2014095068A1 (en) 2012-12-21 2014-06-26 Harman Becker Automotive Systems Gmbh Infotainment system
JP6040897B2 (en) 2013-09-04 2016-12-07 トヨタ自動車株式会社 Attention display device and attention display method

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7706978B2 (en) * 2005-09-02 2010-04-27 Delphi Technologies, Inc. Method for estimating unknown parameters for a vehicle object detection system
US20130315446A1 (en) 2009-08-26 2013-11-28 Jacob BEN TZVI Projecting location based elements over a heads up display
KR20120066472A (en) 2010-12-14 2012-06-22 한국전자통신연구원 Apparatus and method for displaying augmented reality contents using a front object
KR20120067854A (en) 2010-12-16 2012-06-26 한국전자통신연구원 Display system for augmented reality in vehicle, and method for the same
US20130169679A1 (en) * 2011-12-30 2013-07-04 Automotive Research & Test Center Vehicle image display system and correction method thereof
KR20130089139A (en) 2012-02-01 2013-08-09 한국전자통신연구원 Augmented reality head-up display apparatus and method for vehicles
US20140070934A1 (en) * 2012-09-07 2014-03-13 GM Global Technology Operations LLC Methods and systems for monitoring driver object detection
US20140160012A1 (en) * 2012-12-11 2014-06-12 Automotive Research & Test Center Automatic correction device of vehicle display system and method thereof
US20160216521A1 (en) * 2013-10-22 2016-07-28 Nippon Seiki Co., Ltd. Vehicle information projection system and projection device

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180090007A1 (en) * 2015-03-16 2018-03-29 Denso Corporation Image generation apparatus
US10748425B2 (en) * 2015-03-16 2020-08-18 Denso Corporation Image generation apparatus
US20180086265A1 (en) * 2016-09-26 2018-03-29 Volvo Car Corporation Method, system and vehicle for use of an object displaying device in a vehicle
US11279371B2 (en) * 2016-09-26 2022-03-22 Volvo Car Corporation Method, system and vehicle for use of an object displaying device in a vehicle
US10795166B2 (en) 2018-03-07 2020-10-06 Pegatron Corporation Head up display system and control method thereof
US11422764B1 (en) 2018-06-03 2022-08-23 Epic Optix, Inc. Multi-platform integrated display
US10746987B2 (en) 2018-07-12 2020-08-18 Toyota Research Institute, Inc. Vehicle systems and methods for redirecting a driver's gaze towards an object of interest
EP3932719A1 (en) 2020-07-03 2022-01-05 Honda Research Institute Europe GmbH Method for assisting a user of an assistance system, assistance system and vehicle comprising such a system
US20220074753A1 (en) * 2020-09-09 2022-03-10 Volkswagen Aktiengesellschaft Method for Representing a Virtual Element
US20230121388A1 (en) * 2021-10-14 2023-04-20 Taslim Arefin Khan Systems and methods for prediction-based driver assistance
US11794766B2 (en) * 2021-10-14 2023-10-24 Huawei Technologies Co., Ltd. Systems and methods for prediction-based driver assistance

Also Published As

Publication number Publication date
US20160163108A1 (en) 2016-06-09
DE102015218162B4 (en) 2023-06-29
DE102015218162A1 (en) 2016-06-09

Similar Documents

Publication Publication Date Title
US9690104B2 (en) Augmented reality HUD display method and device for vehicle
KR101713740B1 (en) Method and device for displaying augmented reality HUD for vehicle
US10510276B1 (en) Apparatus and method for controlling a display of a vehicle
EP2891953B1 (en) Eye vergence detection on a display
CN107848415B (en) Display control device, display device, and display control method
RU2675760C1 (en) Vehicle display device and display method for vehicle
US20170269684A1 (en) Vehicle display device
KR102578517B1 (en) Electronic apparatus and control method thereof
US9809221B2 (en) Apparatus, method, and computer readable medium for displaying vehicle information
JP2019064580A (en) Systems and methods for displaying three-dimensional images on vehicle instrument console
CN105966311B (en) Method for calibrating a camera, device for a vehicle and computer program product
US20140176425A1 (en) System and method for identifying position of head-up display area
US9463743B2 (en) Vehicle information display device and vehicle information display method
JP2018058544A (en) On-vehicle display control device
US10209857B2 (en) Display control apparatus and display system
CN109788243B (en) System unreliability in identifying and visually presenting display enhanced image content
EP3496041A1 (en) Method and apparatus for estimating parameter of virtual screen
CA2999955A1 (en) Vehicular display device
US11325470B2 (en) Method, device and computer-readable storage medium with instructions for controlling a display of an augmented-reality head-up display device for a transportation vehicle
US20220044032A1 (en) Dynamic adjustment of augmented reality image
CN110891841A (en) Method and device for ascertaining the probability of an object being in the field of view of a vehicle driver
US11815679B2 (en) Method, processing device, and display system for information display
KR101637298B1 (en) Head-up display apparatus for vehicle using aumented reality
WO2023089105A1 (en) System and method
WO2023089106A1 (en) System and method for displaying information using augmented reality to a vehicle user

Legal Events

Date Code Title Description
AS Assignment

Owner name: HYUNDAI MOTOR COMPANY, KOREA, REPUBLIC OF

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:KIM, SUNG UN;REEL/FRAME:036500/0166

Effective date: 20150603

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4