WO1999054848A1 - 3-dimensional intersection display for vehicle navigation system - Google Patents

3-dimensional intersection display for vehicle navigation system Download PDF

Info

Publication number
WO1999054848A1
WO1999054848A1 PCT/US1999/007911 US9907911W WO9954848A1 WO 1999054848 A1 WO1999054848 A1 WO 1999054848A1 US 9907911 W US9907911 W US 9907911W WO 9954848 A1 WO9954848 A1 WO 9954848A1
Authority
WO
WIPO (PCT)
Prior art keywords
intersection
navigation system
vehicle navigation
roads
complexity
Prior art date
Application number
PCT/US1999/007911
Other languages
French (fr)
Inventor
Jeffrey Alan Millington
Original Assignee
Magellan Dis, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Magellan Dis, Inc. filed Critical Magellan Dis, Inc.
Priority to AT99916613T priority Critical patent/ATE310287T1/en
Priority to EP99916613A priority patent/EP1074002B1/en
Priority to AU34897/99A priority patent/AU3489799A/en
Priority to DE69928387T priority patent/DE69928387T2/en
Priority to CA002326683A priority patent/CA2326683A1/en
Publication of WO1999054848A1 publication Critical patent/WO1999054848A1/en
Priority to NO20005221A priority patent/NO20005221D0/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3635Guidance using 3D or perspective road maps
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3632Guidance using simplified or iconic instructions, e.g. using arrows
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation

Definitions

  • Vehicle navigation systems determine a current position of the vehicle relative to a database of roads and display a recommended route via the database of roads to a destination selected by the user.
  • Some navigation systems have provided algorithms for transforming data in the database into a 3-dimensional perspective view which is more easily understood by many users.
  • the navigation system selects a point at some elevation above the vehicle's current position, or slightly behind the vehicle's current position, from which to calculate the perspective view of the roads.
  • the present invention provides a vehicle navigation system or route guidance system with improved perspective view display.
  • the navigation system displays a perspective view of an intersection indicating the next maneuver to be performed by the driver along the recommended route.
  • the navigation system displays the intersection at an angle relative to "earth. " Based upon the complexity of the intersection displayed, the angle is increased to provide a more "overhead” view.
  • the increased angle of the perspective view improves the understanding of the roads in the intersection.
  • the display displays a horizon and sky in the perspective view. Based upon the time of day (and time of year), which can be gathered from the GPS system, the color of the sky changes. During the day, the sky in the display is preferably a shade of blue which gradually shifts to black at night and back to blue. In this manner, the display assists the observer in associating the perspective view on the display with what the observer sees outside in front of the vehicle.
  • Figure 1 is a schematic of the navigation system of the present invention
  • Figures 2A-2E represent 2-dimensional data in the database of roads in the navigation system of Figure 1 for various types of intersections;
  • Figure 3 illustrates the view angles from which a perspective view is calculated in the navigation system of Figure 1 ;
  • Figure 4 is the display of Figure 1 showing a perspective view of an area of Figure 2 A calculated according to a first viewing angle of Figure 3
  • Figure 5 is the display of Figure 1 showing a perspective view of an area in Figure 2C calculated according to a second viewing angle in Figure 3
  • Figure 6 is the display of Figure 1 showing a plan view of an area in Figure 2E calculated according to a third viewing angle in Figure 3.
  • the navigation system 20 of the present invention is shown schematically in Figure 1.
  • the navigation system 20 includes a processor or CPU 22 connected to a display 24, such as a high resolution LCD or flat panel display.
  • the CPU 22 is also connected to an input device 26 such as a mouse, keyboard, key pad or remote device.
  • the display 24 can be a touch screen display.
  • the navigation system 20 further includes a storage device 28, such as a hard drive 28 or CD ROM, connected to the CPU 22.
  • the storage device 28 contains a database including a map of all the roads in the area to be traveled by the vehicle 32 and may contain the software for the CPU 22, including the graphical user interface, route guidance, operating system, position-determining software, etc.
  • the navigation system 20 preferably includes position and motion determining devices, such as a GPS receiver 34, a gyroscope 36, an orthogonal three-axis accelerometer 37, a compass 38 and a wheel speed sensor 40, all connected to the CPU 22 (connections not shown for simplicity).
  • position and motion determining devices such as a GPS receiver 34, a gyroscope 36, an orthogonal three-axis accelerometer 37, a compass 38 and a wheel speed sensor 40, all connected to the CPU 22 (connections not shown for simplicity).
  • the position and motion determining devices determine the position of the vehicle 32 relative to the database of roads. Further, as is known in navigation systems, the user can select a destination relative to the database of roads utilizing the input device 26 and the display 24. The navigation system 20 then calculates and displays a recommended route directing the driver of the vehicle 32 to the desired destination. Preferably, the navigation system 20 displays turn-by-turn instructions on display 24, guiding the driver to the desired destination. -4-
  • the database of roads contains 2-dimensional data indicating locations of intersections, lengths of road segments and angles of intersection, generally represented in Figures 2A-E.
  • the 2-dimensional data includes the location of the intersection, the number of road segments (or “arms") and the angles between the arms.
  • Figure 2 A represents 2-dimensional data for a simple intersection 41.
  • the simple intersection 41 comprises a plurality of "arms” 42 or road segments 42 intersecting at the node 44 of the intersection 41.
  • the navigation system 20 will recommend a maneuver 45 onto a "TO" arm 42a from a "FROM” arm 42b, which is oriented toward the bottom of the display 24.
  • the "TO" arm 42a is separated from the nearest adjacent arm 42 by an angle A. In this case, the angle A is 90 degrees.
  • Figure 2B illustrates the 2-dimensional data for a more complicated intersection 46 having seven arms 42 intersecting at a node 44.
  • angle A is separated from the nearest arm 42 by an angle A, which is not less than a predetermined threshold, preferably 20 degrees.
  • Figure 2C illustrates an intersection 50 having five arms 42.
  • a "TO" arm 42a is separated from the nearest arm 42 by an angle A, which is less than a predetermined threshold, preferably 20 degrees.
  • Figure 2D illustrates a pair of intersections 52, 54, which are both part of a complex maneuver 56.
  • the intersections 52, 54 share arm 42a which has a length x which is less than a predetermined threshold, such as 200 feet. Also a factor in making the complex maneuver 56 complex is the fact that a maneuver must be performed at intersection 52 and a maneuver must be performed at intersection 54.
  • Figure 2E illustrates a rotary 57 having a plurality of arms 42 including a TO arm 42a.
  • a sample recommended maneuver 45 is shown superimposed on the rotary 57. It should be understood that the recommended maneuvers 45 shown are not part of the 2-dimensional data in the database, but are a -5-
  • the 2-dimensional data as represented in Figures 2A-2E, is transformed via scaling, rotation and translation into a 3-D perspective view by the CPU 22, generally utilizing known rendering techniques.
  • the 3-D model is created as a perspective view of the road in front of the vehicle 32.
  • the perspective view is calculated at an elevation, H, above the earth at an angle relative to a center (Xc, Yc) of the displayed intersection.
  • H elevation
  • Xc, Yc center
  • the perspective view may be calculated from a "camera position" A in Figure 3, at an angle A , preferably 30 degrees.
  • the perspective view may be calculated from a position B as shown in Figure 3 at an angle ⁇ B , preferably 50 degrees.
  • the view may be calculated from a position C as shown in Figure 3 at an angle ⁇ c , preferably 90 degrees.
  • the angle increases based upon the complexity of the intersection.
  • the complexity is determined based upon the number of arms 42 and the angle A between the TO arm 42a and the nearest adjacent arm 42.
  • two maneuvers 52, 54 within a predetermined distance may indicate complexity of an intersection.
  • certain types of intersections may indicate complex intersections. For example, a rotary may indicate a very complex intersection, while a U-turn may indicate a medium complex intersection.
  • angles between A and c may be utilized to calculate perspective views of intersections of moderate complexity.
  • any intersection displayed is first displayed as calculated from point C and angle ⁇ c , i.e. 90 degrees. If the intersection is not very complex, the angle is then decreased and the view is continuously, gradually changed to the appropriate angle a as the vehicle 32 approaches the intersection. In this manner, the user can see the perspective change and more easily understand the intersection perspective view displayed. Alternatively the angle a can be increased from the appropriate angle to 90 degrees as the vehicle approaches the intersection.
  • Figure 4 illustrates the display
  • FIG. 24 of Figure 1 showing a display 60 of a 3-dimensional representation 62 of the intersection represented in Figure 2A displayed at a perspective view calculated according to angle a A shown in Figure 3, which for this simple intersection is 30 degrees.
  • First the intersection 41 is rendered into a polygon having arms 42 separated by the angles specified in the 2-dimensional data.
  • the three dimensional representations of the arms 42 preferably each have an equal predetermined length.
  • the display 60 further includes a maneuver instruction 64, preferably a 3-dimensional representation of an arrow 64 superimposed on the
  • the arrow 64 is also 3-dimensional and shown in the same perspective.
  • a head 65 of the arrow 64 is first rendered on the TO arm 42a, at a fixed distance from the intersection from the center of the intersection.
  • a tail 66 is then rendered on the FROM arm 42b, at a fixed distance from the intersection from the center of the intersection.
  • a point of intersection between lead lines in the head 65 and tail 66 is then utilized to create an inner arc and an outer arc from the head 65 to the tail 66.
  • a plurality of polygons between the inner and outer arcs are rendered from the head 65 to the tail 66 to create the body of the arrow 64.
  • the point on the maneuver to be centered on the display 24 is then calculated.
  • the extent or bounds for the entire maneuver is first computed.
  • the extent or bounds for the turn indicator (arrow) 64 is computed and also recorded as a minimum X, Y and maximum X, Y.
  • the center (Xc, Yc) for the entire maneuver (biased with the turn indicator) is computed as follows: -7-
  • Xc AVG (ArrowMinimum.X, Arrow Maximum. X)
  • Yc AVG (ManeuverMinimum.Y, ManeuverMaximum.Y)
  • the entire 3D polygon(s) making up the 3D maneuver are then translated so that the new calculated center is positioned as the new origin (0,0).
  • the camera's initial position is at 90 degrees (point C in Figure 3 at angle ⁇ c ).
  • the camera position is specified in X, Y, Z coordinates.
  • the X, Y and coordinates are set to 0,0.
  • the Z coordinate (or altitude) is computed as follows:
  • W one half of the width of the entire maneuver in the dominant axis.
  • H height of the camera overlooking the maneuver
  • the span of the entire maneuver in each direction is compared to determine which is longer (taking the aspect ratio of the viewport into consideration).
  • Aspect Ratio 305 pixels in the X direction/230 pixels in the Y direction If (ManeuverSpan in the Y axis * Aspect Ratio) > ManeuverSpan in the X axis
  • the initial coordinates for the camera position are specified as (0,0,H).
  • the range of camera positions are based on the complexity of the maneuver. The following factors are used to determine the complexity of the maneuver.
  • the camera position is initially at 90° and changes to the minimum camera angle as the vehicle 32 approaches the intersection.
  • the angle can also be selectively adjusted by the user between the minimum permitted camera angle (CameraAngleMinimum) and the maximum permitted camera angle (CameraAngleMaximum).
  • the maximum camera angle is always
  • the minimum and maximum camera angles are defined as follows:
  • a fixed scaling is applied to keep the scene in view.
  • the entire scene is adjusted in size by scaling by ⁇ 105 % when decrementing the viewing angle and by ⁇ 95 % when incrementing the viewing angle.
  • the number of increments that the scene can be viewed from ranges from 0 to 8 increments, again depending on the complexity of the maneuver.
  • the above numbers for Minimum and Maximum Camera Angles, and Number of increments shown are for exemplary purposes. It should be recognized that the navigation system 20 may have more or less as needed.
  • the display 60 also includes a horizon line 69 below which is displayed the intersection 62 and maneuver instruction 64. Above the horizon line 69 is a representation of the sky 70.
  • the sky 70 is preferably changed in color based upon the time of day, season of the year and geographic location of the vehicle 32.
  • the CPU 22 of the navigation system 20 has information regarding the geographic location of the vehicle 32, date and current time of day.
  • the GPS receiver 34 receives time information, including date, from the GPS system.
  • the sky 70 is changed from blue to black based upon the time of day, including the expected sunrise and sunset times for the particular season of the year and the current geographic location of the vehicle 32.
  • the sky 70 gradually and continuously changes from blue during the day to black at night. This assists the user in perceiving and understanding the display 60, including the intersection 62 and the perspective view.
  • the display 60 further includes a text instruction field 72 which displays text of a maneuver instruction, such as "Right turn on Maple” or other turn instructions appropriate to the next maneuver.
  • the text in the field 72 corresponds to the maneuver instruction 64.
  • a heading indicator 74 indicating the absolute direction of the desired destination, is also shown in 3-dimensional perspective view in display 60.
  • the heading indicator 74 includes an arrow 75, also 3-dimensional and shown in perspective view. The angle at which the perspective of the heading -10-
  • the display 60 further includes a distance-to-maneuver field 76 which indicates the distance between the current position of the vehicle 32 and the next maneuver, as indicated by the maneuver instruction 64.
  • the user can selectively adjust the angle of the display between A and ⁇ c utilizing the user input device 26.
  • a distance-to-destination field 77 indicates the total distance in the calculated route from the current location to the desired destination.
  • a current heading indicator 78 indicates the current geographical heading of the vehicle 32.
  • display 80 is shown on the display 24 when the vehicle approaches a medium complex intersection, such as that represented in two dimensional data in Figure 2C.
  • the two dimensional data for the intersection of Figure 2C is transformed into a 3-dimensional model and rotated into a perspective view according to the angle ⁇ B of the Figure 3, preferably 50 degrees.
  • the view of the intersection 50 of Figure 2C is first calculated at angle c of Figure 3 and gradually decreased as the vehicle 32 approaches the intersection to angle ⁇ B . This increases the user's understanding of the intersection and the perspective view.
  • the perspective angle is decreased B , as displayed in Figure 5.
  • the arms 42 are shown having thickness and in perspective, although the perspective angle is higher and the view is more overhead.
  • the maneuver instruction 64 is also shown in three dimensions and in the same perspective view, calculated according to angle ⁇ B .
  • the heading indicator 74 is also shown in three dimensions and shown in a perspective view calculated according to angle ⁇ B . Again, this assists the user in understanding the perspective at which the intersection 82 is displayed. The user can selectively adjust the angle of the display between ⁇ B and ⁇ c utilizing the user input device 26. -11-
  • the text instruction field 72 becomes a bar graph 82 indicating more precisely the distance to the upcoming maneuver.
  • the bar graph 82 gradually and continuously decreases as the vehicle 32 approaches the maneuver. Portions of the bar graph 82 which overlap text in the text instruction field 72 become reverse video, as shown.
  • display 86 is shown on the display 24 when the vehicle approaches a very complex intersection, such as that represented in two dimensional data in Figure 2E.
  • the two dimensional data for the intersection of Figure 2E is transformed into a 3-dimensional model according to the angle ⁇ c of the Figure 3, preferably 90 degrees.
  • ⁇ c the angle of the perspective view of a complex intersection 48
  • the road segments or arms 42 are more readily distinguishable and the maneuver instruction 64 is easier to understand.

Abstract

A navigation system includes a display (24) which provides a 3-D perspective view. The angle of view (a, b, c) in the perspective view is increased based upon the complexity of the intersection being displayed. Intersections of increased complexity are displayed at an increased viewing angle (a, b, c) to facilitate understanding. A sky above a horizon on the display (24) changes color based upon the time of day.

Description

3-DIMENSIONAL INTERSECTION DISPLAY FOR VEHICLE NAVIGATION SYSTEM
BACKGROUND OF THE INVENTION
Vehicle navigation systems determine a current position of the vehicle relative to a database of roads and display a recommended route via the database of roads to a destination selected by the user. Some navigation systems have provided algorithms for transforming data in the database into a 3-dimensional perspective view which is more easily understood by many users. In one such system, the navigation system selects a point at some elevation above the vehicle's current position, or slightly behind the vehicle's current position, from which to calculate the perspective view of the roads.
In some situations, it can be difficult to discern a specific road from the display. For example, at an intersection of two roads at an acute angle, it may be difficult to discern the two roads in a perspective view. Also, where there are two or more intersections in close proximity, it may be difficult to discern the location of one road versus another in a perspective view. Further, complex intersections, such as rotaries, with multiple intersections of roads in close proximity may not be easy to understand from the perspective view.
SUMMARY OF THE INVENTION
The present invention provides a vehicle navigation system or route guidance system with improved perspective view display. Generally, the navigation system displays a perspective view of an intersection indicating the next maneuver to be performed by the driver along the recommended route. The navigation system displays the intersection at an angle relative to "earth. " Based upon the complexity of the intersection displayed, the angle is increased to provide a more "overhead" view. The increased angle of the perspective view improves the understanding of the roads in the intersection. -2-
For example, if there are many roads intersecting in the area to be displayed, increasing the viewing angle will provide more space on the screen between the roads, thereby increasing the understanding of the intersection. Further, the maneuver which is being recommended by the navigation system (such as by indicating an arrow on one of the intersecting roads) is more readily perceived.
The display displays a horizon and sky in the perspective view. Based upon the time of day (and time of year), which can be gathered from the GPS system, the color of the sky changes. During the day, the sky in the display is preferably a shade of blue which gradually shifts to black at night and back to blue. In this manner, the display assists the observer in associating the perspective view on the display with what the observer sees outside in front of the vehicle.
BRIEF DESCRIPTION OF THE DRAWINGS
The above, as well as other advantages of the present invention, will become readily apparent to those skilled in the art from the following detailed description of a preferred embodiment when considered in the light of the accompanying drawings in which:
Figure 1 is a schematic of the navigation system of the present invention;
Figures 2A-2E represent 2-dimensional data in the database of roads in the navigation system of Figure 1 for various types of intersections; Figure 3 illustrates the view angles from which a perspective view is calculated in the navigation system of Figure 1 ;
Figure 4 is the display of Figure 1 showing a perspective view of an area of Figure 2 A calculated according to a first viewing angle of Figure 3; Figure 5 is the display of Figure 1 showing a perspective view of an area in Figure 2C calculated according to a second viewing angle in Figure 3; Figure 6 is the display of Figure 1 showing a plan view of an area in Figure 2E calculated according to a third viewing angle in Figure 3.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT
The navigation system 20 of the present invention is shown schematically in Figure 1. The navigation system 20 includes a processor or CPU 22 connected to a display 24, such as a high resolution LCD or flat panel display. The CPU 22 is also connected to an input device 26 such as a mouse, keyboard, key pad or remote device. Alternatively, the display 24 can be a touch screen display. The navigation system 20 further includes a storage device 28, such as a hard drive 28 or CD ROM, connected to the CPU 22. The storage device 28 contains a database including a map of all the roads in the area to be traveled by the vehicle 32 and may contain the software for the CPU 22, including the graphical user interface, route guidance, operating system, position-determining software, etc.
The navigation system 20 preferably includes position and motion determining devices, such as a GPS receiver 34, a gyroscope 36, an orthogonal three-axis accelerometer 37, a compass 38 and a wheel speed sensor 40, all connected to the CPU 22 (connections not shown for simplicity).
These and other position and motion determining devices are known and are commercially available.
As is well known, the position and motion determining devices determine the position of the vehicle 32 relative to the database of roads. Further, as is known in navigation systems, the user can select a destination relative to the database of roads utilizing the input device 26 and the display 24. The navigation system 20 then calculates and displays a recommended route directing the driver of the vehicle 32 to the desired destination. Preferably, the navigation system 20 displays turn-by-turn instructions on display 24, guiding the driver to the desired destination. -4-
Generally, the database of roads contains 2-dimensional data indicating locations of intersections, lengths of road segments and angles of intersection, generally represented in Figures 2A-E. The 2-dimensional data includes the location of the intersection, the number of road segments (or "arms") and the angles between the arms.
Figure 2 A represents 2-dimensional data for a simple intersection 41. The simple intersection 41 comprises a plurality of "arms" 42 or road segments 42 intersecting at the node 44 of the intersection 41. Based upon the route calculated by the navigation system 20, the navigation system 20 will recommend a maneuver 45 onto a "TO" arm 42a from a "FROM" arm 42b, which is oriented toward the bottom of the display 24. The "TO" arm 42a is separated from the nearest adjacent arm 42 by an angle A. In this case, the angle A is 90 degrees.
Figure 2B illustrates the 2-dimensional data for a more complicated intersection 46 having seven arms 42 intersecting at a node 44. A "TO" arm
42a is separated from the nearest arm 42 by an angle A, which is not less than a predetermined threshold, preferably 20 degrees.
Figure 2C illustrates an intersection 50 having five arms 42. A "TO" arm 42a is separated from the nearest arm 42 by an angle A, which is less than a predetermined threshold, preferably 20 degrees.
Figure 2D illustrates a pair of intersections 52, 54, which are both part of a complex maneuver 56. The intersections 52, 54 share arm 42a which has a length x which is less than a predetermined threshold, such as 200 feet. Also a factor in making the complex maneuver 56 complex is the fact that a maneuver must be performed at intersection 52 and a maneuver must be performed at intersection 54.
Figure 2E illustrates a rotary 57 having a plurality of arms 42 including a TO arm 42a. A sample recommended maneuver 45 is shown superimposed on the rotary 57. It should be understood that the recommended maneuvers 45 shown are not part of the 2-dimensional data in the database, but are a -5-
result of the navigation system's 20 recommended route to a user-selected destination.
Generally, the 2-dimensional data, as represented in Figures 2A-2E, is transformed via scaling, rotation and translation into a 3-D perspective view by the CPU 22, generally utilizing known rendering techniques. Referring to
Figure 3, the 3-D model is created as a perspective view of the road in front of the vehicle 32. The perspective view is calculated at an elevation, H, above the earth at an angle relative to a center (Xc, Yc) of the displayed intersection. For a simple intersection 41, such as is shown in Figure 2A, the perspective view may be calculated from a "camera position" A in Figure 3, at an angle A, preferably 30 degrees. For a moderately complex intersection 46, such as that shown in Figure 2C, the perspective view may be calculated from a position B as shown in Figure 3 at an angle αB, preferably 50 degrees. For a very complex intersection 57, such as that shown in Figure 2E, the view may be calculated from a position C as shown in Figure 3 at an angle αc, preferably 90 degrees. The angle increases based upon the complexity of the intersection. The complexity is determined based upon the number of arms 42 and the angle A between the TO arm 42a and the nearest adjacent arm 42. Further, two maneuvers 52, 54 within a predetermined distance may indicate complexity of an intersection. Further, certain types of intersections may indicate complex intersections. For example, a rotary may indicate a very complex intersection, while a U-turn may indicate a medium complex intersection.
It should be apparent that other angles between A and c may be utilized to calculate perspective views of intersections of moderate complexity.
Preferably, any intersection displayed is first displayed as calculated from point C and angle αc, i.e. 90 degrees. If the intersection is not very complex, the angle is then decreased and the view is continuously, gradually changed to the appropriate angle a as the vehicle 32 approaches the intersection. In this manner, the user can see the perspective change and more easily understand the intersection perspective view displayed. Alternatively the angle a can be increased from the appropriate angle to 90 degrees as the vehicle approaches the intersection.
For illustration, the transformation of the 2-dimensional data for the intersection of intersection 41 of Figure 2 A into the three dimensional perspective view of Figure 4 will be described. Figure 4 illustrates the display
24 of Figure 1 showing a display 60 of a 3-dimensional representation 62 of the intersection represented in Figure 2A displayed at a perspective view calculated according to angle a A shown in Figure 3, which for this simple intersection is 30 degrees. First the intersection 41 is rendered into a polygon having arms 42 separated by the angles specified in the 2-dimensional data.
Additional perpendicular polygons are then added to create a three dimensional appearance. The three dimensional representations of the arms 42 preferably each have an equal predetermined length.
The display 60 further includes a maneuver instruction 64, preferably a 3-dimensional representation of an arrow 64 superimposed on the
3-dimensional representation 62 of the intersection. The arrow 64 is also 3-dimensional and shown in the same perspective. A head 65 of the arrow 64 is first rendered on the TO arm 42a, at a fixed distance from the intersection from the center of the intersection. A tail 66 is then rendered on the FROM arm 42b, at a fixed distance from the intersection from the center of the intersection. A point of intersection between lead lines in the head 65 and tail 66 is then utilized to create an inner arc and an outer arc from the head 65 to the tail 66. A plurality of polygons between the inner and outer arcs are rendered from the head 65 to the tail 66 to create the body of the arrow 64. The point on the maneuver to be centered on the display 24 is then calculated. The extent or bounds for the entire maneuver is first computed. This is recorded as a minimum X, Y and a maximum X, Y. The extent or bounds for the turn indicator (arrow) 64 is computed and also recorded as a minimum X, Y and maximum X, Y. The center (Xc, Yc) for the entire maneuver (biased with the turn indicator) is computed as follows: -7-
Xc = AVG (ArrowMinimum.X, Arrow Maximum. X) Yc = AVG (ManeuverMinimum.Y, ManeuverMaximum.Y) The entire 3D polygon(s) making up the 3D maneuver are then translated so that the new calculated center is positioned as the new origin (0,0). The camera's initial position is at 90 degrees (point C in Figure 3 at angle αc).
The camera position is specified in X, Y, Z coordinates. The X, Y and coordinates are set to 0,0. The Z coordinate (or altitude) is computed as follows:
Definitions
W = one half of the width of the entire maneuver in the dominant axis. H = height of the camera overlooking the maneuver FOV = Field of View (used when transforming 3D coordinates into screen coordinates) theta= FOV/2
The span of the entire maneuver in each direction is compared to determine which is longer (taking the aspect ratio of the viewport into consideration).
Aspect Ratio = 305 pixels in the X direction/230 pixels in the Y direction If (ManeuverSpan in the Y axis * Aspect Ratio) > ManeuverSpan in the X axis
ManeuverSpan in the Y axis is dominant Else
ManeuverSpan in the X axis is dominant If Maneuver in the Y axis contributes more (dominant)
W = (ManeuverMaximum.Y - ManeuverMinimum.Y)/2 If Maneuver in the X axis contributes more (dominant) W = (ManeuverMaximum.X - ManeuverMinimum.X)/2 H = W / tan (theta) Z = H -8-
The initial coordinates for the camera position are specified as (0,0,H). The range of camera positions are based on the complexity of the maneuver. The following factors are used to determine the complexity of the maneuver.
I f n u m b e r o f a r m s i n maneuver > MAXIMUM_NUM_ARMS_THRESHOLD
Complexity = MANEUVER_MEDIUM_COMPLEX If number of arms maneuvers > 1
Complexity = MANEUVER_VERY_COMPLEX If maneuver type is roundabout Complexity = MANEUVER_VERY_COMPLEX
If angle between 'TO' arm and any adjacent arm is < MINIMUM_ARM_ANGLE_THRESHOLD
Complexity = MANEUVER_MEDIUM_COMPLEX All other types Complexity = MANEUVER_SIMPLE
The camera position is initially at 90° and changes to the minimum camera angle as the vehicle 32 approaches the intersection. The angle can also be selectively adjusted by the user between the minimum permitted camera angle (CameraAngleMinimum) and the maximum permitted camera angle (CameraAngleMaximum). Preferably the maximum camera angle is always
90° and the minimum camera angle depends upon the complexity of the intersection. The minimum and maximum camera angles are defined as follows:
If Complexity = MANEUVER SIMPLE CameraAngleMinimum = 30°
CameraAngleMaximum = 90° If Complexity = MANEUVER_MEDIUM_COMPLEX CameraAngleMinimum = 50° CameraAngleMaximum = 90° If Complexity = MANEUVER VERY COMPLEX
CameraAngleMinimum = 90° CameraAngleMaximum = 90° // no change
When adjusting the camera through system or user control, a fixed scaling is applied to keep the scene in view. Example, the entire scene is adjusted in size by scaling by ~ 105 % when decrementing the viewing angle and by ~ 95 % when incrementing the viewing angle. Preferably, the number of increments that the scene can be viewed from ranges from 0 to 8 increments, again depending on the complexity of the maneuver. The above numbers for Minimum and Maximum Camera Angles, and Number of increments shown are for exemplary purposes. It should be recognized that the navigation system 20 may have more or less as needed.
The display 60 also includes a horizon line 69 below which is displayed the intersection 62 and maneuver instruction 64. Above the horizon line 69 is a representation of the sky 70. The sky 70 is preferably changed in color based upon the time of day, season of the year and geographic location of the vehicle 32. The CPU 22 of the navigation system 20 has information regarding the geographic location of the vehicle 32, date and current time of day. The GPS receiver 34 receives time information, including date, from the GPS system. The sky 70 is changed from blue to black based upon the time of day, including the expected sunrise and sunset times for the particular season of the year and the current geographic location of the vehicle 32.
Preferably, the sky 70 gradually and continuously changes from blue during the day to black at night. This assists the user in perceiving and understanding the display 60, including the intersection 62 and the perspective view.
The display 60 further includes a text instruction field 72 which displays text of a maneuver instruction, such as "Right turn on Maple" or other turn instructions appropriate to the next maneuver. The text in the field 72 corresponds to the maneuver instruction 64.
A heading indicator 74, indicating the absolute direction of the desired destination, is also shown in 3-dimensional perspective view in display 60. The heading indicator 74 includes an arrow 75, also 3-dimensional and shown in perspective view. The angle at which the perspective of the heading -10-
indicator 74 is calculated is the same as the angle at which the intersection 62 is displayed. This further reinforces an understanding of the perspective intersection view.
The display 60 further includes a distance-to-maneuver field 76 which indicates the distance between the current position of the vehicle 32 and the next maneuver, as indicated by the maneuver instruction 64. The user can selectively adjust the angle of the display between A and αc utilizing the user input device 26. A distance-to-destination field 77 indicates the total distance in the calculated route from the current location to the desired destination. A current heading indicator 78 indicates the current geographical heading of the vehicle 32.
Referring to Figure 5, display 80 is shown on the display 24 when the vehicle approaches a medium complex intersection, such as that represented in two dimensional data in Figure 2C. The two dimensional data for the intersection of Figure 2C is transformed into a 3-dimensional model and rotated into a perspective view according to the angle αB of the Figure 3, preferably 50 degrees. Preferably, the view of the intersection 50 of Figure 2C is first calculated at angle c of Figure 3 and gradually decreased as the vehicle 32 approaches the intersection to angle αB. This increases the user's understanding of the intersection and the perspective view. By the time the vehicle 32 approaches the next maneuver, the perspective angle is decreased B, as displayed in Figure 5. Again the arms 42 are shown having thickness and in perspective, although the perspective angle is higher and the view is more overhead. The maneuver instruction 64 is also shown in three dimensions and in the same perspective view, calculated according to angle αB.
The heading indicator 74 is also shown in three dimensions and shown in a perspective view calculated according to angle αB. Again, this assists the user in understanding the perspective at which the intersection 82 is displayed. The user can selectively adjust the angle of the display between αB and αc utilizing the user input device 26. -11-
As is also shown in Figure 5, when the distance to the maneuver 76 reaches 0.1 miles, the text instruction field 72 becomes a bar graph 82 indicating more precisely the distance to the upcoming maneuver. The bar graph 82 gradually and continuously decreases as the vehicle 32 approaches the maneuver. Portions of the bar graph 82 which overlap text in the text instruction field 72 become reverse video, as shown.
Referring to Figure 6, display 86 is shown on the display 24 when the vehicle approaches a very complex intersection, such as that represented in two dimensional data in Figure 2E. The two dimensional data for the intersection of Figure 2E is transformed into a 3-dimensional model according to the angle αc of the Figure 3, preferably 90 degrees. By increasing the viewing angle of the perspective view of a complex intersection 48, the road segments or arms 42 are more readily distinguishable and the maneuver instruction 64 is easier to understand. In accordance with the provisions of the patent statutes and jurisprudence, exemplary configurations described above are considered to represent a preferred embodiment of the invention. However, it should be noted that the invention can be practiced otherwise than as specifically illustrated and described without departing from its spirit or scope.

Claims

-12-CLAEVISWHAT IS CLAIMED IS:
1. A vehicle navigation system comprising: a database of roads to be travelled by a vehicle; a processor determining a complexity of an intersection in said database; a display displaying said intersection at a perspective view calculated at a viewing angle, said viewing angle based upon said complexity of said intersection.
2. The vehicle navigation system of Claim 1 further including a system for determining the position of the vehicle relative to said database of roads; a user input device for selecting a desired destination for the vehicle relative to said database of roads; a system for determining a route from said database of roads to said desired destination, said route including said intersection.
3. The vehicle navigation system of Claim 1 wherein said display displays said intersection, a horizon and a sky.
4. The vehicle navigation system of Claim 3 wherein said display changes a color of said sky based upon a time of day.
5. The vehicle navigation system of Claim 4 wherein said time of day is received from a GPS receiver. -13-
6. The vehicle navigation system of Claim 1 wherein said processor determines said complexity of said intersection based upon a number of roads in said intersection.
7. The vehicle navigation system of Claim 1 wherein said processor determines said complexity of said intersection based upon a distance between roads in said intersection.
8. The vehicle navigation system of Claim 1 wherein said processor determines said complexity of said intersection based upon an angular separation between adjacent roads in said intersection.
9. The vehicle navigation system of Claim 8 wherein said angular separation is between a TO road, which is recommended, and a nearest adjacent road in said intersection.
10. The vehicle navigation system of Claim 1 wherein said processor determines said complexity of said intersection based upon a type of said intersection.
11. The vehicle navigation system of Claim 1 wherein said processor increases the viewing angle of said intersection with the complexity of the intersection.
12. The vehicle navigation system of Claim 1 wherein said display displays a three dimensional heading indicator icon at said viewing angle.
13. The vehicle navigation system of Claim 1 wherein said viewing angle changes as the vehicle approaches the intersection. -14-
14. The vehicle navigation system of Claim 1 wherein said viewing angle is user-adjustable between a maximum viewing angle and a minimum viewing angle, said minimum viewing angle determined based upon said complexity of said intersection.
15. A method for navigating a vehicle including the steps of: a) determining a route from a database of roads between a position of a vehicle to a desired destination, said route including an intersection; b) determining a complexity of said intersection; c) determining a viewing angle based upon said complexity of said intersection; and d) displaying said intersection as a perspective view at said viewing angle.
16. The method of Claim 15 further including the step of constructing a three- dimensional model of said intersection from two-dimensional data in said database.
17. The method of Claim 15 further including the steps of: e) determining a time of day; f) displaying a sky adjacent said intersection in said step d); and g) adjusting a color of said sky based upon said time of day.
18. The method of Claim 17 wherein said color of said sky is adjusted between blue and black.
19. The method of Claim 15 wherein said step b) is based upon a number of roads in said intersection.
20. The method of Claim 15 wherein said step b) is based upon a spacing of roads in said intersection. -15-
21. The method of Claim 15 wherein said step b) is based upon an angular separation between adjacent roads in said intersection.
22. A display for a vehicle navigation system comprising an intersection at a perspective angle, said display further including a sky adjacent said intersection, a color of said sky changing based upon a time of day.
23. The display of Claim 22 wherein said time of day is received from a GPS receiver.
PCT/US1999/007911 1998-04-17 1999-04-09 3-dimensional intersection display for vehicle navigation system WO1999054848A1 (en)

Priority Applications (6)

Application Number Priority Date Filing Date Title
AT99916613T ATE310287T1 (en) 1998-04-17 1999-04-09 THREE-DIMENSIONAL CROSSING DISPLAY FOR A VEHICLE NAVIGATION DEVICE
EP99916613A EP1074002B1 (en) 1998-04-17 1999-04-09 3-dimensional intersection display for vehicle navigation system
AU34897/99A AU3489799A (en) 1998-04-17 1999-04-09 3-dimensional intersection display for vehicle navigation system
DE69928387T DE69928387T2 (en) 1998-04-17 1999-04-09 THREE-DIMENSIONAL ROAD GUIDANCE INDICATOR FOR A VEHICLE NAVIGATION SYSTEM
CA002326683A CA2326683A1 (en) 1998-04-17 1999-04-09 3-dimensional intersection display for vehicle navigation system
NO20005221A NO20005221D0 (en) 1998-04-17 2000-10-17 3-dimensional intersection display for vehicle navigation system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US09/062,518 US6611753B1 (en) 1998-04-17 1998-04-17 3-dimensional intersection display for vehicle navigation system
US09/062,518 1998-04-17

Publications (1)

Publication Number Publication Date
WO1999054848A1 true WO1999054848A1 (en) 1999-10-28

Family

ID=22043002

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1999/007911 WO1999054848A1 (en) 1998-04-17 1999-04-09 3-dimensional intersection display for vehicle navigation system

Country Status (8)

Country Link
US (1) US6611753B1 (en)
EP (1) EP1074002B1 (en)
AT (1) ATE310287T1 (en)
AU (1) AU3489799A (en)
CA (1) CA2326683A1 (en)
DE (1) DE69928387T2 (en)
NO (1) NO20005221D0 (en)
WO (1) WO1999054848A1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1122626A1 (en) * 2000-02-02 2001-08-08 Matsushita Electric Industrial Co., Ltd. Intersection display method, and map display unit and recording medium for realizing the method
WO2003017226A2 (en) * 2001-08-07 2003-02-27 Siemens Aktiengesellschaft Method and device for displaying driving instructions, especially in car navigation systems
EP1411325A2 (en) * 2002-10-16 2004-04-21 LG Electronics, Inc. Method and apparatus for intersection guiding in navigation system
FR2902381A1 (en) * 2006-06-20 2007-12-21 Peugeot Citroen Automobiles Sa Motor vehicle driving assisting method, involves merging image captured by image formation device of night vision system and synthesis image, and displaying merged image on internal display of night vision system
CN101432786B (en) * 2006-04-27 2010-12-01 星克跃尔株式会社 Method for displaying background sky in navigation system and apparatus thereof
EP1596163A3 (en) * 2004-05-12 2011-10-26 Alpine Electronics, Inc. Navigation apparatus and method for displaying map using the same
US8150216B2 (en) 2004-05-05 2012-04-03 Google Inc. Methods and apparatus for automated true object-based image analysis and retrieval

Families Citing this family (90)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9075136B1 (en) 1998-03-04 2015-07-07 Gtj Ventures, Llc Vehicle operator and/or occupant information apparatus and method
DE19920709A1 (en) * 1999-05-05 2000-11-16 Siemens Ag Method for obtaining a three-dimensional map display and navigation system
US7865306B2 (en) * 2000-09-28 2011-01-04 Michael Mays Devices, methods, and systems for managing route-related information
US6351710B1 (en) * 2000-09-28 2002-02-26 Michael F. Mays Method and system for visual addressing
JP2003109032A (en) * 2001-09-26 2003-04-11 Pioneer Electronic Corp Image producing device and computer program
US7466992B1 (en) 2001-10-18 2008-12-16 Iwao Fujisaki Communication device
US7127271B1 (en) 2001-10-18 2006-10-24 Iwao Fujisaki Communication device
US7107081B1 (en) 2001-10-18 2006-09-12 Iwao Fujisaki Communication device
DE10153528A1 (en) * 2001-10-30 2003-05-15 Bosch Gmbh Robert Process for providing guidance to a user
US6668227B2 (en) * 2002-04-10 2003-12-23 Matsushita Electric Industrial Co., Ltd. Navigation apparatus
JP3992227B2 (en) * 2002-04-26 2007-10-17 パイオニア株式会社 3D information display device
US6771189B2 (en) * 2002-07-17 2004-08-03 Alpine Electronics, Inc. Display method and apparatus for navigation system
US8797402B2 (en) * 2002-11-19 2014-08-05 Hewlett-Packard Development Company, L.P. Methods and apparatus for imaging and displaying a navigable path
US8229512B1 (en) 2003-02-08 2012-07-24 Iwao Fujisaki Communication device
KR100648342B1 (en) * 2003-02-10 2006-11-23 엘지전자 주식회사 navigation system and the operating method
US7415243B2 (en) 2003-03-27 2008-08-19 Honda Giken Kogyo Kabushiki Kaisha System, method and computer program product for receiving data from a satellite radio network
US8241128B1 (en) 2003-04-03 2012-08-14 Iwao Fujisaki Communication device
US7526718B2 (en) * 2003-04-30 2009-04-28 Hewlett-Packard Development Company, L.P. Apparatus and method for recording “path-enhanced” multimedia
KR100703444B1 (en) * 2003-06-03 2007-04-03 삼성전자주식회사 Device and method for downloading and displaying a images of global position information in navigation system
US9341485B1 (en) 2003-06-19 2016-05-17 Here Global B.V. Method and apparatus for representing road intersections
US8090402B1 (en) 2003-09-26 2012-01-03 Iwao Fujisaki Communication device
US7917167B1 (en) 2003-11-22 2011-03-29 Iwao Fujisaki Communication device
US7849149B2 (en) 2004-04-06 2010-12-07 Honda Motor Co., Ltd. Method and system for controlling the exchange of vehicle related messages
US7818380B2 (en) 2003-12-15 2010-10-19 Honda Motor Co., Ltd. Method and system for broadcasting safety messages to a vehicle
US8041779B2 (en) 2003-12-15 2011-10-18 Honda Motor Co., Ltd. Method and system for facilitating the exchange of information between a vehicle and a remote location
US20050137794A1 (en) * 2003-12-18 2005-06-23 Dehua Cui Intersection route navigation system for a motor vehicle
US8041348B1 (en) 2004-03-23 2011-10-18 Iwao Fujisaki Communication device
US7222018B2 (en) 2004-04-06 2007-05-22 Honda Motor Co., Ltd. Bandwidth and memory conserving methods for a vehicle navigation system
US7319931B2 (en) 2004-04-06 2008-01-15 Honda Motor Co., Ltd. Methods for filtering and providing traffic information
US7289904B2 (en) 2004-04-06 2007-10-30 Honda Motor Co., Ltd. Vehicle navigation system and methods for incorporating user preferences into same
US7366606B2 (en) 2004-04-06 2008-04-29 Honda Motor Co., Ltd. Method for refining traffic flow data
US7623045B2 (en) * 2004-04-21 2009-11-24 Mitsubishi Electric Corporation Facility display unit
US7518530B2 (en) 2004-07-19 2009-04-14 Honda Motor Co., Ltd. Method and system for broadcasting audio and visual display messages to a vehicle
US7379063B2 (en) * 2004-07-29 2008-05-27 Raytheon Company Mapping application for rendering pixel imagery
US7643788B2 (en) 2004-09-22 2010-01-05 Honda Motor Co., Ltd. Method and system for broadcasting data messages to a vehicle
US7376510B1 (en) * 2004-11-05 2008-05-20 Navteq North America, Llc Map display for a navigation system
US7908080B2 (en) 2004-12-31 2011-03-15 Google Inc. Transportation routing
EP1835259A4 (en) * 2005-01-07 2011-07-27 Navitime Japan Co Ltd Navigation system and portable terminal
EP1681537A1 (en) * 2005-01-18 2006-07-19 Harman Becker Automotive Systems (Becker Division) GmbH Navigation system with animated junction view
US8108142B2 (en) * 2005-01-26 2012-01-31 Volkswagen Ag 3D navigation system for motor vehicles
US7647565B2 (en) * 2005-02-16 2010-01-12 International Business Machines Coporation Method, apparatus, and computer program product for an enhanced mouse pointer
KR101047719B1 (en) * 2005-02-16 2011-07-08 엘지전자 주식회사 Method and device for driving route guidance of moving object in navigation system
DE102005018082A1 (en) * 2005-04-19 2006-10-26 Robert Bosch Gmbh Method for the three-dimensional representation of a digital road map
US7624358B2 (en) * 2005-04-25 2009-11-24 International Business Machines Corporation Mouse radar for enhanced navigation of a topology
US9726513B2 (en) 2005-06-21 2017-08-08 Nytell Software LLC Navigation system and method
US7711478B2 (en) * 2005-06-21 2010-05-04 Mappick Technologies, Llc Navigation system and method
US8670925B2 (en) * 2005-06-21 2014-03-11 Calabrese Holdings L.L.C. Navigation system and method
DE112006001864T5 (en) * 2005-07-14 2008-06-05 GM Global Technology Operations, Inc., Detroit System for monitoring the vehicle environment from a remote perspective
US7949330B2 (en) 2005-08-25 2011-05-24 Honda Motor Co., Ltd. System and method for providing weather warnings and alerts
JP4506642B2 (en) * 2005-10-31 2010-07-21 株式会社デンソー Route guidance device
US8046162B2 (en) 2005-11-04 2011-10-25 Honda Motor Co., Ltd. Data broadcast method for traffic information
KR100852615B1 (en) 2006-04-27 2008-08-18 팅크웨어(주) System and method for expressing map according to change season and topography
US20080062173A1 (en) * 2006-09-13 2008-03-13 Eric Tashiro Method and apparatus for selecting absolute location on three-dimensional image on navigation display
KR100836677B1 (en) * 2006-09-19 2008-06-10 주식회사 레인콤 Navigation System Equipped with Auxiliary Display
JP4869106B2 (en) * 2007-02-28 2012-02-08 アルパイン株式会社 Navigation apparatus, intersection enlarged view display method, and map information creation method
US10281283B2 (en) * 2007-04-09 2019-05-07 Ian Cummings Apparatus and methods for reducing data transmission in wireless client-server navigation systems
US7890089B1 (en) 2007-05-03 2011-02-15 Iwao Fujisaki Communication device
DE102007023973A1 (en) 2007-05-23 2009-01-29 Audi Ag Method for representation of road course on optical output unit of navigation system for vehicle based digital map data, involves determining number of different levels of cross-over point based on z-level information
US7668653B2 (en) 2007-05-31 2010-02-23 Honda Motor Co., Ltd. System and method for selectively filtering and providing event program information
US20090254274A1 (en) * 2007-07-27 2009-10-08 Kulik Victor Navigation system for providing celestial and terrestrial information
DE102007036627A1 (en) * 2007-08-02 2009-02-05 Navigon Ag Method for operating a navigation system
US8676273B1 (en) 2007-08-24 2014-03-18 Iwao Fujisaki Communication device
US8554475B2 (en) 2007-10-01 2013-10-08 Mitac International Corporation Static and dynamic contours
US8099308B2 (en) 2007-10-02 2012-01-17 Honda Motor Co., Ltd. Method and system for vehicle service appointments based on diagnostic trouble codes
US8639214B1 (en) 2007-10-26 2014-01-28 Iwao Fujisaki Communication device
JP4994256B2 (en) * 2008-01-28 2012-08-08 株式会社ジオ技術研究所 Data structure of route guidance database
US8543157B1 (en) 2008-05-09 2013-09-24 Iwao Fujisaki Communication device which notifies its pin-point location or geographic area in accordance with user selection
US8340726B1 (en) 2008-06-30 2012-12-25 Iwao Fujisaki Communication device
US8452307B1 (en) 2008-07-02 2013-05-28 Iwao Fujisaki Communication device
BRPI0822714A2 (en) * 2008-07-30 2015-07-07 Tele Atlas Bv Computer-implemented method and system for generating a junction preview image
JP4747196B2 (en) * 2008-11-26 2011-08-17 東芝テック株式会社 Merchandise sales data processing apparatus, control program and control method thereof
US8990004B2 (en) * 2008-12-17 2015-03-24 Telenav, Inc. Navigation system with query mechanism and method of operation thereof
US8587617B2 (en) * 2009-02-04 2013-11-19 Raytheon Company Apparatus and method for map zooming
US8532924B2 (en) * 2009-09-02 2013-09-10 Alpine Electronics, Inc. Method and apparatus for displaying three-dimensional terrain and route guidance
CN102110364B (en) * 2009-12-28 2013-12-11 日电(中国)有限公司 Traffic information processing method and traffic information processing device based on intersections and sections
US8417448B1 (en) 2010-04-14 2013-04-09 Jason Adam Denise Electronic direction technology
TW201237451A (en) * 2011-03-04 2012-09-16 Hon Hai Prec Ind Co Ltd Positioning and navigating device
US20120286975A1 (en) * 2011-05-11 2012-11-15 Robert William Thomson System and method for improving viewability of primary flight display
DE102011116771A1 (en) * 2011-10-22 2013-04-25 Valeo Schalter Und Sensoren Gmbh Method for displaying image information on a display unit of a vehicle and driver assistance device for carrying out such a method
WO2013089480A1 (en) * 2011-12-16 2013-06-20 팅크웨어(주) Device and method for displaying a map according to the guiding of a navigation system
US8694246B2 (en) * 2012-05-15 2014-04-08 Qualcomm Incorporated Methods and systems for displaying enhanced turn-by-turn guidance on a personal navigation device
US9886794B2 (en) 2012-06-05 2018-02-06 Apple Inc. Problem reporting in maps
US8965696B2 (en) 2012-06-05 2015-02-24 Apple Inc. Providing navigation instructions while operating navigation application in background
US10156455B2 (en) 2012-06-05 2018-12-18 Apple Inc. Context-aware voice guidance
CN104776855B (en) * 2015-03-17 2018-03-13 腾讯科技(深圳)有限公司 The air navigation aid and device of a kind of intersection
EP3358305B1 (en) * 2015-09-30 2021-07-07 Nissan Motor Co., Ltd. Vehicular display device
US10234294B2 (en) * 2016-04-01 2019-03-19 Here Global B.V. Road geometry matching with componentized junction models
US10883848B2 (en) 2018-09-20 2021-01-05 Here Global B.V. Methods and systems for providing an improved maneuver countdown bar
JP6964062B2 (en) * 2018-11-26 2021-11-10 本田技研工業株式会社 Vehicle control devices, vehicle control methods, and programs
WO2021183128A1 (en) * 2020-03-12 2021-09-16 Google Llc Alternative navigation directions pre-generated when a user is likely to make a mistake in navigation

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2674652A1 (en) * 1986-03-04 1992-10-02 Thomson Csf Process and device for synthesising three-dimensional moving map images
EP0738876A2 (en) * 1995-04-20 1996-10-23 Hitachi, Ltd. Map display apparatus
JPH09171348A (en) * 1995-12-19 1997-06-30 Honda Motor Co Ltd On-vehicle navigation device
JPH09318380A (en) * 1996-05-29 1997-12-12 Fujitsu Ten Ltd Intersection guiding apparatus

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS61216098A (en) 1985-03-20 1986-09-25 日産自動車株式会社 Course guidance unit for vehicle
FR2610752B1 (en) 1987-02-10 1989-07-21 Sagem METHOD FOR REPRESENTING THE PERSPECTIVE IMAGE OF A FIELD AND SYSTEM FOR IMPLEMENTING SAME
US4937570A (en) 1987-02-26 1990-06-26 Mitsubishi Denki Kabushiki Kaisha Route guidance display device
JPH01219610A (en) 1988-02-29 1989-09-01 Nissan Motor Co Ltd Running azimuth detector for vehicle
JP3511561B2 (en) * 1996-09-11 2004-03-29 パイオニア株式会社 Map information display device, navigation device, and recording medium recording navigation program
US5323321A (en) 1990-06-25 1994-06-21 Motorola, Inc. Land vehicle navigation apparatus
JP2801798B2 (en) * 1991-07-10 1998-09-21 パイオニア株式会社 Navigation system
US5557522A (en) 1993-09-10 1996-09-17 Nissan Motor Co., Ltd. Apparatus and method for guiding vehicle occupant to travel from present position of vehicle to set destination through display unit
DE69425234T2 (en) * 1993-12-27 2000-11-30 Nissan Motor Vehicle route guidance device and method using a display unit
US5793310A (en) * 1994-02-04 1998-08-11 Nissan Motor Co., Ltd. Portable or vehicular navigating apparatus and method capable of displaying bird's eye view
US5473447A (en) 1994-02-14 1995-12-05 Polaroid Corporation Heads-up and heads-down displays employing holographic stereograms
EP0678731B1 (en) * 1994-04-15 1999-06-30 Nissan Motor Co., Ltd. Vehicle navigation system
US5757289A (en) * 1994-09-14 1998-05-26 Aisin Aw Co., Ltd. Vehicular navigation system
DE69529871T2 (en) * 1994-09-20 2003-11-20 Aisin Aw Co Car navigation system
EP0884711B1 (en) * 1994-11-11 2004-04-28 Xanavi Informatics Corporation Map display apparatus for motor vehicle
US5742924A (en) * 1994-12-02 1998-04-21 Nissan Motor Co., Ltd. Apparatus and method for navigating mobile body using road map displayed in form of bird's eye view
JP3371605B2 (en) * 1995-04-19 2003-01-27 日産自動車株式会社 Bird's-eye view display navigation system with atmospheric effect display function
JP3483672B2 (en) * 1995-09-06 2004-01-06 三菱電機株式会社 Navigation device
JP3353581B2 (en) * 1995-12-26 2002-12-03 日産自動車株式会社 Bird's-eye view display navigation device
JP4084857B2 (en) * 1996-08-30 2008-04-30 本田技研工業株式会社 Aspect ratio setting method of image sensor in automobile front monitoring system
US5951621A (en) * 1997-10-30 1999-09-14 Lear Automotive Dearborn, Inc. Proximity indicator display

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2674652A1 (en) * 1986-03-04 1992-10-02 Thomson Csf Process and device for synthesising three-dimensional moving map images
EP0738876A2 (en) * 1995-04-20 1996-10-23 Hitachi, Ltd. Map display apparatus
JPH09171348A (en) * 1995-12-19 1997-06-30 Honda Motor Co Ltd On-vehicle navigation device
JPH09318380A (en) * 1996-05-29 1997-12-12 Fujitsu Ten Ltd Intersection guiding apparatus

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
PATENT ABSTRACTS OF JAPAN vol. 097, no. 010 31 October 1997 (1997-10-31) *
PATENT ABSTRACTS OF JAPAN vol. 098, no. 004 31 March 1998 (1998-03-31) *

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1122626A1 (en) * 2000-02-02 2001-08-08 Matsushita Electric Industrial Co., Ltd. Intersection display method, and map display unit and recording medium for realizing the method
US6424911B2 (en) 2000-02-02 2002-07-23 Matsushita Electric Industrial Co., Ltd. Intersection display method, and map display unit and recording medium for realizing the method
US7039521B2 (en) 2001-08-07 2006-05-02 Siemens Aktiengesellschaft Method and device for displaying driving instructions, especially in car navigation systems
WO2003017226A3 (en) * 2001-08-07 2003-06-12 Siemens Ag Method and device for displaying driving instructions, especially in car navigation systems
WO2003017226A2 (en) * 2001-08-07 2003-02-27 Siemens Aktiengesellschaft Method and device for displaying driving instructions, especially in car navigation systems
EP1411325A2 (en) * 2002-10-16 2004-04-21 LG Electronics, Inc. Method and apparatus for intersection guiding in navigation system
EP1411325A3 (en) * 2002-10-16 2004-10-20 LG Electronics, Inc. Method and apparatus for intersection guiding in navigation system
US8150216B2 (en) 2004-05-05 2012-04-03 Google Inc. Methods and apparatus for automated true object-based image analysis and retrieval
US8903199B2 (en) 2004-05-05 2014-12-02 Google Inc. Methods and apparatus for automated true object-based image analysis and retrieval
US8908997B2 (en) 2004-05-05 2014-12-09 Google Inc. Methods and apparatus for automated true object-based image analysis and retrieval
US8908996B2 (en) 2004-05-05 2014-12-09 Google Inc. Methods and apparatus for automated true object-based image analysis and retrieval
US9424277B2 (en) 2004-05-05 2016-08-23 Google Inc. Methods and apparatus for automated true object-based image analysis and retrieval
EP1596163A3 (en) * 2004-05-12 2011-10-26 Alpine Electronics, Inc. Navigation apparatus and method for displaying map using the same
CN101432786B (en) * 2006-04-27 2010-12-01 星克跃尔株式会社 Method for displaying background sky in navigation system and apparatus thereof
FR2902381A1 (en) * 2006-06-20 2007-12-21 Peugeot Citroen Automobiles Sa Motor vehicle driving assisting method, involves merging image captured by image formation device of night vision system and synthesis image, and displaying merged image on internal display of night vision system

Also Published As

Publication number Publication date
US6611753B1 (en) 2003-08-26
NO20005221L (en) 2000-10-17
CA2326683A1 (en) 1999-10-28
NO20005221D0 (en) 2000-10-17
ATE310287T1 (en) 2005-12-15
DE69928387D1 (en) 2005-12-22
EP1074002B1 (en) 2005-11-16
DE69928387T2 (en) 2006-08-03
AU3489799A (en) 1999-11-08
EP1074002A1 (en) 2001-02-07

Similar Documents

Publication Publication Date Title
US6611753B1 (en) 3-dimensional intersection display for vehicle navigation system
US8095307B2 (en) Method for controlling the display of a geographical map in a vehicle and display apparatus for that purpose
EP0884711B1 (en) Map display apparatus for motor vehicle
EP1581782B1 (en) System and method for advanced 3d visualization for mobile navigation units
EP2023091B1 (en) Navigation device and navigation program
ES2293232T3 (en) NAVIGATION DEVICE AND METHOD TO VISUALIZE THE SIMULATED NAVIGATION DATA.
US20080167811A1 (en) Navigation device and method for displaying navigation information
US20090143980A1 (en) Navigation Device and Method of Scrolling Map Data Displayed On a Navigation Device
JP6121650B2 (en) Labeling map elements in a digital map
US20080091732A1 (en) Hierarchical system and method for on-demand loading of data in a navigation system
EP2503292B1 (en) Landmark icons in digital maps
US20060224311A1 (en) Navigation system
US20090082960A1 (en) Navigation system with enhanced display functions
RU2271516C2 (en) Mode and arrangement for controlling of a road-crossing in a navigational system
US8988425B2 (en) Image display control system, image display control method, and image display control program
US20040125114A1 (en) Multiresolution image synthesis for navigation
EP1160544A2 (en) Map display device, map display method, and computer program for use in map display device
JP2007025362A (en) Image processor, distant view image display method and distant view image display program
KR20080019690A (en) Navigation device with camera-info
JP2006098225A (en) Navigation apparatus
JP2021181914A (en) Map display system and map display program
Bachman Moving Map and Situational Awareness Capabilities of the DAGR
NZ564320A (en) Navigation device and method of scrolling map data displayed on a navigation device

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AL AM AT AU AZ BA BB BG BR BY CA CH CN CU CZ DE DK EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT UA UG UZ VN YU ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW SD SL SZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
WWE Wipo information: entry into national phase

Ref document number: 34897/99

Country of ref document: AU

WWE Wipo information: entry into national phase

Ref document number: 1999916613

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2326683

Country of ref document: CA

Ref country code: CA

Ref document number: 2326683

Kind code of ref document: A

Format of ref document f/p: F

NENP Non-entry into the national phase

Ref country code: KR

WWE Wipo information: entry into national phase

Ref document number: 507812

Country of ref document: NZ

WWP Wipo information: published in national office

Ref document number: 1999916613

Country of ref document: EP

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

WWG Wipo information: grant in national office

Ref document number: 1999916613

Country of ref document: EP