US20100286901A1 - Navigation device and method relating to an audible recognition mode - Google Patents

Navigation device and method relating to an audible recognition mode Download PDF

Info

Publication number
US20100286901A1
US20100286901A1 US11/907,232 US90723207A US2010286901A1 US 20100286901 A1 US20100286901 A1 US 20100286901A1 US 90723207 A US90723207 A US 90723207A US 2010286901 A1 US2010286901 A1 US 2010286901A1
Authority
US
United States
Prior art keywords
navigation device
audible
input
information
output
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/907,232
Inventor
Pieter Geelen
Mareji Roosen
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
TomTom International BV
Original Assignee
TomTom International BV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by TomTom International BV filed Critical TomTom International BV
Priority to US11/907,232 priority Critical patent/US20100286901A1/en
Assigned to TOMTOM INTERNATIONAL B.V. reassignment TOMTOM INTERNATIONAL B.V. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: GEELEN, PIETER, ROOSEN, MAREIJE
Publication of US20100286901A1 publication Critical patent/US20100286901A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3691Retrieval, searching and output of information related to real-time traffic, weather, or environmental conditions
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3697Output of additional, non-guidance related information, e.g. low fuel level

Definitions

  • the present application generally relates to navigation methods and devices.
  • Navigation devices were traditionally utilized mainly in the areas of vehicle use, such as on cars, motorcycles, trucks, boats, etc. Alternatively, if such navigation devices were portable, they were further transferable between vehicles and/or useable outside the vehicle, for foot travel for example.
  • These devices are typically tailored to produce a route of travel based upon an initial position of the navigation device and a selected/input travel destination (end position), noting that the initial position could be entered into the device, but is traditionally calculated via GPS Positioning from a GPS receiver within the navigation device.
  • end position a selected/input travel destination
  • instructions are output along the route to a user of the navigation device. These instructions may be a least one of audible and visual.
  • the inventors discovered that users of navigation devices may have some difficulty in operating and viewing touch panel screens. Thus, the inventors discovered that user's desire at least limited hands free access, especially when using the navigation device in a vehicle. As such, the inventors developed methods which allow hands-free or at least partial hands free access by utilizing an audible recognition mode.
  • a method includes receiving an indication of enablement of an audible recognition mode in a navigation device; determining, subsequent to receiving an indication of enablement of the audible recognition mode and subsequent to receiving an audible input, at least one choice relating to address information of a travel destination based upon the received audible input; audibly outputting at least one determined choice relating to address information of a travel destination; and acknowledging selection of the audibly output at least one determined choice upon receiving an affirmative audible input.
  • a navigation device includes a processor to receive an indication of enablement of an audible recognition mode in a navigation device and to determine, subsequent to receiving an audible input, at least one choice relating to address information of a travel destination based upon the received audible input; and an output device to audibly output at least one determined choice relating to address information of a travel destination, the processor being further useable to acknowledge selection of the audibly output at least one determined choice upon receiving an affirmative audible input.
  • a method includes receiving an indication of enablement of an audible recognition mode in a navigation device; and displaying on an integrated input and display device, subsequent to receiving an indication of enablement of the audible recognition mode, an indication as to whether a volume of a received audible input is within an acceptable range, louder than the acceptable range, and softer than the acceptable range.
  • a navigation device includes a processor to receive an indication of enablement of an audible recognition mode in a navigation device; and an integrated input and display device to display, subsequent to the processor receiving an indication of enablement of the audible recognition mode, an indication as to whether a volume of a received audible input is within an acceptable range, louder than the acceptable range, and softer than the acceptable range.
  • a method includes receiving an indication of enablement of an audible recognition mode in a navigation device; receiving additional information from a source other than a user of the navigation device; formulating a question, answerable by a yes or no answer from the user, based upon the received additional information; and outputting the formulated question to the user.
  • a navigation device includes a processor to receive an indication of enablement of an audible recognition mode, to receive additional information from a source other than a user of the navigation device, and to formulate a question, answerable by a yes or no answer from the user, based upon the received additional information; and an output device to output the formulated question to the use.
  • FIG. 1 illustrates an example view of a Global Positioning System (GPS);
  • GPS Global Positioning System
  • FIG. 2 illustrates an example block diagram of electronic components of a navigation device of an embodiment of the present application
  • FIG. 3 illustrates an example block diagram of a server, navigation device and connection therebetween of an embodiment of the present application
  • FIGS. 4A and 4B are perspective views of an implementation of an embodiment of the navigation device
  • FIG. 5 illustrates a flow chart of an embodiment of a method of the present application
  • FIGS. 6A-D are examples of audible recognition mode icons for display in an embodiment of the present application.
  • FIG. 7 illustrates an example chart of an embodiment of the present application
  • FIG. 8 illustrates a flow chart of an embodiment of a method of the present application.
  • FIG. 9 illustrates a flow chart of an embodiment of a method of the present application.
  • FIG. 1 illustrates an example view of Global Positioning System (GPS), usable by navigation devices, including the navigation device of embodiments of the present application.
  • GPS Global Positioning System
  • Such systems are known and are used for a variety of purposes.
  • GPS is a satellite-radio based navigation system capable of determining continuous position, velocity, time, and in some instances direction information for an unlimited number of users.
  • the GPS incorporates a plurality of satellites which work with the earth in extremely precise orbits. Based on these precise orbits, GPS satellites can relay their location to any number of receiving units.
  • the GPS system is implemented when a device, specially equipped to receive GPS data, begins scanning radio frequencies for GPS satellite signals. Upon receiving a radio signal from a GPS satellite, the device determines the precise location of that satellite via one of a plurality of different conventional methods. The device will continue scanning, in most instances, for signals until it has acquired at least three different satellite signals (noting that position is not normally, but can be determined, with only two signals using other triangulation techniques). Implementing geometric triangulation, the receiver utilizes the three known positions to determine its own two-dimensional position relative to the satellites. This can be done in a known manner. Additionally, acquiring a fourth satellite signal will allow the receiving device to calculate its three dimensional position by the same geometrical calculation in a known manner. The position and velocity data can be updated in real time on a continuous basis by an unlimited number of users.
  • the GPS system is denoted generally by reference numeral 100 .
  • a plurality of satellites 120 are in orbit about the earth 124 .
  • the orbit of each satellite 120 is not necessarily synchronous with the orbits of other satellites 120 and, in fact, is likely asynchronous.
  • a GPS receiver 140 usable in embodiments of navigation devices of the present application, is shown receiving spread spectrum GPS satellite signals 160 from the various satellites 120 .
  • the spread spectrum signals 160 continuously transmitted from each satellite 120 , utilize a highly accurate frequency standard accomplished with an extremely accurate atomic clock.
  • Each satellite 120 as part of its data signal transmission 160 , transmits a data stream indicative of that particular satellite 120 .
  • the GPS receiver device 140 generally acquires spread spectrum GPS satellite signals 160 from at least three satellites 120 for the GPS receiver device 140 to calculate its two-dimensional position by triangulation. Acquisition of an additional signal, resulting in signals 160 from a total of four satellites 120 , permits the GPS receiver device 140 to calculate its three-dimensional position in a known manner.
  • FIG. 2 illustrates an example block diagram of electronic components of a navigation device 200 of an embodiment of the present application, in block component format. It should be noted that the block diagram of the navigation device 200 is not inclusive of all components of the navigation device, but is only representative of many example components.
  • the navigation device 200 is located within a housing (not shown).
  • the housing includes a processor 210 connected to an input device 220 and a display screen 240 .
  • the input device 220 can include a keyboard device, voice input device, touch panel and/or any other known input device utilized to input information; and the display screen 240 can include any type of display screen such as an LCD display, for example.
  • the input device 220 and display screen 240 are integrated into an integrated input and display device, including a touchpad or touchscreen input wherein a user need only touch a portion of the display screen 240 to select one of a plurality of display choices or to activate one of a plurality of virtual buttons.
  • output devices 241 can also include, including but not limited to, an audible output device.
  • output device 241 can produce audible information to a user of the navigation device 200
  • input device 240 can also include a microphone and software for receiving input voice commands as well.
  • processor 210 is operatively connected to and set to receive input information from input device 240 via a connection 225 , and operatively connected to at least one of display screen 240 and output device 241 , via output connections 245 , to output information thereto. Further, the processor 210 is operatively connected to memory 230 via connection 235 and is further adapted to receive/send information from/to input/output (I/O) ports 270 via connection 275 , wherein the I/O port 270 is connectible to an I/O device 280 external to the navigation device 200 .
  • the external I/O device 270 may include, but is not limited to an external listening device such as an earpiece for example.
  • connection to I/O device 280 can further be a wired or wireless connection to any other external device such as a car stereo unit for hands-free operation and/or for voice activated operation for example, for connection to an ear piece or head phones, and/or for connection to a mobile phone for example, wherein the mobile phone connection may be used to establish a data connection between the navigation device 200 and the internet or any other network for example, and/or to establish a connection to a server via the internet or some other network for example.
  • any other external device such as a car stereo unit for hands-free operation and/or for voice activated operation for example, for connection to an ear piece or head phones, and/or for connection to a mobile phone for example
  • the mobile phone connection may be used to establish a data connection between the navigation device 200 and the internet or any other network for example, and/or to establish a connection to a server via the internet or some other network for example.
  • the navigation device 200 may establish a “mobile” network connection with the server 302 via a mobile device 400 (such as a mobile phone, PDA, and/or any device with mobile phone technology) establishing a digital connection (such as a digital connection via known Bluetooth technology for example). Thereafter, through its network service provider, the mobile device 400 can establish a network connection (through the internet for example) with a server 302 . As such, a “mobile” network connection is established between the navigation device 200 (which can be, and often times is mobile as it travels alone and/or in a vehicle) and the server 302 to provide a “real-time” or at least very “up to date” gateway for information.
  • a mobile device 400 such as a mobile phone, PDA, and/or any device with mobile phone technology
  • a digital connection such as a digital connection via known Bluetooth technology for example
  • the mobile device 400 can establish a network connection (through the internet for example) with a server 302 .
  • a “mobile” network connection is established between the navigation device 200 (
  • the establishing of the network connection between the mobile device 400 (via a service provider) and another device such as the server 302 , using the internet 410 for example, can be done in a known manner. This can include use of TCP/IP layered protocol for example.
  • the mobile device 400 can utilize any number of communication standards such as CDMA, GSM, WAN, etc.
  • an internet connection may be utilized which is achieved via data connection, via a mobile phone or mobile phone technology within the navigation device 200 for example.
  • an internet connection between the server 302 and the navigation device 200 is established. This can be done, for example, through a mobile phone or other mobile device and a GPRS (General Packet Radio Service)-connection (GPRS connection is a high-speed data connection for mobile devices provided by telecom operators; GPRS is a method to connect to the internet.
  • GPRS General Packet Radio Service
  • the navigation device 200 can further complete a data connection with the mobile device 400 , and eventually with the internet 410 and server 302 , via existing Bluetooth technology for example, in a known manner, wherein the data protocol can utilize any number of standards, such as the GSRM, the Data Protocol Standard for the GSM standard, for example.
  • the data protocol can utilize any number of standards, such as the GSRM, the Data Protocol Standard for the GSM standard, for example.
  • the navigation device 200 may include its own mobile phone technology within the navigation device 200 itself (including an antenna for example, wherein the internal antenna of the navigation device 200 can further alternatively be used).
  • the mobile phone technology within the navigation device 200 can include internal components as specified above, and/or can include an insertable card, complete with necessary mobile phone technology and/or an antenna for example.
  • mobile phone technology within the navigation device 200 can similarly establish a network connection between the navigation device 200 and the server 302 , via the internet 410 for example, in a manner similar to that of any mobile device 400 .
  • the Bluetooth enabled device may be used to correctly work with the ever changing spectrum of mobile phone models, manufacturers, etc., model/manufacturer specific settings may be stored on the navigation device 200 for example.
  • the data stored for this information can be updated in a manner discussed in any of the embodiments, previous and subsequent.
  • FIG. 2 further illustrates an operative connection between the processor 210 and an antenna/receiver 250 via connection 255 , wherein the antenna/receiver 250 can be a GPS antenna/receiver for example.
  • the antenna and receiver designated by reference numeral 250 are combined schematically for illustration, but that the antenna and receiver may be separately located components, and that the antenna may be a GPS patch antenna or helical antenna for example.
  • the electronic components shown in FIG. 2 are powered by power sources (not shown) in a conventional manner.
  • power sources not shown
  • different configurations of the components shown in FIG. 2 are considered within the scope of the present application.
  • the components shown in FIG. 2 may be in communication with one another via wired and/or wireless connections and the like.
  • the scope of the navigation device 200 of the present application includes a portable or handheld navigation device 200 .
  • the portable or handheld navigation device 200 of FIG. 2 can be connected or “docked” in a known manner to a motorized vehicle such as a car or boat for example. Such a navigation device 200 is then removable from the docked location for portable or handheld navigation use.
  • FIG. 3 illustrates an example block diagram of a server 302 and a navigation device 200 of the present application, via a generic communications channel 318 , of an embodiment of the present application.
  • the server 302 and a navigation device 200 of the present application can communicate when a connection via communications channel 318 is established between the server 302 and the navigation device 200 (noting that such a connection can be a data connection via mobile device, a direct connection via personal computer via the internet, etc.).
  • the server 302 includes, in addition to other components which may not be illustrated, a processor 304 operatively connected to a memory 306 and further operatively connected, via a wired or wireless connection 314 , to a mass data storage device 312 .
  • the processor 304 is further operatively connected to transmitter 308 and receiver 310 , to transmit and send information to and from navigation device 200 via communications channel 318 .
  • the signals sent and received may include data, communication, and/or other propagated signals.
  • the transmitter 308 and receiver 310 may be selected or designed according to the communications requirement and communication technology used in the communication design for the navigation system 200 . Further, it should be noted that the functions of transmitter 308 and receiver 310 may be combined into a signal transceiver.
  • Server 302 is further connected to (or includes) a mass storage device 312 , noting that the mass storage device 312 may be coupled to the server 302 via communication link 314 .
  • the mass storage device 312 contains a store of navigation data and map information, and can again be a separate device from the server 302 or can be incorporated into the server 302 .
  • the navigation device 200 is adapted to communicate with the server 302 through communications channel 318 , and includes processor, memory, etc. as previously described with regard to FIG. 2 , as well as transmitter 320 and receiver 322 to send and receive signals and/or data through the communications channel 318 , noting that these devices can further be used to communicate with devices other than server 302 .
  • the transmitter 320 and receiver 322 are selected or designed according to communication requirements and communication technology used in the communication design for the navigation device 200 and the functions of the transmitter 320 and receiver 322 may be combined into a single transceiver.
  • Software stored in server memory 306 provides instructions for the processor 304 and allows the server 302 to provide services to the navigation device 200 .
  • One service provided by the server 302 involves processing requests from the navigation device 200 and transmitting navigation data from the mass data storage 312 to the navigation device 200 .
  • another service provided by the server 302 includes processing the navigation data using various algorithms for a desired application and sending the results of these calculations to the navigation device 200 .
  • the communication channel 318 generically represents the propagating medium or path that connects the navigation device 200 and the server 302 .
  • both the server 302 and navigation device 200 include a transmitter for transmitting data through the communication channel and a receiver for receiving data that has been transmitted through the communication channel.
  • the communication channel 318 is not limited to a particular communication technology. Additionally, the communication channel 318 is not limited to a single communication technology; that is, the channel 318 may include several communication links that use a variety of technology. For example, according to at least one embodiment, the communication channel 318 can be adapted to provide a path for electrical, optical, and/or electromagnetic communications, etc. As such, the communication channel 318 includes, but is not limited to, one or a combination of the following: electric circuits, electrical conductors such as wires and coaxial cables, fiber optic cables, converters, radio-frequency (rf) waves, the atmosphere, empty space, etc. Furthermore, according to at least one various embodiment, the communication channel 318 can include intermediate devices such as routers, repeaters, buffers, transmitters, and receivers, for example.
  • intermediate devices such as routers, repeaters, buffers, transmitters, and receivers, for example.
  • the communication channel 318 includes telephone and computer networks. Furthermore, in at least one embodiment, the communication channel 318 may be capable of accommodating wireless communication such as radio frequency, microwave frequency, infrared communication, etc. Additionally, according to at least one embodiment, the communication channel 318 can accommodate satellite communication.
  • the communication signals transmitted through the communication channel 318 include, but are not limited to, signals as may be required or desired for given communication technology.
  • the signals may be adapted to be used in cellular communication technology such as Time Division Multiple Access (TDMA), Frequency Division Multiple Access (FDMA), Code Division Multiple Access (CDMA), Global System for Mobile Communications (GSM), etc.
  • TDMA Time Division Multiple Access
  • FDMA Frequency Division Multiple Access
  • CDMA Code Division Multiple Access
  • GSM Global System for Mobile Communications
  • Both digital and analogue signals can be transmitted through the communication channel 318 .
  • these signals may be modulated, encrypted and/or compressed signals as may be desirable for the communication technology.
  • the mass data storage 312 includes sufficient memory for the desired navigation applications.
  • Examples of the mass data storage 312 may include magnetic data storage media such as hard drives for example, optical storage media such as CD-Roms for example, charged data storage media such as flash memory for example, molecular memory, etc.
  • the server 302 includes a remote server accessible by the navigation device 200 via a wireless channel.
  • the server 302 may include a network server located on a local area network (LAN), wide area network (WAN), virtual private network (VPN), etc.
  • LAN local area network
  • WAN wide area network
  • VPN virtual private network
  • the server 302 may include a personal computer such as a desktop or laptop computer, and the communication channel 318 may be a cable connected between the personal computer and the navigation device 200 .
  • a personal computer may be connected between the navigation device 200 and the server 302 to establish an internet connection between the server 302 and the navigation device 200 .
  • a mobile telephone or other handheld device may establish a wireless connection to the internet, for connecting the navigation device 200 to the server 302 via the internet.
  • the navigation device 200 may be provided with information from the server 302 via information downloads which may be periodically updated upon a user connecting navigation device 200 to the server 302 and/or may be more dynamic upon a more constant or frequent connection being made between the server 302 and navigation device 200 via a wireless mobile connection device and TCP/IP connection for example.
  • the processor 304 in the server 302 may be used to handle the bulk of the processing needs; however, processor 210 of navigation device 200 can also handle much processing and calculation, oftentimes independent of a connection to a server 302 .
  • the mass storage device 312 connected to the server 302 can include volumes more cartographic and route data than that which is able to be maintained on the navigation device 200 itself, including maps, etc.
  • the server 302 may process, for example, the majority of the devices of a navigation device 200 which travel along the route using a set of processing algorithms. Further, the cartographic and route data stored in memory 312 can operate on signals (e.g. GPS signals), originally received by the navigation device 200 .
  • a navigation device 200 of an embodiment of the present application includes a processor 210 , an input device 220 , and a display screen 240 .
  • the input device 220 and display screen 240 are integrated into an integrated input and display device to enable both input of information (via direct input, menu selection, etc.) and display of information through a touch panel screen, for example.
  • a touch panel screen for example.
  • Such a screen may be a touch input LCD screen, for example, as is well known to those of ordinary skill in the art.
  • the navigation device 200 can also include any additional input device 220 and/or any additional output device 241 , such as audio input/output devices for example.
  • FIGS. 4A and 4B are perspective views of an actual implementation of an embodiment of the navigation device 200 .
  • the navigation device 200 may be a unit that includes an integrated input and display device 290 (a touch panel screen for example) and the other components of FIG. 2 (including but not limited to internal GPS receiver 250 , microprocessor 210 , a power supply, memory systems 220 , etc.).
  • the navigation device 200 may sit on an arm 292 , which itself may be secured to a vehicle dashboard/window/etc. using a large suction cup 294 .
  • This arm 292 is one example of a docking station to which the navigation device 200 can be docked.
  • the navigation device 200 can be docked or otherwise connected to an arm 292 of the docking station by snap connecting the navigation device 292 to the arm 292 for example (this is only one example, as other known alternatives for connection to a docking station are within the scope of the present application).
  • the navigation device 200 may then be rotatable on the arm 292 , as shown by the arrow of FIG. 4B .
  • a button on the navigation device 200 may be pressed, for example (this is only one example, as other known alternatives for disconnection to a docking station are within the scope of the present application).
  • a method includes receiving an indication of enablement of an audible recognition mode in a navigation device 200 ; determining, subsequent to receiving an indication of enablement of the audible recognition mode and subsequent to receiving an audible input, at least one choice relating to address information of a travel destination based upon the received audible input; audibly outputting at least one determined choice relating to address information of a travel destination; and acknowledging selection of the audibly output at least one determined choice upon receiving an affirmative audible input.
  • a navigation device 200 includes a processor 210 to receive an indication of enablement of an audible recognition mode in a navigation device 200 and to determine, subsequent to receiving an audible input, at least one choice relating to address information of a travel destination based upon the received audible input; and an output device 241 to audibly output at least one determined choice relating to address information of a travel destination, the processor 210 being further useable to acknowledge selection of the audibly output at least one choice upon receiving an affirmative audible input.
  • FIG. 5 illustrates a flowchart of an example embodiment of the present application.
  • step S 2 it is first determined is step S 2 whether or not an audible recognition mode has been enabled in the navigation device.
  • an icon can be displayed on an integrated input and display device 290 of the navigation device 200 .
  • Such an icon can be displayed in an initial or subsequent menu for selection prior to input/ selection of a destination for establishing a route of travel and/or can be displayed along with map information, for example, during use of the navigation device in a navigation mode.
  • This icon can include just a pictorial illustration, such as the lips shown in FIG.
  • an audible recognition mode such as audible speech recognition (ASR).
  • ASR audible speech recognition
  • An audible recognition mode can include the processor 210 working in conjunction with an ASR engine or module.
  • An ASR engine or module is a software engine that, once an audible recognition mode is enabled as explained above, can be loaded with grammatical rules, in a language of the country of the user of the navigation device 200 (or selected by the user, for example) for example.
  • a user of the navigation device 200 will typically enter/select a country in which the user is located, and the language of that country can then be selected, input or matched by the processor 210 . Thereafter, the ASR engine can then be loaded with grammatical rules from memory 230 , upon an audible recognition mode being enabled.
  • the ASR engine can then use the language corresponding to the chosen map to recognize geographical names (city and street names, for example) and the current user selected/enabled language to recognize common speech.
  • geographical names city and street names, for example
  • the system may be set up to enable recognition of complex speech from the user, or may be limited to only simple replies of yes, no, done, back, and/or numerical entries such as 1, 2, 3, etc. are of
  • the ASR engine or module is one which enables a speech interface between the user and the navigation device 200 .
  • Such a module is typically not usable in a portable navigation device 200 such as that shown in FIGS. 2-4B of the present application, but embodiments of the present application improve or even optimize memory management between the processor 210 and memory devices 230 for example, as well as data structures, to allow the ASR module to handle and recognize input information.
  • all or most available memory in the memory device(s) 230 of the navigation device 200 are allocated to the ASR module during speech recognition; namely upon the audible recognition mode being enabled in step S 2 of FIG. 5 , while other processes of the processor 210 are put on hold.
  • certain processes devoted to display of navigation information and output of navigation instructions must continue, thus sometimes slowing down operation of the ASR module.
  • the ASR module is primarily utilized in selecting address information of a travel destination based upon received audible input, and thus typically operates at a time when the navigation device 200 is not in use in a navigation mode.
  • another embodiment of the present application involves formulating simple questions, answerable by a yes/no answer (for example) from the user, to thereby enable processing capacity to be allocated to the navigation mode, with only a small amount of processing capacity needed in the ASR module to recognize such yes/no answers from the user of the navigation device 200 .
  • a yes/no answer for example
  • the operation shown in FIG. 5 typically occurs before start of the vehicle in which the navigation device 200 is located, namely before a travel destination is input into the navigation device 200 and before a travel route is determined.
  • step S 2 if the audible recognition mode is not enabled, the system cycles back to repeat step S 2 .
  • the audible recognition mode is enabled, by the processor 210 receiving an indication of selection of the “talk to me” icon shown in FIG. 6A for example, language and grammar information is loaded into the ASR module of the navigation device 200 from memory 230 and the navigation device 200 merely awaits an audible input in step S 4 . If no audible input is received, the system merely cycles back to repeat step S 4 until an audible input is received.
  • the ASR module is typically utilized to recognize speech information from different users. Such information is typically unpredictable, and therefore cannot be stored in memory 230 .
  • the ASR module or engine operates in conjunction with the processor 210 to convert received speech information to a sequence of phonemes in a known manner, and then works with processor 210 to match existing grammar of stored cities, street names, etc., to the converted sequence of phonemes.
  • step S 6 if an audible input is received, the processor 210 works with the ASR module to convert the input speech to phonemes and to compare the sequence of phonemes to stored information in memory 230 to determine at least one choice relating to address information of the travel destination based upon the received audible input.
  • the at least one choice relating to address information of a travel destination can include a city name. Accordingly, a user may audibly output a name of a city as part of the address information of the travel destination, wherein the initial input of the city could be prompted by the navigation device 200 displaying a request, such as “In which city?” for example, to enter travel destination information.
  • the processor 210 and ASR module Upon receipt of this audible information, the processor 210 and ASR module process the phonemes as described above and compare this information to stored cities in memory 230 to determine at least one choice relating to the input audible sound, if possible. If nothing was recognized, the navigation device 200 may return to a screen to prompt input of the city or other address information, and may or may not flash or otherwise display a message “input not recognized”, for example. As will be explained in another embodiment of the present application, a sound indicator can also be displayed to a user indicating whether or not the volume of audible input is within an acceptable range, louder than an acceptable range, or softer than an acceptable range, for example.
  • step S 6 If at least one address information choice (such as a city for example) was determinable in step S 6 , the process proceeds to step S 8 wherein at least one determined choice is audibly output relating to address information of a travel destination. For example, instead of the system merely guessing that an audible input was received correctly, the processor 210 instead directs audible output of at least one determined choice relating to address information of a travel destination in step S 8 . Thereafter, in step S 10 , the processor 210 waits to see if an affirmative audible output was received in step S 10 . If so, the processor 210 and ASR module can then acknowledge that a correct determination occurred, and can thus acknowledge selection of the audibly output at least one determined choice upon receiving and recognizing an affirmative audible input, such as a “yes” for example.
  • an affirmative audible input such as a “yes” for example.
  • At least one determined choice relating to address information is first audibly output, and selection of the at least one determined choice is not acknowledged until an affirmative audible input is received.
  • step S 6 upon receipt of an audible input, at least one address information choice for the travel destination is determined, such as a city name for example.
  • a plurality of “N-best” choices are recognized by the processor 210 .
  • the processor 210 in conjunction with the ASR module, tries to best determine, from the phonemes of the audible input, a name of a city (in this first instance of input of address information for example).
  • the processor 210 scans or reviews all the various cities stored in memory 230 for a match.
  • the processor 210 then ranks the best possible matches such that the best possible match will be audibly output to the user of the navigation device 200 as the at least one determined choice relating to address information of the travel destination.
  • the processor 210 can also direct the navigation device 200 to display an “N-best” list of choices, such as the N-best matches of city names determined by the processor 210 for example, on the integrated input and display device 290 .
  • the best possible match based upon the audible input received from the user may be audibly output and may further be displayed visually at the top of the “N-best” list (as the number one choice in a displayed list).
  • next best choices can be visually displayed to the user in step S 14 as numbered choices, such as choices two-six for example.
  • a visually output choice may be selected in step S 16 , via display and subsequent input through the integrated input and display device 290 , for example. If selected, selection can be acknowledged in step S 20 of FIG. 5 , by processor 210 for example.
  • the processor 210 and ASR module may not only be used to determine one single choice, but can be used to determine a plurality of choices relating to the address information of the travel destination.
  • Each of the plurality of choices may be visually output and only one choice may be audibly output, for example.
  • the plurality of choices may be visually output for selection on the integrated input and display device 290 of the navigation device 200 .
  • Each of these choices such as a list of cities sounding most like the audible input for example, can be determined and displayed and are selectable by at least one of a touch panel and audible output. Further, the audibly output at least one choice is further selectable via receipt of an indication of touch panel input.
  • each of the plurality of determined choices may be selectable via receipt of an indication of a touch panel input, and/or by audible input of a number corresponding to a displayed choice (for example, a user saying “two” to select the second displayed choice).
  • the processor 210 and ASR module can determine an “N-best” list of cities to be audibly and visually output.
  • the first city in the displayed list may be “Salt Lake City”, and may be both audibly output and visually output on an integrated input and display device 290 of the navigation device for example.
  • another “N-best” cities can be determined by processor 210 and the ASR module, including for example, five other cities such as Salem, California, San Antonio, Springfield, and Staunton.
  • the “N-best” list includes a set number of choices, such as six choices for example.
  • These six choices can then be displayed to the user for audible or touch panel input/selection. Accordingly, if all six choices are displayed in order on the touch panel of the integrated input and display device 290 , the user may merely touch and thereby select one of the six choices.
  • the user can acknowledge selection of the audibly output choice by issuing an affirmative audible input.
  • the user can select any one of the other five displayed choices (or even the first choice for example) by merely stating the number corresponding to the particular choice, such as “ 6 ” representing the sixth choice of “Staunton.”
  • the processor 210 By utilizing an affirmative audible input, and/or an audible input of only one of six numerical values, the processor 210 increases the likelihood of confirming a user's selection and thereby can adequately acknowledge selection of a particular choice by the user.
  • a user can issue another audible output, for input/receipt by the processor 210 and ASR module, corresponding to a street name for example.
  • the processor 210 and ASR module can determine at least one street name subsequent to selection of city name and subsequent to receiving another audible output.
  • the processor 210 and ASR module may determine an “N-best” list of street names, for subsequent audible and/or visual output to the user of the navigation device 200 , for subsequent selection thereof. Selection can be done in the same manner as discussed previously with regard to city names.
  • a user can audibly output a number corresponding to the last element of a travel destination address, for input/receipt by the processor 210 and ASR module, which can be recognized and which can be used to determine an “N-best” list in the same manner as previously stated with regard to the city and street names.
  • the user may merely enter the numerical element (number) of the address of the travel destination.
  • an entire address of a travel destination can be input and can thereafter be used by the processor 210 , to determine a travel route (in conjunction with a GPS signal indicating current location of the navigation device 200 and stored map information in memory 230 , for example).
  • each of the plurality of countries, states, cities, or street names may be visually output and only one audibly output for subsequent selection thereof, either by touch panel input or audible input in a manner similar to that previously described.
  • FIG. 6A provides an illustration of a non-limiting example of a selectable icon for enablement of an audible recognition mode.
  • the icon display may be varied to indicate to the user that the audible recognition mode has been enabled and that the system is merely awaiting receipt of the audible input as indicated in step 4 of FIG. 5 for example.
  • the display may include varying the icon displayed in some way, such as changing color of the virtual button shown in FIG. 6A for example or otherwise changing appearance of this virtual button/icon. This is shown in FIG. 6B , noting that the button may be a different color, such as green in color, when waiting for an audible input.
  • the virtual button/icon may be altered again while the system is determining address information choices for a travel destination in step S 6 for example, in a manner such as that shown in FIG. 6C for example.
  • the icon may again be altered as shown in FIG. 6D for example. This can provide feedback to the user regarding the use of the audible recognition mode.
  • the determining of at least one choice relating to address information of the travel destination based upon a received audible input in step S 6 of FIG. 5 can relate to input of a country/state/city/street address of travel destination in a normal fashion for example, and/or can relate to determination of a travel destination based upon a recent destination, a Point of Interest, a favorite, etc., as shown in FIG. 7 for example. Accordingly, upon receiving an indication of enablement of an audible recognition mode in step S 2 , a message such as “where would you like to go” can be displayed to the user on the integrated input and display device 290 of the navigation device 200 for example.
  • the initial audible input received in step S 4 could be that of a word relating to a category of information, such as “home” 710 , “favorite” 720 , “address” 730 , “recent destination” 740 or “Point of Interest (POI)” 750 .
  • the processor 210 and ASR module can be programmed to recognize one of the aforementioned categories 710 , 720 , 730 , 740 , or 750 , such that the determined at least one choice relating to address information of a travel destination may include traditional information such as cities, state, street names, etc., or may include other types of information such as Points of Interest, favorites, etc.
  • each of these processes may determine an output of choices relating to address information of a travel destination, noting that a most likely choice may be audibly output and selection thereof acknowledged by affirmative audible input (or touch panel input), with other “N-best” choices being visually output with selection thereof being acknowledged by at least one of audible and visual input.
  • the recognition may work as follows:
  • the recognition process for geographical names may work according to the following rules:
  • a navigation device 200 including a processor 210 to receive an indication of enablement of an audible recognition mode in a navigation device 200 and to determine, subsequent to receiving an audible input, at least one choice relating to address information of a travel destination based upon the received audible input; and an output device 241 to audibly output at least one determined choice relating to address information of a travel destination, the processor 210 being further useable to acknowledge selection of the audibly output at least one choice upon receiving an affirmative audible input.
  • Such a navigation device 200 may further include an integrated input and display device 290 as the output device 241 enable display of icons and/or selections, and subsequent selection thereof, and/or can further include an audible output device such as a speaker, for example.
  • an input device 220 can include a microphone.
  • a method includes receiving an indication of enablement of an audible recognition mode in a navigation device 200 ; and displaying on an integrated input and display device 290 , subsequent to receiving an indication of enablement of the audible recognition mode, an indication as to whether a volume of a received audible input is within an acceptable range, louder than the acceptable range, and softer than the acceptable range.
  • a navigation device 200 includes a processor 210 to receive an indication of enablement of an audible recognition mode in a navigation device 200 ; and an integrated input and display device 290 to display, subsequent to the processor 210 receiving an indication of enablement of the audible recognition mode, an indication as to whether a volume of a received audible input is within an acceptable range, louder than the acceptable range, and softer than the acceptable range.
  • an embodiment of the present application can be used to indicate to a user whether or not audible input, such as that of step S 4 of FIG. 5 for example, is within an acceptable range.
  • audible input such as that of step S 4 of FIG. 5 for example
  • FIG. 8 it is initially determined by processor 210 , for example in conjunction with the ASR module, whether or not an audible recognition mode was enabled in step S 20 . If so, three different displays can be displayed in steps S 24 , S 28 , and S 32 , depending on whether or not volume of the audible input is determined to be within an acceptable range.
  • the processor 210 and ASR module can attempt to ascertain the input information. The processor 210 and ASR module have a better chance of determining a correct input if the volume is within an acceptable range.
  • step S 22 it is determined in step S 22 whether or not volume of the audible input is within an acceptable range. This can be done by the processor 210 comparing the volume of the received information with an acceptable range, stored in memory for example with a threshold upper limit and threshold lower limit. If the volume of the received audible input is within the upper and lower thresholds in Step S 22 , the processor then determines that the volume of the audible input is within an acceptable range. In response thereto, the process moves to step S 24 wherein the processor 210 directs display of an indication that the volume is within an acceptable range. For example, this display may include changing the color of the “talk to me” icon shown in FIG. 6A to an icon such as that shown in FIG. 6B , in a green color indicative of acceptance for example. Alternatively, another indicator may be displayed, again noting that the indicator may be displayed in a color indicative of acceptance, such as a green color for example.
  • step S 22 the processor 210 then moves to either step S 26 or step S 30 to determine if the volume was louder than an acceptable range or softer than an acceptable range. It should be noted that the order of the steps of S 26 and S 30 is not important; as such determinations can be made in any order. If it is determined that the volume is louder than an acceptable range in Step S 26 , namely greater than upper threshold of the acceptable range, an indication may be displayed in Step S 28 , indicating that the volume is louder than an acceptable range. For example, the icon of FIG. 6B may be displayed in red, for example (a color indicative of incorrectness and something being too high), indicating that the audible input was too loud, and/or a red indicator may be displayed to the user, again indicating that the volume is too loud.
  • an indication may be displayed in Step S 28 , indicating that the volume is louder than an acceptable range. For example, the icon of FIG. 6B may be displayed in red, for example (a color indicative of incorrectness and something being too high), indicating that the au
  • Step S 30 the processor 210 moves to step S 30 wherein it determines whether or not the volume is softer than an acceptable range. If so, an indication may be displayed in step S 32 , indicating that the volume is softer than an acceptable range. For example, this may involve displaying the icon of FIG. 6B in a yellow color, for example, indicating to the user that the audible input is not loud enough. Alternatively, a yellow indicator may be displayed on the integrated information and display device 290 for example.
  • a method of the present application can include receiving an indication of enablement of an audible recognition mode in a navigation device 200 and displaying, on an integrated input and display device 290 and subsequent to receiving an indication of enablement of the audible recognition mode, an indication as to whether a volume of received audible input is within an acceptable range, louder than the acceptable range, and softer than the acceptable range.
  • the display can include a display of color information to display the indications for example, wherein a yellow color may be used to indicate that the received audible input is softer than the acceptable range, a red color may be used to indicate that the received audible input is louder than the acceptable range, and a green color may be used to indicate that the received audible input is within an acceptable range.
  • Address information regarding a travel destination of a user may be received in conjunction with the process shown in FIG. 8 for example, wherein the display may then indicate if the received information is within an acceptable range.
  • the address information can include at least one of a city and street name information.
  • the process may include at least one of recognizing the address information, displaying an indication of no recognition and displaying, on the integrated input and display device 290 , a list of choices to the user for selection.
  • the processes as shown in FIGS. 5 and 8 can be integrated.
  • a navigation device 200 including a processor 210 to receive an indication of enablement of an audible recognition mode in a navigation device 200 ; and an integrated input and display device 290 to display, subsequent to the processor 210 receiving an indication of enablement of the audible recognition mode, an indication as to whether a volume of a received audible input is within an acceptable range, louder than the acceptable range, and softer than the acceptable range.
  • a navigation device 200 may further include an audible output device such as a speaker, for example.
  • an input device 220 can include a microphone.
  • FIG. 9 is directed to another embodiment of the present application.
  • a navigation device 200 is not being used in a navigation mode.
  • the process set forth in FIG. 5 can be used with the navigation device 200 in a navigation mode, this is typically not the case as the vehicle, in which the navigation device 200 is located, for example, is usually stationary upon a user inputting a travel destination from which a route of travel can be determined.
  • a method includes receiving an indication of enablement of an audible recognition mode in a navigation device 200 ; receiving additional information from a source other than a user of the navigation device 200 ; formulating a question, answerable by a yes or no answer from the user, based upon the received additional information; and outputting the formulated question to the user.
  • a navigation device 200 includes a processor 210 to receive an indication of enablement of an audible recognition mode, to receive additional information from a source other than a user of the navigation device 200 , and to formulate a question, answerable by a yes or no answer from the user, based upon the received additional information; and an output device 241 to output the formulated question to the use.
  • FIG. 9 of the present application includes a process involving enablement of an audible recognition mode, which is more likely to be usable in a navigation device 200 on a vehicle in which the navigation device is located, is in moving; e.g. where the navigation device 200 is operated in a navigation mode.
  • step S 50 it is initially determined whether or not an audible recognition mode is enabled. This can be done, for example, in a manner similar to that previously described, including recognition of selection of the icon shown in FIG. 6A for example.
  • the processor 210 of the navigation device 200 may not only monitor receipt of audible information from a user, but can also monitor receipt of additional information from a source other than a user of the navigation device 200 .
  • step S 52 it is determined by the processor 210 whether or not additional information from a source other than a user is received.
  • This information can include but is not limited to receipt of an incoming call or message (such as a telephone call or SMS message received by the navigation device 200 itself and/or with a paired mobile phone, for example; received traffic information; etc.) If not, the process merely cycles back and continues to monitor for such information.
  • an incoming call or message such as a telephone call or SMS message received by the navigation device 200 itself and/or with a paired mobile phone, for example; received traffic information; etc.
  • step S 52 if additional information from a source other than the user of the navigation device 200 is received in step S 52 , the process moves to step S 54 wherein a question is formulated by the processor 210 , answerable by a yes or no answer from the user, based upon the received additional information.
  • the processor 210 can monitor other systems in the navigation device 200 (including paired mobile phones, for example) to determine whether or not, for example, an SMS message is received.
  • the processor 210 may work with both the ASR module and/or, more likely, a TTS module (Text To Speech) to formulate a question answerable by a yes/no answer from the user such as, for example, “A new message was received; shall I read it aloud?” Thereafter, the formulated question may be output in step S 56 , noting that the output is preferably an audible output (but may also be accompanied by a visual output for example).
  • a TTS module Text To Speech
  • the navigation device 200 may determine receipt of a traffic update indicating a traffic delay along the route of a particular period of time (calculatable by the processor 210 in a known manner for example), wherein the processor 210 and TTS module can then instruct the output of, for example, “Traffic delay on your route now ‘x’ minutes. Do you want to replan the route to minimize delays?”
  • the ASR module is typically utilized to recognize speech information from different users. Such information is typically unpredictable, and therefore cannot normally be stored in memory 230 .
  • the ASR module or engine operates in conjunction with the processor 210 to convert received speech information to a sequence of phonemes in a dynamic manner, and works with processor 210 to match existing grammar of stored cities, street names, etc., to the converted sequence of phonemes as described above. As such, the ASR module dynamically causes the processor 210 to utilize large chunks of memory 230 .
  • the TTS module forms questions which can be predefined or prerecorded in memory 230 for example.
  • the TTS module can output any kind of audio information, provided that it is in the language to which the voice corresponds. Some parts of the phrases that are considered to be used most often can be prerecorded, stored, and later used by the TTS module as well, to improve the quality of the output.
  • the TTS module typically works best in conjunction with processor 210 for outputting of preformulated questions, slightly modifiable if necessary, upon a processor 210 determining that additional information such as an SMS message, traffic update, etc., has been received by the navigation device 200 .
  • additional information can include traffic information, an incoming telephone call, an incoming SMS message, etc.
  • the formulating of the question can include inserting information, based upon the received information, into a stored question, such as inserting a traffic delay into the aforementioned traffic delay question for example.
  • the formulating can include inserting information regarding a calculated traffic delay, based upon a received traffic information, into a stored question.
  • the formulated question can be output, noting that the output may include at least one of an audible and visual output.
  • the formulated question output in step S 56 is typically formulated to receive a yes or no answer from the user, to thereby enable the processor 210 to operate in conjunction with the ASR module during driving conditions when the navigation device 200 is operating in a navigation mode.
  • the navigation device 200 is utilizing a lot of existing memory 230 and it is preferable to have the ASR module not utilize so much of memory 230 .
  • the processor 210 and ASR module can easily recognize the short yes or no answer of the user.
  • a subsequent action may be performed by the navigation device 200 upon receipt of a yes answer from the user, such as calculating a new route of travel based upon receipt of a yes answer from the user regarding a calculated traffic delay for example.
  • the SMS message can be converted by utilizing the TTS module for example, and an incoming text message can be output to the user upon receipt of a yes answer from the user.
  • At least one embodiment of the present application is directed to a navigation device 200 , including a processor 210 to receive an indication of enablement of an audible recognition mode, to receive additional information from a source other than a user of the navigation device 200 , and to formulate a question, answerable by a yes or no answer from the user, based upon the received additional information; and an output device 241 to output the formulated question to the use.
  • Such a navigation device 200 may further include an integrated input and display device 290 as the output device 241 enable display of icons and/or selections, and subsequent selection thereof, and/or can further include an audible output device such as a speaker, for example.
  • an input device 220 can include a microphone.
  • the methods of at least one embodiment expressed above may be implemented as a computer data signal embodied in the carrier wave or propagated signal that represents a sequence of instructions which, when executed by a processor (such as processor 304 of server 302 , and/or processor 210 of navigation device 200 for example) causes the processor to perform a respective method.
  • a processor such as processor 304 of server 302 , and/or processor 210 of navigation device 200 for example
  • at least one method provided above may be implemented above as a set of instructions contained on a computer readable or computer accessible medium, such as one of the memory devices previously described, for example, to perform the respective method when executed by a processor or other computer device.
  • the medium may be a magnetic medium, electronic medium, optical medium, etc.
  • any of the aforementioned methods may be embodied in the form of a program.
  • the program may be stored on a computer readable media and is adapted to perform any one of the aforementioned methods when run on a computer device (a device including a processor).
  • the storage medium or computer readable medium is adapted to store information and is adapted to interact with a data processing facility or computer device to perform the method of any of the above mentioned embodiments.
  • the storage medium may be a built-in medium installed inside a computer device main body or a removable medium arranged so that it can be separated from the computer device main body.
  • Examples of the built-in medium include, but are not limited to, rewriteable non-volatile memories, such as ROMs and flash memories, and hard disks.
  • the removable medium examples include, but are not limited to, optical storage media such as CD-ROMs and DVDs; magneto-optical storage media, such as MOs; magnetism storage media, including but not limited to floppy disks (trademark), cassette tapes, and removable hard disks; media with a built-in rewriteable non-volatile memory, including but not limited to memory cards; and media with a built-in ROM, including but not limited to ROM cassettes; etc.
  • various information regarding stored images for example, property information, may be stored in any other form, or it may be provided in other ways.
  • the electronic components of the navigation device 200 and/or the components of the server 302 can be embodied as computer hardware circuitry or as a computer readable program, or as a combination of both.
  • the system and method of embodiments of the present application include software operative on the processor to perform at least one of the methods according to the teachings of the present application.
  • One of ordinary skill in the art will understand, upon reading and comprehending this disclosure, the manner in which a software program can be launched from a computer readable medium in a computer based system to execute the functions found in the software program.
  • One of ordinary skill in the art will further understand the various programming languages which may be employed to create a software program designed to implement and perform at least one of the methods of the present application.
  • the programs can be structured in an object-orientation using an object-oriented language including but not limited to JAVA, Smalltalk, C++, etc., and the programs can be structured in a procedural-orientation using a procedural language including but not limited to COBAL, C, etc.
  • the software components can communicate in any number of ways that are well known to those of ordinary skill in the art, including but not limited to by application of program interfaces (API), interprocess communication techniques, including but not limited to report procedure call (RPC), common object request broker architecture (CORBA), Component Object Model (COM), Distributed Component Object Model (DCOM), Distributed System Object Model (DSOM), and Remote Method Invocation (RMI).
  • API program interfaces
  • interprocess communication techniques including but not limited to report procedure call (RPC), common object request broker architecture (CORBA), Component Object Model (COM), Distributed Component Object Model (DCOM), Distributed System Object Model (DSOM), and Remote Method Invocation (RMI).
  • RPC report procedure call
  • any one of the above-described and other example features of the present invention may be embodied in the form of an apparatus, method, system, computer program and computer program product.
  • the aforementioned methods may be embodied in the form of a system or device, including, but not limited to, any of the structure for performing the methodology illustrated in the drawings.

Abstract

A method and device are disclosed for navigation. In at least one embodiment, the method includes receiving an indication of enablement of an audible recognition mode in a navigation device; determining, subsequent to receiving an indication of enablement of the audible recognition mode and subsequent to receiving an audible input, at least one choice relating to address information of a travel destination based upon the received audible input; audibly outputting at least one determined choice relating to address information of a travel destination; and acknowledging selection of the audibly output at least one determined choice upon receiving an affirmative audible input. In at least one embodiment, the navigation device includes a processor to receive an indication of enablement of an audible recognition mode in a navigation device and to determine, subsequent to receiving an audible input, at least one choice relating to address information of a travel destination based upon the received audible input; and an output device to audibly output at least one determined choice relating to address information of a travel destination, the processor being further useable to acknowledge selection of the audibly output at least one determined choice upon receiving an affirmative audible input.

Description

    CO-PENDING APPLICATIONS
  • The following applications are being filed concurrently with the present application. The entire contents of each of the following applications is hereby incorporated herein by reference: A NAVIGATION DEVICE AND METHOD FOR EARLY INSTRUCTION OUTPUT (Attorney docket number 06P207US01) filed on even date herewith; A NAVIGATION DEVICE AND METHOD FOR ESTABLISHING AND USING PROFILES (Attorney docket number 06P207US02) filed on even date herewith; A NAVIGATION DEVICE AND METHOD FOR ENHANCED MAP DISPLAY (Attorney docket number 06P207US03) filed on even date herewith; NAVIGATION DEVICE AND METHOD FOR PROVIDING POINTS OF INTEREST (Attorney docket number 06P207US05) filed on even date herewith; A NAVIGATION DEVICE AND METHOD FOR FUEL PRICING DISPLAY (Attorney docket number 06P057US06) filed on even date herewith; A NAVIGATION DEVICE AND METHOD FOR INFORMATIONAL SCREEN DISPLAY (Attorney docket number 06P207US06) filed on even date herewith; A NAVIGATION DEVICE AND METHOD FOR DEALING WITH LIMITED ACCESS ROADS (Attorney docket number 06P057US07) filed on even date herewith; A NAVIGATION DEVICE AND METHOD FOR TRAVEL WARNINGS (Attorney docket number 06P057US07) filed on even date herewith; A NAVIGATION DEVICE AND METHOD FOR DRIVING BREAK WARNING (Attorney docket number 06P057US07) filed on even date herewith; A NAVIGATION DEVICE AND METHOD FOR ISSUING WARNINGS (Attorney docket number 06P207US07) filed on even date herewith; A NAVIGATION DEVICE AND METHOD FOR DISPLAY OF POSITION IN TEXT READIBLE FORM (Attorney docket number 06P207US08) filed on even date herewith; A NAVIGATION DEVICE AND METHOD FOR EMERGENCY SERVICE ACCESS (Attorney docket number 06P057US08) filed on even date herewith; A NAVIGATION DEVICE AND METHOD FOR PROVIDING REGIONAL TRAVEL INFORMATION IN A NAVIGATION DEVICE (Attorney docket number 06P207US09) filed on even date herewith; A NAVIGATION DEVICE AND METHOD FOR USING SPECIAL CHARACTERS IN A NAVIGATION DEVICE (Attorney docket number 06P207US09) filed on even date herewith; A NAVIGATION DEVICE AND METHOD USING A PERSONAL AREA NETWORK (Attorney docket number 06P207US10) filed on even date herewith; A NAVIGATION DEVICE AND METHOD USING A LOCATION MESSAGE (Attorney docket number 06P207US10) filed on even date herewith; A NAVIGATION DEVICE AND METHOD FOR CONSERVING POWER (Attorney docket number 06P207US11) filed on even date herewith; A NAVIGATION DEVICE AND METHOD FOR USING A TRAFFIC MESSAGE CHANNEL (Attorney docket number 06P207US13) filed on even date herewith; A NAVIGATION DEVICE AND METHOD FOR USING A TRAFFIC MESSAGE CHANNEL RESOURCE (Attorney docket number 06P207US13) filed on even date herewith; A NAVIGATION DEVICE AND METHOD FOR QUICK OPTION ACCESS (Attorney docket number 06P207US15) filed on even date herewith; A NAVIGATION DEVICE AND METHOD FOR DISPLAYING A RICH CONTENT DOCUMENT (Attorney docket number 06P207US27) filed on even date herewith.
  • PRIORITY STATEMENT
  • The present application hereby claims priority under 35 U.S.C. §119(e) on each of U.S. Provisional Patent Application No. 60/879,523 filed Jan. 10, 2007, 60/879,549 filed Jan. 10, 2007, 60/879,553 filed Jan. 10, 2007, 60/879,577 filed Jan. 10, 2007, and 60/879,599 filed Jan. 10, 2007, the entire contents of each of which is hereby incorporated herein by reference.
  • FIELD
  • The present application generally relates to navigation methods and devices.
  • BACKGROUND
  • Navigation devices were traditionally utilized mainly in the areas of vehicle use, such as on cars, motorcycles, trucks, boats, etc. Alternatively, if such navigation devices were portable, they were further transferable between vehicles and/or useable outside the vehicle, for foot travel for example.
  • These devices are typically tailored to produce a route of travel based upon an initial position of the navigation device and a selected/input travel destination (end position), noting that the initial position could be entered into the device, but is traditionally calculated via GPS Positioning from a GPS receiver within the navigation device. To aid in navigation of the route, instructions are output along the route to a user of the navigation device. These instructions may be a least one of audible and visual.
  • SUMMARY
  • The inventors discovered that users of navigation devices may have some difficulty in operating and viewing touch panel screens. Thus, the inventors discovered that user's desire at least limited hands free access, especially when using the navigation device in a vehicle. As such, the inventors developed methods which allow hands-free or at least partial hands free access by utilizing an audible recognition mode.
  • In at least one embodiment of the present application, a method includes receiving an indication of enablement of an audible recognition mode in a navigation device; determining, subsequent to receiving an indication of enablement of the audible recognition mode and subsequent to receiving an audible input, at least one choice relating to address information of a travel destination based upon the received audible input; audibly outputting at least one determined choice relating to address information of a travel destination; and acknowledging selection of the audibly output at least one determined choice upon receiving an affirmative audible input.
  • In at least one embodiment of the present application, a navigation device includes a processor to receive an indication of enablement of an audible recognition mode in a navigation device and to determine, subsequent to receiving an audible input, at least one choice relating to address information of a travel destination based upon the received audible input; and an output device to audibly output at least one determined choice relating to address information of a travel destination, the processor being further useable to acknowledge selection of the audibly output at least one determined choice upon receiving an affirmative audible input.
  • In at least one other embodiment of the present application, a method includes receiving an indication of enablement of an audible recognition mode in a navigation device; and displaying on an integrated input and display device, subsequent to receiving an indication of enablement of the audible recognition mode, an indication as to whether a volume of a received audible input is within an acceptable range, louder than the acceptable range, and softer than the acceptable range.
  • In at least one other embodiment of the present application, a navigation device includes a processor to receive an indication of enablement of an audible recognition mode in a navigation device; and an integrated input and display device to display, subsequent to the processor receiving an indication of enablement of the audible recognition mode, an indication as to whether a volume of a received audible input is within an acceptable range, louder than the acceptable range, and softer than the acceptable range.
  • In at least one other embodiment of the present application, a method includes receiving an indication of enablement of an audible recognition mode in a navigation device; receiving additional information from a source other than a user of the navigation device; formulating a question, answerable by a yes or no answer from the user, based upon the received additional information; and outputting the formulated question to the user.
  • In at least one other embodiment of the present application, a navigation device includes a processor to receive an indication of enablement of an audible recognition mode, to receive additional information from a source other than a user of the navigation device, and to formulate a question, answerable by a yes or no answer from the user, based upon the received additional information; and an output device to output the formulated question to the use.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The present application will be described in more detail below by using example embodiments, which will be explained with the aid of the drawings, in which:
  • FIG. 1 illustrates an example view of a Global Positioning System (GPS);
  • FIG. 2 illustrates an example block diagram of electronic components of a navigation device of an embodiment of the present application;
  • FIG. 3 illustrates an example block diagram of a server, navigation device and connection therebetween of an embodiment of the present application;
  • FIGS. 4A and 4B are perspective views of an implementation of an embodiment of the navigation device;
  • FIG. 5 illustrates a flow chart of an embodiment of a method of the present application;
  • FIGS. 6A-D are examples of audible recognition mode icons for display in an embodiment of the present application;
  • FIG. 7 illustrates an example chart of an embodiment of the present application;
  • FIG. 8 illustrates a flow chart of an embodiment of a method of the present application; and
  • FIG. 9 illustrates a flow chart of an embodiment of a method of the present application.
  • DETAILED DESCRIPTION OF THE EXAMPLE EMBODIMENTS
  • The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the present invention. As used herein, the singular forms “a”, “an”, and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “includes” and/or “including”, when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
  • In describing example embodiments illustrated in the drawings, specific terminology is employed for the sake of clarity. However, the disclosure of this patent specification is not intended to be limited to the specific terminology so selected and it is to be understood that each specific element includes all technical equivalents that operate in a similar manner.
  • Referencing the drawings, wherein like reference numerals designate identical or corresponding parts throughout the several views, example embodiments of the present patent application are hereafter described. Like numbers refer to like elements throughout. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items.
  • FIG. 1 illustrates an example view of Global Positioning System (GPS), usable by navigation devices, including the navigation device of embodiments of the present application. Such systems are known and are used for a variety of purposes. In general, GPS is a satellite-radio based navigation system capable of determining continuous position, velocity, time, and in some instances direction information for an unlimited number of users.
  • Formerly known as NAVSTAR, the GPS incorporates a plurality of satellites which work with the earth in extremely precise orbits. Based on these precise orbits, GPS satellites can relay their location to any number of receiving units.
  • The GPS system is implemented when a device, specially equipped to receive GPS data, begins scanning radio frequencies for GPS satellite signals. Upon receiving a radio signal from a GPS satellite, the device determines the precise location of that satellite via one of a plurality of different conventional methods. The device will continue scanning, in most instances, for signals until it has acquired at least three different satellite signals (noting that position is not normally, but can be determined, with only two signals using other triangulation techniques). Implementing geometric triangulation, the receiver utilizes the three known positions to determine its own two-dimensional position relative to the satellites. This can be done in a known manner. Additionally, acquiring a fourth satellite signal will allow the receiving device to calculate its three dimensional position by the same geometrical calculation in a known manner. The position and velocity data can be updated in real time on a continuous basis by an unlimited number of users.
  • As shown in FIG. 1, the GPS system is denoted generally by reference numeral 100. A plurality of satellites 120 are in orbit about the earth 124. The orbit of each satellite 120 is not necessarily synchronous with the orbits of other satellites 120 and, in fact, is likely asynchronous. A GPS receiver 140, usable in embodiments of navigation devices of the present application, is shown receiving spread spectrum GPS satellite signals 160 from the various satellites 120.
  • The spread spectrum signals 160, continuously transmitted from each satellite 120, utilize a highly accurate frequency standard accomplished with an extremely accurate atomic clock. Each satellite 120, as part of its data signal transmission 160, transmits a data stream indicative of that particular satellite 120. It is appreciated by those skilled in the relevant art that the GPS receiver device 140 generally acquires spread spectrum GPS satellite signals 160 from at least three satellites 120 for the GPS receiver device 140 to calculate its two-dimensional position by triangulation. Acquisition of an additional signal, resulting in signals 160 from a total of four satellites 120, permits the GPS receiver device 140 to calculate its three-dimensional position in a known manner.
  • FIG. 2 illustrates an example block diagram of electronic components of a navigation device 200 of an embodiment of the present application, in block component format. It should be noted that the block diagram of the navigation device 200 is not inclusive of all components of the navigation device, but is only representative of many example components.
  • The navigation device 200 is located within a housing (not shown). The housing includes a processor 210 connected to an input device 220 and a display screen 240. The input device 220 can include a keyboard device, voice input device, touch panel and/or any other known input device utilized to input information; and the display screen 240 can include any type of display screen such as an LCD display, for example. In at least one embodiment of the present application, the input device 220 and display screen 240 are integrated into an integrated input and display device, including a touchpad or touchscreen input wherein a user need only touch a portion of the display screen 240 to select one of a plurality of display choices or to activate one of a plurality of virtual buttons.
  • In addition, other types of output devices 241 can also include, including but not limited to, an audible output device. As output device 241 can produce audible information to a user of the navigation device 200, it is equally understood that input device 240 can also include a microphone and software for receiving input voice commands as well.
  • In the navigation device 200, processor 210 is operatively connected to and set to receive input information from input device 240 via a connection 225, and operatively connected to at least one of display screen 240 and output device 241, via output connections 245, to output information thereto. Further, the processor 210 is operatively connected to memory 230 via connection 235 and is further adapted to receive/send information from/to input/output (I/O) ports 270 via connection 275, wherein the I/O port 270 is connectible to an I/O device 280 external to the navigation device 200. The external I/O device 270 may include, but is not limited to an external listening device such as an earpiece for example. The connection to I/O device 280 can further be a wired or wireless connection to any other external device such as a car stereo unit for hands-free operation and/or for voice activated operation for example, for connection to an ear piece or head phones, and/or for connection to a mobile phone for example, wherein the mobile phone connection may be used to establish a data connection between the navigation device 200 and the internet or any other network for example, and/or to establish a connection to a server via the internet or some other network for example.
  • The navigation device 200, in at least one embodiment, may establish a “mobile” network connection with the server 302 via a mobile device 400 (such as a mobile phone, PDA, and/or any device with mobile phone technology) establishing a digital connection (such as a digital connection via known Bluetooth technology for example). Thereafter, through its network service provider, the mobile device 400 can establish a network connection (through the internet for example) with a server 302. As such, a “mobile” network connection is established between the navigation device 200 (which can be, and often times is mobile as it travels alone and/or in a vehicle) and the server 302 to provide a “real-time” or at least very “up to date” gateway for information.
  • The establishing of the network connection between the mobile device 400 (via a service provider) and another device such as the server 302, using the internet 410 for example, can be done in a known manner. This can include use of TCP/IP layered protocol for example. The mobile device 400 can utilize any number of communication standards such as CDMA, GSM, WAN, etc.
  • As such, an internet connection may be utilized which is achieved via data connection, via a mobile phone or mobile phone technology within the navigation device 200 for example. For this connection, an internet connection between the server 302 and the navigation device 200 is established. This can be done, for example, through a mobile phone or other mobile device and a GPRS (General Packet Radio Service)-connection (GPRS connection is a high-speed data connection for mobile devices provided by telecom operators; GPRS is a method to connect to the internet.
  • The navigation device 200 can further complete a data connection with the mobile device 400, and eventually with the internet 410 and server 302, via existing Bluetooth technology for example, in a known manner, wherein the data protocol can utilize any number of standards, such as the GSRM, the Data Protocol Standard for the GSM standard, for example.
  • The navigation device 200 may include its own mobile phone technology within the navigation device 200 itself (including an antenna for example, wherein the internal antenna of the navigation device 200 can further alternatively be used). The mobile phone technology within the navigation device 200 can include internal components as specified above, and/or can include an insertable card, complete with necessary mobile phone technology and/or an antenna for example. As such, mobile phone technology within the navigation device 200 can similarly establish a network connection between the navigation device 200 and the server 302, via the internet 410 for example, in a manner similar to that of any mobile device 400.
  • For GRPS phone settings, the Bluetooth enabled device may be used to correctly work with the ever changing spectrum of mobile phone models, manufacturers, etc., model/manufacturer specific settings may be stored on the navigation device 200 for example. The data stored for this information can be updated in a manner discussed in any of the embodiments, previous and subsequent.
  • FIG. 2 further illustrates an operative connection between the processor 210 and an antenna/receiver 250 via connection 255, wherein the antenna/receiver 250 can be a GPS antenna/receiver for example. It will be understood that the antenna and receiver designated by reference numeral 250 are combined schematically for illustration, but that the antenna and receiver may be separately located components, and that the antenna may be a GPS patch antenna or helical antenna for example.
  • Further, it will be understood by one of ordinary skill in the art that the electronic components shown in FIG. 2 are powered by power sources (not shown) in a conventional manner. As will be understood by one of ordinary skill in the art, different configurations of the components shown in FIG. 2 are considered within the scope of the present application. For example, in one embodiment, the components shown in FIG. 2 may be in communication with one another via wired and/or wireless connections and the like. Thus, the scope of the navigation device 200 of the present application includes a portable or handheld navigation device 200.
  • In addition, the portable or handheld navigation device 200 of FIG. 2 can be connected or “docked” in a known manner to a motorized vehicle such as a car or boat for example. Such a navigation device 200 is then removable from the docked location for portable or handheld navigation use.
  • FIG. 3 illustrates an example block diagram of a server 302 and a navigation device 200 of the present application, via a generic communications channel 318, of an embodiment of the present application. The server 302 and a navigation device 200 of the present application can communicate when a connection via communications channel 318 is established between the server 302 and the navigation device 200 (noting that such a connection can be a data connection via mobile device, a direct connection via personal computer via the internet, etc.).
  • The server 302 includes, in addition to other components which may not be illustrated, a processor 304 operatively connected to a memory 306 and further operatively connected, via a wired or wireless connection 314, to a mass data storage device 312. The processor 304 is further operatively connected to transmitter 308 and receiver 310, to transmit and send information to and from navigation device 200 via communications channel 318. The signals sent and received may include data, communication, and/or other propagated signals. The transmitter 308 and receiver 310 may be selected or designed according to the communications requirement and communication technology used in the communication design for the navigation system 200. Further, it should be noted that the functions of transmitter 308 and receiver 310 may be combined into a signal transceiver.
  • Server 302 is further connected to (or includes) a mass storage device 312, noting that the mass storage device 312 may be coupled to the server 302 via communication link 314. The mass storage device 312 contains a store of navigation data and map information, and can again be a separate device from the server 302 or can be incorporated into the server 302.
  • The navigation device 200 is adapted to communicate with the server 302 through communications channel 318, and includes processor, memory, etc. as previously described with regard to FIG. 2, as well as transmitter 320 and receiver 322 to send and receive signals and/or data through the communications channel 318, noting that these devices can further be used to communicate with devices other than server 302. Further, the transmitter 320 and receiver 322 are selected or designed according to communication requirements and communication technology used in the communication design for the navigation device 200 and the functions of the transmitter 320 and receiver 322 may be combined into a single transceiver.
  • Software stored in server memory 306 provides instructions for the processor 304 and allows the server 302 to provide services to the navigation device 200. One service provided by the server 302 involves processing requests from the navigation device 200 and transmitting navigation data from the mass data storage 312 to the navigation device 200. According to at least one embodiment of the present application, another service provided by the server 302 includes processing the navigation data using various algorithms for a desired application and sending the results of these calculations to the navigation device 200.
  • The communication channel 318 generically represents the propagating medium or path that connects the navigation device 200 and the server 302. According to at least one embodiment of the present application, both the server 302 and navigation device 200 include a transmitter for transmitting data through the communication channel and a receiver for receiving data that has been transmitted through the communication channel.
  • The communication channel 318 is not limited to a particular communication technology. Additionally, the communication channel 318 is not limited to a single communication technology; that is, the channel 318 may include several communication links that use a variety of technology. For example, according to at least one embodiment, the communication channel 318 can be adapted to provide a path for electrical, optical, and/or electromagnetic communications, etc. As such, the communication channel 318 includes, but is not limited to, one or a combination of the following: electric circuits, electrical conductors such as wires and coaxial cables, fiber optic cables, converters, radio-frequency (rf) waves, the atmosphere, empty space, etc. Furthermore, according to at least one various embodiment, the communication channel 318 can include intermediate devices such as routers, repeaters, buffers, transmitters, and receivers, for example.
  • In at least one embodiment of the present application, for example, the communication channel 318 includes telephone and computer networks. Furthermore, in at least one embodiment, the communication channel 318 may be capable of accommodating wireless communication such as radio frequency, microwave frequency, infrared communication, etc. Additionally, according to at least one embodiment, the communication channel 318 can accommodate satellite communication.
  • The communication signals transmitted through the communication channel 318 include, but are not limited to, signals as may be required or desired for given communication technology. For example, the signals may be adapted to be used in cellular communication technology such as Time Division Multiple Access (TDMA), Frequency Division Multiple Access (FDMA), Code Division Multiple Access (CDMA), Global System for Mobile Communications (GSM), etc. Both digital and analogue signals can be transmitted through the communication channel 318. According to at least one embodiment, these signals may be modulated, encrypted and/or compressed signals as may be desirable for the communication technology.
  • The mass data storage 312 includes sufficient memory for the desired navigation applications. Examples of the mass data storage 312 may include magnetic data storage media such as hard drives for example, optical storage media such as CD-Roms for example, charged data storage media such as flash memory for example, molecular memory, etc.
  • According to at least one embodiment of the present application, the server 302 includes a remote server accessible by the navigation device 200 via a wireless channel. According to at least one other embodiment of the application, the server 302 may include a network server located on a local area network (LAN), wide area network (WAN), virtual private network (VPN), etc.
  • According to at least one embodiment of the present application, the server 302 may include a personal computer such as a desktop or laptop computer, and the communication channel 318 may be a cable connected between the personal computer and the navigation device 200. Alternatively, a personal computer may be connected between the navigation device 200 and the server 302 to establish an internet connection between the server 302 and the navigation device 200. Alternatively, a mobile telephone or other handheld device may establish a wireless connection to the internet, for connecting the navigation device 200 to the server 302 via the internet.
  • The navigation device 200 may be provided with information from the server 302 via information downloads which may be periodically updated upon a user connecting navigation device 200 to the server 302 and/or may be more dynamic upon a more constant or frequent connection being made between the server 302 and navigation device 200 via a wireless mobile connection device and TCP/IP connection for example. For many dynamic calculations, the processor 304 in the server 302 may be used to handle the bulk of the processing needs; however, processor 210 of navigation device 200 can also handle much processing and calculation, oftentimes independent of a connection to a server 302.
  • The mass storage device 312 connected to the server 302 can include volumes more cartographic and route data than that which is able to be maintained on the navigation device 200 itself, including maps, etc. The server 302 may process, for example, the majority of the devices of a navigation device 200 which travel along the route using a set of processing algorithms. Further, the cartographic and route data stored in memory 312 can operate on signals (e.g. GPS signals), originally received by the navigation device 200.
  • As indicated above in FIG. 2 of the application, a navigation device 200 of an embodiment of the present application includes a processor 210, an input device 220, and a display screen 240. In at least one embodiment, the input device 220 and display screen 240 are integrated into an integrated input and display device to enable both input of information (via direct input, menu selection, etc.) and display of information through a touch panel screen, for example. Such a screen may be a touch input LCD screen, for example, as is well known to those of ordinary skill in the art. Further, the navigation device 200 can also include any additional input device 220 and/or any additional output device 241, such as audio input/output devices for example.
  • FIGS. 4A and 4B are perspective views of an actual implementation of an embodiment of the navigation device 200. As shown in FIG. 4A, the navigation device 200 may be a unit that includes an integrated input and display device 290 (a touch panel screen for example) and the other components of FIG. 2 (including but not limited to internal GPS receiver 250, microprocessor 210, a power supply, memory systems 220, etc.).
  • The navigation device 200 may sit on an arm 292, which itself may be secured to a vehicle dashboard/window/etc. using a large suction cup 294. This arm 292 is one example of a docking station to which the navigation device 200 can be docked.
  • As shown in FIG. 4B, the navigation device 200 can be docked or otherwise connected to an arm 292 of the docking station by snap connecting the navigation device 292 to the arm 292 for example (this is only one example, as other known alternatives for connection to a docking station are within the scope of the present application). The navigation device 200 may then be rotatable on the arm 292, as shown by the arrow of FIG. 4B. To release the connection between the navigation device 200 and the docking station, a button on the navigation device 200 may be pressed, for example (this is only one example, as other known alternatives for disconnection to a docking station are within the scope of the present application).
  • In at least one embodiment of the present application, a method includes receiving an indication of enablement of an audible recognition mode in a navigation device 200; determining, subsequent to receiving an indication of enablement of the audible recognition mode and subsequent to receiving an audible input, at least one choice relating to address information of a travel destination based upon the received audible input; audibly outputting at least one determined choice relating to address information of a travel destination; and acknowledging selection of the audibly output at least one determined choice upon receiving an affirmative audible input.
  • In at least one embodiment of the present application, a navigation device 200 includes a processor 210 to receive an indication of enablement of an audible recognition mode in a navigation device 200 and to determine, subsequent to receiving an audible input, at least one choice relating to address information of a travel destination based upon the received audible input; and an output device 241 to audibly output at least one determined choice relating to address information of a travel destination, the processor 210 being further useable to acknowledge selection of the audibly output at least one choice upon receiving an affirmative audible input.
  • FIG. 5 illustrates a flowchart of an example embodiment of the present application. In the embodiment shown in FIG. 5, it is first determined is step S2 whether or not an audible recognition mode has been enabled in the navigation device. For example, as shown in FIG. 6A, an icon can be displayed on an integrated input and display device 290 of the navigation device 200. Such an icon can be displayed in an initial or subsequent menu for selection prior to input/ selection of a destination for establishing a route of travel and/or can be displayed along with map information, for example, during use of the navigation device in a navigation mode. This icon can include just a pictorial illustration, such as the lips shown in FIG. 6A, and/or can include text indicating that the button corresponds to an audible recognition mode, such as audible speech recognition (ASR). Upon a processor 210 of the navigation device 200 receiving an indication of selection of such an icon as shown in FIG. 6A, an audible recognition mode may be enabled by the processor 210.
  • An audible recognition mode can include the processor 210 working in conjunction with an ASR engine or module. Such an ASR engine or module is a software engine that, once an audible recognition mode is enabled as explained above, can be loaded with grammatical rules, in a language of the country of the user of the navigation device 200 (or selected by the user, for example) for example. Thus, a user of the navigation device 200 will typically enter/select a country in which the user is located, and the language of that country can then be selected, input or matched by the processor 210. Thereafter, the ASR engine can then be loaded with grammatical rules from memory 230, upon an audible recognition mode being enabled. The ASR engine can then use the language corresponding to the chosen map to recognize geographical names (city and street names, for example) and the current user selected/enabled language to recognize common speech. For example, the system may be set up to enable recognition of complex speech from the user, or may be limited to only simple replies of yes, no, done, back, and/or numerical entries such as 1, 2, 3, etc. are of
  • The ASR engine or module is one which enables a speech interface between the user and the navigation device 200. Such a module is typically not usable in a portable navigation device 200 such as that shown in FIGS. 2-4B of the present application, but embodiments of the present application improve or even optimize memory management between the processor 210 and memory devices 230 for example, as well as data structures, to allow the ASR module to handle and recognize input information. Essentially, all or most available memory in the memory device(s) 230 of the navigation device 200 are allocated to the ASR module during speech recognition; namely upon the audible recognition mode being enabled in step S2 of FIG. 5, while other processes of the processor 210 are put on hold. Of course, during use of the navigation device 200 in a navigation mode, certain processes devoted to display of navigation information and output of navigation instructions must continue, thus sometimes slowing down operation of the ASR module.
  • In one example embodiment of the present application, the ASR module is primarily utilized in selecting address information of a travel destination based upon received audible input, and thus typically operates at a time when the navigation device 200 is not in use in a navigation mode. Upon the navigation device 200 operating in a navigation mode, another embodiment of the present application involves formulating simple questions, answerable by a yes/no answer (for example) from the user, to thereby enable processing capacity to be allocated to the navigation mode, with only a small amount of processing capacity needed in the ASR module to recognize such yes/no answers from the user of the navigation device 200. Thus, although the process shown in FIG. 5 can operate during use of the navigation device 200 in the navigation mode, upon sufficient memory 230 being included in the navigation device 200 and/or upon the ASR module being used to recognize Yes/NO limited input information for example, the operation shown in FIG. 5 typically occurs before start of the vehicle in which the navigation device 200 is located, namely before a travel destination is input into the navigation device 200 and before a travel route is determined.
  • Referring back to FIG. 5, in step S2, if the audible recognition mode is not enabled, the system cycles back to repeat step S2. However, if the audible recognition mode is enabled, by the processor 210 receiving an indication of selection of the “talk to me” icon shown in FIG. 6A for example, language and grammar information is loaded into the ASR module of the navigation device 200 from memory 230 and the navigation device 200 merely awaits an audible input in step S4. If no audible input is received, the system merely cycles back to repeat step S4 until an audible input is received.
  • The ASR module is typically utilized to recognize speech information from different users. Such information is typically unpredictable, and therefore cannot be stored in memory 230. The ASR module or engine operates in conjunction with the processor 210 to convert received speech information to a sequence of phonemes in a known manner, and then works with processor 210 to match existing grammar of stored cities, street names, etc., to the converted sequence of phonemes.
  • In step S6, if an audible input is received, the processor 210 works with the ASR module to convert the input speech to phonemes and to compare the sequence of phonemes to stored information in memory 230 to determine at least one choice relating to address information of the travel destination based upon the received audible input. For example, in at least one embodiment, the at least one choice relating to address information of a travel destination can include a city name. Accordingly, a user may audibly output a name of a city as part of the address information of the travel destination, wherein the initial input of the city could be prompted by the navigation device 200 displaying a request, such as “In which city?” for example, to enter travel destination information. Upon receipt of this audible information, the processor 210 and ASR module process the phonemes as described above and compare this information to stored cities in memory 230 to determine at least one choice relating to the input audible sound, if possible. If nothing was recognized, the navigation device 200 may return to a screen to prompt input of the city or other address information, and may or may not flash or otherwise display a message “input not recognized”, for example. As will be explained in another embodiment of the present application, a sound indicator can also be displayed to a user indicating whether or not the volume of audible input is within an acceptable range, louder than an acceptable range, or softer than an acceptable range, for example.
  • If at least one address information choice (such as a city for example) was determinable in step S6, the process proceeds to step S8 wherein at least one determined choice is audibly output relating to address information of a travel destination. For example, instead of the system merely guessing that an audible input was received correctly, the processor 210 instead directs audible output of at least one determined choice relating to address information of a travel destination in step S8. Thereafter, in step S10, the processor 210 waits to see if an affirmative audible output was received in step S10. If so, the processor 210 and ASR module can then acknowledge that a correct determination occurred, and can thus acknowledge selection of the audibly output at least one determined choice upon receiving and recognizing an affirmative audible input, such as a “yes” for example.
  • Accordingly, instead of the processor 210 and ASR module merely guessing that an audible input was correct, at least one determined choice relating to address information is first audibly output, and selection of the at least one determined choice is not acknowledged until an affirmative audible input is received.
  • As stated in step S6, upon receipt of an audible input, at least one address information choice for the travel destination is determined, such as a city name for example. In at least one example embodiment of the present application, however, a plurality of “N-best” choices (not just one choice, noting that N can be any number, such as six for example) are recognized by the processor 210. Essentially, the processor 210, in conjunction with the ASR module, tries to best determine, from the phonemes of the audible input, a name of a city (in this first instance of input of address information for example). The processor 210 scans or reviews all the various cities stored in memory 230 for a match. The processor 210 then ranks the best possible matches such that the best possible match will be audibly output to the user of the navigation device 200 as the at least one determined choice relating to address information of the travel destination.
  • Accordingly, selection of the audibly output at least one determined choice can be acknowledged upon affirmative audible input in step S10. However, as “N-best” cities may be initially determined, the processor 210 can also direct the navigation device 200 to display an “N-best” list of choices, such as the N-best matches of city names determined by the processor 210 for example, on the integrated input and display device 290. The best possible match based upon the audible input received from the user may be audibly output and may further be displayed visually at the top of the “N-best” list (as the number one choice in a displayed list). Thereafter, next best choices can be visually displayed to the user in step S14 as numbered choices, such as choices two-six for example. Thereafter, a visually output choice may be selected in step S16, via display and subsequent input through the integrated input and display device 290, for example. If selected, selection can be acknowledged in step S20 of FIG. 5, by processor 210 for example.
  • Accordingly, the processor 210 and ASR module may not only be used to determine one single choice, but can be used to determine a plurality of choices relating to the address information of the travel destination. Each of the plurality of choices may be visually output and only one choice may be audibly output, for example. The plurality of choices may be visually output for selection on the integrated input and display device 290 of the navigation device 200. Each of these choices, such as a list of cities sounding most like the audible input for example, can be determined and displayed and are selectable by at least one of a touch panel and audible output. Further, the audibly output at least one choice is further selectable via receipt of an indication of touch panel input. In addition, each of the plurality of determined choices may be selectable via receipt of an indication of a touch panel input, and/or by audible input of a number corresponding to a displayed choice (for example, a user saying “two” to select the second displayed choice).
  • As one non-limiting example, if the city “Salt Lake City” is audibly output by a user of the navigation device 200, the processor 210 and ASR module can determine an “N-best” list of cities to be audibly and visually output. The first city in the displayed list may be “Salt Lake City”, and may be both audibly output and visually output on an integrated input and display device 290 of the navigation device for example. Further, another “N-best” cities can be determined by processor 210 and the ASR module, including for example, five other cities such as Salem, Sacramento, San Antonio, Springfield, and Staunton. In one example embodiment of the present application, the “N-best” list includes a set number of choices, such as six choices for example. These six choices (the number one choice and the five other N-best cities) can then be displayed to the user for audible or touch panel input/selection. Accordingly, if all six choices are displayed in order on the touch panel of the integrated input and display device 290, the user may merely touch and thereby select one of the six choices. Alternatively, as the first choice “Salt Lake City” is audibly output to the user, the user can acknowledge selection of the audibly output choice by issuing an affirmative audible input. Alternatively, the user can select any one of the other five displayed choices (or even the first choice for example) by merely stating the number corresponding to the particular choice, such as “6” representing the sixth choice of “Staunton.”
  • By utilizing an affirmative audible input, and/or an audible input of only one of six numerical values, the processor 210 increases the likelihood of confirming a user's selection and thereby can adequately acknowledge selection of a particular choice by the user.
  • Thereafter, once a user selects a city name and such selection is acknowledged in steps S12 or S20, a user can issue another audible output, for input/receipt by the processor 210 and ASR module, corresponding to a street name for example. Thereafter, the processor 210 and ASR module can determine at least one street name subsequent to selection of city name and subsequent to receiving another audible output. Again, the processor 210 and ASR module may determine an “N-best” list of street names, for subsequent audible and/or visual output to the user of the navigation device 200, for subsequent selection thereof. Selection can be done in the same manner as discussed previously with regard to city names.
  • Finally, a user can audibly output a number corresponding to the last element of a travel destination address, for input/receipt by the processor 210 and ASR module, which can be recognized and which can be used to determine an “N-best” list in the same manner as previously stated with regard to the city and street names. Alternatively, the user may merely enter the numerical element (number) of the address of the travel destination. As such, an entire address of a travel destination can be input and can thereafter be used by the processor 210, to determine a travel route (in conjunction with a GPS signal indicating current location of the navigation device 200 and stored map information in memory 230, for example).
  • It should be noted that the process of FIG. 5 can begin with audible input and recognition of a country and/or state for example, instead of a city name. Further, upon determining a plurality or “N-best” list of countries, states, cities, streets, etc., each of the plurality of countries, states, cities, or street names may be visually output and only one audibly output for subsequent selection thereof, either by touch panel input or audible input in a manner similar to that previously described.
  • As previously discussed, FIG. 6A provides an illustration of a non-limiting example of a selectable icon for enablement of an audible recognition mode. It should be noted that upon enablement of this audible recognition mode, the icon display may be varied to indicate to the user that the audible recognition mode has been enabled and that the system is merely awaiting receipt of the audible input as indicated in step 4 of FIG. 5 for example. The display may include varying the icon displayed in some way, such as changing color of the virtual button shown in FIG. 6A for example or otherwise changing appearance of this virtual button/icon. This is shown in FIG. 6B, noting that the button may be a different color, such as green in color, when waiting for an audible input.
  • Thereafter, the virtual button/icon may be altered again while the system is determining address information choices for a travel destination in step S6 for example, in a manner such as that shown in FIG. 6C for example. Finally, upon audibly outputting at least one determined choice in step S8 of FIG. 5, the icon may again be altered as shown in FIG. 6D for example. This can provide feedback to the user regarding the use of the audible recognition mode.
  • It should be noted that the determining of at least one choice relating to address information of the travel destination based upon a received audible input in step S6 of FIG. 5 can relate to input of a country/state/city/street address of travel destination in a normal fashion for example, and/or can relate to determination of a travel destination based upon a recent destination, a Point of Interest, a favorite, etc., as shown in FIG. 7 for example. Accordingly, upon receiving an indication of enablement of an audible recognition mode in step S2, a message such as “where would you like to go” can be displayed to the user on the integrated input and display device 290 of the navigation device 200 for example. Thereafter, the initial audible input received in step S4 could be that of a word relating to a category of information, such as “home” 710, “favorite” 720, “address” 730, “recent destination” 740 or “Point of Interest (POI)” 750. The processor 210 and ASR module can be programmed to recognize one of the aforementioned categories 710, 720, 730, 740, or 750, such that the determined at least one choice relating to address information of a travel destination may include traditional information such as cities, state, street names, etc., or may include other types of information such as Points of Interest, favorites, etc. Again, each of these processes may determine an output of choices relating to address information of a travel destination, noting that a most likely choice may be audibly output and selection thereof acknowledged by affirmative audible input (or touch panel input), with other “N-best” choices being visually output with selection thereof being acknowledged by at least one of audible and visual input.
  • In one example embodiment, the recognition may work as follows: For example, the recognition process for geographical names (cities, streets and crossings) may work according to the following rules:
    • 1. The process may be initiated by the user (choosing the voice recognition address entry for example).
    • 2. The processor 210/ASR module may then enter a listening mode and may indicate this with a special icon display, for example. The color of the icon may change if the level of the input is within an acceptable range, too low (no input), too loud or if the input has not been recognized properly (bad input). This may serve as a feedback to the user.
    • 3. If the recognition input was considered acceptable by the processor 210/ASR module, it may then try to match the accepted phoneme sequence against the known sequences for the chosen grammar. Here, it is possible to combine the precompiled grammar (the list of names known already) with the dynamic part of the grammar (the names added by the user). This part might be emphasized as it is related to MapShare technology.
    • 4. The processor 210/ASR module then may present the results to the user, via display on the integrated input and display device 290 in the form of N-best list. If the current voice is a TTS voice, for example, the best entry may be output (first in the list) to the user.
    • 5. The user then may have the possibility to accept or to reject the result. In the first case, the processor 210/ASR module proceeds to the next step, which is either the recognition of the next address level (city→street, street→crossing or street→house number) or the planning of the route. In the second case, the user has the possibility to pronounce the line number corresponding to the correct entry, if the entry is present in the list, or to go back to the previous step by saying “Back”, for example.
  • It should be noted that each of the aforementioned aspects of an embodiment of the present application have been described with regard to the method of the present application. However, at least one embodiment of the present application is directed to a navigation device 200, including a processor 210 to receive an indication of enablement of an audible recognition mode in a navigation device 200 and to determine, subsequent to receiving an audible input, at least one choice relating to address information of a travel destination based upon the received audible input; and an output device 241 to audibly output at least one determined choice relating to address information of a travel destination, the processor 210 being further useable to acknowledge selection of the audibly output at least one choice upon receiving an affirmative audible input. Such a navigation device 200 may further include an integrated input and display device 290 as the output device 241 enable display of icons and/or selections, and subsequent selection thereof, and/or can further include an audible output device such as a speaker, for example. Further, an input device 220 can include a microphone. Thus, such a navigation device 200 may be used to perform the various aspects of the method described with regard to FIGS. 5-7, as would be understood by one of ordinary skill in the art. Thus, further explanation is omitted for the sake of brevity.
  • In at least one other embodiment of the present application, a method includes receiving an indication of enablement of an audible recognition mode in a navigation device 200; and displaying on an integrated input and display device 290, subsequent to receiving an indication of enablement of the audible recognition mode, an indication as to whether a volume of a received audible input is within an acceptable range, louder than the acceptable range, and softer than the acceptable range.
  • In at least one other embodiment of the present application, a navigation device 200 includes a processor 210 to receive an indication of enablement of an audible recognition mode in a navigation device 200; and an integrated input and display device 290 to display, subsequent to the processor 210 receiving an indication of enablement of the audible recognition mode, an indication as to whether a volume of a received audible input is within an acceptable range, louder than the acceptable range, and softer than the acceptable range.
  • As previously indicated, an embodiment of the present application can be used to indicate to a user whether or not audible input, such as that of step S4 of FIG. 5 for example, is within an acceptable range. As shown in FIG. 8, it is initially determined by processor 210, for example in conjunction with the ASR module, whether or not an audible recognition mode was enabled in step S20. If so, three different displays can be displayed in steps S24, S28, and S32, depending on whether or not volume of the audible input is determined to be within an acceptable range. For example, upon receipt of the audible input, the processor 210 and ASR module can attempt to ascertain the input information. The processor 210 and ASR module have a better chance of determining a correct input if the volume is within an acceptable range.
  • Thus, after the audible recognition mode is enabled and after audible input information is received, it is determined in step S22 whether or not volume of the audible input is within an acceptable range. This can be done by the processor 210 comparing the volume of the received information with an acceptable range, stored in memory for example with a threshold upper limit and threshold lower limit. If the volume of the received audible input is within the upper and lower thresholds in Step S22, the processor then determines that the volume of the audible input is within an acceptable range. In response thereto, the process moves to step S24 wherein the processor 210 directs display of an indication that the volume is within an acceptable range. For example, this display may include changing the color of the “talk to me” icon shown in FIG. 6A to an icon such as that shown in FIG. 6B, in a green color indicative of acceptance for example. Alternatively, another indicator may be displayed, again noting that the indicator may be displayed in a color indicative of acceptance, such as a green color for example.
  • If it is determined that the volume is not within an acceptable range in step S22, the processor 210 then moves to either step S26 or step S30 to determine if the volume was louder than an acceptable range or softer than an acceptable range. It should be noted that the order of the steps of S26 and S30 is not important; as such determinations can be made in any order. If it is determined that the volume is louder than an acceptable range in Step S26, namely greater than upper threshold of the acceptable range, an indication may be displayed in Step S28, indicating that the volume is louder than an acceptable range. For example, the icon of FIG. 6B may be displayed in red, for example (a color indicative of incorrectness and something being too high), indicating that the audible input was too loud, and/or a red indicator may be displayed to the user, again indicating that the volume is too loud.
  • Thereafter, or before Step S26, the processor 210 moves to step S30 wherein it determines whether or not the volume is softer than an acceptable range. If so, an indication may be displayed in step S32, indicating that the volume is softer than an acceptable range. For example, this may involve displaying the icon of FIG. 6B in a yellow color, for example, indicating to the user that the audible input is not loud enough. Alternatively, a yellow indicator may be displayed on the integrated information and display device 290 for example.
  • It should be noted that the use of the colors green, red, and yellow are merely examples and other colors can be utilized. Further, other methods of displaying indications of a volume being within an acceptable range, louder than an acceptable range, or softer than an acceptable range, may also be used, including but not limited to displaying of words indicating that a user should speak softer, louder, etc. Accordingly, as shown in the example embodiment of FIG. 8, a method of the present application can include receiving an indication of enablement of an audible recognition mode in a navigation device 200 and displaying, on an integrated input and display device 290 and subsequent to receiving an indication of enablement of the audible recognition mode, an indication as to whether a volume of received audible input is within an acceptable range, louder than the acceptable range, and softer than the acceptable range. The display can include a display of color information to display the indications for example, wherein a yellow color may be used to indicate that the received audible input is softer than the acceptable range, a red color may be used to indicate that the received audible input is louder than the acceptable range, and a green color may be used to indicate that the received audible input is within an acceptable range. 100931 Address information regarding a travel destination of a user may be received in conjunction with the process shown in FIG. 8 for example, wherein the display may then indicate if the received information is within an acceptable range. Thus, the address information can include at least one of a city and street name information. Further, upon the address information being received within an acceptable range, the process may include at least one of recognizing the address information, displaying an indication of no recognition and displaying, on the integrated input and display device 290, a list of choices to the user for selection. Thus, the processes as shown in FIGS. 5 and 8 can be integrated.
  • It should be noted that each of the aforementioned aspects of an embodiment of the present application have been described with regard to the method of the present application. However, at least one embodiment of the present application is directed to a navigation device 200, including a processor 210 to receive an indication of enablement of an audible recognition mode in a navigation device 200; and an integrated input and display device 290 to display, subsequent to the processor 210 receiving an indication of enablement of the audible recognition mode, an indication as to whether a volume of a received audible input is within an acceptable range, louder than the acceptable range, and softer than the acceptable range. Such a navigation device 200 may further include an audible output device such as a speaker, for example. Further, an input device 220 can include a microphone. Thus, such a navigation device 200 may be used to perform the various aspects of the method described with regard to FIGS. 5-8, as would be understood by one of ordinary skill in the art. Thus, further explanation is omitted for the sake of brevity.
  • Finally, FIG. 9 is directed to another embodiment of the present application. Typically, when entering the address information into a navigation device 200, a navigation device 200 is not being used in a navigation mode. Thus, although the process set forth in FIG. 5 can be used with the navigation device 200 in a navigation mode, this is typically not the case as the vehicle, in which the navigation device 200 is located, for example, is usually stationary upon a user inputting a travel destination from which a route of travel can be determined.
  • In at least one other embodiment of the present application, a method includes receiving an indication of enablement of an audible recognition mode in a navigation device 200; receiving additional information from a source other than a user of the navigation device 200; formulating a question, answerable by a yes or no answer from the user, based upon the received additional information; and outputting the formulated question to the user.
  • In at least one other embodiment of the present application, a navigation device 200 includes a processor 210 to receive an indication of enablement of an audible recognition mode, to receive additional information from a source other than a user of the navigation device 200, and to formulate a question, answerable by a yes or no answer from the user, based upon the received additional information; and an output device 241 to output the formulated question to the use.
  • FIG. 9 of the present application includes a process involving enablement of an audible recognition mode, which is more likely to be usable in a navigation device 200 on a vehicle in which the navigation device is located, is in moving; e.g. where the navigation device 200 is operated in a navigation mode.
  • In the process shown in FIG. 9, in step S50, it is initially determined whether or not an audible recognition mode is enabled. This can be done, for example, in a manner similar to that previously described, including recognition of selection of the icon shown in FIG. 6A for example. Once this audible recognition mode is enabled, the processor 210 of the navigation device 200 may not only monitor receipt of audible information from a user, but can also monitor receipt of additional information from a source other than a user of the navigation device 200. Thus, in step S52, it is determined by the processor 210 whether or not additional information from a source other than a user is received. This information can include but is not limited to receipt of an incoming call or message (such as a telephone call or SMS message received by the navigation device 200 itself and/or with a paired mobile phone, for example; received traffic information; etc.) If not, the process merely cycles back and continues to monitor for such information.
  • However, if additional information from a source other than the user of the navigation device 200 is received in step S52, the process moves to step S54 wherein a question is formulated by the processor 210, answerable by a yes or no answer from the user, based upon the received additional information. For example, the processor 210 can monitor other systems in the navigation device 200 (including paired mobile phones, for example) to determine whether or not, for example, an SMS message is received. If so, the processor 210 may work with both the ASR module and/or, more likely, a TTS module (Text To Speech) to formulate a question answerable by a yes/no answer from the user such as, for example, “A new message was received; shall I read it aloud?” Thereafter, the formulated question may be output in step S56, noting that the output is preferably an audible output (but may also be accompanied by a visual output for example). Somewhat similarly, when the navigation device 200 may determine receipt of a traffic update indicating a traffic delay along the route of a particular period of time (calculatable by the processor 210 in a known manner for example), wherein the processor 210 and TTS module can then instruct the output of, for example, “Traffic delay on your route now ‘x’ minutes. Do you want to replan the route to minimize delays?”
  • The ASR module is typically utilized to recognize speech information from different users. Such information is typically unpredictable, and therefore cannot normally be stored in memory 230. The ASR module or engine operates in conjunction with the processor 210 to convert received speech information to a sequence of phonemes in a dynamic manner, and works with processor 210 to match existing grammar of stored cities, street names, etc., to the converted sequence of phonemes as described above. As such, the ASR module dynamically causes the processor 210 to utilize large chunks of memory 230.
  • To the contrary, when the processor 210 works in conjunction with a TTS module, the TTS module forms questions which can be predefined or prerecorded in memory 230 for example. The TTS module can output any kind of audio information, provided that it is in the language to which the voice corresponds. Some parts of the phrases that are considered to be used most often can be prerecorded, stored, and later used by the TTS module as well, to improve the quality of the output. Thus, while the TI′S module can be used to convert simple SMS messages to voice output for the user, the TTS module typically works best in conjunction with processor 210 for outputting of preformulated questions, slightly modifiable if necessary, upon a processor 210 determining that additional information such as an SMS message, traffic update, etc., has been received by the navigation device 200. Such information can include traffic information, an incoming telephone call, an incoming SMS message, etc.
  • Further, the formulating of the question can include inserting information, based upon the received information, into a stored question, such as inserting a traffic delay into the aforementioned traffic delay question for example. Thus, the formulating can include inserting information regarding a calculated traffic delay, based upon a received traffic information, into a stored question. Thereafter, in step S56, the formulated question can be output, noting that the output may include at least one of an audible and visual output.
  • The formulated question output in step S56 is typically formulated to receive a yes or no answer from the user, to thereby enable the processor 210 to operate in conjunction with the ASR module during driving conditions when the navigation device 200 is operating in a navigation mode. In such a mode, the navigation device 200 is utilizing a lot of existing memory 230 and it is preferable to have the ASR module not utilize so much of memory 230. By utilizing yes/no questions, the processor 210 and ASR module can easily recognize the short yes or no answer of the user. Thereafter, a subsequent action may be performed by the navigation device 200 upon receipt of a yes answer from the user, such as calculating a new route of travel based upon receipt of a yes answer from the user regarding a calculated traffic delay for example. Alternatively, upon the additional information being an SMS message, the SMS message can be converted by utilizing the TTS module for example, and an incoming text message can be output to the user upon receipt of a yes answer from the user.
  • It should be noted that each of the aforementioned aspects of an embodiment of the present application have been described with regard to the method of the present application. However, at least one embodiment of the present application is directed to a navigation device 200, including a processor 210 to receive an indication of enablement of an audible recognition mode, to receive additional information from a source other than a user of the navigation device 200, and to formulate a question, answerable by a yes or no answer from the user, based upon the received additional information; and an output device 241 to output the formulated question to the use. Such a navigation device 200 may further include an integrated input and display device 290 as the output device 241 enable display of icons and/or selections, and subsequent selection thereof, and/or can further include an audible output device such as a speaker, for example. Further, an input device 220 can include a microphone. Thus, such a navigation device 200 may be used to perform the various aspects of the method described with regard to FIG. 9, as would be understood by one of ordinary skill in the art. Thus, further explanation is omitted for the sake of brevity.
  • The methods of at least one embodiment expressed above may be implemented as a computer data signal embodied in the carrier wave or propagated signal that represents a sequence of instructions which, when executed by a processor (such as processor 304 of server 302, and/or processor 210 of navigation device 200 for example) causes the processor to perform a respective method. In at least one other embodiment, at least one method provided above may be implemented above as a set of instructions contained on a computer readable or computer accessible medium, such as one of the memory devices previously described, for example, to perform the respective method when executed by a processor or other computer device. In varying embodiments, the medium may be a magnetic medium, electronic medium, optical medium, etc.
  • Even further, any of the aforementioned methods may be embodied in the form of a program. The program may be stored on a computer readable media and is adapted to perform any one of the aforementioned methods when run on a computer device (a device including a processor). Thus, the storage medium or computer readable medium is adapted to store information and is adapted to interact with a data processing facility or computer device to perform the method of any of the above mentioned embodiments.
  • The storage medium may be a built-in medium installed inside a computer device main body or a removable medium arranged so that it can be separated from the computer device main body. Examples of the built-in medium include, but are not limited to, rewriteable non-volatile memories, such as ROMs and flash memories, and hard disks. Examples of the removable medium include, but are not limited to, optical storage media such as CD-ROMs and DVDs; magneto-optical storage media, such as MOs; magnetism storage media, including but not limited to floppy disks (trademark), cassette tapes, and removable hard disks; media with a built-in rewriteable non-volatile memory, including but not limited to memory cards; and media with a built-in ROM, including but not limited to ROM cassettes; etc. Furthermore, various information regarding stored images, for example, property information, may be stored in any other form, or it may be provided in other ways.
  • As one of ordinary skill in the art will understand upon reading the disclosure, the electronic components of the navigation device 200 and/or the components of the server 302 can be embodied as computer hardware circuitry or as a computer readable program, or as a combination of both.
  • The system and method of embodiments of the present application include software operative on the processor to perform at least one of the methods according to the teachings of the present application. One of ordinary skill in the art will understand, upon reading and comprehending this disclosure, the manner in which a software program can be launched from a computer readable medium in a computer based system to execute the functions found in the software program. One of ordinary skill in the art will further understand the various programming languages which may be employed to create a software program designed to implement and perform at least one of the methods of the present application.
  • The programs can be structured in an object-orientation using an object-oriented language including but not limited to JAVA, Smalltalk, C++, etc., and the programs can be structured in a procedural-orientation using a procedural language including but not limited to COBAL, C, etc. The software components can communicate in any number of ways that are well known to those of ordinary skill in the art, including but not limited to by application of program interfaces (API), interprocess communication techniques, including but not limited to report procedure call (RPC), common object request broker architecture (CORBA), Component Object Model (COM), Distributed Component Object Model (DCOM), Distributed System Object Model (DSOM), and Remote Method Invocation (RMI). However, as will be appreciated by one of ordinary skill in the art upon reading the present application disclosure, the teachings of the present application are not limited to a particular programming language or environment.
  • The above systems, devices, and methods have been described by way of example and not by way of limitation with respect to improving accuracy, processor speed, and ease of user interaction, etc. with a navigation device 200.
  • Further, elements and/or features of different example embodiments may be combined with each other and/or substituted for each other within the scope of this disclosure and appended claims.
  • Still further, any one of the above-described and other example features of the present invention may be embodied in the form of an apparatus, method, system, computer program and computer program product. For example, of the aforementioned methods may be embodied in the form of a system or device, including, but not limited to, any of the structure for performing the methodology illustrated in the drawings.
  • Example embodiments being thus described, it will be obvious that the same may be varied in many ways. Such variations are not to be regarded as a departure from the spirit and scope of the present invention, and all such modifications as would be obvious to one skilled in the art are intended to be included within the scope of the following claims.

Claims (51)

1. A method, comprising:
receiving an indication of enablement of an audible recognition mode in a navigation device;
determining, subsequent to receiving an indication of enablement of the audible recognition mode and subsequent to receiving an audible input, at least one choice relating to address information of a travel destination based upon the received audible input;
audibly outputting at least one determined choice relating to address information of a travel destination; and
acknowledging selection of the audibly output at least one determined choice upon receiving an affirmative audible input.
2. The method of claim 1, wherein, upon the determining including determining a plurality of choices relating to the address information of the travel destination, each of the plurality of choices is visually output and only one choice is audibly output.
3. The method of claim 2, wherein the plurality of choices are visually output for selection on an integrated input and display device of the navigation device.
4. The method of claim 3, wherein each of the plurality of choices are selectable by at least one of touch panel input and audible input, the audibly output at least one choice being further selectable via receipt of an indication of a touch panel input.
5. The method of claim 4, wherein each of the plurality of choices are selectable by audible input of a number corresponding to a displayed choice.
6. The method of claim 1, wherein the at least one choice relating to address information of a travel destination includes a city name.
7. The method of claim 6, wherein, subsequent to selection of a city name and subsequent to receiving another audible input, determining at least one street name.
8. The method of claim 7, wherein, upon the determining including determining a plurality of street names, each of the plurality of street names is visually output and only one street name is audibly output.
9. A method, comprising:
receiving an indication of enablement of an audible recognition mode in a navigation device; and
displaying on an integrated input and display device, subsequent to receiving an indication of enablement of the audible recognition mode, an indication as to whether a volume of a received audible input is within an acceptable range, louder than the acceptable range, and softer than the acceptable range.
10. The method of claim 9, wherein the display includes a display of color information to display the indications.
11. The method of claim 10, wherein a yellow color is used to indicate that the received audible input is softer than the acceptable range, wherein a red color is used to indicate that the received audible input is louder than the acceptable range, and wherein a green color is used to indicate that the received audible input is within an acceptable range.
12. The method of claim 11, wherein subsequent to enablement of the audible recognition mode, address information regarding a travel destination of the user is received, the displaying then indicating if the received information is within an acceptable range.
13. The method of claim 12, whether the address information includes at least one of city and street name information.
14. The method of claim 12, further comprising, upon address information being received within an acceptable range, at least one of recognizing the address information, displaying an indication of no recognition and displaying, on the integrated input and display device, a list of choices to the user for selection.
15. A method, comprising:
receiving an indication of enablement of an audible recognition mode in a navigation device;
receiving additional information from a source other than a user of the navigation device;
formulating a question, answerable by a yes or no answer from the user, based upon the received additional information; and
outputting the formulated question to the user.
16. The method of claim 15, wherein the information includes traffic information.
17. The method of claim 15, wherein the information includes receipt of at least one of an incoming call and message.
18. The method of claim 15, wherein the formulating includes inserting information, based upon the received information, into a stored question.
19. The method of claim 15, wherein the formulating includes inserting information regarding a calculated traffic delay, based upon the received traffic information, into a stored question.
20. The method of claim 15, wherein the output includes at least one of an audible and visual output.
21. The method of claim 15, wherein the output includes an audible and a visual output.
22. The method of claim 15, further comprising performing a subsequent action upon receipt of a yes answer from the user.
23. The method of claim 15, further comprising calculating a new route of travel upon receipt of a yes answer from the user regarding the calculated traffic delay.
24. The method of claim 17, further comprising outputting an incoming text message upon receipt of a yes answer from the user.
25. A navigation device, comprising:
a processor to receive an indication of enablement of an audible recognition mode in a navigation device and to determine, subsequent to receiving an audible input, at least one choice relating to address information of a travel destination based upon the received audible input; and
an output device to audibly output at least one determined choice relating to address information of a travel destination, the processor being further useable to acknowledge selection of the audibly output at least one determined choice upon receiving an affirmative audible input.
26. The navigation device of claim 25, further comprising:
an integrated input and display device to, upon the determining by the processor including determining a plurality of choices relating to the address information of the travel destination, display each of the plurality of choices;
an audible output device to audibly output only one choice relating to the address information of the travel destination.
27. The navigation device of claim 26, wherein the plurality of choices are output by the integrated input and display device for selection.
28. The navigation device of claim 27, wherein each of the plurality of choices are selectable by at least one of touch panel input via the integrated input and display device, and audible input, the audibly output at least one choice being further selectable via receipt by the processor of an indication of a touch panel input.
29. The navigation device of claim 28, wherein each of the plurality of choices are selectable via receipt by the processor of an audible input of a number corresponding to a displayed choice.
30. The navigation device of claim 25, wherein the at least one choice relating to address information of a travel destination includes a city name.
31. The navigation device of claim 30, wherein, subsequent to selection of a city name and subsequent to receiving another audible input, the processor is further useable to determine at least one street name.
32. The navigation device of claim 31, wherein, upon the determining by the processor including determining a plurality of street names, each of the plurality of street names is visually output and only one street name is audibly output.
33. The navigation device of claim 25, wherein the navigation device is portable.
34. A navigation device, comprising:
a processor to receive an indication of enablement of an audible recognition mode in a navigation device; and
an integrated input and display device to display, subsequent to the processor receiving an indication of enablement of the audible recognition mode, an indication as to whether a volume of a received audible input is within an acceptable range, louder than the acceptable range, and softer than the acceptable range.
35. The navigation device of claim 34, wherein the display includes a display of color information to display the indications.
36. The navigation device of claim 35, wherein a yellow color is used to indicate that the received audible input is softer than the acceptable range, wherein a red color is used to indicate that the received audible input is louder than the acceptable range, and wherein a green color is used to indicate that the received audible input is within the acceptable range.
37. The navigation device of claim 36, wherein subsequent to enablement of the audible recognition mode, address information regarding a travel destination of the user is received, the displaying then indicating if the received information is within an acceptable range.
38. The navigation device of claim 37, whether the address information includes at least one of city and street name information.
39. The navigation device of claim 37, wherein, upon address information being received within an acceptable range, the processor at least one of recognizing the address information, directing display on the integrated input and display device, of an indication of no recognition and directing display on the integrated input and display device, of a list of choices to the user for selection.
40. The navigation device of claim 34, wherein the navigation device is portable.
41. A navigation device, comprising:
a processor to receive an indication of enablement of an audible recognition mode, to receive additional information from a source other than a user of the navigation device, and to formulate a question, answerable by a yes or no answer from the user, based upon the received additional information; and
an output device to output the formulated question to the user.
42. The navigation device of claim 41, wherein the information includes traffic information.
43. The navigation device of claim 41, wherein the information includes receipt of at least one of an incoming call and message.
44. The navigation device of claim 41, wherein the formulating includes inserting information, based upon the received information, into a stored question.
45. The navigation device of claim 41, wherein the formulating includes inserting information regarding a calculated traffic delay, based upon the received traffic information, into a stored question.
46. The navigation device of claim 41, wherein the output device is at least one of an audible and visual output device.
47. The navigation device of claim 41, wherein the output device includes an audible output device and a visual output device.
48. The navigation device of claim 41, wherein the processor is useable to perform a subsequent action upon receipt of a yes answer from the user.
49. The navigation device of claim 41, wherein the processor is useable to calculate a new route of travel upon receipt of a yes answer from the user regarding the calculated traffic delay.
50. The navigation device of claim 43, wherein the processor is useable to direct output of an incoming text message upon receipt of a yes answer from the user.
51. The navigation device of claim 41, wherein the navigation device is portable.
US11/907,232 2007-01-10 2007-10-10 Navigation device and method relating to an audible recognition mode Abandoned US20100286901A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US11/907,232 US20100286901A1 (en) 2007-01-10 2007-10-10 Navigation device and method relating to an audible recognition mode

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
US87955307P 2007-01-10 2007-01-10
US87952307P 2007-01-10 2007-01-10
US87954907P 2007-01-10 2007-01-10
US87957707P 2007-01-10 2007-01-10
US87959907P 2007-01-10 2007-01-10
US11/907,232 US20100286901A1 (en) 2007-01-10 2007-10-10 Navigation device and method relating to an audible recognition mode

Publications (1)

Publication Number Publication Date
US20100286901A1 true US20100286901A1 (en) 2010-11-11

Family

ID=38924440

Family Applications (9)

Application Number Title Priority Date Filing Date
US11/907,253 Active 2028-12-27 US7974777B2 (en) 2007-01-10 2007-10-10 Navigation device and method for using a traffic message channel
US11/907,240 Abandoned US20080228390A1 (en) 2007-01-10 2007-10-10 Navigation device and method for providing regional travel information in a navigation device
US11/907,229 Abandoned US20080167810A1 (en) 2007-01-10 2007-10-10 Navigation device and method for early instruction output
US11/907,239 Abandoned US20080168346A1 (en) 2007-01-10 2007-10-10 Navigation device and method for using special characters in a navigation device
US11/907,238 Abandoned US20080207116A1 (en) 2007-01-10 2007-10-10 Navigation device and method using a personal area network
US11/907,252 Active 2031-05-16 US8335637B2 (en) 2007-01-10 2007-10-10 Navigation device and method providing a traffic message channel resource
US11/907,232 Abandoned US20100286901A1 (en) 2007-01-10 2007-10-10 Navigation device and method relating to an audible recognition mode
US11/907,251 Abandoned US20080167799A1 (en) 2007-01-10 2007-10-10 Navigation device and method for quick option access
US11/907,233 Abandoned US20080208447A1 (en) 2007-01-10 2007-10-10 Navigation device and method for providing points of interest

Family Applications Before (6)

Application Number Title Priority Date Filing Date
US11/907,253 Active 2028-12-27 US7974777B2 (en) 2007-01-10 2007-10-10 Navigation device and method for using a traffic message channel
US11/907,240 Abandoned US20080228390A1 (en) 2007-01-10 2007-10-10 Navigation device and method for providing regional travel information in a navigation device
US11/907,229 Abandoned US20080167810A1 (en) 2007-01-10 2007-10-10 Navigation device and method for early instruction output
US11/907,239 Abandoned US20080168346A1 (en) 2007-01-10 2007-10-10 Navigation device and method for using special characters in a navigation device
US11/907,238 Abandoned US20080207116A1 (en) 2007-01-10 2007-10-10 Navigation device and method using a personal area network
US11/907,252 Active 2031-05-16 US8335637B2 (en) 2007-01-10 2007-10-10 Navigation device and method providing a traffic message channel resource

Family Applications After (2)

Application Number Title Priority Date Filing Date
US11/907,251 Abandoned US20080167799A1 (en) 2007-01-10 2007-10-10 Navigation device and method for quick option access
US11/907,233 Abandoned US20080208447A1 (en) 2007-01-10 2007-10-10 Navigation device and method for providing points of interest

Country Status (5)

Country Link
US (9) US7974777B2 (en)
EP (1) EP2102596B1 (en)
JP (1) JP5230652B2 (en)
AU (1) AU2007343335A1 (en)
WO (1) WO2008083862A1 (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080228385A1 (en) * 2007-01-10 2008-09-18 Pieter Geelen Navigation device and method for informational screen display
US20090248415A1 (en) * 2008-03-31 2009-10-01 Yap, Inc. Use of metadata to post process speech recognition output
US20090322558A1 (en) * 2008-06-30 2009-12-31 General Motors Corporation Automatic Alert Playback Upon Recognition of a Paired Peripheral Device
US20100049696A1 (en) * 2008-08-20 2010-02-25 Magellan Navigation, Inc. Systems and Methods for Smart City Search
US20110054774A1 (en) * 2008-05-29 2011-03-03 Simone Tertoolen Navigation device and method for altering map information related to audible information
US20140032104A1 (en) * 2012-07-30 2014-01-30 Telenav, Inc. Navigation system with range based notification enhancement delivery mechanism and method of operation thereof
US20140156181A1 (en) * 2011-11-10 2014-06-05 Mitsubishi Electric Corporation Navigation device, navigation method, and navigation program
US20140336925A1 (en) * 2013-05-09 2014-11-13 Jeremiah Joseph Akin Displaying map icons based on a determined route of travel
US9583107B2 (en) 2006-04-05 2017-02-28 Amazon Technologies, Inc. Continuous speech transcription performance indication
US9973450B2 (en) 2007-09-17 2018-05-15 Amazon Technologies, Inc. Methods and systems for dynamically updating web service profile information by parsing transcribed message strings
US10192255B2 (en) 2012-02-22 2019-01-29 Ebay Inc. Systems and methods for in-vehicle navigated shopping
US10697792B2 (en) 2012-03-23 2020-06-30 Ebay Inc. Systems and methods for in-vehicle navigated shopping
US10963951B2 (en) 2013-11-14 2021-03-30 Ebay Inc. Shopping trip planner
US20230214175A1 (en) * 2018-06-06 2023-07-06 Vivo Mobile Communication Co., Ltd. Prompting method and mobile terminal

Families Citing this family (52)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102005054573A1 (en) * 2005-11-16 2007-05-24 Robert Bosch Gmbh Method for operating a navigation device and a correspondingly designed navigation device
JP5230652B2 (en) 2007-01-10 2013-07-10 トムトム インターナショナル ベスローテン フエンノートシャップ Method, computer program and navigation system for indicating traffic delay
US7768395B2 (en) 2007-01-19 2010-08-03 Gold Steven K Brand mapping
US8302033B2 (en) 2007-06-22 2012-10-30 Apple Inc. Touch screen device, method, and graphical user interface for providing maps, directions, and location-based information
US7814435B2 (en) * 2007-11-29 2010-10-12 Alpine Electronics, Inc. Method and apparatus for displaying local brand icons for navigation system
JP5034932B2 (en) * 2007-12-26 2012-09-26 ソニー株式会社 Display device, program, and recording medium
US20090171584A1 (en) * 2007-12-31 2009-07-02 Magellan Navigation, Inc. System and Method for Accessing a Navigation System
US20090177987A1 (en) * 2008-01-04 2009-07-09 Prasantha Jayakody Efficient display of objects of interest to a user through a graphical user interface
US8327272B2 (en) 2008-01-06 2012-12-04 Apple Inc. Portable multifunction device, method, and graphical user interface for viewing and managing electronic calendars
DE102008006445A1 (en) * 2008-01-28 2009-08-20 Navigon Ag Method for operating a navigation device
US20090216442A1 (en) * 2008-02-26 2009-08-27 Nokia Corporation Method, apparatus and computer program product for map feature detection
US8417450B2 (en) * 2008-03-11 2013-04-09 Microsoft Corporation On-board diagnostics based navigation device for dead reckoning
JP4543342B2 (en) 2008-05-12 2010-09-15 ソニー株式会社 Navigation device and information providing method
CA2720198A1 (en) * 2008-05-29 2009-12-03 Tomtom International B.V. Portable navigation device, portable electronic communications apparatus, and method of generating radio data system information therefor
DE112009000957B4 (en) * 2008-05-30 2013-06-20 Mitsubishi Electric Corp. Navigation device and adaptive-controlled communication system
TW201017122A (en) * 2008-10-31 2010-05-01 Quantum Digital Comm Technology Corp Intelligent navigation device and control method thereof
US8532926B2 (en) * 2009-02-17 2013-09-10 Mitsubishi Electric Corporation Map information processing device
EP2432482B1 (en) 2009-05-20 2015-04-15 Cardio3 Biosciences S.A. Pharmaceutical composition for the treatment of heart diseases.
US20110029904A1 (en) * 2009-07-30 2011-02-03 Adam Miles Smith Behavior and Appearance of Touch-Optimized User Interface Elements for Controlling Computer Function
US20110099525A1 (en) * 2009-10-28 2011-04-28 Marek Krysiuk Method and apparatus for generating a data enriched visual component
US20110099507A1 (en) * 2009-10-28 2011-04-28 Google Inc. Displaying a collection of interactive elements that trigger actions directed to an item
KR101612789B1 (en) * 2009-12-01 2016-04-18 엘지전자 주식회사 Navigation method of mobile terminal and apparatus thereof
US8862576B2 (en) 2010-01-06 2014-10-14 Apple Inc. Device, method, and graphical user interface for mapping directions between search results
CN102346257A (en) * 2010-07-29 2012-02-08 深圳市凯立德欣软件技术有限公司 Navigation equipment and tunnel navigation method thereof
US20120101721A1 (en) * 2010-10-21 2012-04-26 Telenav, Inc. Navigation system with xpath repetition based field alignment mechanism and method of operation thereof
KR101144388B1 (en) * 2010-11-09 2012-05-10 기아자동차주식회사 Traffic information providing system and apparatus and method thereof
US8686864B2 (en) 2011-01-18 2014-04-01 Marwan Hannon Apparatus, system, and method for detecting the presence of an intoxicated driver and controlling the operation of a vehicle
US8718536B2 (en) 2011-01-18 2014-05-06 Marwan Hannon Apparatus, system, and method for detecting the presence and controlling the operation of mobile devices within a vehicle
US9146126B2 (en) * 2011-01-27 2015-09-29 Here Global B.V. Interactive geographic feature
US8909468B2 (en) * 2011-06-07 2014-12-09 General Motors Llc Route portals
CN102325371A (en) * 2011-07-26 2012-01-18 范海绍 Wireless sport team positioning and communication system
US9087348B2 (en) * 2011-08-11 2015-07-21 GM Global Technology Operations LLC Digital content networking
US9635271B2 (en) * 2011-11-17 2017-04-25 GM Global Technology Operations LLC Vision-based scene detection
US9285472B2 (en) * 2011-12-06 2016-03-15 L-3 Communications Avionics Systems, Inc. Multi-link transponder for aircraft and method of providing multi-link transponder capability to an aircraft having an existing transponder
EP2825846A4 (en) * 2012-03-16 2015-12-09 Qoros Automotive Co Ltd Navigation system and method for different mobility modes
GB2505891A (en) * 2012-09-12 2014-03-19 Tomtom Int Bv Navigation device having a USB port function switching means
GB201218681D0 (en) * 2012-10-17 2012-11-28 Tomtom Int Bv Methods and systems of providing information using a navigation apparatus
GB201218680D0 (en) 2012-10-17 2012-11-28 Tomtom Int Bv Methods and systems of providing information using a navigation apparatus
US9046370B2 (en) 2013-03-06 2015-06-02 Qualcomm Incorporated Methods for providing a navigation route based on network availability and device attributes
US9330136B2 (en) * 2013-10-08 2016-05-03 Toyota Jidosha Kabushiki Kaisha System for proving proactive zone information
KR102160975B1 (en) * 2013-10-30 2020-09-29 삼성전자 주식회사 Method and system providing of location based service to a electronic device
ES2762953T3 (en) 2014-05-15 2020-05-26 Samsung Electronics Co Ltd System to provide personalized information and procedure to provide personalized information
WO2015174764A1 (en) * 2014-05-15 2015-11-19 Samsung Electronics Co., Ltd. System for providing personalized information and method of providing the personalized information
CN103973701B (en) * 2014-05-23 2017-04-12 南京美桥信息科技有限公司 Video retrieval system and method based on internet
US9826496B2 (en) * 2014-07-11 2017-11-21 Telenav, Inc. Navigation system with location mechanism and method of operation thereof
EP3209969B1 (en) 2014-10-20 2020-03-04 TomTom Navigation B.V. Alternative routes
JP6037468B2 (en) * 2014-11-14 2016-12-07 インターナショナル・ビジネス・マシーンズ・コーポレーションInternational Business Machines Corporation Method for notifying that moving body is approaching specific area, and server computer and server computer program therefor
BR112018000692A2 (en) 2015-07-14 2018-09-18 Driving Man Systems Inc detecting a phone's location using wireless rf and ultrasonic signals
DE102016012500A1 (en) * 2016-10-19 2018-04-19 Texmag Gmbh Vertriebsgesellschaft Method and device for detecting the position of a moving web
US10527449B2 (en) * 2017-04-10 2020-01-07 Microsoft Technology Licensing, Llc Using major route decision points to select traffic cameras for display
CN110244337B (en) * 2019-06-14 2023-10-03 北京世纪东方智汇科技股份有限公司 Method and device for positioning target object in tunnel
CN113395462B (en) * 2021-08-17 2021-12-14 腾讯科技(深圳)有限公司 Navigation video generation method, navigation video acquisition method, navigation video generation device, navigation video acquisition device, server, equipment and medium

Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5454062A (en) * 1991-03-27 1995-09-26 Audio Navigation Systems, Inc. Method for recognizing spoken words
US5475599A (en) * 1992-07-23 1995-12-12 Aisin Aw Co., Ltd. Vehicle navigation system
US6073094A (en) * 1998-06-02 2000-06-06 Motorola Voice compression by phoneme recognition and communication of phoneme indexes and voice features
US6269303B1 (en) * 1997-08-08 2001-07-31 Aisin Aw Co., Ltd. Vehicle navigation system and recording medium
US6529826B2 (en) * 2000-11-29 2003-03-04 Sharp Kabushiki Kaisha Navigation apparatus and communication base station, and navigation system and navigation method using same
US20040204845A1 (en) * 2002-06-19 2004-10-14 Winnie Wong Display method and apparatus for navigation system
US20050216185A1 (en) * 2001-02-20 2005-09-29 Matsushita Industrial Electric Co., Ltd. Travel guidance device and travel warning announcement device
US20060100779A1 (en) * 2003-09-02 2006-05-11 Vergin William E Off-board navigational system
US20060253251A1 (en) * 2005-05-09 2006-11-09 Puranik Nishikant N Method for street name destination address entry using voice
US20070061066A1 (en) * 2003-06-26 2007-03-15 Christian Bruelle-Drews Method for assisting navigation and navigation system
US20070100544A1 (en) * 2005-10-31 2007-05-03 Denso Corporation System for traffic circle navigation
US20070129055A1 (en) * 2003-12-23 2007-06-07 Gregory Ehlers System and method for providing information to a user
US20070156331A1 (en) * 2003-12-26 2007-07-05 Tomohiro Terada Navigation device
US20070260456A1 (en) * 2006-05-02 2007-11-08 Xerox Corporation Voice message converter
US20080045236A1 (en) * 2006-08-18 2008-02-21 Georges Nahon Methods and apparatus for gathering and delivering contextual messages in a mobile communication system
US20090326794A1 (en) * 2005-04-21 2009-12-31 Thomas Lungwitz Method for selecting elements in a driver information system or in a navigation system or in a mobile terminal
US20100142926A1 (en) * 2004-09-27 2010-06-10 Coleman David J Method and apparatus for remote voice-over or music production and management
US20100185392A1 (en) * 2004-09-10 2010-07-22 Atx Group, Inc. Systems and Methods for Off-Board Voice-Automated Vehicle Navigation

Family Cites Families (130)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2658538B2 (en) * 1990-09-14 1997-09-30 三菱電機株式会社 RDS receiver
DE4137000C2 (en) * 1991-11-11 1994-06-09 Opel Adam Ag Method for field strength-dependent evaluation of radio information for vehicles
US5448773A (en) * 1992-02-05 1995-09-05 Trimble Navigation Limited Long life portable global position system receiver
US20020104083A1 (en) * 1992-12-09 2002-08-01 Hendricks John S. Internally targeted advertisements using television delivery systems
JP2809042B2 (en) * 1993-04-13 1998-10-08 松下電器産業株式会社 Travel position display device
US5469370A (en) * 1993-10-29 1995-11-21 Time Warner Entertainment Co., L.P. System and method for controlling play of multiple audio tracks of a software carrier
US5497241A (en) * 1993-10-29 1996-03-05 Time Warner Entertainment Co., L.P. System and method for controlling display of motion picture subtitles in a selected language during play of a software carrier
DE4344173A1 (en) * 1993-12-23 1995-06-29 Philips Patentverwaltung Control unit for an RDS-TMC radio receiver
US6680674B1 (en) * 1994-04-13 2004-01-20 Seiko Instruments Inc. Adaptive geographic mapping in vehicle information systems
US6321158B1 (en) * 1994-06-24 2001-11-20 Delorme Publishing Company Integrated routing/mapping information
JPH08128838A (en) * 1994-11-01 1996-05-21 Fujitsu Ten Ltd Navigation device
DE4442413A1 (en) 1994-11-29 1996-05-30 Bosch Gmbh Robert Procedure for setting up a mobile radio receiver and radio receiver
DE4445582C2 (en) * 1994-12-20 2002-05-29 Deutsche Automobilgesellsch Method and device for outputting traffic disturbance reports in a vehicle
US5737691A (en) * 1995-07-14 1998-04-07 Motorola, Inc. System and method for allocating frequency channels in a two-way messaging network
TW308665B (en) * 1995-08-09 1997-06-21 Toyota Motor Co Ltd
JPH0961179A (en) * 1995-08-22 1997-03-07 Zanavy Informatics:Kk Route guiding apparatus for vehicle
DE19710863A1 (en) 1997-03-15 1998-09-17 Bosch Gmbh Robert Method and receiver for the geographical selection of digitally coded messages
DE19711540A1 (en) 1997-03-20 1998-10-01 Grundig Ag RDS receiver for the evaluation of traffic information
JP3593844B2 (en) 1997-04-24 2004-11-24 ソニー株式会社 Information receiving method, navigation device and automobile
US5990890A (en) * 1997-08-25 1999-11-23 Liberate Technologies System for data entry and navigation in a user interface
US6240280B1 (en) 1997-08-26 2001-05-29 Thomson Consumer Electronics Sales Gmbh Selection of traffic capable station by RDS radio while listening to other media
US6154673A (en) * 1997-12-30 2000-11-28 Agilent Technologies, Inc. Multilingual defibrillator
US7257528B1 (en) * 1998-02-13 2007-08-14 Zi Corporation Of Canada, Inc. Method and apparatus for Chinese character text input
JP2000047855A (en) * 1998-05-28 2000-02-18 Sharp Corp Portable electric appliance with telephone function
US7548787B2 (en) * 2005-08-03 2009-06-16 Kamilo Feher Medical diagnostic and communication system
US5899905A (en) * 1998-10-19 1999-05-04 Third Millennium Engineering Llc Expansion locking vertebral body screw, staple, and rod assembly
US6606082B1 (en) * 1998-11-12 2003-08-12 Microsoft Corporation Navigation graphical interface for small screen devices
US7051360B1 (en) * 1998-11-30 2006-05-23 United Video Properties, Inc. Interactive television program guide with selectable languages
US6208934B1 (en) * 1999-01-19 2001-03-27 Navigation Technologies Corp. Method and system for providing walking instructions with route guidance in a navigation program
JP3449291B2 (en) * 1999-05-14 2003-09-22 株式会社デンソー Map display device
EP1120631B1 (en) * 1999-07-06 2008-12-24 Mitsubishi Denki Kabushiki Kaisha Navigation device
DE19945431A1 (en) * 1999-09-22 2001-04-05 Siemens Ag Method for arranging route information within a road map and navigation device
US7263664B1 (en) * 2000-11-01 2007-08-28 Ita Software, Inc. Graphical user interface for travel planning system
JP2001245341A (en) * 2000-02-28 2001-09-07 Toshiba Corp Wireless data communication system
US6587782B1 (en) * 2000-03-14 2003-07-01 Navigation Technologies Corp. Method and system for providing reminders about points of interests while traveling
US6812860B1 (en) * 2000-03-22 2004-11-02 Ford Global Technologies, Llc System and method of providing information to an onboard information device in a vehicle
DE10019681A1 (en) 2000-04-20 2001-10-25 Grundig Ag Device and method for automatic selection of transmitters
DE10028659A1 (en) * 2000-06-09 2001-12-13 Nokia Mobile Phones Ltd Electronic appointment planner
JP3979009B2 (en) * 2000-07-07 2007-09-19 株式会社デンソー Control information output device and information system
AU2001276992A1 (en) * 2000-07-20 2002-02-05 Aeptec Microsystems, Inc. Method, system, and protocol for location-aware mobile devices
US7155061B2 (en) * 2000-08-22 2006-12-26 Microsoft Corporation Method and system for searching for words and phrases in active and stored ink word documents
US6810323B1 (en) * 2000-09-25 2004-10-26 Motorola, Inc. System and method for storing and using information associated with geographic locations of interest to a mobile user
US7039418B2 (en) * 2000-11-16 2006-05-02 Qualcomm Incorporated Position determination in a wireless communication system with detection and compensation for repeaters
US7512685B2 (en) * 2000-11-30 2009-03-31 3Com Corporation Method and system for implementing wireless data transfers between a selected group of mobile computing devices
DE60029361T2 (en) * 2000-12-04 2007-07-05 Mitsubishi Denki K.K. WIRELESS MOTOR VEHICLE COMMUNICATION DEVICE WITH SHORT RANGE
JP2002209246A (en) * 2001-01-11 2002-07-26 Mitsubishi Electric Corp Wireless communication unit
US6687613B2 (en) * 2001-05-31 2004-02-03 Alpine Electronics, Inc. Display method and apparatus of navigation system
EP1401546A4 (en) * 2001-06-15 2006-11-02 Walker Digital Llc Method and apparatus for planning and customizing a gaming experience
JP4033379B2 (en) * 2001-07-11 2008-01-16 タカタ株式会社 Seat anchor guide anchor
DE10135023A1 (en) * 2001-07-18 2003-02-13 Bosch Gmbh Robert interface
US20060240806A1 (en) * 2001-07-18 2006-10-26 Saban Demirbasa Data security device
US6640185B2 (en) * 2001-07-21 2003-10-28 Alpine Electronics, Inc. Display method and apparatus for navigation system
US6653825B2 (en) * 2001-11-29 2003-11-25 Theodore G. Munniksma Meter lead holder device
US20030151506A1 (en) * 2002-02-11 2003-08-14 Mark Luccketti Method and apparatus for locating missing persons
US20050192025A1 (en) * 2002-04-22 2005-09-01 Kaplan Richard D. Method and apparatus for an interactive tour-guide system
GB0211566D0 (en) * 2002-05-21 2002-06-26 Koninkl Philips Electronics Nv Method and apparatus for providing travel relating information to a user
JP2005533301A (en) 2002-06-13 2005-11-04 パナソニック オートモーティブ システムズ カンパニー オブ アメリカ Multi-mode interface
US20030236671A1 (en) * 2002-06-20 2003-12-25 Deere & Company System and method of loadable languages for implement monitoring display
DE10233376A1 (en) * 2002-07-23 2004-02-12 Fendt, Günter Intelligent predictive driver assistance system and/or traffic warning system has ability to predict route to be driven or to attempt to predict route when no route data have been entered
JP4370761B2 (en) * 2002-08-28 2009-11-25 カシオ計算機株式会社 Camera device and message output method
US6810328B2 (en) * 2002-11-23 2004-10-26 Alpine Electronics, Inc Navigation method and system for indicating area-specific traffic information
JP2006511901A (en) * 2002-12-18 2006-04-06 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Handheld PDA video accessories
US8225194B2 (en) * 2003-01-09 2012-07-17 Kaleidescape, Inc. Bookmarks and watchpoints for selection and presentation of media streams
US20070128899A1 (en) * 2003-01-12 2007-06-07 Yaron Mayer System and method for improving the efficiency, comfort, and/or reliability in Operating Systems, such as for example Windows
DE10312502A1 (en) 2003-03-14 2004-09-23 DDG GESELLSCHAFT FüR VERKEHRSDATEN MBH Process for providing traffic information
JP2004312538A (en) * 2003-04-09 2004-11-04 Mitsubishi Electric Corp Radio equipment connection system
ATE526742T1 (en) 2003-05-08 2011-10-15 Harman Becker Automotive Sys BACKGROUND TUNER OF A RADIO RECEIVER FOR RECEIVING TRAFFIC AND TRAVEL INFORMATION AND FOR EXPLORING ALTERNATIVE FREQUENCIES
US20040236504A1 (en) * 2003-05-22 2004-11-25 Bickford Brian L. Vehicle navigation point of interest
US20040243307A1 (en) * 2003-06-02 2004-12-02 Pieter Geelen Personal GPS navigation device
US6905091B2 (en) * 2003-07-14 2005-06-14 Supersonic Aerospace International, Llc System and method for controlling the acoustic signature of a device
JPWO2005006609A1 (en) 2003-07-14 2006-08-24 ソニー株式会社 Information processing apparatus, information processing method, and information processing program
US7050903B1 (en) 2003-09-23 2006-05-23 Navteq North America, Llc Method and system for developing traffic messages
DE10354218A1 (en) * 2003-11-20 2005-06-30 Siemens Ag Method for selecting and preparing traffic information
US20050124357A1 (en) * 2003-12-04 2005-06-09 International Business Machines Corporation Method for transmitting address information to a global positioning system from a personal digital assistant or other similar device via a connector
US7818380B2 (en) 2003-12-15 2010-10-19 Honda Motor Co., Ltd. Method and system for broadcasting safety messages to a vehicle
JP4388359B2 (en) 2003-12-17 2009-12-24 株式会社ケンウッド In-vehicle man-machine interface device, method, and program
US7353109B2 (en) * 2004-02-05 2008-04-01 Alpine Electronics, Inc. Display method and apparatus for navigation system for performing cluster search of objects
JP2005233628A (en) * 2004-02-17 2005-09-02 Kenwood Corp Guide route search device, navigation device, and guid route search method
JP4346472B2 (en) * 2004-02-27 2009-10-21 株式会社ザナヴィ・インフォマティクス Traffic information prediction device
JP2005321370A (en) * 2004-04-05 2005-11-17 Sony Corp Navigation system, data processing method and computer program
US7366606B2 (en) * 2004-04-06 2008-04-29 Honda Motor Co., Ltd. Method for refining traffic flow data
US7289904B2 (en) * 2004-04-06 2007-10-30 Honda Motor Co., Ltd. Vehicle navigation system and methods for incorporating user preferences into same
US7389244B2 (en) * 2004-04-16 2008-06-17 Donald Kaplan Method and system for providing travel services
JP2005308543A (en) * 2004-04-21 2005-11-04 Denso Corp Electronic equipment with map display function and program
US20070192711A1 (en) * 2006-02-13 2007-08-16 Research In Motion Limited Method and arrangement for providing a primary actions menu on a handheld communication device
US7672778B1 (en) * 2004-07-20 2010-03-02 Navteq North America, Llc Navigation system with downloaded map data
US7630328B2 (en) 2004-08-18 2009-12-08 At&T Intellectual Property, I,L.P. SIP-based session control
US7643788B2 (en) 2004-09-22 2010-01-05 Honda Motor Co., Ltd. Method and system for broadcasting data messages to a vehicle
JP2006119120A (en) * 2004-09-27 2006-05-11 Denso Corp Car navigation device
US7430473B2 (en) * 2004-10-01 2008-09-30 Bose Corporation Vehicle navigation display
US7706977B2 (en) * 2004-10-26 2010-04-27 Honeywell International Inc. Personal navigation device for use with portable device
US7835859B2 (en) * 2004-10-29 2010-11-16 Aol Inc. Determining a route to a destination based on partially completed route
JP2006165667A (en) * 2004-12-02 2006-06-22 Denso Corp On-vehicle wireless receiver and program
DE102005042694A1 (en) * 2004-12-30 2006-07-20 Volkswagen Ag Navigation system for e.g. land vehicle, has man-machine interface for inputting geographical figure and keyword characterizing point of interest, and search module searching point of interest in geographical area defined by figure
TW200632621A (en) * 2005-03-02 2006-09-16 Mitac Int Corp Connection device and its portable system
DE102005011215B4 (en) 2005-03-09 2006-12-07 Bury Sp.Z.O.O navigation system
DE102005011627A1 (en) * 2005-03-09 2006-09-28 Bury Sp.Z.O.O Navigation device for use in motor vehicle, has satellite-navigation antenna attached at antenna input of housing, and plastic deformable metallic bar connected electrically with antenna input and forming part of navigation antenna
US7562049B2 (en) 2005-03-29 2009-07-14 Honda Motor Co., Ltd. Payment system and method for data broadcasted from a remote location to vehicles
US20060223494A1 (en) * 2005-03-31 2006-10-05 Mazen Chmaytelli Location-based emergency announcements
JP4396656B2 (en) * 2005-04-21 2010-01-13 株式会社デンソー Map display apparatus and vehicle navigation apparatus equipped with the apparatus
TW200638294A (en) * 2005-04-22 2006-11-01 Mitac Int Corp Navigation system and method capable of switching to multinational language operating interface
US7616978B2 (en) * 2005-05-17 2009-11-10 Bury Sp. Z.O.O. Combined navigation and communication device
WO2006123198A1 (en) 2005-05-20 2006-11-23 Nokia Corporation Method and system for context sensitive presenting of traffic announcements
US7516012B2 (en) * 2005-05-27 2009-04-07 Bury Sp. Z.O.O. Navigation system and method for updating software routines and navigation database information
DE102005029594B4 (en) * 2005-06-23 2007-04-05 Bury Sp.Z.O.O Navigation system and method for extracting encrypted transmitted information
US7552009B2 (en) * 2005-07-14 2009-06-23 Honda Motor Co., Ltd. System and method for synchronizing data for use in a navigation system
DE102005038300A1 (en) * 2005-08-12 2007-02-15 Royaltek Company Ltd. Navigation device with GPS and TMC and method thereof
US20070050183A1 (en) * 2005-08-26 2007-03-01 Garmin Ltd. A Cayman Islands Corporation Navigation device with integrated multi-language dictionary and translator
US7366609B2 (en) * 2005-08-29 2008-04-29 Garmin Ltd. Navigation device with control feature limiting access to non-navigation application
KR100735399B1 (en) * 2005-09-23 2007-07-04 삼성전자주식회사 Method and apparatus for handover using interworking with cellular system in digital broadcasting system
GB0523512D0 (en) 2005-11-18 2005-12-28 Applied Generics Ltd Enhancing traffic and navigation information with visual and audio data
GB2434931B (en) 2006-02-01 2009-07-01 Nissan Motor Mfg Traffic information device
US20070202930A1 (en) 2006-02-27 2007-08-30 Lucent Technologies Inc. Method and system for testing embedded echo canceller in wireless network
US7783471B2 (en) 2006-03-08 2010-08-24 David Vismans Communication device for emulating a behavior of a navigation device
US20070266239A1 (en) 2006-03-08 2007-11-15 David Vismans Method for providing a cryptographically signed command
US7881864B2 (en) * 2006-05-31 2011-02-01 Garmin Switzerland Gmbh Method and apparatus for utilizing geographic location information
US20080201658A1 (en) * 2006-06-06 2008-08-21 Ivi Smart Technologies, Inc. Wireless Media Player Device and System, and Method for Operating the Same
US20070293146A1 (en) * 2006-06-14 2007-12-20 C.S. Consultant Co Satellite navigation converstion device
US8750892B2 (en) * 2006-06-21 2014-06-10 Scenera Mobile Technologies, Llc System and method for naming a location based on user-specific information
US8099086B2 (en) * 2006-06-21 2012-01-17 Ektimisi Semiotics Holdings, Llc System and method for providing a descriptor for a location to a recipient
KR20080035089A (en) * 2006-10-18 2008-04-23 야후! 인크. Apparatus and method for providing regional information based on location
US8126730B2 (en) * 2006-10-24 2012-02-28 Medapps, Inc. Systems and methods for storage and forwarding of medical data
US20080122691A1 (en) * 2006-11-27 2008-05-29 Carani Sherry L Tracking system and method with multiple time zone selector, dynamic screens and multiple screen presentations
US20080121690A1 (en) * 2006-11-27 2008-05-29 Carani Sherry L Ubiquitous Tracking System and Method
US20080122656A1 (en) * 2006-11-27 2008-05-29 Carani Sherry L Tracking System and Method with Multiple Language Selector, Dynamic Screens and Multiple Screen Presentations
US20080125965A1 (en) * 2006-11-27 2008-05-29 Carani Sherry L Tracking System and Method with Automatic Map Selector and Geo Fence Defining Features
US20080125964A1 (en) * 2006-11-27 2008-05-29 Carani Sherry L Tracking System and Method with Automatic Map Selector And Geo Fence Defining Features
WO2008083740A1 (en) 2007-01-10 2008-07-17 Tomtom International B.V. Improved search function for portable navigation device
JP5230652B2 (en) 2007-01-10 2013-07-10 トムトム インターナショナル ベスローテン フエンノートシャップ Method, computer program and navigation system for indicating traffic delay
US7668653B2 (en) * 2007-05-31 2010-02-23 Honda Motor Co., Ltd. System and method for selectively filtering and providing event program information

Patent Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5454062A (en) * 1991-03-27 1995-09-26 Audio Navigation Systems, Inc. Method for recognizing spoken words
US5475599A (en) * 1992-07-23 1995-12-12 Aisin Aw Co., Ltd. Vehicle navigation system
US6269303B1 (en) * 1997-08-08 2001-07-31 Aisin Aw Co., Ltd. Vehicle navigation system and recording medium
US6073094A (en) * 1998-06-02 2000-06-06 Motorola Voice compression by phoneme recognition and communication of phoneme indexes and voice features
US6529826B2 (en) * 2000-11-29 2003-03-04 Sharp Kabushiki Kaisha Navigation apparatus and communication base station, and navigation system and navigation method using same
US20050216185A1 (en) * 2001-02-20 2005-09-29 Matsushita Industrial Electric Co., Ltd. Travel guidance device and travel warning announcement device
US20040204845A1 (en) * 2002-06-19 2004-10-14 Winnie Wong Display method and apparatus for navigation system
US20070061066A1 (en) * 2003-06-26 2007-03-15 Christian Bruelle-Drews Method for assisting navigation and navigation system
US20060100779A1 (en) * 2003-09-02 2006-05-11 Vergin William E Off-board navigational system
US20070129055A1 (en) * 2003-12-23 2007-06-07 Gregory Ehlers System and method for providing information to a user
US20070156331A1 (en) * 2003-12-26 2007-07-05 Tomohiro Terada Navigation device
US20100185392A1 (en) * 2004-09-10 2010-07-22 Atx Group, Inc. Systems and Methods for Off-Board Voice-Automated Vehicle Navigation
US20100142926A1 (en) * 2004-09-27 2010-06-10 Coleman David J Method and apparatus for remote voice-over or music production and management
US20090326794A1 (en) * 2005-04-21 2009-12-31 Thomas Lungwitz Method for selecting elements in a driver information system or in a navigation system or in a mobile terminal
US20060253251A1 (en) * 2005-05-09 2006-11-09 Puranik Nishikant N Method for street name destination address entry using voice
US20070100544A1 (en) * 2005-10-31 2007-05-03 Denso Corporation System for traffic circle navigation
US20070260456A1 (en) * 2006-05-02 2007-11-08 Xerox Corporation Voice message converter
US20080045236A1 (en) * 2006-08-18 2008-02-21 Georges Nahon Methods and apparatus for gathering and delivering contextual messages in a mobile communication system

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9583107B2 (en) 2006-04-05 2017-02-28 Amazon Technologies, Inc. Continuous speech transcription performance indication
US8160815B2 (en) 2007-01-10 2012-04-17 Tomtom International B.V. Navigation device and method for informational screen display
US20080228385A1 (en) * 2007-01-10 2008-09-18 Pieter Geelen Navigation device and method for informational screen display
US9973450B2 (en) 2007-09-17 2018-05-15 Amazon Technologies, Inc. Methods and systems for dynamically updating web service profile information by parsing transcribed message strings
US8676577B2 (en) * 2008-03-31 2014-03-18 Canyon IP Holdings, LLC Use of metadata to post process speech recognition output
US20090248415A1 (en) * 2008-03-31 2009-10-01 Yap, Inc. Use of metadata to post process speech recognition output
US8635019B2 (en) * 2008-05-29 2014-01-21 Tomtom International B.V. Navigation device and method for altering map information related to audible information
US20110054774A1 (en) * 2008-05-29 2011-03-03 Simone Tertoolen Navigation device and method for altering map information related to audible information
US20090322558A1 (en) * 2008-06-30 2009-12-31 General Motors Corporation Automatic Alert Playback Upon Recognition of a Paired Peripheral Device
US20100049696A1 (en) * 2008-08-20 2010-02-25 Magellan Navigation, Inc. Systems and Methods for Smart City Search
US8249804B2 (en) * 2008-08-20 2012-08-21 Mitac International Corporation Systems and methods for smart city search
US20140156181A1 (en) * 2011-11-10 2014-06-05 Mitsubishi Electric Corporation Navigation device, navigation method, and navigation program
US9341492B2 (en) * 2011-11-10 2016-05-17 Mitsubishi Electric Corporation Navigation device, navigation method, and navigation program
US10991022B2 (en) 2012-02-22 2021-04-27 Ebay Inc. Systems and methods to provide search results based on time to obtain
US10192255B2 (en) 2012-02-22 2019-01-29 Ebay Inc. Systems and methods for in-vehicle navigated shopping
US11054276B2 (en) 2012-03-23 2021-07-06 Ebay Inc. Systems and methods for in-vehicle navigated shopping
US10697792B2 (en) 2012-03-23 2020-06-30 Ebay Inc. Systems and methods for in-vehicle navigated shopping
US20140032104A1 (en) * 2012-07-30 2014-01-30 Telenav, Inc. Navigation system with range based notification enhancement delivery mechanism and method of operation thereof
US8898014B2 (en) * 2012-07-30 2014-11-25 Telenav, Inc. Navigation system with range based notification enhancement delivery mechanism and method of operation thereof
US20140336925A1 (en) * 2013-05-09 2014-11-13 Jeremiah Joseph Akin Displaying map icons based on a determined route of travel
US10963951B2 (en) 2013-11-14 2021-03-30 Ebay Inc. Shopping trip planner
US11593864B2 (en) 2013-11-14 2023-02-28 Ebay Inc. Shopping trip planner
US20230214175A1 (en) * 2018-06-06 2023-07-06 Vivo Mobile Communication Co., Ltd. Prompting method and mobile terminal

Also Published As

Publication number Publication date
US20080167810A1 (en) 2008-07-10
US20080228390A1 (en) 2008-09-18
US7974777B2 (en) 2011-07-05
JP2010515901A (en) 2010-05-13
EP2102596A1 (en) 2009-09-23
US20080168346A1 (en) 2008-07-10
EP2102596B1 (en) 2018-01-03
US20080208447A1 (en) 2008-08-28
US20080221782A1 (en) 2008-09-11
US8335637B2 (en) 2012-12-18
AU2007343335A1 (en) 2008-07-17
WO2008083862A1 (en) 2008-07-17
US20080207116A1 (en) 2008-08-28
US20080167799A1 (en) 2008-07-10
US20080215236A1 (en) 2008-09-04
JP5230652B2 (en) 2013-07-10

Similar Documents

Publication Publication Date Title
US20100286901A1 (en) Navigation device and method relating to an audible recognition mode
AU2007343388A1 (en) A navigation device, a method of and a computer program for operating the navigation device comprising an audible recognition mode
US8160815B2 (en) Navigation device and method for informational screen display
US8473193B2 (en) Method and device for utilizing selectable location marker for relational display of point of interest entries
CN101583848B (en) Method and a navigation device for displaying GPS position data related to map information in text readable form

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION