|Numéro de publication||US20050060088 A1|
|Type de publication||Demande|
|Numéro de demande||US 10/889,156|
|Date de publication||17 mars 2005|
|Date de dépôt||12 juil. 2004|
|Date de priorité||10 juil. 2003|
|Autre référence de publication||US7098788, US7155202, US7339493, US7397346, US20050035854, US20050038860, US20050057357, US20050057361, US20050062637, US20050071879, US20050101250, US20050132047, US20050262212, WO2005008914A1|
|Numéro de publication||10889156, 889156, US 2005/0060088 A1, US 2005/060088 A1, US 20050060088 A1, US 20050060088A1, US 2005060088 A1, US 2005060088A1, US-A1-20050060088, US-A1-2005060088, US2005/0060088A1, US2005/060088A1, US20050060088 A1, US20050060088A1, US2005060088 A1, US2005060088A1|
|Inventeurs||Abdelsalam Helal, Steven Moore, Balaji Ramachandran, Yingchun Ran|
|Cessionnaire d'origine||University Of Florida Research Foundation, Inc.|
|Exporter la citation||BiBTeX, EndNote, RefMan|
|Citations de brevets (31), Référencé par (71), Classifications (121), Événements juridiques (1)|
|Liens externes: USPTO, Cession USPTO, Espacenet|
This application claims the benefit of both U.S. Provisional Application No. 60/486,018, filed in the United States Patent and Trademark Office on Jul. 10, 2003, and U.S. Provisional Application No. 60/490,717 filed in the United States Patent and Trademark Office on Jul. 29, 2003, the entirety of both applications is incorporated herein by reference.
1. Field of the Invention
The present invention relates to the field of portable computing devices and, more particularly, to a portable computing device for pedestrian navigation and for spatial relationship determinations between the device and obstacles proximate to the device.
2. Description of the Related Art
Technological advances in areas of small-scale computing devices have resulted in the simplification of many previously challenging aspects of life. For example, the proliferation of mobile telephone and mobile e-mail devices have simplified interpersonal communication and have provided previously unobtainable freedoms for many business people. In another example, navigation devices, such as in-vehicle navigation systems, have dramatically improved the ease with which people can travel. Further, personal navigation devices, such as hand-held location locators, allow excursionists to roam unfamiliar landscapes without fear of becoming lost.
At present, however, the abilities of personal navigation devices are largely limited to present location detection and rudimentary navigational features. That is, a destination can be entered into a personal navigation device and the device can indicate a bearing that a pedestrian can travel to reach that destination. Conventional pedestrian navigation devices do not typically take environmental constraints, such as available walkways and other such pedestrian paths, into consideration when guiding pedestrians to selected destinations. Further, no known conventional pedestrian navigation device possesses spatial awareness capabilities for detecting static and/or dynamic obstacles and for plotting pedestrian travel routes to avoid these obstacles.
The present invention provides a method, a system, and an apparatus for guiding pedestrians from their current location to a user-selectable destination. More specifically, the present invention can include a pedestrian spatial relation/navigation device (PSD) for aiding pedestrians traveling indoors and/or outdoors. The PSD can be spatially aware of the environment in which the pedestrian is to travel and can use this spatial awareness to aid the traveler. The PSD can contain obstacle sensing components for detecting the presence of obstacles in an environment relative to the PSD so that a pedestrian can avoid these obstacles. Further, the PSD can determine multiple potential pedestrian pathways for reaching a selected destination and can select a recommended travel pathway based upon static and dynamic factors, such as user preferences, temporal constraints, and known pathway obstacles and impediments. The PSD can also include training capabilities that a user can use to program misrecognized obstacles, to store points of interest, and provide other suitable feedback.
In one embodiment, the PSD can be specifically designed to assist visually impaired individuals. In such an embodiment, a visually impaired pedestrian can use the PSD to navigate to a selected destination and to avoid both static and dynamic obstacles in the pathway of or toward the destination. The PSD can provide audible and/or tactile cues to the visually impaired, via audible circuitry and/or a tactile presentation mechanism like a digital Braille pad.
One aspect of the present invention can include a pedestrian navigation method. The method can include the steps of determining a destination location and determining a present location of a mobile computing device based upon at least one wireless transmission received by the mobile computing device. In one embodiment, mobile telephone communications can be sent and received via the mobile computing device. A pedestrian travel path can be determined from the present location and the destination location. The present location and the pedestrian travel path can be intermittently updated as the mobile computing device is moved. At least one obstacle located near the path can be detected based upon at least one wireless transmission conveyed to the mobile computing device. At least a portion of the detected obstacles can be dynamic obstacles that change position over time. Further, at least a portion of the obstacles can contain a location beacon that the mobile computing device is configured to detect.
Sensory indicators can be emitted to guide a pedestrian to the destination location. The emitted sensory indicators can include a warning indicator for warning the pedestrian about detected obstacles for obstacle avoidance purposes. In one embodiment, the mobile computing device can be designed to assist visually impaired pedestrians. In such an embodiment, the sensory indicators can include at least one tactile indicator, such as a digital Braille pad. The sensory indicators can also include synthetically generated voice cues. In a particular embodiment, user feedback can be received through the mobile computing device. The user feedback can be used to improve guidance provided by the mobile computing device. Additionally, the mobile computing device can be communicatively linked to the Internet. Once linked to the Internet, navigation information can be accessed from a remote data source. This navigation information can be used to guide the pedestrian to the destination.
Another aspect of the present invention can include a PSD that can include a position finder, a spatial relationship sensor, an input mechanism, and an output mechanism. In one embodiment, the PSD can be designed for assisting visually impaired pedestrians. The position finder can be configured to determine a geographic position of the PSD based upon received wireless signals. In one embodiment, the position finder can include a Global Positioning System (GPS). The spatial relationship sensor can provide data used to detect a position of at least one obstacle relative to the PSD. In one embodiment, the spatial relationship sensor can include a short range wireless transceiver, a radio frequency identification system, and/or an ultrasonic transducer.
The input mechanism can specify a destination location. The output mechanism can include sensory indicators for guiding a pedestrian to the destination location and/or for warning a pedestrian about detected obstacles proximate to the PSD. For example, the output mechanism can include audio circuitry configured to provide audible sensory indicators. In another example, the output mechanism can include a tactile presentation mechanism configured to provide tactile sensory indicators.
In one embodiment, the PSD can determine multiple pathways for navigating to the destination location and can select one of the determined pathways based upon user preference, temporal constraints, and/or obstacles proximate to the pathway. In another embodiment, the PSD can include a cellular transceiver to send and receive mobile telephony communications. Further, the PSD can be a thin client configured to be communicatively linked to at least one remotely located server. The PSD can also include a training system configured to alter device operation based upon user feedback.
There are shown in the drawings, embodiments that are presently preferred; it being understood, however, that the invention is not limited to the precise arrangements and instrumentalities shown.
The processor 105 can execute a suitable operating system and one or more applications for controlling the various functions of the PSD 100. For example, the processor 105 can execute an operating system which can support the execution of one or more applications intended to run on that platform and which support operation of the various functions and features disclosed herein. For example, as the PSD 100 can include one or more sensors to be described in greater detail herein, the operating system and computing architecture can be designed to support the operation of such sensors.
The memory 125 can be implemented as random access memory (RAM), read-only memory (ROM), Erasable Programmable Read-Only Memory (EPROM), or any other type of physical memory suitable for use within a portable computing device, such as the PSD 100. It should be appreciated that the memory 125, while illustrated as a separate unit, can be incorporated into the processor 105 or another device. In any case, the memory 125 can include programmatic instructions to be executed by the processor 105 as well as any operational data necessary for operation of the PSD 100.
Wireless signals can be received and sent via the antenna 155 which can be suited for longer-range communications such as conventional cellular or personal communication service (PCS) communications. Accordingly, the antenna 155 can be operatively connected to the mobile telephony transceiver 110. Signals detected by antenna 155 can be provided to the mobile telephony transceiver 110 for processing and decoding. For example, the mobile telephony transceiver 110 can include a codec for coding and decoding information received or to be sent via wireless transmission. The transceiver 110 can make the decoded signals and/or information available to other components of the PSD 100 for processing. Outbound information received by the mobile telephony transceiver 110 can be coded and/or formatted for wireless transmission by the codec and then provided to the antenna 155 for transmission.
Thus, it should be appreciated that the PSD 100 can communicate via conventional cellular telephone and/or PCS telephone calls and access wireless networks, for example using Wireless Access Protocol (WAP) or another suitable wireless communications protocol, such that the PSD 100 can access the Internet, the Web, a Local Area Network (LAN), and/or a wide area network (WAN), as well as any applications and/or services disposed on such networks via a wireless communications link.
The audio circuitry 115 can include a microphone or other transducive element (not shown) for receiving sound and one or more analog-to-digital converters (not shown) for digitizing the received sound. The audio circuitry 115 further can include one or more digital-to-analog converters (not shown) for converting digital information into an analog signal. The audio circuitry 115 can include a speaker or other transducive element (not shown) for generating sound from an analog signal as well as one or more amplifiers (not shown). Notably, although not shown, the PSD 100 can include one or more audio output jacks and/or or other digital data interface ports.
It should be appreciated that the audio circuitry 115 can include additional processors, such as digital signal processors (DSP) as may be required for processing audio and performing functions such as audio encoding, audio decoding, noise reduction, and the like. According to one embodiment of the present invention, the audio circuitry can be implemented using one or more discrete components. In another arrangement, the audio circuitry 115 can be implemented using one or more larger integrated circuits configured to perform the various functions disclosed herein. Thus, the PSD 100 can be configured to play various audio formats from streaming formats to MP3's, or other audio file formats such as .wav or .aiff files.
The audio circuitry 115 can also include and/or be communicatively linked to automatic speech recognition (ASR) and synthetic speech generation components that can be used to perform text-to-speech and speech-to-text conversions. When the audio circuitry 115 includes ASR and/or speech generation components suitable software and/or firmware can be embedded within the audio circuitry 115. When the audio circuitry 115 is communicatively linked to remotely located ASR and/or speech generation components, communications between the audio circuitry 115 and the remotely located components can occur using the mobile telephony transceiver, the short range wireless transceiver 120, the interface port 145, or any other suitable elements.
The PSD 100 also can include a short range wireless transceiver 120 as well as an antenna 160 operatively connected thereto. The short range wireless transceiver 120 can both send and receive data. For example, according to one embodiment of the present invention, the short range wireless transceiver 120 can be implemented as a BlueTooth-enabled wireless transceiver, or as a transceiver configured to communicate with one of the 802.11 family of short range wireless communications specifications. The short range wireless transceiver 120 and accompanying antenna 160 can be configured to communicate using any of a variety of short range, wireless communications protocols and/or systems. Accordingly, the various examples disclosed herein have been provided for illustration only and should not be construed as a limitation of the present invention.
The PSD 100 can include a position finder 130 and one or more spatial relationship sensors, such as a radio frequency identification (RFID) mechanism 135, and an ultrasonic transducer 140. The spatial relationship sensors of the PSD 100 can provide data used to detect a position of at least one obstacle relative to the device. One of ordinary skill in the art should appreciate the RFID 135 and the ultrasonic transducer 140 represent two illustrative spatial relationship sensors and that the PSD 100 is not limited in this regard. For example, the PSD 100 can include other spatial relationship sensors such as a radar sensor, a sonar sensor, an optically based sensor, a pressure sensor, a temperature sensor, and the like.
The position finder 130 can determine a geographic position of the device based upon received wireless signals. For example, the position finder 130 can include global positioning system (GPS) components for computing a position from signals conveyed by GPS satellites. In another example, the position finder 130 can receive wireless signals conveyed from signal broadcasting devices with known positions and can determine a geographical location through triangulation techniques. For instance, the position finder 130 can triangulate a position for the PSD 100 based upon cellular and PCS broadcasts conveyed from mobile telephony towers.
Alternatively, the position finder 130 can triangulate a position within a room or building based upon short range wireless broadcasts emitted from a multitude of emitting devices located at known geographical points. For example, a number of wireless access points adhering to the 802.11 family of standards can receive/broadcast RF signals, each access point can have a known broadcasting radius. The position finder 130 can determine which access points the device is within range of and from these points determine the relative location of the PSD 100. In another embodiment, the position finder 130 can triangulate a position for the PSD 100 based upon a multitude of previously established beacons within sensor detection range of the PSD 100.
One illustrative spatial relationship sensor, the RFID mechanism 135 incorporates the use of electromagnetic or electrostatic coupling in the radio frequency (RF) portion of the electromagnetic spectrum to uniquely identify an object, animal, or person. The RFID mechanism 135 does not require direct contact or line-of-sight scanning. An RFID mechanism 135 can include an antenna and transceiver (often combined into one reader) and a transponder (the tag). The antenna uses radio frequency waves to transmit a signal that activates the transponder. When activated, the tag transmits data back to the antenna. The data is used to notify a programmable logic controller, such as the processor 105, that an action should occur. For example, the action can include any programmatic response such as initiating communications to interface and exchange data with another computing system. The PSD 100 can include a low-frequency RFID mechanism 135 of approximately 30 KHz to 500 KHz having a short transmission range of approximately six feet, or a high-frequency RFID mechanism 135 of approximately 850 MHz to 950 MHz and 2.4 GHz to 2.5 GHz and having a longer transmission range of approximately 90 feet or more.
Notably, the RFID mechanism 135 of the PSD 100 can include a tag, a transceiver, or both a tag and a transceiver. Additionally, RFID readable tags/transceivers can be attached to objects to permit the PSD 100 to determine the identity and/or location of the associated obstacle via the RFID mechanism 135. Further, an array of RFID mechanisms 140 both internal to the PSD 100 and external to the PSD 100 can be used to determine obstacle location based on triangulation.
The ultrasonic transducer 140 can include a transceiver capable of transmitting a beacon signal which can be received by one or more ultrasonic transceivers. The use of an ultrasonic transducer 140 enables high precision tracking technology to be used within one's house, for example, in the case where one's home is outfitted with one or more ultrasonic transceivers. Accordingly, a home or other “smart” environment, for example one equipped with a ultrasonic transceiver, can detect when a user having the PSD 100 is within a particular range of the transceiver. Thus, determinations can be made as to the position of the PSD 100 and the position of obstacles relative to the PSD 100.
The PSD 100 can also include one or more interface ports 145 used to physically connect devices and/or peripherals to the PSD 100. For example, the interface port 145 can be a standard wall jack to initiate telephone calls over the Public Switched Telephone Network (PSTN). The interface port 145 can also include a universal serial bus (USB) port, a firewire (IEEE 1394) port, a parallel port, a COM port like an RS-232 port, an ethernet port, an audio port, or the like. Use of the interface port 145 for communicatively linking the PSD 100 with external devices can be advantageous in situations where wireless connectivity may not be available, is intermittent, or otherwise unsuitable for a particular purpose.
The PSD 100 also can include a variety of other components and sensors which have not been illustrated in
Each of the various components of the PSD 100 disclosed herein can be communicatively linked with one another using appropriate circuitry, whether through the memory 125, one or more additional memories (not shown), the processor 105, one or more additional interface processors or logic controllers (not shown), and/or the communications bus 150. For example, while each of the sensors described herein is depicted as being linked to the communications bus 150, it should be appreciated that each sensor can be configured to communicate with the processor 105 through a suitable interface, such as a digital input and/or output or through an intermediate interface processor, for example using an interrupt request of the processor.
Additionally, one skilled in the art will recognize that the various components disclosed herein can be embodied in various other forms and that the configuration disclosed and described with reference to
The physical arrangement of the PSD 200 has been provided for purposes of illustration only. As such, it should be appreciated that the various components can be located in any of a variety of different configurations. For example, the PSD 200 can include additional keys or controls disposed on the frontal portion or the sides of the unit.
According to one embodiment of the present invention, the physical arrangement of the PSD 200 can be conducive for use by visually impaired individuals or those that may have difficulty accessing and/or operating the various keys and/or controls of conventional mobile computing devices, such as the elderly, persons with physical disabilities, or other infirmities. For example, the control keys 210 and the alphanumeric keys 215 of the PSD 200 can be larger in size than conventional cellular device keys and can be spaced a greater distance from one another with respect to both the width and length of the PSD 200. That is, the horizontal key spacing and the vertical key spacing can be greater than that found with conventional cellular devices. Further, the control keys 210 can include Braille markings for key identification purposes.
The presentation element 205 can include a tactile presentation mechanism, such as a Braille pad, a visual display, an audible presentation mechanism like a speaker, and the like. When the presentation element 205 includes a display screen, this display can be a liquid crystal display (LCD) implemented in either grayscale or color, a touch screen, or any other type of suitable display screen. The presentation element 205 can include a display screen that is larger than those found on conventional mobile computing devices and can have an increased contrast ratio if so desired.
The battery of the PSD 200 can be designed to operate for extended times. According to one arrangement, the battery can be comprised of electrical cells that release energy through chemical reactions. Alternatively, the “battery” powering the PSD 200 can utilize a fuel cell, such as a methane battery. Additionally, while the various enhancements disclosed herein may add size to the PSD 200, it is expected that the increased size would be an acceptable tradeoff for increased functionality and ease of use provided by the PSD 200. Alternatively, illustrated components such as the control keys 210 can be replaced by other, smaller components, such as a microphone in order to save space and decrease the size of the PSD 200.
As noted, the PSD 200 can include a variety of sensors. As shown in
As shown in
The PSD 200 can include one or more application programs which allow the user to access the functionality of the various systems and/or devices connected to the smart space control unit 305. In one embodiment, the PSD 200 can be a thin client and the smart space control unit 305 can function as an application server. The smart space control unit 305 can also be configured with a multitude of PSD 200 and/or user specific settings so that information exchanged between the PSD 200 and the smart space control unit 305 can be tailored for the needs, capabilities, and privileges of different users and/or PSDs.
Through the smart space control unit 305, the user of PSD 200 can access information pertaining to the smart space, including space layout, space pedestrian pathways, and space obstacles. For example, the smart space control unit 305 can include a server that broadcasts the layout of the smart space to the PSD 200 through a wireless communication means, such as through a wireless network communication like the 802.11 family of wireless networking protocols, a Bluetooth transmission, and the like.
It should be appreciated that the PSD 200 can communicate with the smart space control unit 305 using any of a variety of different communications mechanisms and that the PSD 200 is not limited to any specific communication mechanism. For example, the PSD 200 can initiate cellular telephone and/or conventional telephone calls to the smart space control unit 305 when the PSD 200 is not located within or proximate to the home within which the smart space control unit 305 is disposed. In another example, the PSD 200 can communicate with the home control unit using short range wireless communications when in range. In still another example, the PSD 200 can be linked to the smart space control unit 305 via one or more interface ports.
Further, the smart space control unit 305 can be communicatively linked to a communication system 320, where the communication system 320 can include a home intercom system, a line based computer network, a message service, a telephony system, an Internet connection, and the like. The capabilities of the communication systems 320 can be utilized by a user of the PSD 200 through access granted via the smart space control unit 305. For example, the communication system 320 can communicatively link the smart space control unit 305 to a multitude of remotely located computing systems, such as a spatial relation system 360, a service provider 365, a navigation system 370, and the like. Web services, databases, and other remotely located computing and/or data resources can be provided by the service provider 365. In one embodiment, the PSD 200 can utilize included communication capabilities to directly communicate with the spatial relation system 360, the service provider 365, and the navigation system 370 without using the smart space control unit 305 as an intermediary.
The smart space control unit 305 and/or the PSD 200 can be communicatively linked to a multitude of interactive subsystems that can include at least one location beacon 310 and at least one dynamic beacon 315. The beacons 310 and 315 can be detected by the smart space control unit 305 and/or the PSD 200 and used for navigational and spatial relation purposes.
The location beacon 310 and the dynamic beacon 315 can consist of a transceiver or other mechanism that permits the PSD 200 to determine a location of the beacon 310 and/or beacon 315 relative to the PSD 200 using sensors of the PSD 200. The PSD 200 can also determine the identity, size, weight, and other object identification characteristics from information conveyed by the beacon 310 and the beacon 315. For example, the beacon 310 and the beacon 315 can include tags containing digitally embedded information that can be sensed and/or read by the RFID 135 and/or the ultrasonic transducer 140. These tags can be affixed to obstacles and/or objects within the smart space.
When location beacons 310 are attached to fixed points with known locations, the PSD 200 can triangulate the position of the PSD 200 using the location beacons 310 as reference points. When the location beacons 310 are affixed to static objects, such as a wall, a doorway, a staircase, a desk, a pedestrian walkway, and the like, the PSD 200 can use the location beacons 310 as obstacle identification points in order to guide a user so that the user is not impeded by the obstacles within the smart space.
Dynamic beacons 315 can be affixed to mobile objects, such as chairs, pets, people, portable appliances, and the like. The dynamic beacons 315 can be used to track the current position of the associated dynamic object so that the PSD 200 can locate the object and/or avoid the object as desired. For example, the dynamic beacon 315 can be affixed to a remote control unit, a set of keys, and/or a telephone so that the user of the PSD 200 can locate these commonly misplaced objects. In another example, the dynamic beacon 315 can be affixed to a vacuum cleaner, an ironing board, or a footrest so that a visually challenged person using the PSD 200 can be made aware of the presence of the associated object for obstacle avoidance purposes.
Obstacle positioning information can also be gathered through sensors contained within the smart space and conveyed to the PSD 200 via the smart space control unit 305. For example, a camera 330 or video system can intermittently video the smart space. Determinations can be performed by the smart space control unit 305 based upon video feeds to determine the location of the PSD 200 user as well as obstacles near the PSD 200 user. Results can be fed from the smart space control unit 305 to the PSD 200.
Additionally, surveillance system 335 data can be gathered by the smart space control unit 305 and used to determine the location of a PSD 200 user and obstacles near the PSD 200 user. Typical surveillance systems 335 can include motion sensors, pressure sensors, sound detectors, and the like. It should be noted that the PSD 200 is not limited to any particular object detection source and that data provided by multiple sources, including spatial relationship sensors of the PSD 200 and data gathered via smart space sensors, can be combined to improve the accuracy of the PSD 200.
It should be appreciated that while smart spaces have been described with reference to a single, centralized computer system, one or more computer systems can be included. For example, lighting can be controlled with one computer system while temperature is controlled by another, and appliances can be controlled by yet another computer system. The various computer systems may or may not communicate with one another so long as each is able to communicate with the PSD 200. Still, each system can be configured to communicate with the PSD 200 independently and operate on its own. For instance, each appliance can be a “smart” appliance having built-in communications and control mechanisms for being accessed remotely. In that case, each appliance need not communicate with other appliances or a centralized computing system so long as the appliance and/or system can communicate directly with the PSD 200.
The user input can be provided directly to the mobile computing device or can be provided to a remote system, such as a networked computer, and can be subsequently conveyed to the mobile computing device. Further, the user input can be processed by a remote computing device and/or the mobile computing device into a form readable by the mobile computing device. For example, the user can input an address and/or room number verbally, this input can be conveyed to a remote server, speech-to-text converted, and translated into coordinate values that are conveyed back to the mobile computing device in a format comprehensible by the mobile computing device.
In step 510, the present location of the mobile computing device can be automatically determined using position finding capabilities of the mobile computing device. In step 515, a pedestrian travel path can be determined from the present location to the destination location. It should be noted that multiple travel paths can be computed initially, each of which can be used by the pedestrian to travel from the present location to the destination location. One of these potential travel paths can be selected as a preferred travel path based upon user preference, temporal constraints, static and dynamic obstacles, and the like.
In step 520, the mobile computing device can emit sensory indicators to guide pedestrians to the destination location. A sensory indicator can include audible indicators, visual indicators, tactical indicators, and the like. An audible indicator can include tonal warnings, speech cues, and the like. Visual indicators can include graphically displayed images, textual directions, and the like. Tactile indicators can include device vibrations, Braille pad presentations, heat sensations, low powered electric stimulations, and the like.
In step 525, the area proximate to the mobile computing device can be searched for obstacles. In one scenario, these obstacles can be determined by surveillance systems, environmental sensors, and the like that are connected to computer systems remote from the mobile computing device. Then, the remote computer system can wirelessly communicate with the mobile computing device. In another scenario, the mobile computing device can include environmental sensors that can detect nearby obstacles. When obstacles are detected in step 530, one or more sensory indicators can be emitted from the mobile computing device in step 535 to warn a pedestrian about the detected obstacle. When no obstacles are detected in step 530, step 535 can be skipped and the method can proceed to step 540.
In step 540, a determination can be made as to whether the mobile computing device has been moved. When the mobile computing device has been moved, the method can proceed to step 545, where the present location and the pedestrian travel path can be updated. After the update, the method can loop to step 520, where sensory indicators can be emitted to guide the pedestrian to the destination location. If the mobile computing device has not been moved, the method can proceed to step 550.
In step 550, a determination can be made as to whether the pedestrian has arrived at the destination or not. If so, the method can end in step 555 or the method can be repeated for a new destination by looping to step 510. If the pedestrian has not arrived at the destination, as determined by the location of the mobile computing device, the method can progress from step 550 to step 560. In step 560, a determination can be made as to whether a new, different destination has been entered. If not, the method can loop back to step 540, where a new determination as to whether the mobile computing device has been moved can be performed. If a new destination has been entered as determined by step 560, the method can progress from step 560 to step 565, where the current navigation operation being performed by the mobile computing device can be canceled. Once canceled, the method can loop to step 510, where the present location of the mobile computing device. The new destination location and the present location of the mobile computing device can be used to determine a pedestrian travel path.
The present invention can be realized in hardware, software, or a combination of hardware and software. The present invention can be realized in a centralized fashion in one computer system or in a distributed fashion where different elements are spread across several interconnected computer systems. Any kind of computer system or other apparatus adapted for carrying out the methods described herein is suited. A typical combination of hardware and software can be a general-purpose computer system with a computer program that, when being loaded and executed, controls the computer system such that it carries out the methods described herein.
The present invention also can be embedded in a computer program product, which comprises all the features enabling the implementation of the methods described herein, and which when loaded in a computer system is able to carry out these methods. Computer program in the present context means any expression, in any language, code or notation, of a set of instructions intended to cause a system having an information processing capability to perform a particular function either directly or after either or both of the following: a) conversion to another language, code or notation; b) reproduction in a different material form.
This invention can be embodied in other forms without departing from the spirit or essential attributes thereof. Accordingly, reference should be made to the following claims, rather than to the foregoing specification, as indicating the scope of the invention.
|Brevet cité||Date de dépôt||Date de publication||Déposant||Titre|
|US5508699 *||25 oct. 1994||16 avr. 1996||Silverman; Hildy S.||Identifier/locator device for visually impaired|
|US5833603 *||13 mars 1996||10 nov. 1998||Lipomatrix, Inc.||Implantable biosensing transponder|
|US5872834 *||16 sept. 1996||16 févr. 1999||Dew Engineering And Development Limited||Telephone with biometric sensing device|
|US5973618 *||24 sept. 1997||26 oct. 1999||Ellis; Christ G.||Intelligent walking stick|
|US6204763 *||22 mars 1999||20 mars 2001||Jujitsu Limited||Household consumable item automatic replenishment system including intelligent refrigerator|
|US6320496 *||29 avr. 1999||20 nov. 2001||Fuji Xerox Co., Ltd||Systems and methods providing tactile guidance using sensory supplementation|
|US6362778 *||9 mars 2001||26 mars 2002||Timothy J Neher||Personal location detection system|
|US6380585 *||6 juin 2000||30 avr. 2002||Matsushita Electric Industrial Co., Ltd.||Nonvolatile semiconductor device capable of increased electron injection efficiency|
|US6404880 *||24 déc. 1999||11 juin 2002||Alcatel Usa Sourcing, L.P.||Method and apparatus for delivering critical information|
|US6428475 *||27 déc. 1999||6 août 2002||Ein-Yiao Shen||Mobile phone combined physiological function detector|
|US6453027 *||24 oct. 2000||17 sept. 2002||Gyu-Hwan Kang||Multifunction keyboard|
|US6486784 *||24 nov. 1998||26 nov. 2002||Fabien Beckers||Process and system enabling the blind or partially sighted to find their bearings and their way in an unknown environment|
|US6496111 *||29 sept. 2000||17 déc. 2002||Ray N. Hosack||Personal security system|
|US6553262 *||17 sept. 1999||22 avr. 2003||Biotronik Mess-Und Therapiegerate Gmbh & Co. Ingenieurburo Berlin||Arrangement for patient monitoring|
|US6618683 *||12 déc. 2000||9 sept. 2003||International Business Machines Corporation||Method and apparatus for calibrating an accelerometer-based navigation system|
|US6671226 *||31 mai 2002||30 déc. 2003||Arizona Board Of Regents||Ultrasonic path guidance for visually impaired|
|US6774788 *||7 oct. 2002||10 août 2004||Thomas J. Balfe||Navigation device for use by the visually impaired|
|US20010046862 *||8 févr. 2001||29 nov. 2001||Netfrontier, Inc.||Communication systems, components, and methods operative with programmable wireless devices|
|US20020060243 *||13 nov. 2001||23 mai 2002||Janiak Martin J.||Biometric authentication device for use in mobile telecommunications|
|US20020127145 *||5 janv. 2001||12 sept. 2002||Viken Der Ghazarian||RF breathalyzer|
|US20020128864 *||6 mars 2001||12 sept. 2002||Maus Christopher T.||Computerized information processing and retrieval system|
|US20030009088 *||4 avr. 2002||9 janv. 2003||Uwe Korth||Monitoring system for patients|
|US20030013507 *||9 juil. 2002||16 janv. 2003||Hideki Sato||Portable electronic apparatus with azimuth measuring function, magnetic sensor suitable for the apparatus, and azimuth measuring method for the apparatus|
|US20030063776 *||17 sept. 2002||3 avr. 2003||Shigemi Sato||Walking auxiliary for person with impaired vision|
|US20030064732 *||28 sept. 2001||3 avr. 2003||Agere Systems Inc.||Proximity regulation system for use with a portable cell phone and a method of operation thereof|
|US20030064749 *||5 mars 2002||3 avr. 2003||Nokia Corporation||Mobile telephone featuring accelerated ambient temperature measurement module|
|US20030083020 *||30 oct. 2001||1 mai 2003||Fred Langford||Telephone handset with thumb-operated tactile keypad|
|US20030179133 *||20 mars 2003||25 sept. 2003||Gilles Pepin||Wireless handheld portabel navigation system and method for visually impaired pedestrians|
|US20040068368 *||3 oct. 2003||8 avr. 2004||International Business Machines Corporation||Apparatus, system, and method for determining a user position and progress along a path|
|US20040128069 *||11 déc. 2003||1 juil. 2004||International Business Machines Corporation||Method and system for the visually impaired to navigate a route through a facility|
|US20060098089 *||12 juin 2003||11 mai 2006||Eli Sofer||Method and apparatus for a multisensor imaging and scene interpretation system to aid the visually impaired|
|Brevet citant||Date de dépôt||Date de publication||Déposant||Titre|
|US6977579 *||6 nov. 2003||20 déc. 2005||International Business Machines Corporation||Radio frequency identification aiding the visually impaired|
|US7302593 *||18 déc. 2003||27 nov. 2007||Intel Corporation||Method for remotely querying a blade server's physical location within a rack of blade servers|
|US7339523||5 déc. 2005||4 mars 2008||Honeywell International Inc.||Navigation system using radio frequency tags|
|US7375634 *||8 août 2005||20 mai 2008||Xerox Corporation||Direction signage system|
|US7584048||5 sept. 2006||1 sept. 2009||Honeywell International Inc.||Portable positioning and navigation system|
|US7698061||23 sept. 2005||13 avr. 2010||Scenera Technologies, Llc||System and method for selecting and presenting a route to a user|
|US7769542 *||18 janv. 2007||3 août 2010||Symbol Technologies, Inc.||Monitoring a location of a mobile computing device|
|US7774132||5 juil. 2006||10 août 2010||Cisco Technology, Inc.||Providing navigation directions|
|US7788032 *||14 sept. 2007||31 août 2010||Palm, Inc.||Targeting location through haptic feedback signals|
|US7822513||27 juil. 2005||26 oct. 2010||Symbol Technologies, Inc.||System and method for monitoring a mobile computing product/arrangement|
|US7864991 *||6 avr. 2007||4 janv. 2011||Espre Solutions Inc.||System and method for assisting a visually impaired individual|
|US7991544||1 mars 2010||2 août 2011||Scenera Technologies, Llc||System and method for selecting and presenting a route to a user|
|US7991548||2 mars 2010||2 août 2011||Scenera Technologies, Llc||System and method for presenting a computed route|
|US8094937 *||17 avr. 2007||10 janv. 2012||Avago Technologies Ecbu Ip (Singapore) Pte. Ltd.||System and method for labeling feature clusters in frames of image data for optical navigation|
|US8285475 *||2 mars 2007||9 oct. 2012||Avago Technologies Ecbu Ip (Singapore) Pte. Ltd.||Combined beacon and scene navigation system|
|US8483956 *||5 févr. 2010||9 juil. 2013||Shenzhen Futaihong Precision Industry Co., Ltd.||Portable electronic device with guide function|
|US8577598||22 mars 2011||5 nov. 2013||Scenera Technologies, Llc||System and method for presenting a computed route|
|US8581747 *||10 sept. 2009||12 nov. 2013||Fujitsu Limited||Pedestrian support system|
|US8589064||11 juil. 2011||19 nov. 2013||Scenera Technologies, Llc||System and method for selecting and presenting a route to a user|
|US8594742||21 juin 2006||26 nov. 2013||Symbol Technologies, Inc.||System and method for monitoring a mobile device|
|US8620532||25 mars 2010||31 déc. 2013||Waldeck Technology, Llc||Passive crowd-sourced map updates and alternate route recommendations|
|US8654933 *||31 oct. 2007||18 févr. 2014||Nuance Communications, Inc.||Mass-scale, user-independent, device-independent, voice messaging system|
|US8682304||26 janv. 2007||25 mars 2014||Nuance Communications, Inc.||Method of providing voicemails to a wireless information device|
|US8688375||31 mai 2007||1 avr. 2014||Trx Systems, Inc.||Method and system for locating and monitoring first responders|
|US8696357 *||1 août 2012||15 avr. 2014||Thieab AlDossary||Tactile communication apparatus, method, and computer program product|
|US8706414||14 sept. 2012||22 avr. 2014||Trx Systems, Inc.||Method and system for locating and monitoring first responders|
|US8710966 *||28 févr. 2011||29 avr. 2014||Blackberry Limited||Methods and apparatus to provide haptic feedback|
|US8712686||21 nov. 2011||29 avr. 2014||Trx Systems, Inc.||System and method for locating, tracking, and/or monitoring the status of personnel and/or assets both indoors and outdoors|
|US8750463||31 oct. 2007||10 juin 2014||Nuance Communications, Inc.||Mass-scale, user-independent, device-independent voice messaging system|
|US8773260||29 avr. 2011||8 juil. 2014||Symbol Technologies, Inc.||System and method for monitoring a mobile computing product/arrangement|
|US8903053||12 févr. 2007||2 déc. 2014||Nuance Communications, Inc.||Mass-scale, user-independent, device-independent voice messaging system|
|US8934611||31 oct. 2007||13 janv. 2015||Nuance Communications, Inc.||Mass-scale, user-independent, device-independent voice messaging system|
|US8953753||31 oct. 2007||10 févr. 2015||Nuance Communications, Inc.||Mass-scale, user-independent, device-independent voice messaging system|
|US8965688||14 sept. 2012||24 févr. 2015||Trx Systems, Inc.||System and method for locating, tracking, and/or monitoring the status of personnel and/or assets both indoors and outdoors|
|US8976944||31 oct. 2007||10 mars 2015||Nuance Communications, Inc.||Mass-scale, user-independent, device-independent voice messaging system|
|US8989713||22 nov. 2011||24 mars 2015||Nuance Communications, Inc.||Selection of a link in a received message for speaking reply, which is converted into text form for delivery|
|US8989785||26 janv. 2007||24 mars 2015||Nuance Communications, Inc.||Method of providing voicemails to a wireless information device|
|US9008962||14 sept. 2012||14 avr. 2015||Trx Systems, Inc.||System and method for locating, tracking, and/or monitoring the status of personnel and/or assets both indoors and outdoors|
|US9014974 *||16 oct. 2012||21 avr. 2015||Qualcomm, Incorporated||Predictive scheduling of navigation tasks|
|US9020523||2 sept. 2011||28 avr. 2015||Qualcomm Incorporated||Position estimating for a mobile device|
|US9046373||14 sept. 2012||2 juin 2015||Trx Systems, Inc.|
|US9140566||20 déc. 2013||22 sept. 2015||Waldeck Technology, Llc||Passive crowd-sourced map updates and alternative route recommendations|
|US20050099306 *||6 nov. 2003||12 mai 2005||International Business Machines Corporation||Radio frequency identification aiding the visually impaired|
|US20050138439 *||18 déc. 2003||23 juin 2005||Rothman Michael A.||Remote query of a blade server's physical location|
|US20060045056 *||31 août 2004||2 mars 2006||O'hara Robert B Jr||Border access point protocol facilitating wireless client macro-mobility|
|US20060134586 *||29 nov. 2005||22 juin 2006||International Business Machines Corporation||Tactile interface system|
|US20060287816 *||17 juin 2005||21 déc. 2006||Bardsley Jeffrey S||Methods, systems, and computer program products for indicating a return route in a mobile device|
|US20070027585 *||27 juil. 2005||1 févr. 2007||Thomas Wulff||System and method for monitoring a mobile computing product/arrangement|
|US20070030152 *||8 août 2005||8 févr. 2007||Xerox Corporation||Direction signage system|
|US20090322566 *||10 sept. 2009||31 déc. 2009||Fujitsu Limited||Pedestrian Support System|
|US20100168999 *||15 déc. 2009||1 juil. 2010||Fujitsu Limited||Computer readable medium for storing information display program, information display apparatus and information display method|
|US20100292923 *||5 févr. 2010||18 nov. 2010||Shenzhen Futaihong Precision Industry Co., Ltd.||Portable electronic device with guide function|
|US20110270654 *||3 nov. 2011||Arup Banerjee||Pedestrian Mapping System|
|US20110307172 *||15 déc. 2011||Tata Consultancy Services Limited||Hand-held navigation aid for individuals with visual impairment|
|US20120059582 *||24 nov. 2010||8 mars 2012||Daniel Isaac S||System and method of locating a structure in large spaces|
|US20120116672 *||10 nov. 2010||10 mai 2012||Qualcomm Incorporated||Haptic based personal navigation|
|US20120218089 *||28 févr. 2011||30 août 2012||Thomas Casey Hill||Methods and apparatus to provide haptic feedback|
|US20130218456 *||23 août 2012||22 août 2013||John S. Zelek||Wearable tactile navigation system|
|US20130231127 *||1 mars 2012||5 sept. 2013||Nokia Corporation||Method and apparatus for receiving user estimation of navigational instructions|
|US20140340298 *||14 mai 2013||20 nov. 2014||Thieab AlDossary||Tactile communication apparatus|
|DE102007016912A1 *||10 avr. 2007||30 oct. 2008||Offis E.V.||Portable orientation system for guiding person to e.g. museum, has control unit producing control signals to control signal transmitters, where transmitters are individually controllable and output signals of variable intensity|
|DE102007016912B4 *||10 avr. 2007||13 sept. 2012||Offis E.V.||Tragbares Orientierungssystem zum Leiten einer Person, Verfahren und Computerprogramm|
|EP1705459A2 *||22 déc. 2005||27 sept. 2006||Deutsche Forschungsanstalt für Luft- und Raumfahrt e.V.||System for providing navigation information for persons inside and outside buildings|
|EP1788357A1 *||16 nov. 2005||23 mai 2007||Alcatel Lucent||System for locating pedestrian user|
|EP1930742A1 *||30 nov. 2007||11 juin 2008||Jörn Peters||Navigation device and navigation system for blind and visually impaired people and a navigation method|
|EP2022456A2 *||1 juil. 2008||11 févr. 2009||Robert Bosch Gmbh||Portable navigation device|
|WO2007072389A1 *||18 déc. 2006||28 juin 2007||Koninkl Philips Electronics Nv||A guiding device for guiding inside buildings, such as hospitals|
|WO2008005052A2 *||6 déc. 2006||10 janv. 2008||Cisco Tech Inc||Providing navigation directions|
|WO2008015375A1 *||13 juil. 2007||7 févr. 2008||Guide Dogs For The Blind Ass||Assistance device for blind and partially sighted people|
|WO2008061539A1 *||20 nov. 2006||29 mai 2008||Nokia Corp||Method and apparatus for communicating navigational instructions to a user|
|WO2011072169A2 *||9 déc. 2010||16 juin 2011||Qualcomm Incorporated||Method and apparatus for reducing instructions in an indoor navigation environment|
|Classification aux États-Unis||701/532, 340/995.19|
|Classification internationale||A61B5/11, A61B5/0205, H04M1/247, G05B19/042, G08B25/08, G08C17/00, H04L12/28, A61B5/024, G05B19/00, G06Q50/00, A61B5/08, A61B5/053, G08B25/01, H04B7/00, G08B27/00, G06F15/16, G08B25/00, G08B1/08, G08C17/02, G06F19/00, A61B5/00, H04M1/725, G01C21/34, G01C21/20, G01C21/00|
|Classification coopérative||A61B5/0062, A61B5/1113, G08B21/04, G08B21/028, G08C2201/51, G08B25/016, G08B25/008, G01C21/20, G08B13/19684, G08B13/19658, G06Q10/10, G08C2201/41, G06F19/3406, G08B21/0275, G08B21/025, A61B5/7475, G08C2201/61, A61B2560/0468, G05B19/042, G06Q90/00, H04M1/72533, H04M1/72566, G08C17/02, G08B21/0283, G08B21/0263, G08C2201/93, H04M1/7253, G01C21/005, H04M1/72569, A61B5/024, G06Q50/00, H04M2250/12, G06F15/16, G08B27/006, G08B25/009, H04L12/282, A61B2560/0242, G08C2201/42, A61B5/411, G08B13/19697, A61B5/0816, H04M1/72572, G08C2201/50, A61B5/0531, A61B5/02055, H04L12/2803, H04L12/2836, G08C17/00, A61B5/1112, H04M1/72588, G06F19/3418, G06Q10/109, A61B5/0022, A61B5/002, A61B2503/08|
|Classification européenne||A61B5/11N, G08B21/02A25, G06Q90/00, G08B13/196U3, H04M1/725F1B1, A61B5/74M, H04L12/28H5C, G06F19/34A, A61B5/11M, G06Q50/00, G06F15/16, G08B21/02A23, G08B25/01D, G08B27/00P, G05B19/042, G08B25/00P, H04M1/725F5, A61B5/00P1, H04L12/28H3B, A61B5/0205B, H04M1/725F2C, G08B21/02A19, G08C17/00, G08B21/04, G08B21/02A11G, G08B13/196N2, G06Q10/109, A61B5/00B, G08B13/196Y, G06Q10/10, H04M1/725F1B2, H04M1/725F2G, H04L12/28H, G08B25/00S, G08C17/02, A61B5/41B, G08B21/02A25T, G01C21/20, G01C21/00C|
|22 nov. 2004||AS||Assignment|
Owner name: UNIVERSITY OF FLORIDA RESEARCH FOUNDATION, INC., F
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HELAL, ABDELSALAM A.;MOORE, STEVEN E.;RAMACHANDRAN, BALAJI;AND OTHERS;REEL/FRAME:015384/0819;SIGNING DATES FROM 20041014 TO 20041026