US20120310531A1 - Navigation system employing augmented labeling and/or indicia - Google Patents
Navigation system employing augmented labeling and/or indicia Download PDFInfo
- Publication number
- US20120310531A1 US20120310531A1 US13/183,613 US201113183613A US2012310531A1 US 20120310531 A1 US20120310531 A1 US 20120310531A1 US 201113183613 A US201113183613 A US 201113183613A US 2012310531 A1 US2012310531 A1 US 2012310531A1
- Authority
- US
- United States
- Prior art keywords
- user
- parameter
- vehicle
- location
- information corresponding
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W72/00—Local resource management
- H04W72/50—Allocation or scheduling criteria for wireless resources
- H04W72/56—Allocation or scheduling criteria for wireless resources based on priority criteria
- H04W72/566—Allocation or scheduling criteria for wireless resources based on priority criteria of the information or information source or recipient
- H04W72/569—Allocation or scheduling criteria for wireless resources based on priority criteria of the information or information source or recipient of the traffic information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W72/00—Local resource management
- H04W72/50—Allocation or scheduling criteria for wireless resources
- H04W72/52—Allocation or scheduling criteria for wireless resources based on load
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W72/00—Local resource management
- H04W72/04—Wireless resource allocation
- H04W72/044—Wireless resource allocation based on the type of the allocated resource
- H04W72/0446—Resources in time domain, e.g. slots or frames
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W72/00—Local resource management
- H04W72/20—Control channels or signalling for resource management
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W72/00—Local resource management
- H04W72/12—Wireless traffic scheduling
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W72/00—Local resource management
- H04W72/12—Wireless traffic scheduling
- H04W72/1263—Mapping of traffic onto schedule, e.g. scheduled allocation or multiplexing of flows
- H04W72/1268—Mapping of traffic onto schedule, e.g. scheduled allocation or multiplexing of flows of uplink data flows
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D30/00—Reducing energy consumption in communication networks
- Y02D30/70—Reducing energy consumption in communication networks in wireless communication networks
Definitions
- the invention relates generally to navigation systems; and, more particularly, it relates to providing effective, timely, and detailed information to a user of such navigation systems.
- navigation systems have been of interest for a variety of applications.
- many vehicles include some form of navigational system to provide directions to a driver of a vehicle thereby assisting the driver to reach an endpoint destination of interest.
- typical prior art vehicular and other navigation systems often fail to provide adequate directions to user.
- such navigation systems will often fail to provide adequate directions with sufficient specificity within impending moments before which some type of route adjustments must be made. For example, while driving a vehicle, an indication that a driver should make a particular turn onto a particular road is often so soon before that route adjustments must be made that the driver does not have time to make the proper route adjustment.
- such an instruction may be provided by the navigation system when the vehicle is within a proximity that is too close to an intersection at which a driver should turn. That is to say, the instruction may be provided too late, and the driver may not be able to position the vehicle into the proper position or lane to make the appropriate route adjustments safely.
- prior art navigation systems typically are implemented to provide such directions to a user in accordance with some form of animated depiction of the physical environment. That is to say a typical prior art navigation system will often times include cartoonlike animation that is representative of the physical environment. As the reader will understand, such animation is often times or inherently a non-accurate and imperfect representation of the physical environment. This also may result in a delay or problem for the user of the navigation system in trying to translate the animated type depiction of the physical environment to the actual physical environment. This also introduces a latency and delay which can result in a sacrifice of safety most importantly and also lead to frustration and delays for the user secondarily.
- the driver may fail to notice hazards, people, etc. within the actual physical environment and may unfortunately place him or herself or others in a detrimental position.
- FIG. 1 and FIG. 2 are diagrams illustrating various embodiments of communication systems.
- FIG. 3 is a diagram illustrating an alternative embodiment of a wireless communication system.
- FIG. 4 is a diagram illustrating an embodiment of a wireless communication device.
- FIG. 5 is a diagram illustrating an alternative embodiment of a wireless communication device.
- FIG. 6 is a diagram illustrating an embodiment of a navigation system with a display or projection system in a vehicular context.
- FIG. 7 is a diagram illustrating an embodiment of a navigation system operative to include overlays/directions within an in-dash and/or in-steering wheel display.
- FIG. 8 is a diagram illustrating an embodiment of a vehicle including at least one projector for effectuating a display or projection of information.
- FIG. 9A is a diagram illustrating an embodiment of a vehicle including at least one antenna for supporting wireless communications with at least one wireless communication system and/or network.
- FIG. 9B is a diagram illustrating an embodiment of a vehicle including at least one camera for supporting acquisition of image and/or video information.
- FIG. 10 is a diagram illustrating an embodiment of a navigation system operative to include overlays/directions within a wireless communication device.
- FIG. 11A is a diagram illustrating an embodiment of navigation system operative to include overlays/directions within a headset and/or eyeglasses context.
- FIG. 11B is a diagram illustrating an embodiment of navigation system operative to include overlays/directions within a headset and/or eyeglasses context in conjunction with a wireless communication device.
- FIG. 12 is a diagram illustrating an embodiment of various types of navigation systems, implemented to support wireless communications, entering/exiting various wireless communication systems and/or networks.
- FIG. 13 is a diagram illustrating an embodiment of at least two projectors operative in accordance with a lenticular surface for effectuating three dimensional (3-D) display.
- FIG. 14 is a diagram illustrating an embodiment of at least two projectors operative in accordance with a non-uniform surface for effectuating 3-D display.
- FIG. 15 is a diagram illustrating an embodiment of an organic light emitting diode (LED) as may be implemented in accordance with various applications for effectuating 3-D display.
- LED organic light emitting diode
- FIG. 16A , FIG. 16B , FIG. 16C , FIG. 17A , FIG. 17B , FIG. 17C , FIG. 18A , FIG. 18B , FIG. 19A , and FIG. 19B illustrate various embodiment of methods as may be performed in accordance with operation of various devices such as various wireless communication devices and/or various navigations systems.
- FIG. 20 is a diagram illustrating an embodiment of a vehicle including an audio system for effectuating directional audio indication.
- FIG. 21 is a diagram illustrating an embodiment of a navigation system with a display or projection system in a vehicular context, and particularly including at least one localized region of the field of vision in which labeling and/or indicia is provided.
- signals are transmitted between various communication devices therein.
- the goal of digital communications systems is to transmit digital data from one location, or subsystem, to another either error free or with an acceptably low error rate.
- data may be transmitted over a variety of communications channels in a wide variety of communication systems: magnetic media, wired, wireless, fiber, copper, and other types of media as well.
- FIG. 1 and FIG. 2 are diagrams illustrating various embodiments of communication systems, 100 , and 200 , respectively.
- this embodiment of a communication system 100 is a communication channel 199 that communicatively couples a communication device 110 (including a transmitter 112 having an encoder 114 and including a receiver 116 having a decoder 118 ) situated at one end of the communication channel 199 to another communication device 120 (including a transmitter 126 having an encoder 128 and including a receiver 122 having a decoder 124 ) at the other end of the communication channel 199 .
- either of the communication devices 110 and 120 may only include a transmitter or a receiver.
- the communication channel 199 may be implemented (e.g., a satellite communication channel 130 using satellite dishes 132 and 134 , a wireless communication channel 140 using towers 142 and 144 and/or local antennae 152 and 154 , a wired communication channel 150 , and/or a fiber-optic communication channel 160 using electrical to optical (E/O) interface 162 and optical to electrical (O/E) interface 164 )).
- a satellite communication channel 130 using satellite dishes 132 and 134 e.g., a satellite communication channel 130 using satellite dishes 132 and 134 , a wireless communication channel 140 using towers 142 and 144 and/or local antennae 152 and 154 , a wired communication channel 150 , and/or a fiber-optic communication channel 160 using electrical to optical (E/O) interface 162 and optical to electrical (O/E) interface 164 )
- E/O electrical to optical
- O/E optical to electrical
- error correction and channel coding schemes are often employed.
- these error correction and channel coding schemes involve the use of an encoder at the transmitter end of the communication channel 199 and a decoder at the receiver end of the communication channel 199 .
- ECC codes described can be employed within any such desired communication system (e.g., including those variations described with respect to FIG. 1 ), any information storage device (e.g., hard disk drives (HDDs), network information storage devices and/or servers, etc.) or any application in which information encoding and/or decoding is desired.
- any information storage device e.g., hard disk drives (HDDs), network information storage devices and/or servers, etc.
- any application in which information encoding and/or decoding is desired.
- video data encoding may generally be viewed as being performed at a transmitting end of the communication channel 199
- video data decoding may generally be viewed as being performed at a receiving end of the communication channel 199 .
- the communication device 110 may include only video data encoding capability
- the communication device 120 may include only video data decoding capability, or vice versa (e.g., in a uni-directional communication embodiment such as in accordance with a video broadcast embodiment).
- information bits 201 are provided to a transmitter 297 that is operable to perform encoding of these information bits 201 using an encoder and symbol mapper 220 (which may be viewed as being distinct functional blocks 222 and 224 , respectively) thereby generating a sequence of discrete-valued modulation symbols 203 that is provided to a transmit driver 230 that uses a DAC (Digital to Analog Converter) 232 to generate a continuous-time transmit signal 204 and a transmit filter 234 to generate a filtered, continuous-time transmit signal 205 that substantially comports with the communication channel 299 .
- DAC Digital to Analog Converter
- continuous-time receive signal 206 is provided to an AFE (Analog Front End) 260 that includes a receive filter 262 (that generates a filtered, continuous-time receive signal 207 ) and an ADC (Analog to Digital Converter) 264 (that generates discrete-time receive signals 208 ).
- a metric generator 270 calculates metrics 209 (e.g., on either a symbol and/or bit basis) that are employed by a decoder 280 to make best estimates of the discrete-valued modulation symbols and information bits encoded therein 210 .
- this diagram shows a processing module 280 a as including the encoder and symbol mapper 220 and all associated, corresponding components therein, and a processing module 280 is shown as including the metric generator 270 and the decoder 280 and all associated, corresponding components therein.
- processing modules 280 a and 280 b may be respective integrated circuits.
- other boundaries and groupings may alternatively be performed without departing from the scope and spirit of the invention.
- all components within the transmitter 297 may be included within a first processing module or integrated circuit, and all components within the receiver 298 may be included within a second processing module or integrated circuit.
- any other combination of components within each of the transmitter 297 and the receiver 298 may be made in other embodiments.
- such a communication system 200 may be employed for the communication of video data is communicated from one location, or subsystem, to another (e.g., from transmitter 297 to the receiver 298 via the communication channel 299 ).
- FIG. 3 is a diagram illustrating an embodiment of a wireless communication system 300 .
- the wireless communication system 300 includes a plurality of base stations and/or access points 312 , 316 , a plurality of wireless communication devices 318 - 332 and a network hardware component 334 .
- the network hardware 334 which may be a router, switch, bridge, modem, system controller, etc., provides a wide area network connection 342 for the communication system 300 .
- the wireless communication devices 318 - 332 may be laptop host computers 318 and 326 , personal digital assistant hosts 320 and 330 , personal computer hosts 324 and 332 and/or cellular telephone hosts 322 and 328 .
- Wireless communication devices 322 , 323 , and 324 are located within an independent basic service set (IBSS) area and communicate directly (i.e., point to point). In this configuration, these devices 322 , 323 , and 324 may only communicate with each other. To communicate with other wireless communication devices within the system 300 or to communicate outside of the system 300 , the devices 322 , 323 , and/or 324 need to affiliate with one of the base stations or access points 312 or 316 .
- IBSS independent basic service set
- the base stations or access points 312 , 316 are located within basic service set (BSS) areas 311 and 313 , respectively, and are operably coupled to the network hardware 334 via local area network connections 336 , 338 . Such a connection provides the base station or access point 312 - 316 with connectivity to other devices within the system 300 and provides connectivity to other networks via the WAN connection 342 .
- BSS basic service set
- each of the base stations or access points 312 - 116 has an associated antenna or antenna array.
- base station or access point 312 wirelessly communicates with wireless communication devices 318 and 320 while base station or access point 316 wirelessly communicates with wireless communication devices 326 - 332 .
- the wireless communication devices register with a particular base station or access point 312 , 316 to receive services from the communication system 300 .
- base stations are used for cellular telephone systems (e.g., advanced mobile phone services (AMPS), digital AMPS, global system for mobile communications (GSM), code division multiple access (CDMA), local multi-point distribution systems (LMDS), multi-channel-multi-point distribution systems (MMDS), Enhanced Data rates for GSM Evolution (EDGE), General Packet Radio Service (GPRS), high-speed downlink packet access (HSDPA), high-speed uplink packet access (HSUPA and/or variations thereof) and like-type systems, while access points are used for in-home or in-building wireless networks (e.g., IEEE 802.11, Bluetooth, ZigBee, any other type of radio frequency based network protocol and/or variations thereof).
- each wireless communication device includes a built-in radio and/or is coupled to a radio.
- FIG. 4 is a diagram illustrating an embodiment 300 of a wireless communication device that includes the host device 318 - 332 and an associated radio 460 .
- the radio 460 is a built-in component.
- the radio 460 may be built-in or an externally coupled component.
- the host device 318 - 332 includes a processing module 450 , memory 452 , a radio interface 454 , an input interface 458 , and an output interface 456 .
- the processing module 450 and memory 452 execute the corresponding instructions that are typically done by the host device. For example, for a cellular telephone host device, the processing module 450 performs the corresponding communication functions in accordance with a particular cellular telephone standard.
- the radio interface 454 allows data to be received from and sent to the radio 460 .
- the radio interface 454 provides the data to the processing module 450 for further processing and/or routing to the output interface 456 .
- the output interface 456 provides connectivity to an output display device such as a display, monitor, speakers, etc., such that the received data may be displayed.
- the radio interface 454 also provides data from the processing module 450 to the radio 460 .
- the processing module 450 may receive the outbound data from an input device such as a keyboard, keypad, microphone, etc., via the input interface 458 or generate the data itself.
- the processing module 450 may perform a corresponding host function on the data and/or route it to the radio 460 via the radio interface 454 .
- Radio 460 includes a host interface 462 , digital receiver processing module 464 , an analog-to-digital converter 466 , a high pass and low pass filter module 468 , an IF mixing down conversion stage 470 , a receiver filter 471 , a low noise amplifier 472 , a transmitter/receiver switch 473 , a local oscillation module 474 (which may be implemented, at least in part, using a voltage controlled oscillator (VCO)), memory 475 , a digital transmitter processing module 476 , a digital-to-analog converter 478 , a filtering/gain module 480 , an IF mixing up conversion stage 482 , a power amplifier 484 , a transmitter filter module 485 , a channel bandwidth adjust module 487 , and an antenna 486 .
- VCO voltage controlled oscillator
- the antenna 486 may be a single antenna that is shared by the transmit and receive paths as regulated by the Tx/Rx switch 473 , or may include separate antennas for the transmit path and receive path.
- the antenna implementation will depend on the particular standard to which the wireless communication device is compliant.
- the digital receiver processing module 464 and the digital transmitter processing module 476 in combination with operational instructions stored in memory 475 , execute digital receiver functions and digital transmitter functions, respectively.
- the digital receiver functions include, but are not limited to, digital intermediate frequency to baseband conversion, demodulation, constellation demapping, decoding, and/or descrambling.
- the digital transmitter functions include, but are not limited to, scrambling, encoding, constellation mapping, modulation, and/or digital baseband to IF conversion.
- the digital receiver and transmitter processing modules 464 and 476 may be implemented using a shared processing device, individual processing devices, or a plurality of processing devices.
- Such a processing device may be a microprocessor, micro-controller, digital signal processor, microcomputer, central processing unit, field programmable gate array, programmable logic device, state machine, logic circuitry, analog circuitry, digital circuitry, and/or any device that manipulates signals (analog and/or digital) based on operational instructions.
- the memory 475 may be a single memory device or a plurality of memory devices.
- Such a memory device may be a read-only memory, random access memory, volatile memory, non-volatile memory, static memory, dynamic memory, flash memory, and/or any device that stores digital information.
- the processing module 464 and/or 476 implements one or more of its functions via a state machine, analog circuitry, digital circuitry, and/or logic circuitry
- the memory storing the corresponding operational instructions is embedded with the circuitry comprising the state machine, analog circuitry, digital circuitry, and/or logic circuitry.
- the radio 460 receives outbound data 494 from the host device via the host interface 462 .
- the host interface 462 routes the outbound data 494 to the digital transmitter processing module 476 , which processes the outbound data 494 in accordance with a particular wireless communication standard (e.g., IEEE 802.11, Bluetooth, ZigBee, WiMAX (Worldwide Interoperability for Microwave Access), any other type of radio frequency based network protocol and/or variations thereof etc.) to produce outbound baseband signals 496 .
- a particular wireless communication standard e.g., IEEE 802.11, Bluetooth, ZigBee, WiMAX (Worldwide Interoperability for Microwave Access), any other type of radio frequency based network protocol and/or variations thereof etc.
- the outbound baseband signals 496 will be digital base-band signals (e.g., have a zero IF) or digital low IF signals, where the low IF typically will be in the frequency range of one hundred kHz (kilo-Hertz) to a few MHz (Mega-Hertz).
- the digital-to-analog converter 478 converts the outbound baseband signals 496 from the digital domain to the analog domain.
- the filtering/gain module 480 filters and/or adjusts the gain of the analog signals prior to providing it to the IF mixing stage 482 .
- the IF mixing stage 482 converts the analog baseband or low IF signals into RF signals based on a transmitter local oscillation 483 provided by local oscillation module 474 .
- the power amplifier 484 amplifies the RF signals to produce outbound RF signals 498 , which are filtered by the transmitter filter module 485 .
- the antenna 486 transmits the outbound RF signals 498 to a targeted device such as a base station, an access point and/or another wireless communication device.
- the radio 460 also receives inbound RF signals 488 via the antenna 486 , which were transmitted by a base station, an access point, or another wireless communication device.
- the antenna 486 provides the inbound RF signals 488 to the receiver filter module 471 via the Tx/Rx switch 473 , where the Rx filter 471 bandpass filters the inbound RF signals 488 .
- the Rx filter 471 provides the filtered RF signals to low noise amplifier 472 , which amplifies the signals 488 to produce an amplified inbound RF signals.
- the low noise amplifier 472 provides the amplified inbound RF signals to the IF mixing module 470 , which directly converts the amplified inbound RF signals into an inbound low IF signals or baseband signals based on a receiver local oscillation 481 provided by local oscillation module 474 .
- the down conversion module 470 provides the inbound low IF signals or baseband signals to the filtering/gain module 468 .
- the high pass and low pass filter module 468 filters, based on settings provided by the channel bandwidth adjust module 487 , the inbound low IF signals or the inbound baseband signals to produce filtered inbound signals.
- the analog-to-digital converter 466 converts the filtered inbound signals from the analog domain to the digital domain to produce inbound baseband signals 490 , where the inbound baseband signals 490 will be digital base-band signals or digital low IF signals, where the low IF typically will be in the frequency range of one hundred kHz to a few MHz.
- the digital receiver processing module 464 based on settings provided by the channel bandwidth adjust module 487 , decodes, descrambles, demaps, and/or demodulates the inbound baseband signals 490 to recapture inbound data 492 in accordance with the particular wireless communication standard being implemented by radio 460 .
- the host interface 462 provides the recaptured inbound data 492 to the host device 318 - 332 via the radio interface 454 .
- the wireless communication device of the embodiment 400 of FIG. 4 may be implemented using one or more integrated circuits.
- the host device may be implemented on one integrated circuit
- the digital receiver processing module 464 , the digital transmitter processing module 476 and memory 475 may be implemented on a second integrated circuit
- the remaining components of the radio 460 , less the antenna 486 may be implemented on a third integrated circuit.
- the radio 460 may be implemented on a single integrated circuit.
- the processing module 450 of the host device and the digital receiver and transmitter processing modules 464 and 476 may be a common processing device implemented on a single integrated circuit.
- the memory 452 and memory 475 may be implemented on a single integrated circuit and/or on the same integrated circuit as the common processing modules of processing module 450 and the digital receiver and transmitter processing module 464 and 476 .
- any of the various embodiments of communication device that may be implemented within various communication systems can incorporate functionality to perform communication via more than one standard, protocol, or other predetermined means of communication.
- a single communication device designed in accordance with certain aspects of the invention, can include functionality to perform communication in accordance with a first protocol, a second protocol, and/or a third protocol, and so on.
- WiMAX Worldwide Interoperability for Microwave Access
- WLAN/WiFi wireless local area network
- 802.11 protocols such as 802.11a, 802.11b, 802.11g, 802.11n, 802.11ac, etc.
- Bluetooth protocol or any other predetermined means by which wireless communication may be effectuated.
- FIG. 5 is a diagram illustrating an alternative embodiment of a wireless communication device that includes the host device 318 - 332 and an associated at least one radio 560 .
- the radio 560 is a built-in component.
- the radio 560 may be built-in or an externally coupled component.
- the components are typically housed in a single structure.
- the host device 318 - 332 includes a processing module 550 , memory 552 , radio interface 554 , input interface 558 and output interface 556 .
- the processing module 550 and memory 552 execute the corresponding instructions that are typically done by the host device. For example, for a cellular telephone host device, the processing module 550 performs the corresponding communication functions in accordance with a particular cellular telephone standard.
- the radio interface 554 allows data to be received from and sent to the radio 560 .
- the radio interface 554 provides the data to the processing module 550 for further processing and/or routing to the output interface 556 .
- the output interface 556 provides connectivity to an output display device such as a display, monitor, speakers, et cetera such that the received data may be displayed.
- the radio interface 554 also provides data from the processing module 550 to the radio 560 .
- the processing module 550 may receive the outbound data from an input device such as a keyboard, keypad, microphone, et cetera via the input interface 558 or generate the data itself.
- the processing module 550 may perform a corresponding host function on the data and/or route it to the radio 560 via the radio interface 554 .
- Radio 560 includes a host interface 562 , a baseband processing module 564 , memory 566 , a plurality of radio frequency (RF) transmitters 568 - 372 , a transmit/receive (T/R) module 574 , a plurality of antennae 582 - 386 , a plurality of RF receivers 576 - 380 , and a local oscillation module 5100 (which may be implemented, at least in part, using a VCO).
- the baseband processing module 564 in combination with operational instructions stored in memory 566 , execute digital receiver functions and digital transmitter functions, respectively.
- the digital receiver functions include, but are not limited to, digital intermediate frequency to baseband conversion, demodulation, constellation demapping, decoding, de-interleaving, fast Fourier transform, cyclic prefix removal, space and time decoding, and/or descrambling.
- the digital transmitter functions include, but are not limited to, scrambling, encoding, interleaving, constellation mapping, modulation, inverse fast Fourier transform, cyclic prefix addition, space and time encoding, and/or digital baseband to IF conversion.
- the baseband processing modules 564 may be implemented using one or more processing devices.
- Such a processing device may be a microprocessor, micro-controller, digital signal processor, microcomputer, central processing unit, field programmable gate array, programmable logic device, state machine, logic circuitry, analog circuitry, digital circuitry, and/or any device that manipulates signals (analog and/or digital) based on operational instructions.
- the memory 566 may be a single memory device or a plurality of memory devices.
- Such a memory device may be a read-only memory, random access memory, volatile memory, non-volatile memory, static memory, dynamic memory, flash memory, and/or any device that stores digital information.
- the processing module 564 implements one or more of its functions via a state machine, analog circuitry, digital circuitry, and/or logic circuitry
- the memory storing the corresponding operational instructions is embedded with the circuitry comprising the state machine, analog circuitry, digital circuitry, and/or logic circuitry.
- the radio 560 receives outbound data 588 from the host device via the host interface 562 .
- the baseband processing module 564 receives the outbound data 588 and, based on a mode selection signal 5102 , produces one or more outbound symbol streams 590 .
- the mode selection signal 5102 will indicate a particular mode as are illustrated in the mode selection tables, which appear at the end of the detailed discussion. Such operation as described herein is exemplary with respect to at least one possible embodiment, and it is of course noted that the various aspects and principles, and their equivalents, of the invention may be extended to other embodiments without departing from the scope and spirit of the invention.
- the mode selection signal 5102 may indicate a frequency band of 2.4 GHz or 5 GHz, a channel bandwidth of 20 or 22 MHz (e.g., channels of 20 or 22 MHz width) and a maximum bit rate of 54 megabits-per-second.
- the channel bandwidth may extend up to 1.28 GHz or wider with supported maximum bit rates extending to 1 gigabit-per-second or greater.
- the mode selection signal will further indicate a particular rate ranging from 1 megabit-per-second to 54 megabits-per-second.
- the mode selection signal will indicate a particular type of modulation, which includes, but is not limited to, Barker Code Modulation, BPSK, QPSK, CCK, 16 QAM and/or 64 QAM.
- a code rate is supplied as well as number of coded bits per subcarrier (NBPSC), coded bits per OFDM symbol (NCBPS), and data bits per OFDM symbol (NDBPS).
- the mode selection signal may also indicate a particular channelization for the corresponding mode which for the information in table 1 is illustrated in table 2.
- table 2 includes a channel number and corresponding center frequency.
- the mode select signal may further indicate a power spectral density mask value which for table 1 is illustrated in table 3.
- the mode select signal may alternatively indicate rates within table 4 that has a 5 GHz frequency band, 20 MHz channel bandwidth and a maximum bit rate of 54 megabits-per-second. If this is the particular mode select, the channelization is illustrated in table 5.
- the mode select signal 5102 may indicate a 2.4 GHz frequency band, 20 MHz channels and a maximum bit rate of 192 megabits-per-second as illustrated in table 6.
- Table 6 a number of antennae may be utilized to achieve the higher bit rates.
- the mode select would further indicate the number of antennae to be utilized.
- Table 7 illustrates the channelization for the set-up of table 6.
- Table 8 illustrates yet another mode option where the frequency band is 2.4 GHz, the channel bandwidth is 20 MHz and the maximum bit rate is 192 megabits-per-second.
- the corresponding table 8 includes various bit rates ranging from 12 megabits-per-second to 216 megabits-per-second utilizing 2-4 antennae and a spatial time encoding rate as indicated.
- Table 9 illustrates the channelization for table 8.
- the mode select signal 102 may further indicate a particular operating mode as illustrated in table 10, which corresponds to a 5 GHz frequency band having 40 MHz frequency band having 40 MHz channels and a maximum bit rate of 486 megabits-per-second. As shown in table 10, the bit rate may range from 13.5 megabits-per-second to 486 megabits-per-second utilizing 1-4 antennae and a corresponding spatial time code rate. Table 10 further illustrates a particular modulation scheme code rate and NBPSC values. Table 11 provides the power spectral density mask for table 10 and table 12 provides the channelization for table 10.
- the baseband processing module 564 based on the mode selection signal 5102 produces the one or more outbound symbol streams 590 from the output data 588 . For example, if the mode selection signal 5102 indicates that a single transmit antenna is being utilized for the particular mode that has been selected, the baseband processing module 564 will produce a single outbound symbol stream 590 . Alternatively, if the mode select signal indicates 2, 3 or 4 antennae, the baseband processing module 564 will produce 2, 3 or 4 outbound symbol streams 590 corresponding to the number of antennae from the output data 588 .
- a corresponding number of the RF transmitters 568 - 372 will be enabled to convert the outbound symbol streams 590 into outbound RF signals 592 .
- the transmit/receive module 574 receives the outbound RF signals 592 and provides each outbound RF signal to a corresponding antenna 582 - 386 .
- the transmit/receive module 574 receives one or more inbound RF signals via the antennae 582 - 386 .
- the T/R module 574 provides the inbound RF signals 594 to one or more RF receivers 576 - 380 .
- the RF receiver 576 - 380 converts the inbound RF signals 594 into a corresponding number of inbound symbol streams 596 .
- the number of inbound symbol streams 596 will correspond to the particular mode in which the data was received (recall that the mode may be any one of the modes illustrated in tables 1-12).
- the baseband processing module 560 receives the inbound symbol streams 590 and converts them into inbound data 598 , which is provided to the host device 318 - 332 via the host interface 562 .
- radio 560 it includes a transmitter and a receiver.
- the transmitter may include a MAC module, a PLCP module, and a PMD module.
- the Medium Access Control (MAC) module which may be implemented with the processing module 564 , is operably coupled to convert a MAC Service Data Unit (MSDU) into a MAC Protocol Data Unit (MPDU) in accordance with a WLAN protocol.
- the Physical Layer Convergence Procedure (PLCP) Module which may be implemented in the processing module 564 , is operably coupled to convert the MPDU into a PLCP Protocol Data Unit (PPDU) in accordance with the WLAN protocol.
- MAC Medium Access Control
- PLCP Physical Layer Convergence Procedure
- the Physical Medium Dependent (PMD) module is operably coupled to convert the PPDU into a plurality of radio frequency (RF) signals in accordance with one of a plurality of operating modes of the WLAN protocol, wherein the plurality of operating modes includes multiple input and multiple output combinations.
- RF radio frequency
- An embodiment of the Physical Medium Dependent (PMD) module includes an error protection module, a demultiplexing module, and a plurality of direction conversion modules.
- the error protection module which may be implemented in the processing module 564 , is operably coupled to restructure a PPDU (PLCP (Physical Layer Convergence Procedure) Protocol Data Unit) to reduce transmission errors producing error protected data.
- the demultiplexing module is operably coupled to divide the error protected data into a plurality of error protected data streams
- the plurality of direct conversion modules is operably coupled to convert the plurality of error protected data streams into a plurality of radio frequency (RF) signals.
- RF radio frequency
- the wireless communication device of this diagram may be implemented using one or more integrated circuits.
- the host device may be implemented on one integrated circuit
- the baseband processing module 564 and memory 566 may be implemented on a second integrated circuit
- the remaining components of the radio 560 less the antennae 582 - 586 , may be implemented on a third integrated circuit.
- the radio 560 may be implemented on a single integrated circuit.
- the processing module 550 of the host device and the baseband processing module 564 may be a common processing device implemented on a single integrated circuit.
- the memory 552 and memory 566 may be implemented on a single integrated circuit and/or on the same integrated circuit as the common processing modules of processing module 550 and the baseband processing module 564 .
- a wireless communication device may be constructed and implemented.
- more than one radio e.g., such as multiple instantiations of the radio 460 , the radio 560 , a combination thereof, or even another implementation of a radio
- a single wireless communication device can include multiple radios therein to effectuate simultaneous transmission of two or more signals.
- multiple radios within a wireless communication device can effectuate simultaneous reception of two or more signals, or transmission of one or more signals at the same time as reception of one or more other signals (e.g., simultaneous transmission/reception).
- wireless communication devices may generally be referred to as WDEVs, DEVs, TXs, and/or RXs. It is noted that such wireless communication devices may be wireless stations (STAs), access points (APs), or any other type of wireless communication device without departing from the scope and spirit of the invention.
- STAs wireless stations
- APs access points
- wireless communication devices that are APs may be referred to as transmitting or transmitter wireless communication devices
- wireless communication devices that are STAs may be referred to as receiving or receiver wireless communication devices in certain contexts.
- a transmitting wireless communication device e.g., such as being an AP, or a STA operating as an ‘AP’ with respect to other STAs
- initiates communications and/or operates as a network controller type of wireless communication device, with respect to a number of other, receiving wireless communication devices (e.g., such as being STAs), and the receiving wireless communication devices (e.g., such as being STAs) responding to and cooperating with the transmitting wireless communication device in supporting such communications.
- transmitting wireless communication device(s) and receiving wireless communication device(s) may be employed to differentiate the operations as performed by such different wireless communication devices within a communication system
- all such wireless communication devices within such a communication system may of course support bi-directional communications to and from other wireless communication devices within the communication system.
- the various types of transmitting wireless communication device(s) and receiving wireless communication device(s) may all support bi-directional communications to and from other wireless communication devices within the communication system.
- FIG. 6 is a diagram illustrating an embodiment 600 of a navigation system with a display or projection system in a vehicular context.
- a novel approach and architecture is presented herein by which information is provided to a driver of a vehicle within the driver's field of vision. For example, instead of providing information to a driver in a way that the driver must take his or her eyes off of the road, information related to the driving experience is provided within the driver's field of vision (e.g., so the driver need not take his or her eyes off of the road). In addition, by providing such information within the field of vision, the navigation system becomes much more effectual.
- a navigation system operating in conjunction with one or more projectors, provides information to help ensure that the driver is able to keep his or her head up while driving.
- a navigation system is implemented using overlays or directions that are included within the actual field of vision.
- a driver or user
- overlays or directions within the actual field of vision rather than on a display such as may be included within the dashboard of a vehicle.
- various feature labeling may also be included with respect to elements that are included within the field of vision. For example, a label for a building or other feature within the field of vision may be appropriately placed thereon. Street names may also be overlaid the actual streets within the field of vision.
- any of a variety of types of information related to a driving experience may similarly be displayed within the field of vision.
- the actual vehicle's location e.g., such as in accordance with a global positioning system (GPS) location
- the vehicle's relative location e.g., such as relative to a point of destination to which the vehicle is heading
- the vehicle's speed e.g., velocity
- the vehicle's directionality e.g., North, South, Southwest, etc.
- time information e.g., a clock
- chronometer information e.g., a timer
- temperature information e.g., external to the vehicle and/or internal of the vehicle
- any other additional information which may be used to enhance a driver's experience may be included within the field of vision.
- a user is given wide latitude to select which particular information is to be provided within the field of vision.
- a user interface, display, and/or control module of the vehicle a user may select any group or combination of information to be displayed within the field of vision. It is noted that such user interface, display, and/or control module may be implemented in any of a variety of ways including by using an already implemented navigation or GPS system of the vehicle.
- a user may select or identify feature labels to be included within the field of vision.
- those labels are only included within the field of vision when the vehicle is within proximity of those features. For example, when the vehicle is near a particular street, such that the user should be able to see that street, the feature label would then be overlaid the street within the actual field of vision. Analogously, when the vehicle is within a sufficient proximity of a particular feature such as a building, the label for that particular building would then be displayed within the field of vision.
- the user may also select the proximity to which various features will cause such labeling to appear.
- a user may select a proximity of X number of distance measure (e.g., where distance measure may be in meters, feet, etc. and/or any other measuring unit) to a particular feature as being the triggering event that will cause such labeling to appear within the field of vision.
- distance measure e.g., where distance measure may be in meters, feet, etc. and/or any other measuring unit
- the windshields of the vehicle may provide background on which such information is projected.
- various imaging devices such as projectors may be implemented within a vehicle and can use various translucent surfaces around the driver as the backdrop(s) or background on which images are projected for visual consumption by the driver in a vehicle or context. That is to say, the windshields and windows of the vehicle may themselves serve as the background on which such information may be projected.
- the glass or glasslike materials employed for windshields may serve as the actual background for projection of such labeling and/or indicia for consumption by the driver and/or passengers.
- labeling and/or indicia as described with respect to this embodiment, as well as with respect to other embodiments and/or diagrams, it is noted that such terminology of labeling and/or indicia may be viewed as generally referring to information that is provided to enhance the user experience.
- labeling and/or indicia may correspond to any of a variety of formats, including arrows, markers, labels, alphanumeric characters, etc. and/or any combination thereof, etc.
- such information may be provided in accordance with three-dimensional (3-D) projection such that the various labels etc. will appear as being extended out and into the field of vision.
- overlays or directions are provided by arrows indicating the path by which the driver should proceed.
- one arrow indicates to proceed forward until reaching the intersection of X Street and Y Avenue.
- a second arrow indicates that the driver should turn right on Y Avenue.
- labels or features within the field of vision e.g., labels of landmarks, buildings, stadiums, airports, parks, historical features, and/or any other features that a driver may come across, etc.
- additional information is provided to help a driver navigate in an attempt to reach his or her destination.
- Such labeling of these features may operate as guideposts or reference points that will help the driver navigate along the appropriate roads to reach his or her destination.
- labeling and/or indicia may be viewed as markers that reach out into the field of vision at perspective depths within the field of vision to highlight certain pathways, features, destinations, etc. for a driver.
- a user also has a wide degree of latitude in selecting when such labeling and/or indicia are to be provided to the user. For example, a user may select that desired information be provided at all times.
- such labeling and/or indicia may be provided when the vehicle is within certain proximity of the destination (e.g., which may be selected by the user) or only when the vehicle is to make a particular road change, lane change, turn, etc. along the path which will lead to the destination.
- labeling and/or indicia may be provided when selected by the user (e.g., when the user enables such labeling and/or indicia to be provided).
- such labeling and/or indicia may be provided within the field of vision as a function of the complexity of a particular segment or portion of the path that leads to the endpoint destination. For example, while driving along a country road in a predominantly rural environment, there may be very little need to provide very specific indicia related to the drive. As the reader will understand, while driving in a very rural environment in which there are very few roads, very few interchanges, etc., very little indication and information may need to be provided to the driver. However, while driving in a very congested, complex, urban, etc. environment, there may be a multiplicity of options by which a driver may take the wrong path.
- such labeling and/or indicia is provided within the actual field of vision and is provided with respect to the actual physical environment. That is to say, such labeling and/or indicia is not provided within an artificially reconstructed environment, but such labeling and/or indicia are within the actual field of vision of the driver.
- labeling and/or indicia is provided and tailored particularly for the frame of reference of the driver.
- labeling and/or indicia could alternatively be provided and tailored particularly with respect to the reference or any other location within the vehicle (e.g., with respect to the passenger seat next to the driver, a seat in the back of the vehicle, etc.).
- FIG. 7 is a diagram illustrating an embodiment 700 of a navigation system operative to include overlays/directions within an in-dash and/or in-steering wheel display.
- labeling and/or indicia is being instead included within actual photographic image and/or video information that is provided via a display within the vehicle.
- actual photographic image and/or video may be provided via a display within the vehicle (e.g., such as via a small screen implemented within the dashboard of the vehicle, on the steering wheel of the vehicle, etc.), and such labeling and/or indicia is included within that actual photographic image and/or video information.
- such labeling and/or indicia will correspond particularly to the actual physical environment as represented by the actual photographic image and/or video information.
- any particular information which may be operative to enhance the experience of the driver for any occupant of the vehicle may be provided within the actual photographic image and/or video information.
- photographic image and/or video information may be acquired in real-time.
- photographic image and/or video information may be retrieved beforehand, such as from a database including such photographic image and/or video information, maps, etc., and such information may be stored with in any desired memory device within the vehicle and/or the navigation system thereof.
- such information is retrieved from one or more networks (such as one or more wireless networks) whenever the vehicle is stationary/not being driven and has access to such one or more networks. This way, very recent/up to date information (or at least relatively recent/up to date) information may be employed even if not retrieved exactly in real time.
- FIG. 8 is a diagram illustrating an embodiment 800 of a vehicle including at least one projector for effectuating a display or projection of information.
- This diagram illustrates how one or more projectors may be implemented within and around the vehicle for effectuating the display of labeling and/or indicia within their field of vision of the driver or passenger of the vehicle.
- only a single projector is employed for displaying labeling and/or indicia for one or more people within the vehicle.
- two or more projectors may be employed to effectuate three-dimensional or visual stereo labeling and/or indicia within the field of vision.
- This embodiment particularly shows how various projectors may be implemented to use the windshields or windows of a vehicle as the background on which various labeling and/or indicia are projected for visual consumption by the driver or passengers of the vehicle.
- This diagram represents a top view of a generic vehicle, and the projectors may be implemented within the vehicle to ensure their protection from the elements, etc.
- various embodiments may implement the one or more projectors in entirely different positions within the vehicle. That is to say, one or more projectors may be appropriately placed to provide a better user experience for someone located within the driver position of the vehicle in one embodiment. In another embodiment, the one or more projectors may be appropriately placed to provide a better user experience for a passenger or non-driver of the vehicle.
- labeling and/or indicia may be employed to provide an enhanced experience for passengers of the vehicle. That is to say, in addition to merely providing directions and information for use by the driver of the vehicle to assist in that driver in reaching the appropriate endpoint destination, labeling and/or indicia may be provided for consumption by the passengers of the vehicle.
- One embodiment may include providing information for consumption by participants of a tour that happens to be riding along in the vehicle. For example, within a relatively large vehicle such as a van or bus, there may be a very large number of windows by which various passengers of the vehicle may have respective fields of vision. Different and respective labeling and/or indicia may be provided for selective consumption by the various passengers in the vehicle.
- a first passenger riding near the back of the vehicle may have labeling and/or indicia for a particular feature within the physical environment projected into his respective field of vision that is different than the labeling and/or indicia for that very same particular feature within the physical environment from the perspective of a second passenger riding in the vehicle.
- labeling and/or indicia may be very useful in tour guide industries such that each of the respective participants of the tour can be provided with individual and selective information regarding the physical environment through which they are traveling for greater understanding and enjoyment of that physical environment.
- FIG. 9A is a diagram illustrating an embodiment 900 of a vehicle including at least one antenna for supporting wireless communications with at least one wireless communication system and/or network.
- one or more antenna may be included within a vehicle to help effectuate wireless communications with one or more wireless communication systems and/or networks.
- the appropriate and respective means, functionality, circuitry, etc. for supporting wireless communications with any desired wireless communication systems and/or networks may be included within the vehicle (e.g., satellite, wireless local area network (WLAN) such as WiFi, wide area network (WAN) such as WiMAX, etc.).
- WLAN wireless local area network
- WAN wide area network
- WiMAX WiMAX
- a navigation system of the vehicle may receive information from any wireless communication system and/or network with which the vehicle may communicate.
- the navigation system of the vehicle may also access other communication systems and/or networks through any intervening wireless communication system and/or network with which the navigation system may communicate.
- the navigation system may be able to communicate wirelessly with an access point (AP) of a wireless local area network, and via that access point connectivity, the navigation system may be able to access the Internet to access or retrieve any information there from.
- AP access point
- various databases available via the Internet would be acceptable by the navigation system of the vehicle.
- real-time information may be provided to driver via the navigation system.
- real-time information regarding traffic flow, accidents, road closures, detours, etc. may be provided to the driver.
- any locally stored database corresponding to the navigation system on the vehicle may be adapted or modified as a function of new information received via such wireless communications.
- certain other embodiments may operate by which such photographic and/or video information may be updated, corrected, enhanced, etc. by information that is received via such wireless communications.
- such updating of any information locally stored for use by the navigation system may be performed without meeting specific interaction by the driver or user of the navigation system.
- selective updating of the photographic and/or video information employed by the navigation system may need only to be performed when the vehicle is within a certain location. That is to say, locally stored photographic and/or video information corresponding to the physical environment of a given locale could be updated when the vehicle enters into or near that given locale.
- the updating of any locally stored information of the navigation system could be performed as a function of the location of the vehicle. As the vehicle moves around, any photographic and/or video information pertaining to its current location could be updated in real time.
- such updating of locally stored information of the navigation system may be triggered based on a comparison of such locally stored information to that which may be retrieved from a database with which the navigation system may communicate. For example, as the vehicle moves to a location for which the navigation system has some locally stored information, the navigation system would then automatically compare its locally stored information to other information corresponding to that same location that which may be retrieved from some remote database (e.g., from a remote database accessible via the Internet).
- some remote database e.g., from a remote database accessible via the Internet.
- the implementation of such image processing comparison functionality, circuitry, etc. may not be desirable in all embodiments. For example, in an embodiment in which the provisioning of such functionality, circuitry, etc. may be undesirable (e.g., for any number of reasons, including cost, complexity, space, etc.), information that is received via one or more wireless networks may simply be provided for consumption by the user “as is” (such as without undergoing any comparison operations).
- FIG. 9B is a diagram illustrating an embodiment 901 of a vehicle including at least one camera for supporting acquisition of image and/or video information.
- one or more cameras may be included within the vehicle to perform photo and/or video capture of the physical environment around the vehicle.
- the navigation system By using one or more cameras that are physically implemented around the vehicle, very accurate and completely up-to-date information may be employed by the navigation system. For example, as the vehicle is approaching a particular intersection in which any given route adjustment may need to be made, photographic and/or video information may be acquired in real time by the one or more cameras implemented around the vehicle. Then, this currently acquired information may be used by the navigation system such that labeling and/or indicia may be included therein.
- various operational parameters and constraints by which these one or more cameras operate may be adaptive and/or configurable. For example, via a user interface of the vehicle's navigation system, a user may select the various operational parameters and constraints by which photographic and/or video information is acquired by the one or more cameras.
- the navigation system may include adaptive capability such that the one or more cameras perform photographic and/or video capture and compare that acquired information to that which is stored with in memory of the navigation system or which may be retrieved via one or more communication networks with which the navigation system may communicate.
- the navigation system may update its respective information using that information which has been acquired in real time by the one or more cameras. It is also noted that such updating of information may be performed when a vehicle is not being driven (e.g., such as when it is parked).
- a navigation system operating in conjunction with at least one camera implemented with respect to the vehicle, may be implemented to provide for acquisition of the physical environment through which the vehicle passes to contribute significantly to the accuracy of the photographic and/or video information that is employed to provide direction and instructions to a user of the navigation system.
- the driver may be provided with much more accurate information, corresponding specifically to the environment in which the vehicle presently is, thereby minimizing any possibility that the driver of the vehicle will improperly translate instructions provided via the navigation system.
- the one or more cameras that may be implemented with respect to a vehicle may be implemented in any desired location.
- the one or more cameras will be appropriately implemented to provide for photographic and/or video information acquisition in the direction of the forward field of vision with respect to the perspective of the driver.
- additional cameras may be implemented to provide acquisition of such information in any directional view extending from the vehicle.
- multiple cameras may be implemented around the perimeter or periphery of the bus to allow for acquisition of such information from any perspective of any particular occupant of such a vehicle.
- any desired capability of the one or more cameras may be selected for and implemented within a given embodiment.
- the one or more cameras are three dimension (3-D) capable.
- the cameras may have any one or combination of a particular or preferred resolution, focus or auto focus capabilities, image acquisition size, widescreen capability, panoramic capability, etc.
- the operational capabilities and quality of the navigation system may be specifically tailored.
- Such an embodiment includes one or more cameras operative to perform photographic and/or image acquisition, various types of buffering may be performed. For example, such acquisition may be performed in a manner that only a last or most recent period of time of photographic and/or image information is kept (e.g., depending upon the amount of memory employed within such a system for buffering of photographic and/or image information).
- FIG. 10 is a diagram illustrating an embodiment 1000 of a navigation system operative to include overlays/directions within a wireless communication device.
- the navigation of this particular diagram is implemented within a wireless communication device (e.g., a wireless terminal 1010 ), such as may be a portable wireless communication device as may be employed by a user.
- a wireless communication device e.g., a wireless terminal 1010
- Examples of such wireless communication devices or wireless terminal terminals are many including cell phones, personal digital assistants, laptop computers, and/or, generally speaking, any wireless communication device that may be portable.
- Such a wireless communication device includes a user interface that is operative to provide information to a user and may also be operative to receive input from the user.
- the user interface includes a display by which photographic image and/or video information may be displayed for consumption by the user.
- labeling and/or indicia may be included within such photographic image and/or video information that is provided via the display for consumption by the user in accordance with navigational directions and instructions.
- any desired additional information which may enhance the user experience may be included or overlaid with respect to actual photographic and/or video in for example, formation.
- Some examples of such user experience enhancing information may include direction or bearing (such as North, South, etc.) that the wireless communication device or user is moving or facing, the actual location of the wireless communication device or user, arrows indicating certain routes and/or route adjustments that will get the user to the appropriate endpoint destination.
- direction or bearing such as North, South, etc.
- FIG. 11A is a diagram illustrating an embodiment 1100 of navigation system operative to include overlays/directions within a headset and/or eyeglasses context.
- headset and/or eyeglass-based system may include functionality in which labeling and/or indicia may be projected into the user's field of vision. Such labeling and/or indicia may be provided in accordance with the operation of the navigation system.
- An appropriately implemented headset and/or eyeglasses having such functionality to include labeling and/or indicia within a user's field of vision will include the appropriate circuitry to effectuate such labeling and/or indicia.
- one or more projectors are implemented within the headset and/or eyeglass-based system such that the labeling and/or indicia may be projected onto the actual transparent or translucent surfaces through which a user views the respective field of vision.
- the labeling and/or indicia being projected appropriately onto the transparent or translucent surface of the headset and/or eyeglass-based system will effectuate such labeling and/or indicia as being projected into or overlaid the field of vision.
- labeling and/or indicia as may be provided for consumption by a user need not necessarily be provided within the visible spectrum.
- certain labeling and/or indicia may be provided within the non-visible spectrum.
- eyeglasses, headgear, etc. may enable a user to perceive infrared provided labeling and/or indicia.
- such infrared provided labeling and/or indicia could be provided for consumption by a user that would not be perceptible by others (e.g., those not employing such appropriately implemented eyeglasses, headgear, etc.).
- FIG. 11B is a diagram illustrating an embodiment 1101 of navigation system operative to include overlays/directions within a headset and/or eyeglasses context in conjunction with a wireless communication device.
- a headset and/or eyeglass-based system may include wireless communication capability.
- the component placed upon a user's head or in front of the user's eyes may include a wireless terminal therein.
- Such a wireless terminal or wireless communication device may be implemented to operate in accordance with any desired wireless communication protocol, standard, and/or recommended practice, etc. as may be desired.
- such a wireless terminal as included within a headset and/or eyeglass-based system may operate cooperatively with a handheld or portable wireless terminal of the user.
- the headset and/or eyeglass-based system and the portable wireless terminal may generally be viewed as a combined system.
- the handheld or portable wireless terminal may include significantly more processing capability, hardware, memory, etc. with respect to that which may be included within the compliment placed upon the user's head or in front of the user's eyes.
- significantly reduced amount of processing capability, hardware, memory, etc. may be implemented within the placed upon the user's head or in front of the user's eyes.
- FIG. 12 is a diagram illustrating an embodiment 1200 of various types of navigation systems, implemented to support wireless communications, entering/exiting various wireless communication systems and/or networks.
- certain such devices and/or systems may include wireless communication capability.
- such devices and/or systems may enter and exit various wireless network service areas.
- a vehicle including wireless communication capability may move among various wireless network service areas. Specifically, with respect to a time 1 and a time 2 , the vehicle is shown as being within two respective wireless network service areas. As the vehicle enters and exits various wireless networks, it may acquire information respectively from those wireless networks.
- such a user may also enter and exit such wireless network service areas. However, it is likely that such a user while moving on foot would move at a rate that is less than that of the vehicle. As such, the respective times at which such a user may communicate with the respective wireless network service areas may differ from that of the vehicle.
- wireless network service areas may include information specific to the locale in which the wireless network operates.
- One embodiment may correspond to the locale of a traffic intersection or road near which the wireless network operates.
- the wireless network may include at least one device therein (e.g., access point (AP)) that includes capability for monitoring traffic flow.
- AP access point
- an access point may include photographic and/or video acquisition capability (e.g., one or more cameras) such that information related to traffic flow is acquired and monitored.
- the navigation system could then retrieve information related to such traffic flow.
- information may be current information, historical information such as trends of traffic flow as a function of time of day (e.g., rush-hour, midday, etc.), etc.
- the navigation system having such wireless communication capability effectuate communication with such a wireless network service area, such information may be retrieved and utilizes within and in conjunction with the navigation system.
- such acquired information may be provided to a user as a function of the proximity of the navigation system or user to the endpoint destination. For example, when the user is within a particular distance of the endpoint destination (e.g., such as determined by a threshold or proximity which may be set by the user), only then is such information provided to the user.
- a particular distance of the endpoint destination e.g., such as determined by a threshold or proximity which may be set by the user
- such acquired information may be provided to a user as a function of the complexity of the current environment in which the user is. For example, while driving a vehicle, when approaching a particular interchange having a relatively high complexity, such capability would be automatically turned on.
- the settings and constraints by which such decision-making may be made could be user-defined, predetermined, etc.
- such acquired information may be provided as a function of the congestion or traffic in which the vehicle may currently be in or along the route that the vehicle is proceeding.
- one or more sensors may be included within the vehicle to detect the existence of one or more additional vehicles nearby (e.g., collision detection capability may be employed to detect proximity of one or more other vehicles).
- Such information acquired by sensors may be accompanied with monitoring of the velocity of the vehicle (e.g., when the velocity of the vehicle is below a particular threshold and one or more other vehicles are detected within a particular proximity of the vehicle, which may indicate relatively congested traffic or a traffic jam), then such information may be provided for consumption by the user.
- one or more cameras may be employed to perform photographic and/or video information that may be used to determine congestion or traffic information, proximity of one or more other vehicles, etc.
- real-time acquisition of information from one or more wireless networks with which the navigation system may communicate may be used to provide information related to congestion or traffic in which the vehicle may currently be in or along the route that the vehicle is proceeding.
- FIG. 13 is a diagram illustrating an embodiment 1300 of at least two projectors operative in accordance with a lenticular surface for effectuating three dimensional (3-D) display.
- a lenticular surface may be viewed as an array or series of cylindrical molded lenses.
- Such a lenticular surface may be included within any of the transparent or translucent surface (e.g., any windshield or window of a vehicle, headset and/or eyeglasses, a transparent or translucent portion of a headset, etc.).
- Such a lenticular surface may be constructed such that it is generally transparent non-perceptible to a user. That is to say, the cylindrical lenses of such a lenticular surface will not deleteriously affect the perceived quality or vision of a user when looking through such a lenticular surface.
- the cylindrical lenses may be visually non-perceptible to a user, yet tactically felt by a user (e.g., when a user may slide a finger across the surface).
- the particular features of such a lenticular surface can provide one means by which three-dimensional imaging may be effectuated.
- a navigation system implemented within a vehicle such that labeling and/or indicia is projected into a field of vision of the driver or any other passenger of the vehicle
- one or more of the transparent or translucent surfaces of the vehicle e.g., any windshield or window of a vehicle
- three-dimensional display of such labeling and/or indicia may be made.
- the use of such a lenticular surface will not obstruct the view or vision capabilities of a driver or passenger of the vehicle.
- the effect of three-dimensional projection of labeling and/or indicia may be made with respect to a field of vision of a driver or passenger of the vehicle.
- two or more projectors may be employed in different locations for effectuating three-dimensional imaging.
- a first projector could be implemented on a left-hand side and a second projector could be implemented on a right-hand side.
- Each respective projector is then operative respectively to project an image and/or video information towards the lenticular surface.
- Each of the respective cylindrical lenses of the lenticular surface then appropriately process the respective received image and/or video information such that a driver or passenger of the vehicle will perceive a three-dimensional effect. That is to say, the cylindrical lenses and their curvature shape operate to refract light rays received via the at least two respective directions corresponding to the at least two respective projectors.
- the curvature shape of the respective cylindrical lenses along the lenticular surface effectively refract the received light rays as a function of their angle of incidence at the surface.
- a lenticular surface may be designed to have a rear focal plane coincident with the backplane of the transparent or translucent surface.
- the rear focal plane of the lenticular surface may be designed to coincide with the backplane of the windshield (e.g., the surface of the windshield which is exposed to the outside, which is the backplane of the translucent or transparent surface through which a driver or passenger looks).
- such a lenticular surface may be implemented along any desired surface through which a user may look.
- such a lenticular surface may be implemented within eyeglasses, a headset type display, etc.
- three-dimensional imaging may be effectuated using such transparent or translucent material.
- FIG. 14 is a diagram illustrating an embodiment 1400 of at least two projectors operative in accordance with a non-uniform surface for effectuating 3-D display.
- This diagram depicts an alternative embodiment by which three-dimensional imaging may be effectuated.
- a non-uniform surface may be employed.
- two or more projectors may be implemented at different respective locations so that when they respectively project photographic and/or video information towards the non-uniform surface. Therefore, from the perspective of a driver or passenger of the vehicle, three-dimensional projected labeling and/or indicia will appear to extend out into the respective field of vision.
- any desired surface may be employed to assist in effectuating three-dimensional imaging for projecting labeling and/or indicia into a user's field of vision.
- the use of lenticular surface as described with respect to a previous embodiment and the sawtooth appearing non-uniform surface of this perspective embodiment are to possible examples.
- other such services that help effectuate three-dimensional imaging.
- FIG. 15 is a diagram illustrating an embodiment 1500 of an organic light emitting diode (organic LED or OLED) as may be implemented in accordance with various applications for effectuating 3-D display.
- OLED organic light emitting diode
- An organic LED is a light emitting diode in which the permissive electroluminescent layer is a film of or chronic compounds which emit light in response to an electric current.
- An OLED may be implemented using a flexible type material such that the OLED may be placed over any desired having any of a variety of different shapes. For example, an OLED may be implemented over any translucent or transparent surface.
- transparent or translucent surfaces include those which may appear in a vehicular context (e.g., windshield, windows, sun roofs, moon roofs, etc.).
- examples of transparent translucent surfaces include the lenses of the eyeglasses and/or headwear.
- an OLED implemented using a flexible material may be employed within a variety of applications including those such as may be used within navigation systems to provide labeling and/or indicia to a user.
- indicia and/or labeling may be indicated within a user's respective field of vision.
- different OLEDs may be operative to emit light having different colors.
- a matrix type display structure in which different respectively located OLEDs are arranged to emit different respective colors (e.g., red, green, blue)
- a common OLED material may be selectively and differentially energized using different voltages appropriately applied across different portions of the overall OLED structure to effectuate the emission of different respective colors.
- the use of an old LED flexible material over a translucent or transparent surface allows for the inclusion of labeling and/or indicia within a user's field of vision.
- the composition of an OLED includes a layer of organic material situated between two electrodes, the anode and cathode, respectively.
- the organic molecules are electrically conductive as a result of delocalization of pi electrons caused by conjugation over all or part of the molecule. These materials have conductivity levels ranging from insulators to conductors, and therefore are considered organic semiconductors.
- the highest occupied and lowest unoccupied molecular orbits (HOMO and LUMO) of such organic semiconductors may be viewed as being analogous to the valence and conduction band of inorganic semiconductors.
- a voltage is applied across the OLED such that the anode is positive with respect to the cathode.
- a current of electrons flows through the device from cathode to anode, as electrons are injected into the LUMO of the organic layer at the cathode and withdrawn from the HOMO at the anode.
- This process may also be described as the injection of electron holes into the HOMO. Electrostatic forces bring the electrons in the holes toward each other, and they recombine forming an exciton, a bound state of the electron and hole. This will typically occur closer to the emissive layer, because in organic semiconductors, holes are generally more mobile than electrons.
- the decay of this excited state results in a relaxation of the energy levels of the electron, accompanied by emission of radiation whose frequency is in the visible spectrum.
- the frequency of this radiation depends on the bandgap of the material, particularly the difference in energy levels between HOMO and LUMO.
- Such OLED's may be implemented using a variety of different materials.
- ITO indium tin oxide
- ITO is one material that may be used as the anode material within OLED's.
- the use of OLED's on the transparent or translucent surfaces through which a user looks to perceive a field of vision when using a navigation system can provide for three-dimensional imaging of labeling and/or indicia to assist the user with directions or instructions.
- labeling and/or indicia may be provided within a field of vision of a user.
- labeling and/or indicia is provided with respect to the operation of a navigation system in which such labeling and/or indicia assists the user to navigate through a particular region or environment in an effort to arrive at a desired endpoint destination.
- the inclusion of such labeling and or indicia is made in accordance with three-dimensional imaging.
- a variety of means may be employed to effectuate such three-dimensional imaging including the use of OLED, the use of appropriately designs along transparent or translucent services, etc.
- any desired information may be provided for consumption by a user to enhance the overall experience.
- labeling and/or indicia within a user's field of vision may accompany additional information such as directionality, location, vehicular velocity, time, estimated time of arrival, etc.
- additional information such as directionality, location, vehicular velocity, time, estimated time of arrival, etc.
- the improved accuracy provided by such photographic and/or video information can reduce or eliminate the amount of time for a user to translate what is depicted within the display of such a system to that of the actual physical environment.
- the use of such photographic and/or video information will provide much greater specificity thereby assisting a user to reach a desired endpoint destination hopefully more safely, quicker, and with less frustration for the user.
- FIG. 16A , FIG. 16B , FIG. 16C , FIG. 17A , FIG. 17B , FIG. 17C , FIG. 18A , FIG. 18B , FIG. 19A , and FIG. 19B illustrate various embodiment of methods as may be performed in accordance with operation of various devices such as various wireless communication devices and/or various navigations systems.
- the method 1600 begins by identifying a feature within an actual physical environment, as shown in a block 1610 .
- a feature may be any as described herein including a landmark, a building, a stadium, and airport, apart, a historical feature, a geological feature, and/or any other feature within a physical environment.
- the method 1600 continues by projecting an overlay/label into a user's field of vision based on the identified feature, as shown in a block 1620 .
- Such projection may be made in accordance with three-dimensional (3D) projection.
- 3D three-dimensional
- one or more projectors may be implemented within a vehicle to project labeling and/or indicia into the actual physical environment from the perspective of a user (e.g., a driver and/or passenger of the vehicle).
- the method 1601 begins by identifying at least one of a user's current position and an endpoint destination, as shown in a block 1611 .
- any of a variety of means may be employed to identify a user's current location.
- a user enters information corresponding to that user's current location via a user interface of a given device.
- certain location determining means, functionality, circuitry, etc. are employed that do not necessarily require the interaction of user (e.g., global positioning system (GPS), detection of one or more fixed wireless access points within a wireless communication network [e.g., such as one or more access point (APs) within a wireless local area network (WLAN)], etc.).
- GPS global positioning system
- APs access point
- WLAN wireless local area network
- the method 1601 then operates by identifying a feature within an actual physical environment corresponding to a user's current location and/or an endpoint destination, as shown in a block 1621 .
- the feature identified may correspond particularly to a pathway between a user's current location and an endpoint destination. That is to say, the method 1601 may operate particularly to identify features based upon a user's current location, an endpoint destination, and/or one or more pathways between the user's current location and the endpoint destination.
- the method 1601 may operate to identify different respective features that may be encountered along different respective pathways between a user's current location and an endpoint destination.
- Such different pathways, and their respective features identified there along, could be presented to a user of a given device for selection by the user of a particular one of the possible pathways.
- selectivity of pathways such as in accordance with a method operating in accordance with navigation, may be based upon respective features that may be encountered along different respective pathways.
- the method 1601 continues by projecting an overlay/label into the user's field of vision based on the identified feature, the user's current location, and/or the endpoint destination, as shown in a block 1631 .
- Any one or any combination of the identified feature, the user's current location, and the endpoint destination may govern, at least in part, an overlay/label that may be projected into a user's field of vision.
- any such overlay/label may take the form of any labeling and/or indicia that may correspond to any of a variety of formats, including arrows, markers, labels, alphanumeric characters, etc. and/or any combination thereof, etc.
- the method 1602 begins by receiving user input corresponding to an endpoint destination, as shown in a block 1610 .
- the method 1602 may operate in accordance with navigation functionality such that a user may input different types of information thereto.
- the method 1602 continues by identifying the user's current location and the endpoint destination, as shown in a block 1622 .
- the user's current location may be provided by information that is entered via a user interface by the user.
- other means may be employed to identify the user's current location (e.g., GPS, detection of one or more fixed wireless access points within a wireless communication network [e.g., such as one or more access point (APs) within a wireless local area network (WLAN)], etc.).
- the endpoint destination may be provided based upon information that is entered via a user interface by the user.
- the method 1602 then operates by determining a pathway between the user's current location and the endpoint destination, as shown in a block 1632 .
- a pathway between the user's current location and the endpoint destination as shown in a block 1632 .
- more than one pathway is presented to a user, and a user is provided an opportunity to select one of those pathways.
- the method 1602 continues by identifying a feature within an actual physical environment corresponding to the pathway, as shown in a block 1642 .
- different respective features are identified as corresponding to different respective pathways between a user's current location and an endpoint destination.
- the method 1602 then operates by projecting an overlay/label into the user's field of vision based on the identified feature, as shown in a block 1652 .
- the projection of an overlay/label may be based upon additional considerations as well (e.g., the user's current location, the endpoint destination, etc.)
- the method 1700 begins by identifying a feature within media, as shown in a block 1710 .
- Various examples of such media may include photographic and or video media depicting an actual physical environment.
- Such an actual physical environment may correspond to that environment in which a user currently is.
- the physical buyer environment may correspond to an environment in which a user intends or plans to enter (e.g., such as in accordance with following navigational directions between a current location and an endpoint destination).
- the method 1700 continues by outputting the media with an overlay/label therein based on the identified feature, as shown in a block 1720 .
- the media after having undergone processing for identification of at least one feature therein, may be output for consumption by a user with at least one overlay/label therein.
- the method 1701 begins by identifying a user's current location and/or an endpoint destination, as shown in a block 1711 .
- various means may be employed to identify the user's current location (e.g., GPS, user provided information entered via a user interface, etc.) and the endpoint destination (e.g., user provided information entered via a user interface, etc.).
- the method 1701 then operates by identifying at least one feature within media corresponding to the user's current location and/or an endpoint destination, as shown in a block 1721 .
- certain processing of that media may provide indication of respective features included therein.
- Various types of pattern recognition, image processing, video processing, etc. may be employed to identify one or more features within media.
- the identification of at least one feature within the media is based upon a pathway between the user's current location and the endpoint destination. As also described with respect other embodiments, more than one pathway may be identified between a user's current location and an endpoint destination. In such instances, different respective features may be identified along the different respective pathways between the user's current location and the endpoint destination.
- the method 1701 continues by outputting the media with an overlay/label therein based on anyone or any combination of the identified feature, the user's current location, and/or the endpoint destination, as shown in a block 1731 .
- the method 1702 begins by receiving user input corresponding to an endpoint destination, as shown in a block 1710 .
- Such information may be provided via a user interface of a given device.
- the method 1702 continues by identifying the user's current location and the endpoint destination, as shown in a block 1722 .
- various means may be employed to identify the user's current location (e.g., GPS, user provided information entered via a user interface, etc.) and the endpoint destination (e.g., user provided information entered via a user interface, etc.).
- the method 1702 then operates by determining a pathway between the user's current location in the endpoint destination, as shown in a block 1732 .
- a pathway between the user's current location in the endpoint destination as shown in a block 1732 .
- more than one pathway may be identified between the user's current location in the endpoint destination, and those respective pathways may be provided to a user for selection of at least one thereof.
- the method 1702 continues by identifying at least one feature within media corresponding to the pathway, as shown in a block 1742 .
- certain processing of that media may provide indication of respective features included therein.
- Various types of pattern recognition, image processing, video processing, etc. may be employed to identify one or more features within media.
- the method 1702 then operates by outputting the media with an overlay/label therein based on the identified feature, as shown in a block 1752 .
- an overlay/label therein based on the identified feature, as shown in a block 1752 .
- multiple overlays/labels may be included within the media in accordance with more than one respective identified feature.
- the method 1800 begins by acquiring media corresponding to an actual physical environment, as shown in a block 1810 .
- such acquisition is performed in accordance with accessing one or more databases via one or more networks, as shown in a block 1810 a .
- one or more databases may be accessed to acquire media corresponding to an actual physical location.
- Such an actual physical location may correspond to an environment in which a user currently is, an environment in which a user plans to be, etc.
- such acquisition is performed in accordance with one or more cameras, as shown in a block 1810 b .
- one or more cameras may be employed to perform real-time acquisition of such media.
- real-time acquired information will hopefully be extremely accurate, up-to-date, etc. and properly detect the actual physical environment.
- the method 1800 continues by processing the media in accordance with identifying a feature therein, as shown in a block 1820 .
- various types of pattern recognition, image processing, video processing, etc. may be employed for processing of media to identify one or more features therein.
- the method 1800 then operates by outputting the media with an overlay/label therein based on the identified feature, as shown in a block 1830 .
- an overlay/label therein based on the identified feature, as shown in a block 1830 .
- multiple overlays/labels may be included within the media in accordance with more than one respective identified feature.
- the method 1801 begins by identifying a user's current location and/or an endpoint destination, as shown in a block 1811 .
- various means may be employed to identify the user's current location (e.g., GPS, user provided information entered via a user interface, etc.) and the endpoint destination (e.g., user provided information entered via a user interface, etc.).
- the method 1801 then operates by acquiring media based on one or both of the user's current location and the endpoint destination, as shown in a block 1821 .
- such acquisition is performed in accordance with accessing one or more databases via one or more networks, as shown in a block 1821 a .
- one or more databases may be accessed to acquire media corresponding to an actual physical location.
- Such an actual physical location may correspond to an environment in which a user currently is, an environment in which a user plans to be, etc.
- acquisition of media is particularly based on one or both of the user's current location and the endpoint destination.
- the search window is particularly tailored and/or narrowed based upon one or both of the user's current location and the endpoint destination.
- different respective media may be acquired for more than one of the respective pathways. For example, a first set of media may be acquired for a first pathway, a second set of media may be acquired for a second pathway, etc.
- such acquisition is performed in accordance with one or more cameras, as shown in a block 1821 b .
- one or more cameras may be employed to perform real-time acquisition of such media.
- real-time acquired information will hopefully be extremely accurate, up-to-date, etc. and properly detect the actual physical environment.
- the method 1801 continues by processing the media in accordance with identifying a feature therein, as shown in a block 1831 .
- identifying a feature therein may be employed to identify one or more features within media.
- the method 1801 then operates by outputting the media with an overlay/label therein based on the identified feature, as shown in a block 1841 .
- an overlay/label therein based on the identified feature, as shown in a block 1841 .
- multiple overlays/labels may be included within the media in accordance with more than one respective identified feature.
- the method 1900 begins by acquiring first information via a first network, as shown in a block 1910 .
- a first network for example, any of a variety of types of information may be acquired via a given network, and certain networks may be operative to provide different information, different levels of specificity, etc.
- networks may be operative to provide different information, different levels of specificity, etc.
- conductivity to those different respective networks may be made, and by which, different respective information may be acquired.
- such acquisition is performed in accordance with acquiring real-time traffic information, information corresponding to one or more features within a physical environment, media corresponding to an actual physical environment, etc., as shown in a block 1910 a.
- the method 1900 operates by acquiring second information via a second network, as shown in a block 1920 . That is to say, a user and/or device may move throughout a physical environment including moving in and out of different respective service areas of different respective wireless networks.
- the method 1900 then operates by selectively outputting the first information (and/or the second information) based on a user's current location and/or an endpoint destination, as shown in a block 1930 .
- the method 1901 begins by receiving user input corresponding to a first operational mode, as shown in a block 1911 .
- a first operational mode may indicate any one or more different parameters corresponding to the type and/or manner of information to be provided to a user.
- a given operational mode may correspond to a particular display format, indicating what particular information to display, when to display such information (e.g., displaying first information at her during a first time, displaying second information at ordering a second time, etc.), etc.
- the method 1901 then operates by providing one or more overlays/labels within media and/or a user's field of vision in accordance with the first operational mode, as shown in a block 1921 .
- a great deal of selectivity is provided by which such information may be provided to a user in any of a given number of applications.
- certain embodiments relate to providing such labeling and/or indicia within a user's field of vision, other embodiments relate to providing such labeling and/or indicia within media, and even other embodiments relate to providing such labeling and/or indicia within both a user's field of vision and media.
- a method may operate by not only projecting labeling and/or indicia within a user's field of vision, but a display (e.g., within the vehicle, on a portable device, etc.) may also output media including one or more overlays/labels therein.
- a redundant type system may operate in which not only is labeling and/or indicia projected within a user's field of vision, but a display is operative to output media including one or more overlays/labels therein simultaneously or in parallel with a projection of the labeling and/or indicia within the user's field of vision.
- the method 1901 continues by receiving user input corresponding to a second operational mode, as shown in a block 1931 .
- the second operational mode may include one or more overlapping operational parameters similar to the first operational mode.
- the second operational mode and the first operational mode are mutually exclusive such that they do not include any common operational parameters.
- the method 1901 then operates by providing one or more overlays/labels within media and/or a user's field of vision in accordance with the second operational mode, as shown in a block 1941 .
- different respective operational modes may govern the manner by which such information is provided. Also, by allowing for different respective operational modes, different operation may be effectuated at different times.
- various types of indicia may be used to trigger or select when to operate within different operational modes. For example, a first operational mode may be selected when a user is relatively far from an endpoint destination. Then, a second operational mode based be selected when that user is within a given proximity of the endpoint destination (e.g., within X number of miles).
- FIG. 20 is a diagram illustrating an embodiment 2000 of a vehicle including an audio system for effectuating directional audio indication.
- a vehicle may be implemented to have an audio system that supports capability of stereo, surround sound, and other functionality.
- a number of speakers may be implemented at different locations within the vehicle.
- sound may be provided from the speakers in such a manner as to have a given directionality.
- multiple audio channels and often times multiple corresponding speakers, such as may be serviced by the multiple audio channels
- the perceptual effect of the sound coming from a particular location or direction may be made.
- an audio system may be coordinated such that audio prompts or audio indications may be provided via the audio system in such a way as to correspond to the respective location of a particular feature.
- a label associated with a feature located on a particular side of the vehicle may be audibly provided via the audio system such that the label is perceptually been as being sourced from the location of that particular feature.
- an audio prompt may be provided via the audio system such that, from the perspective of the driver, the audio prompt is perceived as directionally coming from forward, right-hand side portion of the vehicle.
- an audio prompt may be provided via the audio system such that, from the perspective of the driver, the audio prompt is perceived as directionally coming from the forward, left hand side portion of the vehicle.
- a given audio system having at least two speakers and associated functionality, circuitry, etc.
- DSP audio processing for performing any of a number of different audio processing operations (e.g., DSP audio processing [such as for effectuating certain audio effects as reverb or chorus effects, equalizer settings, matrixing of mono/stereo signaling two different channels, parametric control of audio DSP effects, etc.], balance adjustment, fader adjustment, selectively driving one or more speakers,)
- audio prompt may be provided via the audio system such that, from the perspective of the driver, the audio prompt is perceived as directionally coming from the rear of the vehicle.
- coordination between various labeling and/or indicia which may be included and presented within various respective fields of vision and corresponding audio prompts associated with that labeling and/or indicia may further augment or enhance a driver experience.
- FIG. 21 is a diagram illustrating an embodiment 2100 of a navigation system with a display or projection system in a vehicular context, and particularly including at least one localized region of the field of vision in which labeling and/or indicia is provided.
- the reader is referred to FIG. 6 in comparison to this diagram.
- this diagram instead of providing such labeling and/or indicia across an entirety of the one or more respective fields of vision of the driver, such labeling and/or indicia is provided within a localized region (e.g., a subset of the field of vision). That is to say, and set of providing such labeling and/or indicia within any particular location of one or more fields of vision, such labeling and/or indicia is provided within only one or more localized regions.
- a localized region e.g., a subset of the field of vision
- a driver of such a vehicle will then be able to focus attention only to those one or more localized regions instead of surveying all of the respective fields of vision and multiple directions.
- certain feature labeling may only be provided when visible within one or more of these localized regions. For example, a feature labeling associated with the street would appear only when that street may be perceived within a given localized region.
- a localized region may be appropriately selected from the perspective of the user to which such labeling and/or indicia is being provided.
- a driver of a vehicle may be provided a first respective localized region (e.g., located within the windshield in front of the driver)
- a front seat passenger may be provided a second respective localized region (e.g., located within the windshield in front of the front seat passenger)
- a backseat passenger may be provided a third respective localized region (e.g., located within a side window nearest to the backseat passenger, on the right-hand or left-hand side), etc.
- a navigation system including various compounds, a wireless communication device, and/or any of a number of other devices.
- one or more modules and/or circuitries within a given device e.g., such as baseband processing module implemented in accordance with the baseband processing module as described with reference to FIG. 2
- other components therein may operate to perform any of the various methods herein.
- modules and/or circuitries may be a single processing device or a plurality of processing devices.
- a processing device may be a microprocessor, micro-controller, digital signal processor, microcomputer, central processing unit, field programmable gate array, programmable logic device, state machine, logic circuitry, analog circuitry, digital circuitry, and/or any device that manipulates signals (analog and/or digital) based on operational instructions.
- the operational instructions may be stored in a memory.
- the memory may be a single memory device or a plurality of memory devices.
- Such a memory device may be a read-only memory (ROM), random access memory (RAM), volatile memory, non-volatile memory, static memory, dynamic memory, flash memory, and/or any device that stores digital information.
- ROM read-only memory
- RAM random access memory
- volatile memory non-volatile memory
- static memory dynamic memory
- flash memory flash memory
- any device that stores digital information any device that stores digital information.
- the processing module implements one or more of its functions via a state machine, analog circuitry, digital circuitry, and/or logic circuitry
- the memory storing the corresponding operational instructions is embedded with the circuitry comprising the state machine, analog circuitry, digital circuitry, and/or logic circuitry.
- a memory stores, and a processing module coupled thereto executes, operational instructions corresponding to at least some of the steps and/or functions illustrated and/or described herein.
- connections or couplings between the various modules, circuits, functional blocks, components, devices, etc. within any of the various diagrams or as described herein may be differently implemented in different embodiments.
- such connections or couplings may be direct connections or direct couplings there between.
- such connections or couplings may be indirect connections or indirect couplings there between (e.g., with one or more intervening components there between).
- certain other embodiments may have some combinations of such connections or couplings therein such that some of the connections or couplings are direct, while others are indirect.
- Different implementations may be employed for effectuating communicative coupling between modules, circuits, functional blocks, components, devices, etc. without departing from the scope and spirit of the invention.
- PSD Power Spectral Density
- PSD Power Spectral Density
Abstract
Description
- The present U.S. Utility patent application claims priority pursuant to 35 U.S.C. §119(e) to the following U.S. Provisional patent application which is hereby incorporated herein by reference in its entirety and made part of the present U.S. Utility patent application for all purposes:
- 1. U.S. Provisional Patent Application Ser. No. 61/491,838, entitled “Media communications and signaling within wireless communication systems,” (Attorney Docket No. BP22744), filed May 31, 2011, pending.
- The following standards/draft standards are hereby incorporated herein by reference in their entirety and are made part of the present U.S. Utility patent application for all purposes:
- 1. WD1:
Working Draft 1 of High-Efficiency Video Coding, Joint Collaborative Team on Video Coding (JCT-VC), of ITU-T SG16 WP3 and ISO/IEC JTC1/SC29/WG11, Thomas Wiegand, et al., 3rd Meeting: Guangzhou, CN, 7-15 Oct., 2010, Document: JCTVC-C403, 137 pages. - 2. ISO/IEC 14496-10—MPEG-4 Part 10, AVC (Advanced Video Coding), alternatively referred to as H.264/MPEG-4 Part 10 or AVC (Advanced Video Coding), ITU H.264/MPEG4-AVC, or equivalent.
- The following IEEE standards/draft IEEE standards are hereby incorporated herein by reference in their entirety and are made part of the present U.S. Utility patent application for all purposes:
- 1. IEEE Std 802.11™—2007, “IEEE Standard for Information technology—Telecommunications and information exchange between systems—Local and metropolitan area networks—Specific requirements; Part 11: Wireless LAN Medium Access Control (MAC) and Physical Layer (PHY) Specifications,” IEEE Computer Society, IEEE Std 802.11™—2007, (Revision of IEEE Std 802.11-1999), 1233 pages.
- 2. IEEE Std 802.11n™—2009, “IEEE Standard for Information technology—Telecommunications and information exchange between systems—Local and metropolitan area networks—Specific requirements; Part 11: Wireless LAN Medium Access Control (MAC) and Physical Layer (PHY) Specifications; Amendment 5: Enhancements for Higher Throughput,” IEEE Computer Society, IEEE Std 802.11n™—2009, (Amendment to IEEE Std 802.11™—2007 as amended by IEEE Std 802.11k™—2008, IEEE Std 802.11r™—2008, IEEE Std 802.11y™—2008, and IEEE Std 802.11r™—2009), 536 pages.
- 3. IEEE P802.11ac™/D1.0, May 2011, “Draft STANDARD for Information Technology—Telecommunications and information exchange between systems—Local and metropolitan area networks—Specific requirements, Part 11: Wireless LAN Medium Access Control (MAC) and Physical Layer (PHY) specifications, Amendment 5: Enhancements for Very High Throughput for Operation in Bands below 6 GHz,” Prepared by the 802.11 Working Group of the 802 Committee, 263 total pages (pp. i-xxi, 1-242).
- 1. Technical Field of the Invention
- The invention relates generally to navigation systems; and, more particularly, it relates to providing effective, timely, and detailed information to a user of such navigation systems.
- 2. Description of Related Art
- For many years, navigation systems have been of interest for a variety of applications. For example many vehicles include some form of navigational system to provide directions to a driver of a vehicle thereby assisting the driver to reach an endpoint destination of interest. However, typical prior art vehicular and other navigation systems often fail to provide adequate directions to user. For example, such navigation systems will often fail to provide adequate directions with sufficient specificity within impending moments before which some type of route adjustments must be made. For example, while driving a vehicle, an indication that a driver should make a particular turn onto a particular road is often so soon before that route adjustments must be made that the driver does not have time to make the proper route adjustment. For example, such an instruction may be provided by the navigation system when the vehicle is within a proximity that is too close to an intersection at which a driver should turn. That is to say, the instruction may be provided too late, and the driver may not be able to position the vehicle into the proper position or lane to make the appropriate route adjustments safely.
- In typical prior art navigation systems, because of these errors that are often times made, rerouting and establishing a new reach the endpoint destination are commonplace. Of course, this could lead to great frustration for the driver or passengers of the vehicle, and there may be associated lost time as well as the driver must then take the alternate route to the endpoint destination. Moreover, by the typical situation in which a prior art navigation system may provide such information in a non-timely manner, accidents or other unfortunate events are more likely to happen as the driver attempts to comply with the very late given instructions.
- Also, prior art navigation systems typically are implemented to provide such directions to a user in accordance with some form of animated depiction of the physical environment. That is to say a typical prior art navigation system will often times include cartoonlike animation that is representative of the physical environment. As the reader will understand, such animation is often times or inherently a non-accurate and imperfect representation of the physical environment. This also may result in a delay or problem for the user of the navigation system in trying to translate the animated type depiction of the physical environment to the actual physical environment. This also introduces a latency and delay which can result in a sacrifice of safety most importantly and also lead to frustration and delays for the user secondarily. For example, when a driver of a vehicle is spending an inordinate amount of time in trying to decipher and translate the directions provided by the navigation system, the driver may fail to notice hazards, people, etc. within the actual physical environment and may unfortunately place him or herself or others in a detrimental position.
- There appears to be an insatiable desire for better and more effective navigation systems for use by drivers, pedestrians, etc. The present state of the art does not provide inadequate since solution that can overcome these and other deficiencies. As such, there seems to be a continual desire for more effective navigation systems and for improvements thereof.
-
FIG. 1 andFIG. 2 are diagrams illustrating various embodiments of communication systems. -
FIG. 3 is a diagram illustrating an alternative embodiment of a wireless communication system. -
FIG. 4 is a diagram illustrating an embodiment of a wireless communication device. -
FIG. 5 is a diagram illustrating an alternative embodiment of a wireless communication device. -
FIG. 6 is a diagram illustrating an embodiment of a navigation system with a display or projection system in a vehicular context. -
FIG. 7 is a diagram illustrating an embodiment of a navigation system operative to include overlays/directions within an in-dash and/or in-steering wheel display. -
FIG. 8 is a diagram illustrating an embodiment of a vehicle including at least one projector for effectuating a display or projection of information. -
FIG. 9A is a diagram illustrating an embodiment of a vehicle including at least one antenna for supporting wireless communications with at least one wireless communication system and/or network. -
FIG. 9B is a diagram illustrating an embodiment of a vehicle including at least one camera for supporting acquisition of image and/or video information. -
FIG. 10 is a diagram illustrating an embodiment of a navigation system operative to include overlays/directions within a wireless communication device. -
FIG. 11A is a diagram illustrating an embodiment of navigation system operative to include overlays/directions within a headset and/or eyeglasses context. -
FIG. 11B is a diagram illustrating an embodiment of navigation system operative to include overlays/directions within a headset and/or eyeglasses context in conjunction with a wireless communication device. -
FIG. 12 is a diagram illustrating an embodiment of various types of navigation systems, implemented to support wireless communications, entering/exiting various wireless communication systems and/or networks. -
FIG. 13 is a diagram illustrating an embodiment of at least two projectors operative in accordance with a lenticular surface for effectuating three dimensional (3-D) display. -
FIG. 14 is a diagram illustrating an embodiment of at least two projectors operative in accordance with a non-uniform surface for effectuating 3-D display. -
FIG. 15 is a diagram illustrating an embodiment of an organic light emitting diode (LED) as may be implemented in accordance with various applications for effectuating 3-D display. -
FIG. 16A ,FIG. 16B ,FIG. 16C ,FIG. 17A ,FIG. 17B ,FIG. 17C ,FIG. 18A ,FIG. 18B ,FIG. 19A , andFIG. 19B illustrate various embodiment of methods as may be performed in accordance with operation of various devices such as various wireless communication devices and/or various navigations systems. -
FIG. 20 is a diagram illustrating an embodiment of a vehicle including an audio system for effectuating directional audio indication. -
FIG. 21 is a diagram illustrating an embodiment of a navigation system with a display or projection system in a vehicular context, and particularly including at least one localized region of the field of vision in which labeling and/or indicia is provided. - Within communication systems, signals are transmitted between various communication devices therein. The goal of digital communications systems is to transmit digital data from one location, or subsystem, to another either error free or with an acceptably low error rate. As shown in
FIG. 1 , data may be transmitted over a variety of communications channels in a wide variety of communication systems: magnetic media, wired, wireless, fiber, copper, and other types of media as well. -
FIG. 1 andFIG. 2 are diagrams illustrating various embodiments of communication systems, 100, and 200, respectively. - Referring to
FIG. 1 , this embodiment of acommunication system 100 is acommunication channel 199 that communicatively couples a communication device 110 (including atransmitter 112 having anencoder 114 and including areceiver 116 having a decoder 118) situated at one end of thecommunication channel 199 to another communication device 120 (including atransmitter 126 having anencoder 128 and including areceiver 122 having a decoder 124) at the other end of thecommunication channel 199. In some embodiments, either of thecommunication devices communication channel 199 may be implemented (e.g., asatellite communication channel 130 usingsatellite dishes wireless communication channel 140 usingtowers local antennae wired communication channel 150, and/or a fiber-optic communication channel 160 using electrical to optical (E/O)interface 162 and optical to electrical (O/E) interface 164)). In addition, more than one type of media may be implemented and interfaced together thereby forming thecommunication channel 199. - To reduce transmission errors that may undesirably be incurred within a communication system, error correction and channel coding schemes are often employed. Generally, these error correction and channel coding schemes involve the use of an encoder at the transmitter end of the
communication channel 199 and a decoder at the receiver end of thecommunication channel 199. - Any of various types of ECC codes described can be employed within any such desired communication system (e.g., including those variations described with respect to
FIG. 1 ), any information storage device (e.g., hard disk drives (HDDs), network information storage devices and/or servers, etc.) or any application in which information encoding and/or decoding is desired. - Generally speaking, when considering a communication system in which video data is communicated from one location, or subsystem, to another, video data encoding may generally be viewed as being performed at a transmitting end of the
communication channel 199, and video data decoding may generally be viewed as being performed at a receiving end of thecommunication channel 199. - Also, while the embodiment of this diagram shows bi-directional communication being capable between the
communication devices communication device 110 may include only video data encoding capability, and thecommunication device 120 may include only video data decoding capability, or vice versa (e.g., in a uni-directional communication embodiment such as in accordance with a video broadcast embodiment). - Referring to the
communication system 200 ofFIG. 2 , at a transmitting end of acommunication channel 299, information bits 201 (e.g., corresponding particularly to video data in one embodiment) are provided to atransmitter 297 that is operable to perform encoding of theseinformation bits 201 using an encoder and symbol mapper 220 (which may be viewed as being distinctfunctional blocks modulation symbols 203 that is provided to a transmitdriver 230 that uses a DAC (Digital to Analog Converter) 232 to generate a continuous-time transmit signal 204 and a transmitfilter 234 to generate a filtered, continuous-time transmit signal 205 that substantially comports with thecommunication channel 299. At a receiving end of thecommunication channel 299, continuous-time receive signal 206 is provided to an AFE (Analog Front End) 260 that includes a receive filter 262 (that generates a filtered, continuous-time receive signal 207) and an ADC (Analog to Digital Converter) 264 (that generates discrete-time receive signals 208). Ametric generator 270 calculates metrics 209 (e.g., on either a symbol and/or bit basis) that are employed by adecoder 280 to make best estimates of the discrete-valued modulation symbols and information bits encoded therein 210. - Within each of the
transmitter 297 and thereceiver 298, any desired integration of various components, blocks, functional blocks, circuitries, etc. therein may be implemented. For example, this diagram shows aprocessing module 280 a as including the encoder andsymbol mapper 220 and all associated, corresponding components therein, and aprocessing module 280 is shown as including themetric generator 270 and thedecoder 280 and all associated, corresponding components therein.Such processing modules transmitter 297 may be included within a first processing module or integrated circuit, and all components within thereceiver 298 may be included within a second processing module or integrated circuit. Alternatively, any other combination of components within each of thetransmitter 297 and thereceiver 298 may be made in other embodiments. - As with the previous embodiment, such a
communication system 200 may be employed for the communication of video data is communicated from one location, or subsystem, to another (e.g., fromtransmitter 297 to thereceiver 298 via the communication channel 299). -
FIG. 3 is a diagram illustrating an embodiment of awireless communication system 300. Thewireless communication system 300 includes a plurality of base stations and/oraccess points network hardware component 334. Note that thenetwork hardware 334, which may be a router, switch, bridge, modem, system controller, etc., provides a widearea network connection 342 for thecommunication system 300. Further note that the wireless communication devices 318-332 may belaptop host computers -
Wireless communication devices devices system 300 or to communicate outside of thesystem 300, thedevices access points - The base stations or
access points areas network hardware 334 via localarea network connections system 300 and provides connectivity to other networks via theWAN connection 342. To communicate with the wireless communication devices within itsBSS access point 312 wirelessly communicates withwireless communication devices access point 316 wirelessly communicates with wireless communication devices 326-332. Typically, the wireless communication devices register with a particular base station oraccess point communication system 300. - Typically, base stations are used for cellular telephone systems (e.g., advanced mobile phone services (AMPS), digital AMPS, global system for mobile communications (GSM), code division multiple access (CDMA), local multi-point distribution systems (LMDS), multi-channel-multi-point distribution systems (MMDS), Enhanced Data rates for GSM Evolution (EDGE), General Packet Radio Service (GPRS), high-speed downlink packet access (HSDPA), high-speed uplink packet access (HSUPA and/or variations thereof) and like-type systems, while access points are used for in-home or in-building wireless networks (e.g., IEEE 802.11, Bluetooth, ZigBee, any other type of radio frequency based network protocol and/or variations thereof). Regardless of the particular type of communication system, each wireless communication device includes a built-in radio and/or is coupled to a radio.
-
FIG. 4 is a diagram illustrating anembodiment 300 of a wireless communication device that includes the host device 318-332 and an associatedradio 460. For cellular telephone hosts, theradio 460 is a built-in component. For personal digital assistants hosts, laptop hosts, and/or personal computer hosts, theradio 460 may be built-in or an externally coupled component. - As illustrated, the host device 318-332 includes a
processing module 450,memory 452, aradio interface 454, aninput interface 458, and anoutput interface 456. Theprocessing module 450 andmemory 452 execute the corresponding instructions that are typically done by the host device. For example, for a cellular telephone host device, theprocessing module 450 performs the corresponding communication functions in accordance with a particular cellular telephone standard. - The
radio interface 454 allows data to be received from and sent to theradio 460. For data received from the radio 460 (e.g., inbound data), theradio interface 454 provides the data to theprocessing module 450 for further processing and/or routing to theoutput interface 456. Theoutput interface 456 provides connectivity to an output display device such as a display, monitor, speakers, etc., such that the received data may be displayed. Theradio interface 454 also provides data from theprocessing module 450 to theradio 460. Theprocessing module 450 may receive the outbound data from an input device such as a keyboard, keypad, microphone, etc., via theinput interface 458 or generate the data itself. For data received via theinput interface 458, theprocessing module 450 may perform a corresponding host function on the data and/or route it to theradio 460 via theradio interface 454. -
Radio 460 includes ahost interface 462, digital receiver processing module 464, an analog-to-digital converter 466, a high pass and lowpass filter module 468, an IF mixing downconversion stage 470, areceiver filter 471, alow noise amplifier 472, a transmitter/receiver switch 473, a local oscillation module 474 (which may be implemented, at least in part, using a voltage controlled oscillator (VCO)),memory 475, a digital transmitter processing module 476, a digital-to-analog converter 478, a filtering/gain module 480, an IF mixing upconversion stage 482, apower amplifier 484, atransmitter filter module 485, a channel bandwidth adjust module 487, and anantenna 486. Theantenna 486 may be a single antenna that is shared by the transmit and receive paths as regulated by the Tx/Rx switch 473, or may include separate antennas for the transmit path and receive path. The antenna implementation will depend on the particular standard to which the wireless communication device is compliant. - The digital receiver processing module 464 and the digital transmitter processing module 476, in combination with operational instructions stored in
memory 475, execute digital receiver functions and digital transmitter functions, respectively. The digital receiver functions include, but are not limited to, digital intermediate frequency to baseband conversion, demodulation, constellation demapping, decoding, and/or descrambling. The digital transmitter functions include, but are not limited to, scrambling, encoding, constellation mapping, modulation, and/or digital baseband to IF conversion. The digital receiver and transmitter processing modules 464 and 476 may be implemented using a shared processing device, individual processing devices, or a plurality of processing devices. Such a processing device may be a microprocessor, micro-controller, digital signal processor, microcomputer, central processing unit, field programmable gate array, programmable logic device, state machine, logic circuitry, analog circuitry, digital circuitry, and/or any device that manipulates signals (analog and/or digital) based on operational instructions. Thememory 475 may be a single memory device or a plurality of memory devices. Such a memory device may be a read-only memory, random access memory, volatile memory, non-volatile memory, static memory, dynamic memory, flash memory, and/or any device that stores digital information. Note that when the processing module 464 and/or 476 implements one or more of its functions via a state machine, analog circuitry, digital circuitry, and/or logic circuitry, the memory storing the corresponding operational instructions is embedded with the circuitry comprising the state machine, analog circuitry, digital circuitry, and/or logic circuitry. - In operation, the
radio 460 receivesoutbound data 494 from the host device via thehost interface 462. Thehost interface 462 routes theoutbound data 494 to the digital transmitter processing module 476, which processes theoutbound data 494 in accordance with a particular wireless communication standard (e.g., IEEE 802.11, Bluetooth, ZigBee, WiMAX (Worldwide Interoperability for Microwave Access), any other type of radio frequency based network protocol and/or variations thereof etc.) to produce outbound baseband signals 496. The outbound baseband signals 496 will be digital base-band signals (e.g., have a zero IF) or digital low IF signals, where the low IF typically will be in the frequency range of one hundred kHz (kilo-Hertz) to a few MHz (Mega-Hertz). - The digital-to-
analog converter 478 converts the outbound baseband signals 496 from the digital domain to the analog domain. The filtering/gain module 480 filters and/or adjusts the gain of the analog signals prior to providing it to theIF mixing stage 482. TheIF mixing stage 482 converts the analog baseband or low IF signals into RF signals based on a transmitterlocal oscillation 483 provided bylocal oscillation module 474. Thepower amplifier 484 amplifies the RF signals to produce outbound RF signals 498, which are filtered by thetransmitter filter module 485. Theantenna 486 transmits the outbound RF signals 498 to a targeted device such as a base station, an access point and/or another wireless communication device. - The
radio 460 also receives inbound RF signals 488 via theantenna 486, which were transmitted by a base station, an access point, or another wireless communication device. Theantenna 486 provides the inbound RF signals 488 to thereceiver filter module 471 via the Tx/Rx switch 473, where theRx filter 471 bandpass filters the inbound RF signals 488. TheRx filter 471 provides the filtered RF signals tolow noise amplifier 472, which amplifies thesignals 488 to produce an amplified inbound RF signals. Thelow noise amplifier 472 provides the amplified inbound RF signals to theIF mixing module 470, which directly converts the amplified inbound RF signals into an inbound low IF signals or baseband signals based on a receiverlocal oscillation 481 provided bylocal oscillation module 474. The downconversion module 470 provides the inbound low IF signals or baseband signals to the filtering/gain module 468. The high pass and lowpass filter module 468 filters, based on settings provided by the channel bandwidth adjust module 487, the inbound low IF signals or the inbound baseband signals to produce filtered inbound signals. - The analog-to-
digital converter 466 converts the filtered inbound signals from the analog domain to the digital domain to produce inbound baseband signals 490, where the inbound baseband signals 490 will be digital base-band signals or digital low IF signals, where the low IF typically will be in the frequency range of one hundred kHz to a few MHz. The digital receiver processing module 464, based on settings provided by the channel bandwidth adjust module 487, decodes, descrambles, demaps, and/or demodulates the inbound baseband signals 490 to recaptureinbound data 492 in accordance with the particular wireless communication standard being implemented byradio 460. Thehost interface 462 provides the recapturedinbound data 492 to the host device 318-332 via theradio interface 454. - As one of average skill in the art will appreciate, the wireless communication device of the
embodiment 400 ofFIG. 4 may be implemented using one or more integrated circuits. For example, the host device may be implemented on one integrated circuit, the digital receiver processing module 464, the digital transmitter processing module 476 andmemory 475 may be implemented on a second integrated circuit, and the remaining components of theradio 460, less theantenna 486, may be implemented on a third integrated circuit. As an alternate example, theradio 460 may be implemented on a single integrated circuit. As yet another example, theprocessing module 450 of the host device and the digital receiver and transmitter processing modules 464 and 476 may be a common processing device implemented on a single integrated circuit. Further, thememory 452 andmemory 475 may be implemented on a single integrated circuit and/or on the same integrated circuit as the common processing modules ofprocessing module 450 and the digital receiver and transmitter processing module 464 and 476. - Any of the various embodiments of communication device that may be implemented within various communication systems can incorporate functionality to perform communication via more than one standard, protocol, or other predetermined means of communication. For example, a single communication device, designed in accordance with certain aspects of the invention, can include functionality to perform communication in accordance with a first protocol, a second protocol, and/or a third protocol, and so on. These various protocols may be WiMAX (Worldwide Interoperability for Microwave Access) protocol, a protocol that complies with a wireless local area network (WLAN/WiFi) (e.g., one of the IEEE (Institute of Electrical and Electronics Engineer) 802.11 protocols such as 802.11a, 802.11b, 802.11g, 802.11n, 802.11ac, etc.), a Bluetooth protocol, or any other predetermined means by which wireless communication may be effectuated.
-
FIG. 5 is a diagram illustrating an alternative embodiment of a wireless communication device that includes the host device 318-332 and an associated at least oneradio 560. For cellular telephone hosts, theradio 560 is a built-in component. For personal digital assistants hosts, laptop hosts, and/or personal computer hosts, theradio 560 may be built-in or an externally coupled component. For access points or base stations, the components are typically housed in a single structure. - As illustrated, the host device 318-332 includes a
processing module 550,memory 552, radio interface 554,input interface 558 andoutput interface 556. Theprocessing module 550 andmemory 552 execute the corresponding instructions that are typically done by the host device. For example, for a cellular telephone host device, theprocessing module 550 performs the corresponding communication functions in accordance with a particular cellular telephone standard. - The radio interface 554 allows data to be received from and sent to the
radio 560. For data received from the radio 560 (e.g., inbound data), the radio interface 554 provides the data to theprocessing module 550 for further processing and/or routing to theoutput interface 556. Theoutput interface 556 provides connectivity to an output display device such as a display, monitor, speakers, et cetera such that the received data may be displayed. The radio interface 554 also provides data from theprocessing module 550 to theradio 560. Theprocessing module 550 may receive the outbound data from an input device such as a keyboard, keypad, microphone, et cetera via theinput interface 558 or generate the data itself. For data received via theinput interface 558, theprocessing module 550 may perform a corresponding host function on the data and/or route it to theradio 560 via the radio interface 554. -
Radio 560 includes ahost interface 562, abaseband processing module 564,memory 566, a plurality of radio frequency (RF) transmitters 568-372, a transmit/receive (T/R)module 574, a plurality of antennae 582-386, a plurality of RF receivers 576-380, and a local oscillation module 5100 (which may be implemented, at least in part, using a VCO). Thebaseband processing module 564, in combination with operational instructions stored inmemory 566, execute digital receiver functions and digital transmitter functions, respectively. The digital receiver functions, include, but are not limited to, digital intermediate frequency to baseband conversion, demodulation, constellation demapping, decoding, de-interleaving, fast Fourier transform, cyclic prefix removal, space and time decoding, and/or descrambling. The digital transmitter functions, include, but are not limited to, scrambling, encoding, interleaving, constellation mapping, modulation, inverse fast Fourier transform, cyclic prefix addition, space and time encoding, and/or digital baseband to IF conversion. Thebaseband processing modules 564 may be implemented using one or more processing devices. Such a processing device may be a microprocessor, micro-controller, digital signal processor, microcomputer, central processing unit, field programmable gate array, programmable logic device, state machine, logic circuitry, analog circuitry, digital circuitry, and/or any device that manipulates signals (analog and/or digital) based on operational instructions. Thememory 566 may be a single memory device or a plurality of memory devices. Such a memory device may be a read-only memory, random access memory, volatile memory, non-volatile memory, static memory, dynamic memory, flash memory, and/or any device that stores digital information. Note that when theprocessing module 564 implements one or more of its functions via a state machine, analog circuitry, digital circuitry, and/or logic circuitry, the memory storing the corresponding operational instructions is embedded with the circuitry comprising the state machine, analog circuitry, digital circuitry, and/or logic circuitry. - In operation, the
radio 560 receivesoutbound data 588 from the host device via thehost interface 562. Thebaseband processing module 564 receives theoutbound data 588 and, based on amode selection signal 5102, produces one or more outbound symbol streams 590. Themode selection signal 5102 will indicate a particular mode as are illustrated in the mode selection tables, which appear at the end of the detailed discussion. Such operation as described herein is exemplary with respect to at least one possible embodiment, and it is of course noted that the various aspects and principles, and their equivalents, of the invention may be extended to other embodiments without departing from the scope and spirit of the invention. - For example, the
mode selection signal 5102, with reference to table 1 may indicate a frequency band of 2.4 GHz or 5 GHz, a channel bandwidth of 20 or 22 MHz (e.g., channels of 20 or 22 MHz width) and a maximum bit rate of 54 megabits-per-second. In other embodiments, the channel bandwidth may extend up to 1.28 GHz or wider with supported maximum bit rates extending to 1 gigabit-per-second or greater. In this general category, the mode selection signal will further indicate a particular rate ranging from 1 megabit-per-second to 54 megabits-per-second. In addition, the mode selection signal will indicate a particular type of modulation, which includes, but is not limited to, Barker Code Modulation, BPSK, QPSK, CCK, 16 QAM and/or 64 QAM. As is further illustrated in table 1, a code rate is supplied as well as number of coded bits per subcarrier (NBPSC), coded bits per OFDM symbol (NCBPS), and data bits per OFDM symbol (NDBPS). - The mode selection signal may also indicate a particular channelization for the corresponding mode which for the information in table 1 is illustrated in table 2. As shown, table 2 includes a channel number and corresponding center frequency. The mode select signal may further indicate a power spectral density mask value which for table 1 is illustrated in table 3. The mode select signal may alternatively indicate rates within table 4 that has a 5 GHz frequency band, 20 MHz channel bandwidth and a maximum bit rate of 54 megabits-per-second. If this is the particular mode select, the channelization is illustrated in table 5. As a further alternative, the mode
select signal 5102 may indicate a 2.4 GHz frequency band, 20 MHz channels and a maximum bit rate of 192 megabits-per-second as illustrated in table 6. In table 6, a number of antennae may be utilized to achieve the higher bit rates. In this instance, the mode select would further indicate the number of antennae to be utilized. Table 7 illustrates the channelization for the set-up of table 6. Table 8 illustrates yet another mode option where the frequency band is 2.4 GHz, the channel bandwidth is 20 MHz and the maximum bit rate is 192 megabits-per-second. The corresponding table 8 includes various bit rates ranging from 12 megabits-per-second to 216 megabits-per-second utilizing 2-4 antennae and a spatial time encoding rate as indicated. Table 9 illustrates the channelization for table 8. The mode select signal 102 may further indicate a particular operating mode as illustrated in table 10, which corresponds to a 5 GHz frequency band having 40 MHz frequency band having 40 MHz channels and a maximum bit rate of 486 megabits-per-second. As shown in table 10, the bit rate may range from 13.5 megabits-per-second to 486 megabits-per-second utilizing 1-4 antennae and a corresponding spatial time code rate. Table 10 further illustrates a particular modulation scheme code rate and NBPSC values. Table 11 provides the power spectral density mask for table 10 and table 12 provides the channelization for table 10. - It is of course noted that other types of channels, having different bandwidths, may be employed in other embodiments without departing from the scope and spirit of the invention. For example, various other channels such as those having 80 MHz, 120 MHz, and/or 160 MHz of bandwidth may alternatively be employed such as in accordance with IEEE Task Group ac (TGac VHTL6).
- The
baseband processing module 564, based on themode selection signal 5102 produces the one or more outbound symbol streams 590 from theoutput data 588. For example, if themode selection signal 5102 indicates that a single transmit antenna is being utilized for the particular mode that has been selected, thebaseband processing module 564 will produce a singleoutbound symbol stream 590. Alternatively, if the mode select signal indicates 2, 3 or 4 antennae, thebaseband processing module 564 will produce 2, 3 or 4 outbound symbol streams 590 corresponding to the number of antennae from theoutput data 588. - Depending on the number of
outbound streams 590 produced by thebaseband module 564, a corresponding number of the RF transmitters 568-372 will be enabled to convert the outbound symbol streams 590 into outbound RF signals 592. The transmit/receivemodule 574 receives the outbound RF signals 592 and provides each outbound RF signal to a corresponding antenna 582-386. - When the
radio 560 is in the receive mode, the transmit/receivemodule 574 receives one or more inbound RF signals via the antennae 582-386. The T/R module 574 provides the inbound RF signals 594 to one or more RF receivers 576-380. The RF receiver 576-380 converts the inbound RF signals 594 into a corresponding number of inbound symbol streams 596. The number of inbound symbol streams 596 will correspond to the particular mode in which the data was received (recall that the mode may be any one of the modes illustrated in tables 1-12). Thebaseband processing module 560 receives the inbound symbol streams 590 and converts them intoinbound data 598, which is provided to the host device 318-332 via thehost interface 562. - In one embodiment of
radio 560 it includes a transmitter and a receiver. The transmitter may include a MAC module, a PLCP module, and a PMD module. The Medium Access Control (MAC) module, which may be implemented with theprocessing module 564, is operably coupled to convert a MAC Service Data Unit (MSDU) into a MAC Protocol Data Unit (MPDU) in accordance with a WLAN protocol. The Physical Layer Convergence Procedure (PLCP) Module, which may be implemented in theprocessing module 564, is operably coupled to convert the MPDU into a PLCP Protocol Data Unit (PPDU) in accordance with the WLAN protocol. The Physical Medium Dependent (PMD) module is operably coupled to convert the PPDU into a plurality of radio frequency (RF) signals in accordance with one of a plurality of operating modes of the WLAN protocol, wherein the plurality of operating modes includes multiple input and multiple output combinations. - An embodiment of the Physical Medium Dependent (PMD) module includes an error protection module, a demultiplexing module, and a plurality of direction conversion modules. The error protection module, which may be implemented in the
processing module 564, is operably coupled to restructure a PPDU (PLCP (Physical Layer Convergence Procedure) Protocol Data Unit) to reduce transmission errors producing error protected data. The demultiplexing module is operably coupled to divide the error protected data into a plurality of error protected data streams The plurality of direct conversion modules is operably coupled to convert the plurality of error protected data streams into a plurality of radio frequency (RF) signals. - It is also noted that the wireless communication device of this diagram, as well as others described herein, may be implemented using one or more integrated circuits. For example, the host device may be implemented on one integrated circuit, the
baseband processing module 564 andmemory 566 may be implemented on a second integrated circuit, and the remaining components of theradio 560, less the antennae 582-586, may be implemented on a third integrated circuit. As an alternate example, theradio 560 may be implemented on a single integrated circuit. As yet another example, theprocessing module 550 of the host device and thebaseband processing module 564 may be a common processing device implemented on a single integrated circuit. Further, thememory 552 andmemory 566 may be implemented on a single integrated circuit and/or on the same integrated circuit as the common processing modules ofprocessing module 550 and thebaseband processing module 564. - The previous diagrams and their associated written description illustrate some possible embodiments by which a wireless communication device may be constructed and implemented. In some embodiments, more than one radio (e.g., such as multiple instantiations of the
radio 460, theradio 560, a combination thereof, or even another implementation of a radio) is implemented within a wireless communication device. For example, a single wireless communication device can include multiple radios therein to effectuate simultaneous transmission of two or more signals. Also, multiple radios within a wireless communication device can effectuate simultaneous reception of two or more signals, or transmission of one or more signals at the same time as reception of one or more other signals (e.g., simultaneous transmission/reception). - Within the various diagrams and embodiments described and depicted herein, wireless communication devices may generally be referred to as WDEVs, DEVs, TXs, and/or RXs. It is noted that such wireless communication devices may be wireless stations (STAs), access points (APs), or any other type of wireless communication device without departing from the scope and spirit of the invention. Generally speaking, wireless communication devices that are APs may be referred to as transmitting or transmitter wireless communication devices, and wireless communication devices that are STAs may be referred to as receiving or receiver wireless communication devices in certain contexts.
- Of course, it is noted that the general nomenclature employed herein wherein a transmitting wireless communication device (e.g., such as being an AP, or a STA operating as an ‘AP’ with respect to other STAs) initiates communications, and/or operates as a network controller type of wireless communication device, with respect to a number of other, receiving wireless communication devices (e.g., such as being STAs), and the receiving wireless communication devices (e.g., such as being STAs) responding to and cooperating with the transmitting wireless communication device in supporting such communications.
- Of course, while this general nomenclature of transmitting wireless communication device(s) and receiving wireless communication device(s) may be employed to differentiate the operations as performed by such different wireless communication devices within a communication system, all such wireless communication devices within such a communication system may of course support bi-directional communications to and from other wireless communication devices within the communication system. In other words, the various types of transmitting wireless communication device(s) and receiving wireless communication device(s) may all support bi-directional communications to and from other wireless communication devices within the communication system.
- Various aspects and principles, and their equivalents, of the invention as presented herein may be adapted for use in various standards, protocols, and/or recommended practices (including those currently under development) such as those in accordance with IEEE 802.11x (e.g., where x is a, b, g, n, ac, ah, ad, af, etc.).
-
FIG. 6 is a diagram illustrating anembodiment 600 of a navigation system with a display or projection system in a vehicular context. With respect to navigation systems, a novel approach and architecture is presented herein by which information is provided to a driver of a vehicle within the driver's field of vision. For example, instead of providing information to a driver in a way that the driver must take his or her eyes off of the road, information related to the driving experience is provided within the driver's field of vision (e.g., so the driver need not take his or her eyes off of the road). In addition, by providing such information within the field of vision, the navigation system becomes much more effectual. - For example, by providing such information to a driver within the actual field of vision, much greater specificity and clarification may be provided to a driver in regards to directions. For example, when approaching a relatively complex interchange of roads, if specific information regarding which of a number of potential roads is the correct one, the driver will be able to make better or more accurate decisions. Generally speaking, such a navigation system, operating in conjunction with one or more projectors, provides information to help ensure that the driver is able to keep his or her head up while driving.
- As may be seen with respect to this embodiment, a navigation system is implemented using overlays or directions that are included within the actual field of vision. For example, in the context of a vehicle application, a driver (or user) is provided with overlays or directions within the actual field of vision rather than on a display such as may be included within the dashboard of a vehicle. Also, various feature labeling may also be included with respect to elements that are included within the field of vision. For example, a label for a building or other feature within the field of vision may be appropriately placed thereon. Street names may also be overlaid the actual streets within the field of vision.
- In addition, any of a variety of types of information related to a driving experience may similarly be displayed within the field of vision. For example, the actual vehicle's location (e.g., such as in accordance with a global positioning system (GPS) location), the vehicle's relative location (e.g., such as relative to a point of destination to which the vehicle is heading), the vehicle's speed (velocity), the vehicle's directionality (e.g., North, South, Southwest, etc.), time information (e.g., a clock), chronometer information (e.g., a timer), temperature information (e.g., external to the vehicle and/or internal of the vehicle), and/or any other additional information which may be used to enhance a driver's experience may be included within the field of vision.
- Of course, a user is given wide latitude to select which particular information is to be provided within the field of vision. Using a user interface, display, and/or control module of the vehicle, a user may select any group or combination of information to be displayed within the field of vision. It is noted that such user interface, display, and/or control module may be implemented in any of a variety of ways including by using an already implemented navigation or GPS system of the vehicle.
- Also, a user may select or identify feature labels to be included within the field of vision. Of course, with respect to feature labels, those labels are only included within the field of vision when the vehicle is within proximity of those features. For example, when the vehicle is near a particular street, such that the user should be able to see that street, the feature label would then be overlaid the street within the actual field of vision. Analogously, when the vehicle is within a sufficient proximity of a particular feature such as a building, the label for that particular building would then be displayed within the field of vision. In certain embodiments, the user may also select the proximity to which various features will cause such labeling to appear. For example, a user may select a proximity of X number of distance measure (e.g., where distance measure may be in meters, feet, etc. and/or any other measuring unit) to a particular feature as being the triggering event that will cause such labeling to appear within the field of vision.
- As the reader will understand, any other variety of different information may be displayed within the field of vision to enhance driver experience. To effectuate such labeling, overlays, identifiers, and/or directions, etc. within the actual field of vision, the windshields of the vehicle may provide background on which such information is projected. For example, various imaging devices such as projectors may be implemented within a vehicle and can use various translucent surfaces around the driver as the backdrop(s) or background on which images are projected for visual consumption by the driver in a vehicle or context. That is to say, the windshields and windows of the vehicle may themselves serve as the background on which such information may be projected. The glass or glasslike materials employed for windshields may serve as the actual background for projection of such labeling and/or indicia for consumption by the driver and/or passengers. With respect to such labeling and/or indicia as described with respect to this embodiment, as well as with respect to other embodiments and/or diagrams, it is noted that such terminology of labeling and/or indicia may be viewed as generally referring to information that is provided to enhance the user experience. For example, such labeling and/or indicia may correspond to any of a variety of formats, including arrows, markers, labels, alphanumeric characters, etc. and/or any combination thereof, etc.
- As the reader will understand with respect to other embodiments, such information may be provided in accordance with three-dimensional (3-D) projection such that the various labels etc. will appear as being extended out and into the field of vision.
- Referring specifically to this diagram, overlays or directions are provided by arrows indicating the path by which the driver should proceed. In the diagram, one arrow indicates to proceed forward until reaching the intersection of X Street and Y Avenue. A second arrow indicates that the driver should turn right on Y Avenue. As may be seen, by providing such directions particularly within the field of vision, there is a very small possibility that the driver will make meeting returns in trying to reach his or her destination. In addition, by providing such labels or features within the field of vision (e.g., labels of landmarks, buildings, stadiums, airports, parks, historical features, and/or any other features that a driver may come across, etc.), additional information is provided to help a driver navigate in an attempt to reach his or her destination. Such labeling of these features may operate as guideposts or reference points that will help the driver navigate along the appropriate roads to reach his or her destination. Generally speaking, such labeling and/or indicia may be viewed as markers that reach out into the field of vision at perspective depths within the field of vision to highlight certain pathways, features, destinations, etc. for a driver.
- With respect to providing such labeling and/or indicia within the field of vision, a user also has a wide degree of latitude in selecting when such labeling and/or indicia are to be provided to the user. For example, a user may select that desired information be provided at all times. In an alternative embodiment, such labeling and/or indicia may be provided when the vehicle is within certain proximity of the destination (e.g., which may be selected by the user) or only when the vehicle is to make a particular road change, lane change, turn, etc. along the path which will lead to the destination. That is to say, when a particular action must be taken by the driver to ensure that the vehicle maintains on the proper path to the endpoint destination, at that point such labeling and/or indicia would then be provided to the driver within the driver's field of vision. Of course, such labeling and/or indicia may be provided when selected by the user (e.g., when the user enables such labeling and/or indicia to be provided).
- In even other embodiments, such labeling and/or indicia may be provided within the field of vision as a function of the complexity of a particular segment or portion of the path that leads to the endpoint destination. For example, while driving along a country road in a predominantly rural environment, there may be very little need to provide very specific indicia related to the drive. As the reader will understand, while driving in a very rural environment in which there are very few roads, very few interchanges, etc., very little indication and information may need to be provided to the driver. However, while driving in a very congested, complex, urban, etc. environment, there may be a multiplicity of options by which a driver may take the wrong path. As the reader will understand, while driving along a road in a large city and when approaching a very complex interchange of roads in which there are many different routes, liens, exits, etc. that may be taken, greater specificity and labeling of the various roads, and more specifically the corrected road on which the driver should proceed, will help ensure that the driver takes the correct road.
- As may be seen with respect to this diagram, such labeling and/or indicia is provided within the actual field of vision and is provided with respect to the actual physical environment. That is to say, such labeling and/or indicia is not provided within an artificially reconstructed environment, but such labeling and/or indicia are within the actual field of vision of the driver. Generally speaking, such labeling and/or indicia is provided and tailored particularly for the frame of reference of the driver. However, such labeling and/or indicia could alternatively be provided and tailored particularly with respect to the reference or any other location within the vehicle (e.g., with respect to the passenger seat next to the driver, a seat in the back of the vehicle, etc.). For example, in a situation in which the driver does not necessarily want such labeling and/or indicia to be provided to him or her, yet when a passenger in the vehicle does want such information provided to him or her, such labeling and/or indicia may be provided only to those occupants of the vehicle for whom such information is desired to be provided.
-
FIG. 7 is a diagram illustrating anembodiment 700 of a navigation system operative to include overlays/directions within an in-dash and/or in-steering wheel display. With respect to this diagram, instead of providing such labeling and/or indicia within the actual field of vision of a driver and/or occupant of the vehicle, such labeling and/or indicia is being instead included within actual photographic image and/or video information that is provided via a display within the vehicle. For example, actual photographic image and/or video may be provided via a display within the vehicle (e.g., such as via a small screen implemented within the dashboard of the vehicle, on the steering wheel of the vehicle, etc.), and such labeling and/or indicia is included within that actual photographic image and/or video information. As the reader will understand, such labeling and/or indicia will correspond particularly to the actual physical environment as represented by the actual photographic image and/or video information. - As we understand, by using actual photographic and/or video information corresponding to the actual physical environment, there is a much greater likelihood and lower possibility that a driver will incorrectly interpret the directions provided by such a navigation system. While this variant does not particularly project labeling and/or indicia into the driver's field of vision, the use of actual photographic and/or video information nevertheless does provide a very accurate depiction of the physical environment. There is significantly less likelihood that a driver will have difficulty in translating the directions provided by the navigation system. For example, the fact that actual photo and/or video information of the physical environment is itself employed as the medium within which labeling and/or indicia is placed (as opposed to some animated or cartoonlike representation of the physical environment) will effectively reduce the likelihood of misinterpretation of the directions provided by the navigation system.
- Analogous to the previous embodiment, any particular information which may be operative to enhance the experience of the driver for any occupant of the vehicle may be provided within the actual photographic image and/or video information. As will be seen with respect to other diagrams and/or embodiments described herein, such photographic image and/or video information may be acquired in real-time. Alternatively, such photographic image and/or video information may be retrieved beforehand, such as from a database including such photographic image and/or video information, maps, etc., and such information may be stored with in any desired memory device within the vehicle and/or the navigation system thereof. In some instances, such information is retrieved from one or more networks (such as one or more wireless networks) whenever the vehicle is stationary/not being driven and has access to such one or more networks. This way, very recent/up to date information (or at least relatively recent/up to date) information may be employed even if not retrieved exactly in real time.
-
FIG. 8 is a diagram illustrating anembodiment 800 of a vehicle including at least one projector for effectuating a display or projection of information. This diagram illustrates how one or more projectors may be implemented within and around the vehicle for effectuating the display of labeling and/or indicia within their field of vision of the driver or passenger of the vehicle. In some embodiments, only a single projector is employed for displaying labeling and/or indicia for one or more people within the vehicle. In an alternative embodiment, two or more projectors may be employed to effectuate three-dimensional or visual stereo labeling and/or indicia within the field of vision. This embodiment particularly shows how various projectors may be implemented to use the windshields or windows of a vehicle as the background on which various labeling and/or indicia are projected for visual consumption by the driver or passengers of the vehicle. - This diagram represents a top view of a generic vehicle, and the projectors may be implemented within the vehicle to ensure their protection from the elements, etc. Of course, various embodiments may implement the one or more projectors in entirely different positions within the vehicle. That is to say, one or more projectors may be appropriately placed to provide a better user experience for someone located within the driver position of the vehicle in one embodiment. In another embodiment, the one or more projectors may be appropriately placed to provide a better user experience for a passenger or non-driver of the vehicle.
- Also, as the reader will understand with respect to other embodiments described herein, the labeling and/or indicia may be provided within any directionality or field of vision of the driver or any passenger of the vehicle. That is to say, the labeling and/or indicia need not be provided only via the front windshield of the vehicle. Labeling and/or indicia may also be provided via the rear windshield, any side windows, any sun or moon roof, etc. that may provide a field of vision for the driver or any passenger of the vehicle.
- As such, such labeling and/or indicia may be employed to provide an enhanced experience for passengers of the vehicle. That is to say, in addition to merely providing directions and information for use by the driver of the vehicle to assist in that driver in reaching the appropriate endpoint destination, labeling and/or indicia may be provided for consumption by the passengers of the vehicle. One embodiment may include providing information for consumption by participants of a tour that happens to be riding along in the vehicle. For example, within a relatively large vehicle such as a van or bus, there may be a very large number of windows by which various passengers of the vehicle may have respective fields of vision. Different and respective labeling and/or indicia may be provided for selective consumption by the various passengers in the vehicle. For example, a first passenger riding near the back of the vehicle may have labeling and/or indicia for a particular feature within the physical environment projected into his respective field of vision that is different than the labeling and/or indicia for that very same particular feature within the physical environment from the perspective of a second passenger riding in the vehicle. As the reader will understand, such labeling and/or indicia may be very useful in tour guide industries such that each of the respective participants of the tour can be provided with individual and selective information regarding the physical environment through which they are traveling for greater understanding and enjoyment of that physical environment.
-
FIG. 9A is a diagram illustrating anembodiment 900 of a vehicle including at least one antenna for supporting wireless communications with at least one wireless communication system and/or network. As may be seen within this diagram, one or more antenna may be included within a vehicle to help effectuate wireless communications with one or more wireless communication systems and/or networks. For example, the appropriate and respective means, functionality, circuitry, etc. for supporting wireless communications with any desired wireless communication systems and/or networks may be included within the vehicle (e.g., satellite, wireless local area network (WLAN) such as WiFi, wide area network (WAN) such as WiMAX, etc.). This diagram also shows a top view of the vehicle. The particular placement of the one or more antenna for supporting wireless communications may be placed within any desired pattern or configuration within the vehicle. - By including such wireless communication capability within a vehicle, a navigation system of the vehicle may receive information from any wireless communication system and/or network with which the vehicle may communicate. Of course, the navigation system of the vehicle may also access other communication systems and/or networks through any intervening wireless communication system and/or network with which the navigation system may communicate. For example, the navigation system may be able to communicate wirelessly with an access point (AP) of a wireless local area network, and via that access point connectivity, the navigation system may be able to access the Internet to access or retrieve any information there from. For example, various databases available via the Internet would be acceptable by the navigation system of the vehicle.
- By being able to provide such real-time communications to the navigation system on the vehicle, real-time information may be provided to driver via the navigation system. For example, real-time information regarding traffic flow, accidents, road closures, detours, etc. may be provided to the driver. Moreover, any locally stored database corresponding to the navigation system on the vehicle may be adapted or modified as a function of new information received via such wireless communications.
- Referring to a previous embodiment in which photographic and/or video information regarding the actual physical environment is employed by the navigation system, certain other embodiments may operate by which such photographic and/or video information may be updated, corrected, enhanced, etc. by information that is received via such wireless communications. For example, by providing such wireless communication capability for the navigation system of the vehicle, such updating of any information locally stored for use by the navigation system may be performed without meeting specific interaction by the driver or user of the navigation system. In addition, selective updating of the photographic and/or video information employed by the navigation system may need only to be performed when the vehicle is within a certain location. That is to say, locally stored photographic and/or video information corresponding to the physical environment of a given locale could be updated when the vehicle enters into or near that given locale. As such, the updating of any locally stored information of the navigation system could be performed as a function of the location of the vehicle. As the vehicle moves around, any photographic and/or video information pertaining to its current location could be updated in real time.
- It is also noted that such updating of locally stored information of the navigation system may be triggered based on a comparison of such locally stored information to that which may be retrieved from a database with which the navigation system may communicate. For example, as the vehicle moves to a location for which the navigation system has some locally stored information, the navigation system would then automatically compare its locally stored information to other information corresponding to that same location that which may be retrieved from some remote database (e.g., from a remote database accessible via the Internet). Of course, in certain embodiments, the implementation of such image processing comparison functionality, circuitry, etc. may not be desirable in all embodiments. For example, in an embodiment in which the provisioning of such functionality, circuitry, etc. may be undesirable (e.g., for any number of reasons, including cost, complexity, space, etc.), information that is received via one or more wireless networks may simply be provided for consumption by the user “as is” (such as without undergoing any comparison operations).
-
FIG. 9B is a diagram illustrating anembodiment 901 of a vehicle including at least one camera for supporting acquisition of image and/or video information. As may be seen with respect to this diagram, one or more cameras may be included within the vehicle to perform photo and/or video capture of the physical environment around the vehicle. By using one or more cameras that are physically implemented around the vehicle, very accurate and completely up-to-date information may be employed by the navigation system. For example, as the vehicle is approaching a particular intersection in which any given route adjustment may need to be made, photographic and/or video information may be acquired in real time by the one or more cameras implemented around the vehicle. Then, this currently acquired information may be used by the navigation system such that labeling and/or indicia may be included therein. - With various operational parameters and constraints by which these one or more cameras operate may be adaptive and/or configurable. For example, via a user interface of the vehicle's navigation system, a user may select the various operational parameters and constraints by which photographic and/or video information is acquired by the one or more cameras. Alternatively, the navigation system may include adaptive capability such that the one or more cameras perform photographic and/or video capture and compare that acquired information to that which is stored with in memory of the navigation system or which may be retrieved via one or more communication networks with which the navigation system may communicate. When the real-time acquired information differs from that which is either locally stored or retrieved from a remote database, the navigation system may update its respective information using that information which has been acquired in real time by the one or more cameras. It is also noted that such updating of information may be performed when a vehicle is not being driven (e.g., such as when it is parked).
- As the reader will understand, a navigation system, operating in conjunction with at least one camera implemented with respect to the vehicle, may be implemented to provide for acquisition of the physical environment through which the vehicle passes to contribute significantly to the accuracy of the photographic and/or video information that is employed to provide direction and instructions to a user of the navigation system. For example, the driver may be provided with much more accurate information, corresponding specifically to the environment in which the vehicle presently is, thereby minimizing any possibility that the driver of the vehicle will improperly translate instructions provided via the navigation system.
- As described with respect to other embodiments herein, using photographic and/or video information accompanied with labeling and/or indicia, and specifically using up-to-date and fully accurate photographic and/or video information, very accurate and comprehensive directions may be provided to the driver of the vehicle via the navigation system.
- Somewhat analogously as described with respect other embodiments that include various complements therein (such as the previous embodiment that includes one or more antenna), the one or more cameras that may be implemented with respect to a vehicle may be implemented in any desired location. In some embodiments, the one or more cameras will be appropriately implemented to provide for photographic and/or video information acquisition in the direction of the forward field of vision with respect to the perspective of the driver. In other embodiments, additional cameras may be implemented to provide acquisition of such information in any directional view extending from the vehicle.
- For example, with respect to an embodiment including a very large vehicle such as a bus, multiple cameras may be implemented around the perimeter or periphery of the bus to allow for acquisition of such information from any perspective of any particular occupant of such a vehicle.
- It is also noted that any desired capability of the one or more cameras may be selected for and implemented within a given embodiment. For example, in some instances, the one or more cameras are three dimension (3-D) capable. In other embodiments, the cameras may have any one or combination of a particular or preferred resolution, focus or auto focus capabilities, image acquisition size, widescreen capability, panoramic capability, etc. By implementing one or more cameras having such desired capabilities for a given application, the operational capabilities and quality of the navigation system may be specifically tailored.
- With respect to such an embodiment includes one or more cameras operative to perform photographic and/or image acquisition, various types of buffering may be performed. For example, such acquisition may be performed in a manner that only a last or most recent period of time of photographic and/or image information is kept (e.g., depending upon the amount of memory employed within such a system for buffering of photographic and/or image information).
-
FIG. 10 is a diagram illustrating anembodiment 1000 of a navigation system operative to include overlays/directions within a wireless communication device. The navigation of this particular diagram is implemented within a wireless communication device (e.g., a wireless terminal 1010), such as may be a portable wireless communication device as may be employed by a user. Examples of such wireless communication devices or wireless terminal terminals are many including cell phones, personal digital assistants, laptop computers, and/or, generally speaking, any wireless communication device that may be portable. - Such a wireless communication device includes a user interface that is operative to provide information to a user and may also be operative to receive input from the user. For example, the user interface includes a display by which photographic image and/or video information may be displayed for consumption by the user. Somewhat analogous to previous embodiments, labeling and/or indicia may be included within such photographic image and/or video information that is provided via the display for consumption by the user in accordance with navigational directions and instructions. Also, somewhat analogous to certain previous embodiments, any desired additional information which may enhance the user experience may be included or overlaid with respect to actual photographic and/or video in for example, formation. Some examples of such user experience enhancing information may include direction or bearing (such as North, South, etc.) that the wireless communication device or user is moving or facing, the actual location of the wireless communication device or user, arrows indicating certain routes and/or route adjustments that will get the user to the appropriate endpoint destination.
-
FIG. 11A is a diagram illustrating anembodiment 1100 of navigation system operative to include overlays/directions within a headset and/or eyeglasses context. With respect to this diagram, and appropriately implemented headset and/or eyeglass-based system may include functionality in which labeling and/or indicia may be projected into the user's field of vision. Such labeling and/or indicia may be provided in accordance with the operation of the navigation system. An appropriately implemented headset and/or eyeglasses having such functionality to include labeling and/or indicia within a user's field of vision will include the appropriate circuitry to effectuate such labeling and/or indicia. - In one embodiment, one or more projectors are implemented within the headset and/or eyeglass-based system such that the labeling and/or indicia may be projected onto the actual transparent or translucent surfaces through which a user views the respective field of vision. When the user does view the field of vision, the labeling and/or indicia being projected appropriately onto the transparent or translucent surface of the headset and/or eyeglass-based system will effectuate such labeling and/or indicia as being projected into or overlaid the field of vision.
- It is also noted respective this diagram and
embodiment 1100 as well as with respect other diagrams and embodiments herein, such labeling and/or indicia as may be provided for consumption by a user need not necessarily be provided within the visible spectrum. For example, by using appropriately implemented eyeglasses, headgear, etc., certain labeling and/or indicia may be provided within the non-visible spectrum. For example, such appropriately implemented eyeglasses, headgear, etc. may enable a user to perceive infrared provided labeling and/or indicia. For example, in accordance with a night vision-based system, such infrared provided labeling and/or indicia could be provided for consumption by a user that would not be perceptible by others (e.g., those not employing such appropriately implemented eyeglasses, headgear, etc.). -
FIG. 11B is a diagram illustrating anembodiment 1101 of navigation system operative to include overlays/directions within a headset and/or eyeglasses context in conjunction with a wireless communication device. With respect to this diagram, a headset and/or eyeglass-based system may include wireless communication capability. For example, the component placed upon a user's head or in front of the user's eyes may include a wireless terminal therein. Such a wireless terminal or wireless communication device may be implemented to operate in accordance with any desired wireless communication protocol, standard, and/or recommended practice, etc. as may be desired. - In addition, such a wireless terminal as included within a headset and/or eyeglass-based system may operate cooperatively with a handheld or portable wireless terminal of the user. In such an embodiment, the headset and/or eyeglass-based system and the portable wireless terminal may generally be viewed as a combined system. For example, the handheld or portable wireless terminal may include significantly more processing capability, hardware, memory, etc. with respect to that which may be included within the compliment placed upon the user's head or in front of the user's eyes. By utilizing the wireless communications effectuated between the two respective complements of the overall system, significantly reduced amount of processing capability, hardware, memory, etc. may be implemented within the placed upon the user's head or in front of the user's eyes.
-
FIG. 12 is a diagram illustrating anembodiment 1200 of various types of navigation systems, implemented to support wireless communications, entering/exiting various wireless communication systems and/or networks. As may be seen with respect to this embodiment, certain such devices and/or systems may include wireless communication capability. As the reader will understand, such devices and/or systems may enter and exit various wireless network service areas. In this diagram, it may be seen that a vehicle including wireless communication capability may move among various wireless network service areas. Specifically, with respect to atime 1 and atime 2, the vehicle is shown as being within two respective wireless network service areas. As the vehicle enters and exits various wireless networks, it may acquire information respectively from those wireless networks. - With respect to a pedestrian as depicted below the vehicle, such a user may also enter and exit such wireless network service areas. However, it is likely that such a user while moving on foot would move at a rate that is less than that of the vehicle. As such, the respective times at which such a user may communicate with the respective wireless network service areas may differ from that of the vehicle.
- Regardless of the particular mode of transportation of a user, as a device including such navigation system having such wireless communication capability moves among various wireless network service areas, different respective information may be retrieved respectively there from and provided to the user of the navigation system. For example, certain wireless network service areas may include information specific to the locale in which the wireless network operates. One embodiment may correspond to the locale of a traffic intersection or road near which the wireless network operates. The wireless network may include at least one device therein (e.g., access point (AP)) that includes capability for monitoring traffic flow. For example, an access point may include photographic and/or video acquisition capability (e.g., one or more cameras) such that information related to traffic flow is acquired and monitored. As a navigation system enters such a wireless network service area, the navigation system could then retrieve information related to such traffic flow. Such information may be current information, historical information such as trends of traffic flow as a function of time of day (e.g., rush-hour, midday, etc.), etc. Then, when the navigation system having such wireless communication capability effectuate communication with such a wireless network service area, such information may be retrieved and utilizes within and in conjunction with the navigation system.
- As the reader will understand, by including such capability that real-time retrieved information may be incorporated within a navigation system as a function of the movement of the navigation system, such greater detailed information will further enhance the user's experience.
- It is also noted that while such functionality and capability may be included within a navigation system having such wireless communication capability, there may be selectivity by which some real-time acquired information is provided to a user of the navigation system. For example, one embodiment may correspond to always providing such information to a user. Another embodiment may include selectivity in which such information is provided to the user only when switched on such as via a switch, a voice command, etc.
- In yet another embodiment, such acquired information may be provided to a user as a function of the proximity of the navigation system or user to the endpoint destination. For example, when the user is within a particular distance of the endpoint destination (e.g., such as determined by a threshold or proximity which may be set by the user), only then is such information provided to the user.
- In yet another embodiment, such acquired information may be provided to a user as a function of the complexity of the current environment in which the user is. For example, while driving a vehicle, when approaching a particular interchange having a relatively high complexity, such capability would be automatically turned on. The settings and constraints by which such decision-making may be made could be user-defined, predetermined, etc.
- In giving yet another embodiment, such acquired information may be provided as a function of the congestion or traffic in which the vehicle may currently be in or along the route that the vehicle is proceeding. For example, one or more sensors may be included within the vehicle to detect the existence of one or more additional vehicles nearby (e.g., collision detection capability may be employed to detect proximity of one or more other vehicles). Such information acquired by sensors may be accompanied with monitoring of the velocity of the vehicle (e.g., when the velocity of the vehicle is below a particular threshold and one or more other vehicles are detected within a particular proximity of the vehicle, which may indicate relatively congested traffic or a traffic jam), then such information may be provided for consumption by the user. In some embodiments, one or more cameras may be employed to perform photographic and/or video information that may be used to determine congestion or traffic information, proximity of one or more other vehicles, etc. In addition, by employing such wireless communication capability, real-time acquisition of information from one or more wireless networks with which the navigation system may communicate may be used to provide information related to congestion or traffic in which the vehicle may currently be in or along the route that the vehicle is proceeding.
-
FIG. 13 is a diagram illustrating anembodiment 1300 of at least two projectors operative in accordance with a lenticular surface for effectuating three dimensional (3-D) display. Generally speaking, a lenticular surface may be viewed as an array or series of cylindrical molded lenses. Such a lenticular surface may be included within any of the transparent or translucent surface (e.g., any windshield or window of a vehicle, headset and/or eyeglasses, a transparent or translucent portion of a headset, etc.). - Such a lenticular surface may be constructed such that it is generally transparent non-perceptible to a user. That is to say, the cylindrical lenses of such a lenticular surface will not deleteriously affect the perceived quality or vision of a user when looking through such a lenticular surface. The cylindrical lenses may be visually non-perceptible to a user, yet tactically felt by a user (e.g., when a user may slide a finger across the surface).
- However, the particular features of such a lenticular surface can provide one means by which three-dimensional imaging may be effectuated. For example, with respect to an embodiment of a navigation system implemented within a vehicle such that labeling and/or indicia is projected into a field of vision of the driver or any other passenger of the vehicle, when one or more of the transparent or translucent surfaces of the vehicle (e.g., any windshield or window of a vehicle) is implemented in accordance with the lenticular surface, three-dimensional display of such labeling and/or indicia may be made. Again, the use of such a lenticular surface will not obstruct the view or vision capabilities of a driver or passenger of the vehicle. However, by having such a lenticular surface implemented on the transparent or translucent surface, the effect of three-dimensional projection of labeling and/or indicia may be made with respect to a field of vision of a driver or passenger of the vehicle.
- As may be seen with respect to this diagram, two or more projectors may be employed in different locations for effectuating three-dimensional imaging. In this particular instance, a first projector could be implemented on a left-hand side and a second projector could be implemented on a right-hand side. Each respective projector is then operative respectively to project an image and/or video information towards the lenticular surface. Each of the respective cylindrical lenses of the lenticular surface then appropriately process the respective received image and/or video information such that a driver or passenger of the vehicle will perceive a three-dimensional effect. That is to say, the cylindrical lenses and their curvature shape operate to refract light rays received via the at least two respective directions corresponding to the at least two respective projectors. As the reader will understand, the curvature shape of the respective cylindrical lenses along the lenticular surface effectively refract the received light rays as a function of their angle of incidence at the surface. In a preferred embodiment, such a lenticular surface may be designed to have a rear focal plane coincident with the backplane of the transparent or translucent surface. In the instance of the lenticular surface being implemented on a windshield, the rear focal plane of the lenticular surface may be designed to coincide with the backplane of the windshield (e.g., the surface of the windshield which is exposed to the outside, which is the backplane of the translucent or transparent surface through which a driver or passenger looks).
- It is noted that such a lenticular surface may be implemented along any desired surface through which a user may look. For example, in another embodiment, such a lenticular surface may be implemented within eyeglasses, a headset type display, etc. By using two or more appropriately placed projectors, three-dimensional imaging may be effectuated using such transparent or translucent material.
-
FIG. 14 is a diagram illustrating anembodiment 1400 of at least two projectors operative in accordance with a non-uniform surface for effectuating 3-D display. This diagram depicts an alternative embodiment by which three-dimensional imaging may be effectuated. Instead of employing a particularly titular surface as with respect to a previous embodiment, a non-uniform surface may be employed. Somewhat analogous to the previous embodiment, two or more projectors may be implemented at different respective locations so that when they respectively project photographic and/or video information towards the non-uniform surface. Therefore, from the perspective of a driver or passenger of the vehicle, three-dimensional projected labeling and/or indicia will appear to extend out into the respective field of vision. - Generally speaking, any desired surface may be employed to assist in effectuating three-dimensional imaging for projecting labeling and/or indicia into a user's field of vision. The use of lenticular surface as described with respect to a previous embodiment and the sawtooth appearing non-uniform surface of this perspective embodiment are to possible examples. Of course, other such services that help effectuate three-dimensional imaging.
-
FIG. 15 is a diagram illustrating anembodiment 1500 of an organic light emitting diode (organic LED or OLED) as may be implemented in accordance with various applications for effectuating 3-D display. This diagram depicts yet another means by which three-dimensional imaging may be effectuated for an enhanced user experience. An organic LED is a light emitting diode in which the permissive electroluminescent layer is a film of or chronic compounds which emit light in response to an electric current. An OLED may be implemented using a flexible type material such that the OLED may be placed over any desired having any of a variety of different shapes. For example, an OLED may be implemented over any translucent or transparent surface. Again, as with respect to various other embodiments described herein, some examples of such transparent or translucent surfaces include those which may appear in a vehicular context (e.g., windshield, windows, sun roofs, moon roofs, etc.). With respect to eyeglasses and/or headset applications, examples of transparent translucent surfaces include the lenses of the eyeglasses and/or headwear. Generally speaking, an OLED implemented using a flexible material may be employed within a variety of applications including those such as may be used within navigation systems to provide labeling and/or indicia to a user. - By appropriately implementing an OLED flexible material over a translucent or transparent surface through which a user views a respective field of vision, indicia and/or labeling may be indicated within a user's respective field of vision. As the reader will understand, different OLEDs may be operative to emit light having different colors. By employing a matrix type display structure in which different respectively located OLEDs are arranged to emit different respective colors (e.g., red, green, blue), such an overall display structure would then be operative to display any desired photographic and or video information. In certain alternative embodiments, a common OLED material may be selectively and differentially energized using different voltages appropriately applied across different portions of the overall OLED structure to effectuate the emission of different respective colors. Generally speaking, the use of an old LED flexible material over a translucent or transparent surface allows for the inclusion of labeling and/or indicia within a user's field of vision.
- The composition of an OLED includes a layer of organic material situated between two electrodes, the anode and cathode, respectively. The organic molecules are electrically conductive as a result of delocalization of pi electrons caused by conjugation over all or part of the molecule. These materials have conductivity levels ranging from insulators to conductors, and therefore are considered organic semiconductors. The highest occupied and lowest unoccupied molecular orbits (HOMO and LUMO) of such organic semiconductors may be viewed as being analogous to the valence and conduction band of inorganic semiconductors.
- During operation, a voltage is applied across the OLED such that the anode is positive with respect to the cathode. A current of electrons flows through the device from cathode to anode, as electrons are injected into the LUMO of the organic layer at the cathode and withdrawn from the HOMO at the anode. This process may also be described as the injection of electron holes into the HOMO. Electrostatic forces bring the electrons in the holes toward each other, and they recombine forming an exciton, a bound state of the electron and hole. This will typically occur closer to the emissive layer, because in organic semiconductors, holes are generally more mobile than electrons. The decay of this excited state (e.g., decay of the exciton) results in a relaxation of the energy levels of the electron, accompanied by emission of radiation whose frequency is in the visible spectrum. As the reader will understand, the frequency of this radiation depends on the bandgap of the material, particularly the difference in energy levels between HOMO and LUMO. Such OLED's may be implemented using a variety of different materials. For example, indium tin oxide (ITO) is one material that may be used as the anode material within OLED's.
- Generally speaking, the use of OLED's on the transparent or translucent surfaces through which a user looks to perceive a field of vision when using a navigation system can provide for three-dimensional imaging of labeling and/or indicia to assist the user with directions or instructions.
- As the reader will understand with respect to the various embodiments described herein, a novel means by which labeling and/or indicia may be provided within a field of vision of a user. In many instances, such labeling and/or indicia is provided with respect to the operation of a navigation system in which such labeling and/or indicia assists the user to navigate through a particular region or environment in an effort to arrive at a desired endpoint destination. In some embodiments, the inclusion of such labeling and or indicia is made in accordance with three-dimensional imaging. A variety of means may be employed to effectuate such three-dimensional imaging including the use of OLED, the use of appropriately designs along transparent or translucent services, etc. In addition, any desired information may be provided for consumption by a user to enhance the overall experience. For example, in the context of a vehicular system, labeling and/or indicia within a user's field of vision may accompany additional information such as directionality, location, vehicular velocity, time, estimated time of arrival, etc. By providing such information in a manner, without requiring a driver or passenger of the vehicle to take his or her eyes out of the field of vision, such information may be provided to the user without incurring a potential safety risk.
- With respect to those embodiments that include labeling and/or indicia within or overlaid photographic and/or video information, the improved accuracy provided by such photographic and/or video information can reduce or eliminate the amount of time for a user to translate what is depicted within the display of such a system to that of the actual physical environment. Again, the use of such photographic and/or video information will provide much greater specificity thereby assisting a user to reach a desired endpoint destination hopefully more safely, quicker, and with less frustration for the user.
-
FIG. 16A ,FIG. 16B ,FIG. 16C ,FIG. 17A ,FIG. 17B ,FIG. 17C ,FIG. 18A ,FIG. 18B ,FIG. 19A , andFIG. 19B illustrate various embodiment of methods as may be performed in accordance with operation of various devices such as various wireless communication devices and/or various navigations systems. - Referring to
method 1600 ofFIG. 16A , themethod 1600 begins by identifying a feature within an actual physical environment, as shown in ablock 1610. Such a feature may be any as described herein including a landmark, a building, a stadium, and airport, apart, a historical feature, a geological feature, and/or any other feature within a physical environment. Themethod 1600 continues by projecting an overlay/label into a user's field of vision based on the identified feature, as shown in a block 1620. Such projection may be made in accordance with three-dimensional (3D) projection. For example, one or more projectors may be implemented within a vehicle to project labeling and/or indicia into the actual physical environment from the perspective of a user (e.g., a driver and/or passenger of the vehicle). - Referring to
method 1601 ofFIG. 16B , themethod 1601 begins by identifying at least one of a user's current position and an endpoint destination, as shown in a block 1611. For example, any of a variety of means may be employed to identify a user's current location. In some embodiments, a user enters information corresponding to that user's current location via a user interface of a given device. In other embodiments, certain location determining means, functionality, circuitry, etc. are employed that do not necessarily require the interaction of user (e.g., global positioning system (GPS), detection of one or more fixed wireless access points within a wireless communication network [e.g., such as one or more access point (APs) within a wireless local area network (WLAN)], etc.). - The
method 1601 then operates by identifying a feature within an actual physical environment corresponding to a user's current location and/or an endpoint destination, as shown in a block 1621. For example, the feature identified may correspond particularly to a pathway between a user's current location and an endpoint destination. That is to say, themethod 1601 may operate particularly to identify features based upon a user's current location, an endpoint destination, and/or one or more pathways between the user's current location and the endpoint destination. For example, themethod 1601 may operate to identify different respective features that may be encountered along different respective pathways between a user's current location and an endpoint destination. Such different pathways, and their respective features identified there along, could be presented to a user of a given device for selection by the user of a particular one of the possible pathways. As may be seen, selectivity of pathways, such as in accordance with a method operating in accordance with navigation, may be based upon respective features that may be encountered along different respective pathways. - The
method 1601 continues by projecting an overlay/label into the user's field of vision based on the identified feature, the user's current location, and/or the endpoint destination, as shown in a block 1631. Any one or any combination of the identified feature, the user's current location, and the endpoint destination may govern, at least in part, an overlay/label that may be projected into a user's field of vision. As described also with respect to other diagrams and embodiments herein any such overlay/label may take the form of any labeling and/or indicia that may correspond to any of a variety of formats, including arrows, markers, labels, alphanumeric characters, etc. and/or any combination thereof, etc. - Referring to
method 1602 ofFIG. 16C , themethod 1602 begins by receiving user input corresponding to an endpoint destination, as shown in ablock 1610. For example, themethod 1602 may operate in accordance with navigation functionality such that a user may input different types of information thereto. - The
method 1602 continues by identifying the user's current location and the endpoint destination, as shown in a block 1622. The user's current location may be provided by information that is entered via a user interface by the user. Alternatively, other means may be employed to identify the user's current location (e.g., GPS, detection of one or more fixed wireless access points within a wireless communication network [e.g., such as one or more access point (APs) within a wireless local area network (WLAN)], etc.). The endpoint destination may be provided based upon information that is entered via a user interface by the user. - The
method 1602 then operates by determining a pathway between the user's current location and the endpoint destination, as shown in a block 1632. In certain embodiments, more than one pathway is presented to a user, and a user is provided an opportunity to select one of those pathways. - The
method 1602 continues by identifying a feature within an actual physical environment corresponding to the pathway, as shown in ablock 1642. In certain embodiments, different respective features are identified as corresponding to different respective pathways between a user's current location and an endpoint destination. - The
method 1602 then operates by projecting an overlay/label into the user's field of vision based on the identified feature, as shown in a block 1652. In alternative embodiments, the projection of an overlay/label may be based upon additional considerations as well (e.g., the user's current location, the endpoint destination, etc.) - Referring to
method 1700 ofFIG. 17A , themethod 1700 begins by identifying a feature within media, as shown in ablock 1710. Various examples of such media may include photographic and or video media depicting an actual physical environment. Such an actual physical environment may correspond to that environment in which a user currently is. Alternatively, the physical buyer environment may correspond to an environment in which a user intends or plans to enter (e.g., such as in accordance with following navigational directions between a current location and an endpoint destination). - The
method 1700 continues by outputting the media with an overlay/label therein based on the identified feature, as shown in ablock 1720. For example, the media, after having undergone processing for identification of at least one feature therein, may be output for consumption by a user with at least one overlay/label therein. By providing such labeling and/or indicia particularly within media that corresponds particularly to an actual physical environment, highly accurate information may be provided to a user. - Referring to
method 1701 ofFIG. 17B , themethod 1701 begins by identifying a user's current location and/or an endpoint destination, as shown in a block 1711. Again, as with respect to other embodiments herein, various means may be employed to identify the user's current location (e.g., GPS, user provided information entered via a user interface, etc.) and the endpoint destination (e.g., user provided information entered via a user interface, etc.). - The
method 1701 then operates by identifying at least one feature within media corresponding to the user's current location and/or an endpoint destination, as shown in a block 1721. For example, within media including photographic and/or video media, certain processing of that media may provide indication of respective features included therein. Various types of pattern recognition, image processing, video processing, etc. may be employed to identify one or more features within media. In some embodiments, the identification of at least one feature within the media is based upon a pathway between the user's current location and the endpoint destination. As also described with respect other embodiments, more than one pathway may be identified between a user's current location and an endpoint destination. In such instances, different respective features may be identified along the different respective pathways between the user's current location and the endpoint destination. - The
method 1701 continues by outputting the media with an overlay/label therein based on anyone or any combination of the identified feature, the user's current location, and/or the endpoint destination, as shown in a block 1731. - Referring to
method 1702 ofFIG. 17C , themethod 1702 begins by receiving user input corresponding to an endpoint destination, as shown in ablock 1710. Such information may be provided via a user interface of a given device. - The
method 1702 continues by identifying the user's current location and the endpoint destination, as shown in a block 1722. As also mentioned with respect to various embodiments, various means may be employed to identify the user's current location (e.g., GPS, user provided information entered via a user interface, etc.) and the endpoint destination (e.g., user provided information entered via a user interface, etc.). - The
method 1702 then operates by determining a pathway between the user's current location in the endpoint destination, as shown in a block 1732. In alternative embodiments, more than one pathway may be identified between the user's current location in the endpoint destination, and those respective pathways may be provided to a user for selection of at least one thereof. - The
method 1702 continues by identifying at least one feature within media corresponding to the pathway, as shown in ablock 1742. For example, within media including photographic and/or video media, certain processing of that media may provide indication of respective features included therein. Various types of pattern recognition, image processing, video processing, etc. may be employed to identify one or more features within media. - The
method 1702 then operates by outputting the media with an overlay/label therein based on the identified feature, as shown in ablock 1752. Of course, multiple overlays/labels may be included within the media in accordance with more than one respective identified feature. - Referring to
method 1800 ofFIG. 18A , themethod 1800 begins by acquiring media corresponding to an actual physical environment, as shown in ablock 1810. - In certain embodiments, such acquisition is performed in accordance with accessing one or more databases via one or more networks, as shown in a
block 1810 a. For example, via some wireless communication network and/or via the Internet, one or more databases may be accessed to acquire media corresponding to an actual physical location. Such an actual physical location may correspond to an environment in which a user currently is, an environment in which a user plans to be, etc. - In certain other embodiments, such acquisition is performed in accordance with one or more cameras, as shown in a
block 1810 b. For example, one or more cameras may be employed to perform real-time acquisition of such media. As the reader will understand, such real-time acquired information will hopefully be extremely accurate, up-to-date, etc. and properly detect the actual physical environment. - The
method 1800 continues by processing the media in accordance with identifying a feature therein, as shown in ablock 1820. Again, within this embodiment as well as within others, various types of pattern recognition, image processing, video processing, etc. may be employed for processing of media to identify one or more features therein. - The
method 1800 then operates by outputting the media with an overlay/label therein based on the identified feature, as shown in ablock 1830. Of course, within this embodiment as within others, multiple overlays/labels may be included within the media in accordance with more than one respective identified feature. - Referring to
method 1801 ofFIG. 18B , themethod 1801 begins by identifying a user's current location and/or an endpoint destination, as shown in a block 1811. Again, various means may be employed to identify the user's current location (e.g., GPS, user provided information entered via a user interface, etc.) and the endpoint destination (e.g., user provided information entered via a user interface, etc.). - The
method 1801 then operates by acquiring media based on one or both of the user's current location and the endpoint destination, as shown in a block 1821. - In certain embodiments, such acquisition is performed in accordance with accessing one or more databases via one or more networks, as shown in a block 1821 a. For example, via some wireless communication network and/or via the Internet, one or more databases may be accessed to acquire media corresponding to an actual physical location. Such an actual physical location may correspond to an environment in which a user currently is, an environment in which a user plans to be, etc. As the reader will understand, such acquisition of media is particularly based on one or both of the user's current location and the endpoint destination. In other words, from certain perspectives, the search window is particularly tailored and/or narrowed based upon one or both of the user's current location and the endpoint destination. In alternative embodiments in which multiple pathways may be identified corresponding to multiple paths between the user's current location and the end point destination, different respective media may be acquired for more than one of the respective pathways. For example, a first set of media may be acquired for a first pathway, a second set of media may be acquired for a second pathway, etc.
- In certain other embodiments, such acquisition is performed in accordance with one or more cameras, as shown in a
block 1821 b. For example, one or more cameras may be employed to perform real-time acquisition of such media. As the reader will understand, such real-time acquired information will hopefully be extremely accurate, up-to-date, etc. and properly detect the actual physical environment. - The
method 1801 continues by processing the media in accordance with identifying a feature therein, as shown in ablock 1831. Again, various types of pattern recognition, image processing, video processing, etc. may be employed to identify one or more features within media. - The
method 1801 then operates by outputting the media with an overlay/label therein based on the identified feature, as shown in ablock 1841. Of course, within this embodiment as within others, multiple overlays/labels may be included within the media in accordance with more than one respective identified feature. - Referring to
method 1900 ofFIG. 19A , themethod 1900 begins by acquiring first information via a first network, as shown in ablock 1910. For example, any of a variety of types of information may be acquired via a given network, and certain networks may be operative to provide different information, different levels of specificity, etc. As the reader will understand, as a user and/or device moves throughout a physical environment, and particularly into and out of different respective networks (e.g., such as into and out of different respective wireless networks which may include any or more, or any combination of satellite networks, wireless local area networks (WLANs), wide area networks (WANs), and/or any other type of wireless networks), conductivity to those different respective networks may be made, and by which, different respective information may be acquired. - In certain embodiments, such acquisition is performed in accordance with acquiring real-time traffic information, information corresponding to one or more features within a physical environment, media corresponding to an actual physical environment, etc., as shown in a block 1910 a.
- In certain embodiments, the
method 1900 operates by acquiring second information via a second network, as shown in a block 1920. That is to say, a user and/or device may move throughout a physical environment including moving in and out of different respective service areas of different respective wireless networks. - The
method 1900 then operates by selectively outputting the first information (and/or the second information) based on a user's current location and/or an endpoint destination, as shown in a block 1930. - Referring to
method 1901 ofFIG. 19B , themethod 1901 begins by receiving user input corresponding to a first operational mode, as shown in a block 1911. Such an operational mode may indicate any one or more different parameters corresponding to the type and/or manner of information to be provided to a user. For example, a given operational mode may correspond to a particular display format, indicating what particular information to display, when to display such information (e.g., displaying first information at her during a first time, displaying second information at ordering a second time, etc.), etc. - The
method 1901 then operates by providing one or more overlays/labels within media and/or a user's field of vision in accordance with the first operational mode, as shown in a block 1921. For example, a great deal of selectivity is provided by which such information may be provided to a user in any of a given number of applications. For example, certain embodiments relate to providing such labeling and/or indicia within a user's field of vision, other embodiments relate to providing such labeling and/or indicia within media, and even other embodiments relate to providing such labeling and/or indicia within both a user's field of vision and media. For example, within the navigation system type application, a method may operate by not only projecting labeling and/or indicia within a user's field of vision, but a display (e.g., within the vehicle, on a portable device, etc.) may also output media including one or more overlays/labels therein. For example, a redundant type system may operate in which not only is labeling and/or indicia projected within a user's field of vision, but a display is operative to output media including one or more overlays/labels therein simultaneously or in parallel with a projection of the labeling and/or indicia within the user's field of vision. - The
method 1901 continues by receiving user input corresponding to a second operational mode, as shown in a block 1931. In certain embodiments, the second operational mode may include one or more overlapping operational parameters similar to the first operational mode. In other embodiments, the second operational mode and the first operational mode are mutually exclusive such that they do not include any common operational parameters. - The
method 1901 then operates by providing one or more overlays/labels within media and/or a user's field of vision in accordance with the second operational mode, as shown in a block 1941. As the reader will understand, different respective operational modes may govern the manner by which such information is provided. Also, by allowing for different respective operational modes, different operation may be effectuated at different times. As described elsewhere herein, various types of indicia may be used to trigger or select when to operate within different operational modes. For example, a first operational mode may be selected when a user is relatively far from an endpoint destination. Then, a second operational mode based be selected when that user is within a given proximity of the endpoint destination (e.g., within X number of miles). -
FIG. 20 is a diagram illustrating anembodiment 2000 of a vehicle including an audio system for effectuating directional audio indication. A vehicle may be implemented to have an audio system that supports capability of stereo, surround sound, and other functionality. For example, a number of speakers may be implemented at different locations within the vehicle. Using appropriate audio signaling, sound may be provided from the speakers in such a manner as to have a given directionality. For example, within such audio systems, often times multiple audio channels (and often times multiple corresponding speakers, such as may be serviced by the multiple audio channels) are implemented for providing sound to the respective speakers of the audio system. Depending upon the particular manner in which the audio signaling is provided to the speakers, the perceptual effect of the sound coming from a particular location or direction may be made. In the context of a navigation system implemented within a vehicle, an audio system may be coordinated such that audio prompts or audio indications may be provided via the audio system in such a way as to correspond to the respective location of a particular feature. For example, a label associated with a feature located on a particular side of the vehicle may be audibly provided via the audio system such that the label is perceptually been as being sourced from the location of that particular feature. - As one possible example in the context of operation in conjunction with a navigation system, when approaching an intersection at which a driver should change route and turn to the right (e.g., based upon a given route that will lead the driver to an endpoint destination), an audio prompt may be provided via the audio system such that, from the perspective of the driver, the audio prompt is perceived as directionally coming from forward, right-hand side portion of the vehicle. Alternatively, when approaching an intersection at which a driver should change route in turn to the left, such an audio prompt may be provided via the audio system such that, from the perspective of the driver, the audio prompt is perceived as directionally coming from the forward, left hand side portion of the vehicle. Generally speaking, by operating using a given audio system having at least two speakers and associated functionality, circuitry, etc. for performing any of a number of different audio processing operations (e.g., DSP audio processing [such as for effectuating certain audio effects as reverb or chorus effects, equalizer settings, matrixing of mono/stereo signaling two different channels, parametric control of audio DSP effects, etc.], balance adjustment, fader adjustment, selectively driving one or more speakers,)
- As another possible example in the context of operation in conjunction with a navigation system, considering an instance in which a driver has missed a turn (e.g., based upon a given route that will lead the driver to an endpoint destination), and audio prompt may be provided via the audio system such that, from the perspective of the driver, the audio prompt is perceived as directionally coming from the rear of the vehicle. As the reader will understand, coordination between various labeling and/or indicia which may be included and presented within various respective fields of vision and corresponding audio prompts associated with that labeling and/or indicia may further augment or enhance a driver experience.
-
FIG. 21 is a diagram illustrating anembodiment 2100 of a navigation system with a display or projection system in a vehicular context, and particularly including at least one localized region of the field of vision in which labeling and/or indicia is provided. The reader is referred toFIG. 6 in comparison to this diagram. As can be seen with respect to this diagram, instead of providing such labeling and/or indicia across an entirety of the one or more respective fields of vision of the driver, such labeling and/or indicia is provided within a localized region (e.g., a subset of the field of vision). That is to say, and set of providing such labeling and/or indicia within any particular location of one or more fields of vision, such labeling and/or indicia is provided within only one or more localized regions. - As may be understood, a driver of such a vehicle will then be able to focus attention only to those one or more localized regions instead of surveying all of the respective fields of vision and multiple directions. If desired, certain feature labeling may only be provided when visible within one or more of these localized regions. For example, a feature labeling associated with the street would appear only when that street may be perceived within a given localized region.
- It is also noted that such a localized region may be appropriately selected from the perspective of the user to which such labeling and/or indicia is being provided. For example, a driver of a vehicle may be provided a first respective localized region (e.g., located within the windshield in front of the driver), a front seat passenger may be provided a second respective localized region (e.g., located within the windshield in front of the front seat passenger), and a backseat passenger may be provided a third respective localized region (e.g., located within a side window nearest to the backseat passenger, on the right-hand or left-hand side), etc.
- It is also noted that the various operations and functions as described with respect to various methods herein may be performed within a navigation system including various compounds, a wireless communication device, and/or any of a number of other devices. For example, within certain embodiments, one or more modules and/or circuitries within a given device (e.g., such as baseband processing module implemented in accordance with the baseband processing module as described with reference to
FIG. 2 ) and/or other components therein may operate to perform any of the various methods herein. - It is noted that the various modules and/or circuitries (baseband processing modules and/or circuitries, encoding modules and/or circuitries, decoding modules and/or circuitries, etc., etc.) described herein may be a single processing device or a plurality of processing devices. Such a processing device may be a microprocessor, micro-controller, digital signal processor, microcomputer, central processing unit, field programmable gate array, programmable logic device, state machine, logic circuitry, analog circuitry, digital circuitry, and/or any device that manipulates signals (analog and/or digital) based on operational instructions. The operational instructions may be stored in a memory. The memory may be a single memory device or a plurality of memory devices. Such a memory device may be a read-only memory (ROM), random access memory (RAM), volatile memory, non-volatile memory, static memory, dynamic memory, flash memory, and/or any device that stores digital information. It is also noted that when the processing module implements one or more of its functions via a state machine, analog circuitry, digital circuitry, and/or logic circuitry, the memory storing the corresponding operational instructions is embedded with the circuitry comprising the state machine, analog circuitry, digital circuitry, and/or logic circuitry. In such an embodiment, a memory stores, and a processing module coupled thereto executes, operational instructions corresponding to at least some of the steps and/or functions illustrated and/or described herein.
- It is also noted that any of the connections or couplings between the various modules, circuits, functional blocks, components, devices, etc. within any of the various diagrams or as described herein may be differently implemented in different embodiments. For example, in one embodiment, such connections or couplings may be direct connections or direct couplings there between. In another embodiment, such connections or couplings may be indirect connections or indirect couplings there between (e.g., with one or more intervening components there between). Of course, certain other embodiments may have some combinations of such connections or couplings therein such that some of the connections or couplings are direct, while others are indirect. Different implementations may be employed for effectuating communicative coupling between modules, circuits, functional blocks, components, devices, etc. without departing from the scope and spirit of the invention.
- Various aspects of the present invention have also been described above with the aid of method steps illustrating the performance of specified functions and relationships thereof. The boundaries and sequence of these functional building blocks and method steps have been arbitrarily defined herein for convenience of description. Alternate boundaries and sequences can be defined so long as the specified functions and relationships are appropriately performed. Any such alternate boundaries or sequences are thus within the scope and spirit of the claimed invention.
- Various aspects of the present invention have been described above with the aid of functional building blocks illustrating the performance of certain significant functions. The boundaries of these functional building blocks have been arbitrarily defined for convenience of description. Alternate boundaries could be defined as long as the certain significant functions are appropriately performed. Similarly, flow diagram blocks may also have been arbitrarily defined herein to illustrate certain significant functionality. To the extent used, the flow diagram block boundaries and sequence could have been defined otherwise and still perform the certain significant functionality. Such alternate definitions of both functional building blocks and flow diagram blocks and sequences are thus within the scope and spirit of the claimed invention.
- One of average skill in the art will also recognize that the functional building blocks, and other illustrative blocks, modules and components herein, can be implemented as illustrated or by discrete components, application specific integrated circuits, processors executing appropriate software and the like or any combination thereof.
- Moreover, although described in detail for purposes of clarity and understanding by way of the aforementioned embodiments, various aspects of the present invention are not limited to such embodiments. It will be obvious to one of average skill in the art that various changes and modifications may be practiced within the spirit and scope of the invention, as limited only by the scope of the appended claims.
-
-
TABLE 1 2.4 GHz, 20/22 MHz channel BW, 54 Mbps max bit rate Code Rate Modulation Rate NBPSC NCBPS NDBPS EVM Sensitivity ACR AACR Barker 1 BPSK Barker 2 QPSK 5.5 CCK 6 BPSK 0.5 1 48 24 −5 −82 16 32 9 BPSK 0.75 1 48 36 −8 −81 15 31 11 CCK 12 QPSK 0.5 2 96 48 −10 −79 13 29 18 QPSK 0.75 2 96 72 −13 −77 11 27 24 16-QAM 0.5 4 192 96 −16 −74 8 24 36 16-QAM 0.75 4 192 144 −19 −70 4 20 48 64-QAM 0.666 6 288 192 −22 −66 0 16 54 64-QAM 0.75 6 288 216 −25 −65 −1 15 -
TABLE 2 Channelization for Table 1 Frequency Channel (MHz) 1 2412 2 2417 3 2422 4 2427 5 2432 6 2437 7 2442 8 2447 9 2452 10 2457 11 2462 12 2467 -
TABLE 3 Power Spectral Density (PSD) Mask for Table 1 PSD Mask 1Frequency Offset dBr −9 MHz to 9 MHz 0 +/−11 MHz −20 +/−20 MHz −28 +/−30 MHz and −50 greater -
TABLE 4 5 GHz, 20 MHz channel BW, 54 Mbps max bit rate Code Rate Modulation Rate NBPSC NCBPS NDBPS EVM Sensitivity ACR AACR 6 BPSK 0.5 1 48 24 −5 −82 16 32 9 BPSK 0.75 1 48 36 −8 −81 15 31 12 QPSK 0.5 2 96 48 −10 −79 13 29 18 QPSK 0.75 2 96 72 −13 −77 11 27 24 16-QAM 0.5 4 192 96 −16 −74 8 24 36 16-QAM 0.75 4 192 144 −19 −70 4 20 48 64-QAM 0.666 6 288 192 −22 −66 0 16 54 64-QAM 0.75 6 288 216 −25 −65 −1 15 -
TABLE 5 Channelization for Table 4 Frequency Frequency Channel (MHz) Country Channel (MHz) Country 240 4920 Japan 244 4940 Japan 248 4960 Japan 252 4980 Japan 8 5040 Japan 12 5060 Japan 16 5080 Japan 36 5180 USA/Europe 34 5170 Japan 40 5200 USA/Europe 38 5190 Japan 44 5220 USA/Europe 42 5210 Japan 48 5240 USA/Europe 46 5230 Japan 52 5260 USA/Europe 56 5280 USA/Europe 60 5300 USA/Europe 64 5320 USA/ Europe 100 5500 USA/Europe 104 5520 USA/Europe 108 5540 USA/ Europe 112 5560 USA/ Europe 116 5580 USA/ Europe 120 5600 USA/ Europe 124 5620 USA/ Europe 128 5640 USA/ Europe 132 5660 USA/Europe 136 5680 USA/ Europe 140 5700 USA/Europe 149 5745 USA 153 5765 USA 157 5785 USA 161 5805 USA 165 5825 USA -
TABLE 6 2.4 GHz, 20 MHz channel BW, 192 Mbps max bit rate ST TX Code Modula- Code Rate Antennas Rate tion Rate NBPSC NCBPS NDBPS 12 2 1 BPSK 0.5 1 48 24 24 2 1 QPSK 0.5 2 96 48 48 2 1 16-QAM 0.5 4 192 96 96 2 1 64-QAM 0.666 6 288 192 108 2 1 64-QAM 0.75 6 288 216 18 3 1 BPSK 0.5 1 48 24 36 3 1 QPSK 0.5 2 96 48 72 3 1 16-QAM 0.5 4 192 96 144 3 1 64-QAM 0.666 6 288 192 162 3 1 64-QAM 0.75 6 288 216 24 4 1 BPSK 0.5 1 48 24 48 4 1 QPSK 0.5 2 96 48 96 4 1 16-QAM 0.5 4 192 96 192 4 1 64-QAM 0.666 6 288 192 216 4 1 64-QAM 0.75 6 288 216 -
TABLE 7 Channelization for Table 6 Channel Frequency (MHz) 1 2412 2 2417 3 2422 4 2427 5 2432 6 2437 7 2442 8 2447 9 2452 10 2457 11 2462 12 2467 -
TABLE 8 5 GHz, 20 MHz channel BW, 192 Mbps max bit rate ST TX Code Modula- Code Rate Antennas Rate tion Rate NBPSC NCBPS NDBPS 12 2 1 BPSK 0.5 1 48 24 24 2 1 QPSK 0.5 2 96 48 48 2 1 16-QAM 0.5 4 192 96 96 2 1 64-QAM 0.666 6 288 192 108 2 1 64-QAM 0.75 6 288 216 18 3 1 BPSK 0.5 1 48 24 36 3 1 QPSK 0.5 2 96 48 72 3 1 16-QAM 0.5 4 192 96 144 3 1 64-QAM 0.666 6 288 192 162 3 1 64-QAM 0.75 6 288 216 24 4 1 BPSK 0.5 1 48 24 48 4 1 QPSK 0.5 2 96 48 96 4 1 16-QAM 0.5 4 192 96 192 4 1 64-QAM 0.666 6 288 192 216 4 1 64-QAM 0.75 6 288 216 -
TABLE 9 channelization for Table 8 Frequency Frequency Channel (MHz) Country Channel (MHz) Country 240 4920 Japan 244 4940 Japan 248 4960 Japan 252 4980 Japan 8 5040 Japan 12 5060 Japan 16 5080 Japan 36 5180 USA/Europe 34 5170 Japan 40 5200 USA/Europe 38 5190 Japan 44 5220 USA/Europe 42 5210 Japan 48 5240 USA/Europe 46 5230 Japan 52 5260 USA/Europe 56 5280 USA/Europe 60 5300 USA/Europe 64 5320 USA/ Europe 100 5500 USA/Europe 104 5520 USA/Europe 108 5540 USA/ Europe 112 5560 USA/ Europe 116 5580 USA/ Europe 120 5600 USA/ Europe 124 5620 USA/ Europe 128 5640 USA/ Europe 132 5660 USA/Europe 136 5680 USA/ Europe 140 5700 USA/Europe 149 5745 USA 153 5765 USA 157 5785 USA 161 5805 USA 165 5825 USA -
TABLE 10 5 GHz, with 40 MHz channels and max bit rate of 486 Mbps TX ST Code Code Rate Antennas Rate Modulation Rate NBPSC 13.5 Mbps 1 1 BPSK 0.5 1 27 Mbps 1 1 QPSK 0.5 2 54 Mbps 1 1 16-QAM 0.5 4 108 Mbps 1 1 64-QAM 0.666 6 121.5 Mbps 1 1 64-QAM 0.75 6 27 Mbps 2 1 BPSK 0.5 1 54 Mbps 2 1 QPSK 0.5 2 108 Mbps 2 1 16-QAM 0.5 4 216 Mbps 2 1 64-QAM 0.666 6 243 Mbps 2 1 64-QAM 0.75 6 40.5 Mbps 3 1 BPSK 0.5 1 81 Mbps 3 1 QPSK 0.5 2 162 Mbps 3 1 16-QAM 0.5 4 324 Mbps 3 1 64-QAM 0.666 6 365.5 Mbps 3 1 64-QAM 0.75 6 54 Mbps 4 1 BPSK 0.5 1 108 Mbps 4 1 QPSK 0.5 2 216 Mbps 4 1 16-QAM 0.5 4 432 Mbps 4 1 64-QAM 0.666 6 486 Mbps 4 1 64-QAM 0.75 6 -
TABLE 11 Power Spectral Density (PSD) mask for Table 10 PSD Mask 2Frequency Offset dBr −19 MHz to 19 MHz 0 +/−21 MHz −20 +/−30 MHz −28 +/−40 MHz and −50 greater -
TABLE 12 Channelization for Table 10 Frequency Frequency Channel (MHz) Country Channel (MHz) County 242 4930 Japan 250 4970 Japan 12 5060 Japan 38 5190 USA/Europe 36 5180 Japan 46 5230 USA/Europe 44 5520 Japan 54 5270 USA/Europe 62 5310 USA/Europe 102 5510 USA/ Europe 110 5550 USA/ Europe 118 5590 USA/ Europe 126 5630 USA/ Europe 134 5670 USA/Europe 151 5755 USA 159 5795 USA
Claims (26)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/183,613 US20120310531A1 (en) | 2011-05-31 | 2011-07-15 | Navigation system employing augmented labeling and/or indicia |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201161491838P | 2011-05-31 | 2011-05-31 | |
US13/183,613 US20120310531A1 (en) | 2011-05-31 | 2011-07-15 | Navigation system employing augmented labeling and/or indicia |
Publications (1)
Publication Number | Publication Date |
---|---|
US20120310531A1 true US20120310531A1 (en) | 2012-12-06 |
Family
ID=47261635
Family Applications (9)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/180,843 Abandoned US20120309321A1 (en) | 2011-05-31 | 2011-07-12 | Synchronized calibration for wireless communication devices |
US13/181,112 Active 2032-02-11 US8730930B2 (en) | 2011-05-31 | 2011-07-12 | Polling using B-ACK for occasional back-channel traffic in VoWIFI applications |
US13/183,613 Abandoned US20120310531A1 (en) | 2011-05-31 | 2011-07-15 | Navigation system employing augmented labeling and/or indicia |
US13/192,390 Expired - Fee Related US9049736B2 (en) | 2011-05-31 | 2011-07-27 | Video sub-reservation protocol in a wireless ecosystem |
US13/231,402 Abandoned US20120307746A1 (en) | 2011-05-31 | 2011-09-13 | Fair Channel Allocation for Multiple Clients |
US13/231,481 Active 2032-05-26 US8831091B2 (en) | 2011-05-31 | 2011-09-13 | Adaptive wireless channel allocation for media distribution in a multi-user environment |
US13/240,906 Active 2032-07-11 US9295076B2 (en) | 2011-05-31 | 2011-09-22 | Selective intra and/or inter prediction video encoding based on a channel rate |
US13/244,567 Abandoned US20120307885A1 (en) | 2011-05-31 | 2011-09-25 | Channel Condition Prediction Employing Transmit Queuing Model |
US14/723,610 Active 2032-07-03 US9807784B2 (en) | 2011-05-31 | 2015-05-28 | Video sub-reservation protocol in a wireless ecosystem |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/180,843 Abandoned US20120309321A1 (en) | 2011-05-31 | 2011-07-12 | Synchronized calibration for wireless communication devices |
US13/181,112 Active 2032-02-11 US8730930B2 (en) | 2011-05-31 | 2011-07-12 | Polling using B-ACK for occasional back-channel traffic in VoWIFI applications |
Family Applications After (6)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/192,390 Expired - Fee Related US9049736B2 (en) | 2011-05-31 | 2011-07-27 | Video sub-reservation protocol in a wireless ecosystem |
US13/231,402 Abandoned US20120307746A1 (en) | 2011-05-31 | 2011-09-13 | Fair Channel Allocation for Multiple Clients |
US13/231,481 Active 2032-05-26 US8831091B2 (en) | 2011-05-31 | 2011-09-13 | Adaptive wireless channel allocation for media distribution in a multi-user environment |
US13/240,906 Active 2032-07-11 US9295076B2 (en) | 2011-05-31 | 2011-09-22 | Selective intra and/or inter prediction video encoding based on a channel rate |
US13/244,567 Abandoned US20120307885A1 (en) | 2011-05-31 | 2011-09-25 | Channel Condition Prediction Employing Transmit Queuing Model |
US14/723,610 Active 2032-07-03 US9807784B2 (en) | 2011-05-31 | 2015-05-28 | Video sub-reservation protocol in a wireless ecosystem |
Country Status (1)
Country | Link |
---|---|
US (9) | US20120309321A1 (en) |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8467770B1 (en) * | 2012-08-21 | 2013-06-18 | Mourad Ben Ayed | System for securing a mobile terminal |
US20150079563A1 (en) * | 2013-09-17 | 2015-03-19 | Sony Corporation | Nonverbal audio cues during physical activity |
US9164281B2 (en) | 2013-03-15 | 2015-10-20 | Honda Motor Co., Ltd. | Volumetric heads-up display with dynamic focal plane |
US20150379871A1 (en) * | 2013-03-08 | 2015-12-31 | Honda Motor Co., Ltd. | Congestion sign detection method, program, and congestion sign detection device |
US9251715B2 (en) | 2013-03-15 | 2016-02-02 | Honda Motor Co., Ltd. | Driver training system using heads-up display augmented reality graphics elements |
US9378644B2 (en) | 2013-03-15 | 2016-06-28 | Honda Motor Co., Ltd. | System and method for warning a driver of a potential rear end collision |
US9393870B2 (en) | 2013-03-15 | 2016-07-19 | Honda Motor Co., Ltd. | Volumetric heads-up display with dynamic focal plane |
US9588340B2 (en) * | 2015-03-03 | 2017-03-07 | Honda Motor Co., Ltd. | Pedestrian intersection alert system and method thereof |
US9747898B2 (en) | 2013-03-15 | 2017-08-29 | Honda Motor Co., Ltd. | Interpretation of ambiguous vehicle instructions |
WO2018004858A3 (en) * | 2016-06-30 | 2018-07-26 | Intel Corporation | Road condition heads up display |
US10215583B2 (en) | 2013-03-15 | 2019-02-26 | Honda Motor Co., Ltd. | Multi-level navigation monitoring and control |
US10339711B2 (en) | 2013-03-15 | 2019-07-02 | Honda Motor Co., Ltd. | System and method for providing augmented reality based directions based on verbal and gestural cues |
Families Citing this family (58)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100316150A1 (en) * | 2009-06-05 | 2010-12-16 | Broadcom Corporation | Mixed mode operations within multiple user, multiple access, and/or MIMO wireless communications |
JP5204870B2 (en) * | 2011-04-25 | 2013-06-05 | 株式会社エヌ・ティ・ティ・ドコモ | Base station and resource allocation method in mobile communication system |
US20120309321A1 (en) * | 2011-05-31 | 2012-12-06 | Broadcom Corporation | Synchronized calibration for wireless communication devices |
CN103139180B (en) * | 2011-12-01 | 2016-08-03 | 华为技术有限公司 | A kind of method and apparatus promoting cell throughout based on streaming media service |
US9531990B1 (en) | 2012-01-21 | 2016-12-27 | Google Inc. | Compound prediction using multiple sources or prediction modes |
US8917608B2 (en) * | 2012-01-31 | 2014-12-23 | Qualcomm Incorporated | Low latency WiFi display using intelligent aggregation |
US8737824B1 (en) | 2012-03-09 | 2014-05-27 | Google Inc. | Adaptively encoding a media stream with compound prediction |
US20130336204A1 (en) * | 2012-06-13 | 2013-12-19 | Jen-Chieh Huang | Control method for adjusting queuing data volumn of wireless communications device by detecting data transfer speed at physical layer and related control module and machine-readable medium thereof |
US9781447B1 (en) | 2012-06-21 | 2017-10-03 | Google Inc. | Correlation based inter-plane prediction encoding and decoding |
US9185414B1 (en) | 2012-06-29 | 2015-11-10 | Google Inc. | Video encoding using variance |
US8897274B2 (en) * | 2012-08-08 | 2014-11-25 | St-Ericsson Sa | Successive interference cancellation stacked branch VAMOS receivers |
US9167268B1 (en) | 2012-08-09 | 2015-10-20 | Google Inc. | Second-order orthogonal spatial intra prediction |
US9344742B2 (en) | 2012-08-10 | 2016-05-17 | Google Inc. | Transform-domain intra prediction |
US9380298B1 (en) | 2012-08-10 | 2016-06-28 | Google Inc. | Object-based intra-prediction |
US9369732B2 (en) | 2012-10-08 | 2016-06-14 | Google Inc. | Lossless intra-prediction video coding |
US9549189B2 (en) * | 2012-11-06 | 2017-01-17 | Ittiam Systems (P) Ltd. | Method for media rate control in a video encoding system |
US9628790B1 (en) | 2013-01-03 | 2017-04-18 | Google Inc. | Adaptive composite intra prediction for image and video compression |
US9426196B2 (en) * | 2013-01-04 | 2016-08-23 | Qualcomm Incorporated | Live timing for dynamic adaptive streaming over HTTP (DASH) |
US9755707B2 (en) * | 2013-01-30 | 2017-09-05 | Telefonaktiebolaget Lm Ericsson (Publ) | Method and apparatus for calibrating multiple antennas |
CN108599836B (en) * | 2013-02-22 | 2023-06-23 | 华为技术有限公司 | Subframe generation method and device, subframe determination method and user equipment |
US9191146B2 (en) * | 2013-03-13 | 2015-11-17 | Qualcomm Incorporated | Methods permitting a wireless system receiver to determine and report channel conditions to a system transmitter |
US9558508B2 (en) * | 2013-03-15 | 2017-01-31 | Microsoft Technology Licensing, Llc | Energy-efficient mobile advertising |
GB2575383B (en) * | 2013-04-26 | 2020-04-15 | Cisco Tech Inc | Selection of radio bearers for scheduling in a mobile communications network |
US9374578B1 (en) | 2013-05-23 | 2016-06-21 | Google Inc. | Video coding using combined inter and intra predictors |
EP2996436A4 (en) * | 2013-05-29 | 2016-06-01 | Huawei Tech Co Ltd | Data transmission method, device, apparatus and base station |
US9247251B1 (en) | 2013-07-26 | 2016-01-26 | Google Inc. | Right-edge extension for quad-tree intra-prediction |
US9231993B2 (en) * | 2013-09-06 | 2016-01-05 | Lg Display Co., Ltd. | Apparatus for transmitting encoded video stream and method for transmitting the same |
US9516358B2 (en) | 2013-11-26 | 2016-12-06 | At&T Intellectual Property I, L.P. | Method and apparatus for providing media content |
US9609343B1 (en) | 2013-12-20 | 2017-03-28 | Google Inc. | Video coding using compound prediction |
CN104703051B (en) * | 2014-01-06 | 2018-06-05 | 杭州海康威视数字技术股份有限公司 | Code stream sending method and device |
JP6276065B2 (en) * | 2014-02-26 | 2018-02-07 | パナソニック株式会社 | Wireless communication method and wireless communication device |
WO2015131935A1 (en) * | 2014-03-05 | 2015-09-11 | 2Kb Beteiligungs Gmbh | System and method for controlling video resolution depending on an upload transfer rate |
JP6525576B2 (en) * | 2014-12-17 | 2019-06-05 | キヤノン株式会社 | Control device, control system, control method, medical imaging apparatus, medical imaging system, imaging control method and program |
CN105791191A (en) * | 2014-12-18 | 2016-07-20 | 上海协同科技股份有限公司 | Method for realizing high-speed data communication for wireless narrowband, communication system of wireless narrow band and modem of wireless narrow band |
US10033495B2 (en) * | 2015-02-13 | 2018-07-24 | Avago Technologies General Ip (Singapore) Pte. Ltd. | Communication device to generate and process a clear to send announcement frame |
US10575008B2 (en) * | 2015-06-01 | 2020-02-25 | Apple Inc. | Bandwidth management in devices with simultaneous download of multiple data streams |
US10225036B2 (en) * | 2015-06-11 | 2019-03-05 | California Institute Of Technology | Communication systems and methods of communicating utilizing cooperation facilitators |
CN104994360B (en) * | 2015-08-03 | 2018-10-26 | 北京旷视科技有限公司 | Video frequency monitoring method and video monitoring system |
EP3343934B1 (en) * | 2015-10-06 | 2021-03-10 | Sony Interactive Entertainment Inc. | Communication system, transmission device, receiving device, communication system control method and program |
US10470058B2 (en) * | 2016-05-07 | 2019-11-05 | Microsoft Technology Licensing, Llc | Single radio serving multiple wireless links |
US10517001B2 (en) | 2016-05-07 | 2019-12-24 | Microsoft Technology Licensing, Llc | Single radio switching between multiple wireless links |
US10642651B2 (en) * | 2016-06-23 | 2020-05-05 | Intel Corporation | Systems, methods and devices for standby power savings |
US10285215B2 (en) | 2016-10-21 | 2019-05-07 | International Business Machines Corporation | Dynamic quality of service (QoS) based channel in wireless network |
US10505859B2 (en) * | 2016-11-10 | 2019-12-10 | The Government Of The United States Of America, As Represented By The Secretary Of The Navy | Packet deadlines in a queue to control the age of information |
US10999602B2 (en) | 2016-12-23 | 2021-05-04 | Apple Inc. | Sphere projected motion estimation/compensation and mode decision |
US11259046B2 (en) | 2017-02-15 | 2022-02-22 | Apple Inc. | Processing of equirectangular object data to compensate for distortion by spherical projections |
US10924747B2 (en) | 2017-02-27 | 2021-02-16 | Apple Inc. | Video coding techniques for multi-view video |
US10374762B2 (en) * | 2017-02-28 | 2019-08-06 | At&T Intellectual Property I, L.P. | Use of underutilized bandwidth via radio access resource sharing |
US10579495B2 (en) | 2017-05-18 | 2020-03-03 | California Institute Of Technology | Systems and methods for transmitting data using encoder cooperation in the presence of state information |
US11093752B2 (en) | 2017-06-02 | 2021-08-17 | Apple Inc. | Object tracking in multi-view video |
US10754242B2 (en) | 2017-06-30 | 2020-08-25 | Apple Inc. | Adaptive resolution and projection format in multi-direction video |
US10270486B2 (en) * | 2017-06-30 | 2019-04-23 | Taiwan Semiconductor Manufacturing Co., Ltd. | Ultra-low power receiver |
US10945141B2 (en) * | 2017-07-25 | 2021-03-09 | Qualcomm Incorporated | Systems and methods for improving content presentation |
EP3633999A1 (en) | 2018-10-05 | 2020-04-08 | InterDigital CE Patent Holdings | Method to be implemented at a device able to run one adaptive streaming session, and corresponding device |
US10912105B2 (en) * | 2019-03-28 | 2021-02-02 | Intel Corporation | Apparatus, system and method of wireless video streaming |
US10966216B2 (en) | 2019-08-29 | 2021-03-30 | Cisco Technology, Inc. | Adaptive resource allocation for media streams over wireless |
JP2022107993A (en) * | 2021-01-12 | 2022-07-25 | ヤマハ株式会社 | Signal processing method, signal processing device, and signal processing program |
CN117061698B (en) * | 2023-10-12 | 2023-12-22 | 太一云境技术有限公司 | Hidden immersion type teleconference channel establishment method and system |
Citations (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5765116A (en) * | 1993-08-28 | 1998-06-09 | Lucas Industries Public Limited Company | Driver assistance system for a vehicle |
US6226592B1 (en) * | 1999-03-22 | 2001-05-01 | Veridian Erim International, Inc. | Method and apparatus for prompting a motor vehicle operator to remain within a lane |
US20010056326A1 (en) * | 2000-04-11 | 2001-12-27 | Keiichi Kimura | Navigation apparatus, method for map matching performed in the navigation apparatus, and computer-readable medium storing a program for executing the method |
US20020041229A1 (en) * | 2000-09-06 | 2002-04-11 | Nissan Motor Co., Ltd. | Lane-keep assisting system for vehicle |
US20030169491A1 (en) * | 2000-07-10 | 2003-09-11 | Eliyahu Bender | Impaired vision assist system and method |
US20040066376A1 (en) * | 2000-07-18 | 2004-04-08 | Max Donath | Mobility assist device |
US20040183663A1 (en) * | 2003-03-11 | 2004-09-23 | Nissan Motor Co., Ltd. | Lane deviation alarm system |
US20050154505A1 (en) * | 2003-12-17 | 2005-07-14 | Koji Nakamura | Vehicle information display system |
US20060022811A1 (en) * | 2004-07-28 | 2006-02-02 | Karsten Haug | Night vision device |
US20060151223A1 (en) * | 2002-11-16 | 2006-07-13 | Peter Knoll | Device and method for improving visibility in a motor vehicle |
US20070013495A1 (en) * | 2005-06-15 | 2007-01-18 | Denso Coropration | Vehicle drive assist system |
US20070176794A1 (en) * | 2005-10-13 | 2007-08-02 | Honeywell International Inc. | Synthetic Vision Final Approach Terrain Fading |
US20070198146A1 (en) * | 2004-05-19 | 2007-08-23 | Honda Motor Co., Ltd. | Traffic lane marking line recognition system for vehicle |
US20070233386A1 (en) * | 2006-03-29 | 2007-10-04 | Fuji Jukogyo Kabushiki Kaisha | Traffic lane deviation preventing system for a vehicle |
US20080192045A1 (en) * | 2007-02-09 | 2008-08-14 | Gm Global Technology Operations, Inc. | Holographic information display |
US20080246595A1 (en) * | 2007-04-03 | 2008-10-09 | Daniel William Sanders | Lane guide for motor vehicles |
US20090005961A1 (en) * | 2004-06-03 | 2009-01-01 | Making Virtual Solid, L.L.C. | En-Route Navigation Display Method and Apparatus Using Head-Up Display |
US20090002141A1 (en) * | 2005-07-18 | 2009-01-01 | Tazio Rinaldi | Visual device for vehicles in difficult climatic/environmental conditions |
US20090135092A1 (en) * | 2007-11-20 | 2009-05-28 | Honda Motor Co., Ltd. | In-vehicle information display apparatus |
US20090195414A1 (en) * | 2005-09-29 | 2009-08-06 | Thilo Riegel | Night Vision Device |
US20100001883A1 (en) * | 2005-07-19 | 2010-01-07 | Winfried Koenig | Display Device |
US20100253540A1 (en) * | 2009-04-02 | 2010-10-07 | Gm Global Technology Operations, Inc. | Enhanced road vision on full windshield head-up display |
US20100253539A1 (en) * | 2009-04-02 | 2010-10-07 | Gm Global Technology Operations, Inc. | Vehicle-to-vehicle communicator on full-windshield head-up display |
US20100253541A1 (en) * | 2009-04-02 | 2010-10-07 | Gm Global Technology Operations, Inc. | Traffic infrastructure indicator on head-up display |
US20100289632A1 (en) * | 2009-05-18 | 2010-11-18 | Gm Global Technology Operations, Inc. | Night vision on full windshield head-up display |
US20110301813A1 (en) * | 2010-06-07 | 2011-12-08 | Denso International America, Inc. | Customizable virtual lane mark display |
US20120174004A1 (en) * | 2010-12-30 | 2012-07-05 | GM Global Technology Operations LLC | Virtual cursor for road scene object lelection on full windshield head-up display |
US20120209472A1 (en) * | 2003-10-14 | 2012-08-16 | Donnelly Corporation | Vehicle vision system with night vision function |
Family Cites Families (63)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5260783A (en) * | 1991-02-21 | 1993-11-09 | Gte Laboratories Incorporated | Layered DCT video coder for packet switched ATM networks |
US6310857B1 (en) * | 1997-06-16 | 2001-10-30 | At&T Corp. | Method and apparatus for smoothing and multiplexing video data flows |
JPH11220711A (en) * | 1998-02-03 | 1999-08-10 | Fujitsu Ltd | Multipoint conference system and conference terminal |
US6438165B2 (en) * | 1998-03-09 | 2002-08-20 | Lg Electronics | Method and apparatus for advanced encoder system |
US6473607B1 (en) * | 1998-06-01 | 2002-10-29 | Broadcom Corporation | Communication device with a self-calibrating sleep timer |
US6747959B1 (en) * | 1998-10-07 | 2004-06-08 | At&T Corp. | Voice data integrated mulitaccess by self-reservation and blocked binary tree resolution |
US6334059B1 (en) * | 1999-01-08 | 2001-12-25 | Trueposition, Inc. | Modified transmission method for improving accuracy for e-911 calls |
US6625211B1 (en) * | 1999-02-25 | 2003-09-23 | Matsushita Electric Industrial Co., Ltd. | Method and apparatus for transforming moving picture coding system |
US6356764B1 (en) * | 1999-03-09 | 2002-03-12 | Micron Technology, Inc. | Wireless communication systems, interrogators and methods of communicating within a wireless communication system |
US6907073B2 (en) * | 1999-12-20 | 2005-06-14 | Sarnoff Corporation | Tweening-based codec for scaleable encoders and decoders with varying motion computation capability |
US7289570B2 (en) * | 2000-04-10 | 2007-10-30 | Texas Instruments Incorporated | Wireless communications |
WO2001082500A2 (en) * | 2000-04-22 | 2001-11-01 | Atheros Communications, Inc. | Methods for controlling shared access to wireless transmission systems and increasing throughput of the same |
KR100694034B1 (en) * | 2000-05-13 | 2007-03-12 | 삼성전자주식회사 | Apparatus for automatic detecting data rate |
US6950399B1 (en) * | 2000-07-06 | 2005-09-27 | Matsushita Electric Industrial Co., Ltd. | System and associated method for scheduling transport of variable bit-rate data over a network |
US6990113B1 (en) * | 2000-09-08 | 2006-01-24 | Mitsubishi Electric Research Labs., Inc. | Adaptive-weighted packet scheduler for supporting premium service in a communications network |
US7330877B2 (en) * | 2000-09-18 | 2008-02-12 | Sharp Laboratories Of America | Devices, softwares and methods for rescheduling multi-party sessions upon premature termination of session |
US6735422B1 (en) * | 2000-10-02 | 2004-05-11 | Baldwin Keith R | Calibrated DC compensation system for a wireless communication device configured in a zero intermediate frequency architecture |
EP1338125A2 (en) * | 2000-11-03 | 2003-08-27 | AT & T Corp. | Tiered contention multiple access (tcma): a method for priority-based shared channel access |
AU2002252339A1 (en) * | 2001-03-12 | 2002-09-24 | Hrl Laboratories, Llc | Priority-based dynamic resource allocation method and apparatus for supply-demand systems |
US7106715B1 (en) * | 2001-11-16 | 2006-09-12 | Vixs Systems, Inc. | System for providing data to multiple devices and method thereof |
US7203472B2 (en) * | 2002-03-15 | 2007-04-10 | Nokia Corporation | Method and apparatus providing calibration technique for RF performance tuning |
EP1359722A1 (en) * | 2002-03-27 | 2003-11-05 | BRITISH TELECOMMUNICATIONS public limited company | Data streaming system and method |
US6773349B2 (en) * | 2002-07-31 | 2004-08-10 | Intec, Inc. | Video game controller with integrated video display |
US7739718B1 (en) * | 2002-08-23 | 2010-06-15 | Arris Group, Inc. | System and method for automatically sensing the state of a video display device |
US8630168B2 (en) * | 2003-06-23 | 2014-01-14 | Intel Corporation | Adaptive use of a transmit opportunity |
KR100508746B1 (en) * | 2003-07-23 | 2005-08-17 | 삼성전자주식회사 | analog front end circuits and method for calibrating DC off-set thereof |
KR100570830B1 (en) * | 2003-07-29 | 2006-04-12 | 삼성전자주식회사 | method for medium access in wireless local area network system based on carrier sense multiple access with collision avoidance and apparatus thereof |
US7184721B2 (en) * | 2003-10-06 | 2007-02-27 | Texas Instruments Incorporated | Transmit power control in a wireless communication device |
US7697608B2 (en) * | 2004-02-03 | 2010-04-13 | Sony Corporation | Scalable MPEG video/macro block rate control |
EP1580914A1 (en) * | 2004-03-26 | 2005-09-28 | STMicroelectronics S.r.l. | Method and system for controlling operation of a network |
US7983160B2 (en) * | 2004-09-08 | 2011-07-19 | Sony Corporation | Method and apparatus for transmitting a coded video signal |
WO2006044672A2 (en) * | 2004-10-15 | 2006-04-27 | Meshnetworks, Inc. | System and method to facilitate inter-frequency handoff of mobile terminals in a wireless communication network |
US7784076B2 (en) | 2004-10-30 | 2010-08-24 | Sharp Laboratories Of America, Inc. | Sender-side bandwidth estimation for video transmission with receiver packet buffer |
JP4838143B2 (en) * | 2004-11-17 | 2011-12-14 | シャープ株式会社 | Transmitter |
US7787416B2 (en) * | 2004-11-18 | 2010-08-31 | Gidwani Sanjay M | Wireless network having real-time channel allocation |
US7349349B2 (en) * | 2004-11-23 | 2008-03-25 | International Business Machines Corporation | Method and system for efficient and reliable MAC-layer multicast wireless transmissions |
US20060209892A1 (en) * | 2005-03-15 | 2006-09-21 | Radiospire Networks, Inc. | System, method and apparatus for wirelessly providing a display data channel between a generalized content source and a generalized content sink |
JP2008537444A (en) * | 2005-04-22 | 2008-09-11 | オリンパス コミュニケーション テクノロジィ オブ アメリカ,インク. | Defragmentation of communication channel assignment |
CA2611160A1 (en) * | 2005-06-06 | 2006-12-14 | Mobidia, Inc. | System and method of controlling a mobile device using a network policy |
US8548048B2 (en) * | 2005-10-27 | 2013-10-01 | Qualcomm Incorporated | Video source rate control for video telephony |
US8842555B2 (en) * | 2005-10-21 | 2014-09-23 | Qualcomm Incorporated | Methods and systems for adaptive encoding of real-time information in packet-switched wireless communication systems |
US8780717B2 (en) * | 2006-09-21 | 2014-07-15 | General Instrument Corporation | Video quality of service management and constrained fidelity constant bit rate video encoding systems and method |
US7652993B2 (en) * | 2006-11-03 | 2010-01-26 | Sharp Laboratories Of America, Inc. | Multi-stream pro-active rate adaptation for robust video transmission |
US8630355B2 (en) * | 2006-12-22 | 2014-01-14 | Qualcomm Incorporated | Multimedia data reorganization between base layer and enhancement layer |
US7889756B2 (en) * | 2006-12-29 | 2011-02-15 | Nokia Corporation | Apparatus, methods and computer program products providing temporary link quality modification for multiradio control |
US20080175147A1 (en) * | 2007-01-18 | 2008-07-24 | Nokia Corporation | Admission control for packet connections |
US7853229B2 (en) * | 2007-08-08 | 2010-12-14 | Analog Devices, Inc. | Methods and apparatus for calibration of automatic gain control in broadcast tuners |
JP2009055542A (en) * | 2007-08-29 | 2009-03-12 | Toshiba Corp | Moving image encoder and moving image encoding method |
US7873020B2 (en) * | 2007-10-01 | 2011-01-18 | Cisco Technology, Inc. | CAPWAP/LWAPP multicast flood control for roaming clients |
KR101211432B1 (en) | 2007-12-27 | 2012-12-12 | 보드 오브 트러스티즈 오브 미시건 스테이트 유니버시티 | Method for estimating channel capacity and tuning coding rate for adaptive video transmission, and video transmission/receiving apparatus using them |
US8483270B2 (en) * | 2008-01-17 | 2013-07-09 | Ballistic Applications And Materials International, Llc | Method and system for adapting use of a radio link between a remotely controlled device and an operator control unit |
US8681709B2 (en) * | 2008-03-27 | 2014-03-25 | At&T Mobility Ii Llc | Dynamic allocation of communications resources |
US8005102B2 (en) * | 2008-03-31 | 2011-08-23 | Futurewei Technologies, Inc. | System and method for scheduling variable bit rate (VBR) streams in a wireless communications system |
US20100098047A1 (en) * | 2008-10-21 | 2010-04-22 | Tzero Technologies, Inc. | Setting a data rate of encoded data of a transmitter |
CN102273080A (en) * | 2008-12-03 | 2011-12-07 | 诺基亚公司 | Switching between DCT coefficient coding modes |
US20100296579A1 (en) * | 2009-05-22 | 2010-11-25 | Qualcomm Incorporated | Adaptive picture type decision for video coding |
KR101711657B1 (en) * | 2009-10-20 | 2017-03-02 | 한국전자통신연구원 | Method for managing resource in a high capacity wireless communication system |
US8687546B2 (en) * | 2009-12-28 | 2014-04-01 | Intel Corporation | Efficient uplink SDMA operation |
EP2545740B1 (en) * | 2010-03-08 | 2013-12-11 | Telefonaktiebolaget LM Ericsson (publ) | Methods and arrangements for redistributing resources for use in a radio communication system |
US8351331B2 (en) * | 2010-06-22 | 2013-01-08 | Microsoft Corporation | Resource allocation framework for wireless/wired networks |
US20120236931A1 (en) * | 2010-12-23 | 2012-09-20 | Qualcomm Incorporated | Transform coefficient scan |
US20120236115A1 (en) * | 2011-03-14 | 2012-09-20 | Qualcomm Incorporated | Post-filtering in full resolution frame-compatible stereoscopic video coding |
US20120309321A1 (en) * | 2011-05-31 | 2012-12-06 | Broadcom Corporation | Synchronized calibration for wireless communication devices |
-
2011
- 2011-07-12 US US13/180,843 patent/US20120309321A1/en not_active Abandoned
- 2011-07-12 US US13/181,112 patent/US8730930B2/en active Active
- 2011-07-15 US US13/183,613 patent/US20120310531A1/en not_active Abandoned
- 2011-07-27 US US13/192,390 patent/US9049736B2/en not_active Expired - Fee Related
- 2011-09-13 US US13/231,402 patent/US20120307746A1/en not_active Abandoned
- 2011-09-13 US US13/231,481 patent/US8831091B2/en active Active
- 2011-09-22 US US13/240,906 patent/US9295076B2/en active Active
- 2011-09-25 US US13/244,567 patent/US20120307885A1/en not_active Abandoned
-
2015
- 2015-05-28 US US14/723,610 patent/US9807784B2/en active Active
Patent Citations (29)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5765116A (en) * | 1993-08-28 | 1998-06-09 | Lucas Industries Public Limited Company | Driver assistance system for a vehicle |
US6226592B1 (en) * | 1999-03-22 | 2001-05-01 | Veridian Erim International, Inc. | Method and apparatus for prompting a motor vehicle operator to remain within a lane |
US20010056326A1 (en) * | 2000-04-11 | 2001-12-27 | Keiichi Kimura | Navigation apparatus, method for map matching performed in the navigation apparatus, and computer-readable medium storing a program for executing the method |
US20030169491A1 (en) * | 2000-07-10 | 2003-09-11 | Eliyahu Bender | Impaired vision assist system and method |
US6977630B1 (en) * | 2000-07-18 | 2005-12-20 | University Of Minnesota | Mobility assist device |
US20040066376A1 (en) * | 2000-07-18 | 2004-04-08 | Max Donath | Mobility assist device |
US20020041229A1 (en) * | 2000-09-06 | 2002-04-11 | Nissan Motor Co., Ltd. | Lane-keep assisting system for vehicle |
US20060151223A1 (en) * | 2002-11-16 | 2006-07-13 | Peter Knoll | Device and method for improving visibility in a motor vehicle |
US20040183663A1 (en) * | 2003-03-11 | 2004-09-23 | Nissan Motor Co., Ltd. | Lane deviation alarm system |
US20120209472A1 (en) * | 2003-10-14 | 2012-08-16 | Donnelly Corporation | Vehicle vision system with night vision function |
US20050154505A1 (en) * | 2003-12-17 | 2005-07-14 | Koji Nakamura | Vehicle information display system |
US20070198146A1 (en) * | 2004-05-19 | 2007-08-23 | Honda Motor Co., Ltd. | Traffic lane marking line recognition system for vehicle |
US20090005961A1 (en) * | 2004-06-03 | 2009-01-01 | Making Virtual Solid, L.L.C. | En-Route Navigation Display Method and Apparatus Using Head-Up Display |
US20060022811A1 (en) * | 2004-07-28 | 2006-02-02 | Karsten Haug | Night vision device |
US20070013495A1 (en) * | 2005-06-15 | 2007-01-18 | Denso Coropration | Vehicle drive assist system |
US20090002141A1 (en) * | 2005-07-18 | 2009-01-01 | Tazio Rinaldi | Visual device for vehicles in difficult climatic/environmental conditions |
US20100001883A1 (en) * | 2005-07-19 | 2010-01-07 | Winfried Koenig | Display Device |
US20090195414A1 (en) * | 2005-09-29 | 2009-08-06 | Thilo Riegel | Night Vision Device |
US20070176794A1 (en) * | 2005-10-13 | 2007-08-02 | Honeywell International Inc. | Synthetic Vision Final Approach Terrain Fading |
US20070233386A1 (en) * | 2006-03-29 | 2007-10-04 | Fuji Jukogyo Kabushiki Kaisha | Traffic lane deviation preventing system for a vehicle |
US20080192045A1 (en) * | 2007-02-09 | 2008-08-14 | Gm Global Technology Operations, Inc. | Holographic information display |
US20080246595A1 (en) * | 2007-04-03 | 2008-10-09 | Daniel William Sanders | Lane guide for motor vehicles |
US20090135092A1 (en) * | 2007-11-20 | 2009-05-28 | Honda Motor Co., Ltd. | In-vehicle information display apparatus |
US20100253540A1 (en) * | 2009-04-02 | 2010-10-07 | Gm Global Technology Operations, Inc. | Enhanced road vision on full windshield head-up display |
US20100253539A1 (en) * | 2009-04-02 | 2010-10-07 | Gm Global Technology Operations, Inc. | Vehicle-to-vehicle communicator on full-windshield head-up display |
US20100253541A1 (en) * | 2009-04-02 | 2010-10-07 | Gm Global Technology Operations, Inc. | Traffic infrastructure indicator on head-up display |
US20100289632A1 (en) * | 2009-05-18 | 2010-11-18 | Gm Global Technology Operations, Inc. | Night vision on full windshield head-up display |
US20110301813A1 (en) * | 2010-06-07 | 2011-12-08 | Denso International America, Inc. | Customizable virtual lane mark display |
US20120174004A1 (en) * | 2010-12-30 | 2012-07-05 | GM Global Technology Operations LLC | Virtual cursor for road scene object lelection on full windshield head-up display |
Cited By (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8467770B1 (en) * | 2012-08-21 | 2013-06-18 | Mourad Ben Ayed | System for securing a mobile terminal |
US20150379871A1 (en) * | 2013-03-08 | 2015-12-31 | Honda Motor Co., Ltd. | Congestion sign detection method, program, and congestion sign detection device |
US9646492B2 (en) * | 2013-03-08 | 2017-05-09 | Honda Motor Co., Ltd. | Congestion sign detection method, program, and congestion sign detection device |
US9452712B1 (en) | 2013-03-15 | 2016-09-27 | Honda Motor Co., Ltd. | System and method for warning a driver of a potential rear end collision |
US9251715B2 (en) | 2013-03-15 | 2016-02-02 | Honda Motor Co., Ltd. | Driver training system using heads-up display augmented reality graphics elements |
US9378644B2 (en) | 2013-03-15 | 2016-06-28 | Honda Motor Co., Ltd. | System and method for warning a driver of a potential rear end collision |
US9393870B2 (en) | 2013-03-15 | 2016-07-19 | Honda Motor Co., Ltd. | Volumetric heads-up display with dynamic focal plane |
US9400385B2 (en) | 2013-03-15 | 2016-07-26 | Honda Motor Co., Ltd. | Volumetric heads-up display with dynamic focal plane |
US9164281B2 (en) | 2013-03-15 | 2015-10-20 | Honda Motor Co., Ltd. | Volumetric heads-up display with dynamic focal plane |
US9747898B2 (en) | 2013-03-15 | 2017-08-29 | Honda Motor Co., Ltd. | Interpretation of ambiguous vehicle instructions |
US10215583B2 (en) | 2013-03-15 | 2019-02-26 | Honda Motor Co., Ltd. | Multi-level navigation monitoring and control |
US10339711B2 (en) | 2013-03-15 | 2019-07-02 | Honda Motor Co., Ltd. | System and method for providing augmented reality based directions based on verbal and gestural cues |
US20150079563A1 (en) * | 2013-09-17 | 2015-03-19 | Sony Corporation | Nonverbal audio cues during physical activity |
US9588340B2 (en) * | 2015-03-03 | 2017-03-07 | Honda Motor Co., Ltd. | Pedestrian intersection alert system and method thereof |
WO2018004858A3 (en) * | 2016-06-30 | 2018-07-26 | Intel Corporation | Road condition heads up display |
US10696308B2 (en) | 2016-06-30 | 2020-06-30 | Intel Corporation | Road condition heads up display |
Also Published As
Publication number | Publication date |
---|---|
US9295076B2 (en) | 2016-03-22 |
US20120307746A1 (en) | 2012-12-06 |
US20150327266A1 (en) | 2015-11-12 |
US9049736B2 (en) | 2015-06-02 |
US20120307806A1 (en) | 2012-12-06 |
US20120307885A1 (en) | 2012-12-06 |
US9807784B2 (en) | 2017-10-31 |
US8730930B2 (en) | 2014-05-20 |
US20120307747A1 (en) | 2012-12-06 |
US20120309321A1 (en) | 2012-12-06 |
US20120307814A1 (en) | 2012-12-06 |
US8831091B2 (en) | 2014-09-09 |
US20120307884A1 (en) | 2012-12-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20120310531A1 (en) | Navigation system employing augmented labeling and/or indicia | |
Anaya et al. | Vulnerable road users detection using v2x communications | |
US10134286B1 (en) | Selecting vehicle pickup location | |
US9888394B2 (en) | Route recommendations | |
US8718797B1 (en) | System and method for establishing communication channels between on-board unit of vehicle and plurality of nodes | |
JP6122233B1 (en) | Visible light signal generation method, signal generation apparatus, and program | |
US20200007845A1 (en) | Image processing apparatus and image processing method | |
US20200005642A1 (en) | Method and apparatus for moving a parked vehicle for an emergency vehicle in autonomous driving system | |
US10507760B2 (en) | Enhanced vehicle authentication system platform providing real time visual based unique taxi vehicle authentication for customers and drivers and methods of implementing the same | |
EP3738327B1 (en) | Target vehicle selection and message delivery in vehicular systems | |
JP6337646B2 (en) | In-vehicle video system, video transfer system, video transfer method, and video transfer program | |
KR20060119746A (en) | Method and apparatus for providing transportation status information and using it | |
US11522615B2 (en) | Transmission method, reception method, transmission device, and reception device | |
JP6602303B2 (en) | System and method for supporting augmented reality | |
JP5890294B2 (en) | Video processing system | |
CN110546950A (en) | Imaging element and electronic device including the same | |
JP7291848B2 (en) | Systems and methods for providing data flow for sensor sharing | |
US10877288B2 (en) | Imaging device and imaging method | |
US20210134156A1 (en) | Vehicle having dangerous situation notification function and control method thereof | |
CN110663203B (en) | Receiving apparatus and receiving method | |
US10608681B2 (en) | Transmission device and communication system | |
ES2315991T3 (en) | APPARATUS, AND ASSOCIATED METHOD, TO FACILITATE THE NETWORK SELECTION THROUGH AN ITINERANT MOBILE NODE. | |
CN1732489A (en) | Method and apparatus for determining the location of a wireless device | |
CN110999131B (en) | Communication device, communication system, and communication method | |
CN106679688A (en) | Navigation method for driving direction |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: BROADCOM CORPORATION, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:AGARWAL, PEYUSH;RAJAKARUNANAYAKE, YASANTHA N.;REEL/FRAME:026597/0653 Effective date: 20110714 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |
|
AS | Assignment |
Owner name: BANK OF AMERICA, N.A., AS COLLATERAL AGENT, NORTH CAROLINA Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:037806/0001 Effective date: 20160201 Owner name: BANK OF AMERICA, N.A., AS COLLATERAL AGENT, NORTH Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:037806/0001 Effective date: 20160201 |
|
AS | Assignment |
Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD., SINGAPORE Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:041706/0001 Effective date: 20170120 Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:041706/0001 Effective date: 20170120 |
|
AS | Assignment |
Owner name: BROADCOM CORPORATION, CALIFORNIA Free format text: TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:BANK OF AMERICA, N.A., AS COLLATERAL AGENT;REEL/FRAME:041712/0001 Effective date: 20170119 |