WO2001043104A1 - Methodology, apparatus, and system for electronic visualization of traffic conditions - Google Patents

Methodology, apparatus, and system for electronic visualization of traffic conditions Download PDF

Info

Publication number
WO2001043104A1
WO2001043104A1 PCT/US2000/033433 US0033433W WO0143104A1 WO 2001043104 A1 WO2001043104 A1 WO 2001043104A1 US 0033433 W US0033433 W US 0033433W WO 0143104 A1 WO0143104 A1 WO 0143104A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
display
user
data
presentation
Prior art date
Application number
PCT/US2000/033433
Other languages
French (fr)
Inventor
David Sitrick
Lawrence Gust
Original Assignee
David Sitrick
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by David Sitrick filed Critical David Sitrick
Publication of WO2001043104A1 publication Critical patent/WO2001043104A1/en

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096708Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control
    • G08G1/096716Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control where the received information does not generate an automatic action on the vehicle control
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/04Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096733Systems involving transmission of highway information, e.g. weather, speed limits where a selection of the information might take place
    • G08G1/096741Systems involving transmission of highway information, e.g. weather, speed limits where a selection of the information might take place where the source of the transmitted information selects which information to transmit to each vehicle
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096766Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
    • G08G1/096775Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is a central station
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096766Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
    • G08G1/096783Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is a roadside individual element

Definitions

  • This invention relates to a methodology, apparatus, and system for electronic visualization of traffic conditions and in-vehicle as well as external-to-vehicle-based subsystems and networked combinations thereof.
  • a common example of this would be commuters driving into work via expressways or other highways, where they join the highway at some point close to their origination point and exit the highway at some point close to their destination, meanwhile sharing the highway with a plurality of other vehicles also going in the same general direction.
  • Another problem with the prior art is that if an operator is unfamiliar with the traffic in a given area, for instance a new visitor or professional driver that has just arrived in a city, he or she may not be familiar with the local colloquialisms and common names for roadways that otherwise appear in an atlas as route numbers. For example, a visitor to Washington, D.C. might have difficulty knowing which road is the Beltway if they have never been to Washington before. Similarly, in the Bay area of California, finding the Almaden expressway, or in the Chicago area, finding the Eisenhower expressway, can be difficult for operators of vehicles that have never been to those particular locations before, and invariably traffic reports are given in terms of the common names for the rights of way.
  • an apparatus and method is provided to allow a operator of a vehicle to have an early warning of traffic conditions that may effect his or her travel, traffic conditions in the immediate vicinity of the operator's vehicle and traffic conditions that are a little bit further than the immediate vicinity of the operator's vehicle but still of significance to the operator at the time. It is another object of the present invention to convey detailed information about very localized traffic conditions to persons other than operators of vehicles, such as emergency vehicle crews, or persons involved in the collecting of information about traffic conditions for relay, for example to the news stations previously mentioned.
  • This relay process can be direct from the source of the images, or data about traffic conditions, or it may be relayed through a series of steps.
  • FIG. 1 is a system diagram of one embodiment in accordance with the present invention
  • FIG. 2 is a structural diagram of an image sensor in accordance with the present invention
  • FIG. 3 is a structural drawing of an image sensor wherein camera body 300 comprises image sensor 310 and lens 320 and an infrared filter 325;
  • FIG. 4 shows another embodiment of the invention indicating the use of a vehicle speed sensor input to control the retractable support;
  • FIG. 5 shows a detailed view of an alternate embodiment of the pan and tilt mechanism of the present invention
  • FIG. 6 is a schematic diagram of one embodiment of the present invention in operation
  • FIG. 7 is a schematic drawing of another embodiment of the present invention in operation, where road hazard 700 exists on a thoroughfare ahead of a first vehicle 745, a second vehicle 765 and a third vehicle 775;
  • FIG. 8 is an illustration of an example of a display provided in accordance with the present invention, illustrating multiple images from ahead in the traffic flow and operator controls;
  • FIG. 9 is a detailed schematic illustration of a retractable support as in one embodiment of the present invention.
  • FIG. 10 is an illustration showing additional detail of the mechanism of FIG. 9 wherein the retractable support comprises a retractable support housing 1040 that emits a retractable support column 1040, upon which is mounted a pan and tilt means 1020, which is in turn affixed to a image sensing means 1000 having an angular field of view 1010;
  • FIG. 11 is an even more detailed view of the image sensing means and the pan and tilt means of FIG. 10;
  • FIG. 12 is a schematic illustration showing one embodiment of the present invention utilizing a speed sensor input similar to the one shown in FIG. 4;
  • FIG. 13 shows an alternate embodiment of the speed sensing retractable support controller means similar to those shown in FIGS. 12 and 4;
  • FIG. 14 shows an alternate embodiment of the retractable support wherein the retractable support 1420 variably positions the image sensing means 1410 off to the side of vehicle 1400;
  • FIG. 15 shows one embodiment of the present invention wherein the retractable support structure is activated by means of hydraulic piston
  • FIG. 16 shows a schematic drawing of a multi-sensor apparatus as in one embodiment of the present invention wherein there are a plurality of image sensing means 1610, 1615, and 1620, and other sensing means 1625 coupled to processor 1630;
  • FIG. 17 shows an embodiment of the present invention as utilized on watercraft;
  • FIG. 18 shows details of one implementation of the mounting means as in FIG.
  • FIG. 19 shows one embodiment of the present invention wherein the system comprises a receiver 1970 coupled to a processing means 1930 which is in turn coupled to a display means 1940 within the vehicle 1900;
  • FIG. 20 shows an alternate embodiment of the present invention wherein the system comprises an antenna 2060 coupled to receiving means 2070, which in turn is coupled to processing means 2030 which is, in turn, coupled to a display means 2040;
  • FIG. 21 shows another embodiment of the present invention as used in a central station
  • FIG. 22 shows an embodiment of the present invention wherein the image sensing means 2210 is affixed to vehicle 2200 via the fixed support 2220;
  • FIG. 23 shows another embodiment of the present invention, wherein the image sensing means 2310 may be operatively elevated above the position of vehicle 2300 via the support element 2320;
  • FIG. 24 shows a schematic of one multiple vehicle embodiment of the present invention.
  • FIG. 25 illustrates an alternate embodiment of the first vehicle as shown in FIG. 24, wherein the combination image sensing means and transmission means 2590 is affixed to vehicle 2500 via support 2520;
  • FIG. 26 shows an alternate embodiment of the present invention wherein emergency vehicle 2600 is equipped with an image sensing and transmission apparatus 2690 which provides a transmitted signal 2695;
  • FIG. 27 shows an alternate embodiment of the present invention wherein a number of image sensing means 2710, 2720, 2730 are arranged at strategic locations along a roadway 2700;
  • FIG. 28 illustrates schematically the central station 2805 comprising a variety of sources of information including textual information 2800 and visual information 2810;
  • FIG. 29 is an example embodiment of the present invention, wherein an image sensing means 2910 is affixed via support structure 2920 to a road sign 2930 located adjacent the right-of-way 2940;
  • FIGS. 30A and 30B are representative examples of the types of display information that may be presented in accordance with one embodiment of the present invention
  • FIG. 31 A is a schematic representation of an alternate embodiment of the present invention wherein a combination image sensing and transmission apparatus 3110 is located adjacent the roadway 3170;
  • FIG. 3 IB is an alternate embodiment to that depicted in FIG. 31 A wherein the image sensing means 3115 is directly coupled to a processing means 3145 which is in turn directly coupled to billboard 3150 thereby providing for integrated information display responsive to the image detected by image sensing means 3115 of the roadway 3180;
  • FIG. 32 shows one embodiment of the signal receiving and processing element of the present invention
  • FIG. 33 represents a simplified block diagram of the receiving and processing systems of the present invention
  • FIG. 34 shows one embodiment of the present invention illustrating detail of the image processing and formatting means 3350 as shown in FIGS. 32 and 33;
  • FIG. 35 is a schematic representation of a multiple transmitter embodiment of the present invention wherein the user's vehicle 3500 receives a plurality of signals via antenna 3510 from sources including a first transmitter 3520 and a second transmitter 3530;
  • FIG. 36 shows a schematic illustration of the operation of an embodiment of the present invention wherein a building 3630 blocks the view of roadway 3640 some distance from the user's vehicle 3600;
  • FIG. 37 shows a schematic representation of the one embodiment of the present invention as used in watercraft
  • FIG. 38A is a schematic representation of traffic congestion involving an emergency vehicle 3800, the user's view of which is blocked;
  • FIG. 38B shows a representation of one possible display as shown by display means 3850 of the present invention wherein the information received by image sensing means 3830 is directly displayed for the user;
  • FIG. 38C shows a different display of the present invention that would be presented to the user on display means 3850 which comprises a schematic view derived from information received from image sensing means 3830 and processed to produce the overhead schematic view and other user information as shown in the illustration;
  • FIGS. 39A and 39B represent an alternate embodiment of the system as depicted in FIGS. 38A, 38B, and 38C wherein the emergency vehicle 3800 is again blocked from direct view of the user in vehicle 3810 by intermediate vehicle 3820;
  • FIG. 40 is another example embodiment of the present invention wherein a first vehicle 4000 comprises a support means 4010, image sensing means 4020, and a transmission means 4030;
  • FIG. 41 shows one embodiment of the present invention which utilizes multiple display means, 4110, 4120, and 4130, all coupled to processing means 4140;
  • FIG. 42 shows one embodiment of the present invention that supports a relay mode of operation;
  • FIGS. 43 A and 43B show alternate embodiments of the relay mode of operation
  • FIG. 43C illustrates a flow chart for the relay operation handoff for FIGS. 43 A and 43B
  • FIG. 44 shows an example of a display in accordance with the present invention, wherein a road map is shown indicating the location of the user's vehicle 4400, locations of junctions 4410 superimposed on the road map, and known emergency vehicle locations 4430; and
  • FIG. 45 illustrates a display provided responsive to information from any of the plurality of received signals, and illustrates a directional indication 4510 superimposed on the display indicating to the user that the most efficient way to bypass this particular congestion would be, in the illustrated example, to drive to the left.
  • FIG. 1 is a system diagram of one embodiment of the present invention.
  • Vehicle 100 comprises sensing means 110 affixed to support means 120 affixed to vehicle 100, wherein the sensing means couples data to processing means 130 and display means 140.
  • Data sensed by sensing means 110 and processed by the processing means 130 is relayed via logical display link 150 to display means 144 displayed to the user.
  • the processing means can alternatively be positioned with the display to receive and process the sensed data for display.
  • the display means may comprise a CRT display such as manufactured by major consumer electronics manufacturers including Sony Corp. (Japan), JVC Corp (Japan), Zenith Corp. (Chicago, IL), and ASC Systems (St. Claire Shores, MI).
  • CTR display such as manufactured by major consumer electronics manufacturers including Sony Corp. (Japan), JVC Corp (Japan), Zenith Corp. (Chicago, IL), and ASC Systems (St. Claire Shores, MI).
  • Other display technologies such as liquid crystal displays available through vendors such as NEC Electronics (Santa Clara, CA), Toshiba America Electronic Components (Irvine, CA), and King Bright Corp. (City of Industry, CA); Plasma displays are available from ASC Systems, and Displays Inc. (Lewiston, PA); and LED displays such as provided by Colorado Microdisplay Inc. (Boulder, CO), King Bright Corp., and Bivar Inc. (Irvine, CA.).
  • FIG. 2 illustrates a structural diagram of an image sensor in accordance with the present invention.
  • Camera body 200 comprises image sensor means 210, which may be a CCD array, or other image sensing device.
  • Camera body 200 also comprises lens 220.
  • Sensing of a physical object 230 is performed by lens 220 gathering reflected light from the object 230 and projecting the image 240 of that object on the image sensing means 210.
  • the image sensing means further comprises a moveable filter as shown in FIG. 3.
  • FIG. 3 is a structural drawing of an image sensor wherein camera body 300 comprises image sensor 310 and lens 320 and an infrared filter 325.
  • the infrared filter 325 may be in a bypass position 330 or in an active position 340, rotating pivotally around pivot 350.
  • the CCD array which is sensitive to both visible light spectrum and infrared spectrum may optionally have an infrared filter interposed between said sensor and lens assembly 320, thereby restricting or eliminating the infrared spectrum from arriving at the image sensor thereby limiting the sensor to only sensing visible light.
  • the filter may be deployed in the bypass position thereby allowing infrared energy to travel through lens 320 and impinge incident upon sensor 310.
  • CCD arrays suitable for this invention are available from a number of commercial sources, including Edmund Scientific (Barrington, NJ), Hewlett Packard (San Jose, CA), Samuel Video Components USA (Rochelle Park, NJ), and Orbit Semi-conductor (Sunny Valley, CA).
  • FIG. 4 shows another embodiment of the invention indicating the use of a vehicle speed sensor input to control a retractable support.
  • Vehicle 400 comprises a sensing means 410, a retractable support means 420, a processing means 430, a display means 440, and a wiring harness 460.
  • Image data is processed for display by the processor 430 and is coupled (wirelessly or via wiring harness) to the display 440. Alternatively, data can be conveyed via logical display link 450 from sensing means 410 to display 440.
  • wheel speed sensor 470, or engine computer 480 may be coupled via wiring harness 460 to the retractable support means 420. This permits either or both of wheel sensor 470 and engine computer 480 to operatively control retractable support means 420 responsive to vehicle speed.
  • FIG. 5 shows a detailed view of an alternate embodiment of a pan and tilt mechanism in accordance with the present invention.
  • Camera body 500 is mounted on an assembly comprising a tilt mechanism and a pan mechanism.
  • the tilt mechanism comprises a tilt motor 520 driving a tilt pinion 530 which operates in cooperation with a tilt rack 540.
  • the pan motor 540 operates a pan pinion 560 which operates in cooperation with a pan rack 570.
  • the support means comprising the pan and tilt mechanism is attached to support 580. Operating the tilt motor 520 in a forward direction will tilt the camera body 500 through a tilt axis thereby positioning lens assembly 510 of the camera body 500 to view or sense image data above the center line of site of the camera.
  • tilt motor 520 in a backward direction will tilt the camera body 500 through a tilt axis thereby positioning lens assembly 510 of the camera body 500 to view or sense image data below the center line of site of the camera
  • Pan motor 550 permits the image sensor assembly 500 to rotate around a vertical axis coincident with the axis of support 580 thereby allowing the camera body 500 and lens assembly 510 to face in various heading directions with respect to the support 580.
  • Motors and actuators suitable for use with the pan and tilt mechanism of the present invention are available from a number of manufacturers including vendors such as MGC Inc.
  • FIG. 6 is a schematic diagram of the one embodiment of the present invention in operation.
  • a road hazard 600 is present on a thoroughfare ahead of a first vehicle 610, a second vehicle 650, and an intermediate vehicle 660.
  • the first vehicle 610 comprises a first sensing means 620 and a first transceiver means 630.
  • the first sensing means 620 is coupled to a first display means 623 via first logical display link 626.
  • Data corresponding to information sensed by the first sensing means 620 is transmitted by the first transceiver means 630 as a first transmitted signal 635.
  • the second vehicle 650 comprises a second sensing means 640 and a second transceiver means 670 both of which are operatively coupled via a second logical display link 690 to a second display means 680 within second vehicle 650.
  • the first transmitted signal 635 carries information content that is displayed on second display means 680 responsive to the reception by the second transceiver means 670. This permits a display generation on the second display means 680 irrespective of the fact that intermediate vehicle 660 may be blocking a direct line of sight between the second vehicle 650 and the road hazard 600.
  • Radio frequency transceivers suitable for use with the present invention are available from a variety of vendors including Motorola Corp. (Schaumburg, IL). FIG.
  • a stationary image sensing means 720 additionally comprising a stationary transceiver means 730 are fixed to stationary support 710, which in the illustrative embodiment, can be a road sign.
  • the stationary transceiver means 730 transmits a transmitted signal 735 in the direction reversed to that of traffic flow, and/or to a logical central controller which displays and/or rebroadcasts the signal.
  • the first vehicle 745 comprises a first vehicle transceiver 750 and a first image sensing means 740.
  • the first vehicle 745 also comprises a first vehicle display 748 coupled to the first vehicle transceiver 750 and the first image sensing means 740.
  • the first vehicle's transceiver 750 produces a first vehicle's transmission 755 which is transmitted opposite of traffic flow.
  • Second vehicle 765 comprises a second vehicle reception and sensing means 760 logically linked to second vehicle's display 768.
  • the second vehicle's reception and sensing means 760 has the ability to receive the first vehicle's transmission 755 and the stationary transmitted signals 735 and to discriminate between the two received signals and to generate a display on the second vehicle's display 768 responsive to the data in the transmitted signal 735 and the data in the first vehicle's transmission 755.
  • the third vehicle 775 has a third vehicle's reception and sensing means 770 coupled to the third vehicle's display 778 and is also capable of independently receiving and discriminating transmitted signals 735 and first vehicle's transmission 755 for generating a display responsive thereto.
  • the selection and discrimination means is detailed later in FIG. 32.
  • FIG. 8 is an illustration of a display in accordance with the present invention, illustrating multiple images from ahead in the traffic flow and operator controls.
  • the display means 800 includes a display comprised of a textual information window 810, a first visual image 820, and a second visual image 830.
  • the first visual image 820 may have a first legend 822 superimposed upon said image.
  • the second visual image 830 may have a second legend 832 superimposed upon it.
  • Legends 822 and 832 may be produced by a character generator circuit such as available by Hyperception Inc. (Dallas, TX).
  • General purpose display driver integrated circuits are available from companies including Telcom Semiconductor, Inc. (Mountain View, CA), Silicon Motion, Inc.
  • the legends 822 and 832 may comprise identification indicating the source of the visual image and/or additionally may comprise other information such as transmitted by the source of the information or generated internally by the processing means of the present invention.
  • the user controls of display means 800 include a tilt control 840, a pan control 850, a zoom control 860, and a focus control 870. In the illustrated embodiment, these controls are shown as thumbwheels permitting the user to easily adjust respectively tilt, pan, zoom, and focus of the image sensing means. In an alternate embodiment these controls may be implemented as buttons, or as graphical user interface elements on an interactive touch screen display.
  • Touch screens are commercially available from a number of manufacturers, including Microtouch Systems, Inc (Methuen, MA), Omron Electronics (Schaumburg, IL), MSI Technologies LLC (Englewood, CO), Advanced Input Devices (Couer D'Alene, ID). (Eden Prairie, MN), Micro Mo Electronics (Clea ⁇ vater, FA), Nidec America (Canton, MA), and Dado Corp. (Summerset, NJ).
  • FIG. 9 is a detailed schematic illustration of a retractable support as one embodiment of the present invention.
  • Vehicle 900 comprises a retractable support means 940 which provides for positioning an image sensing means 910 in either of a retracted position 920 and a deployed position 930.
  • FIG. 10 is an illustration showing additional detail of the mechanism of FIG. 9 wherein the retractable support comprises a retractable support housing 1040 that emits a retractable support column 1040, upon which is mounted a pan and tilt means 1020, which is in turn affixed to an image sensing means 1000 having an angular field of view 1010.
  • the retractable support comprises a retractable support housing 1040 that emits a retractable support column 1040, upon which is mounted a pan and tilt means 1020, which is in turn affixed to an image sensing means 1000 having an angular field of view 1010.
  • FIG. 11 is an even more detailed view of the image sensing means and the pan and tilt means of FIG. 10.
  • the image sensing means 1100 has a first line of sight 1110 and a second line of sight 1120 comprising a tilt range 1130 between the first and second line of sights, respectively.
  • the image sensing means has a pan range 1140 as illustrated in FIG. 11.
  • the image sensing means 1100 is thereby provided means to tilt the line of sight within the tilt range 1130 and to pan in different directions within the pan range 1140 relative to an axis extending through support column 1150.
  • the tilt mechanism permits the image sensing means to look up and look down and the pan means permits the image sensing means to rotate and essentially "look around.”
  • FIG. 12 is a schematic illustration showing one embodiment of the present invention utilizing a speed sensor input similar to the one shown in FIG. 4.
  • vehicle 1200 comprises a sensing means 1210 affixed on a retractable support means 1220 to the vehicle 1200.
  • Data from the sensing mean 1210 is relayed via the sense data relay link 1225 to the processing means 1230.
  • the processing means then produces image data responsive to the data from the sensing means and relays the image data via an image data relay link 1235 to display means 1240.
  • the vehicle's speed sensing means 1250 which in the illustrated embodiment, is a vehicle wheel speed sensor, is coupled to a retractable support controller 1260.
  • the retractable support controller 1260 issues a control signal via control signal link 1270 to the retractable support means 1220 which responds operatively thereto.
  • the operation of the retractable support means is thereby responsive to the speed of the vehicle as detected by the vehicle speed sensing means 1250.
  • this can be used to either deploy, or retract, the camera if the vehicle speed approaches certain thresholds which are programmably determined.
  • the system may prevent the image sensing means from being deployed on the retractable support by instructing the retractable support to retract if the vehicle's speed is above 5 mph.
  • the retractable support may deploy to a preset point if the vehicle's ground speed is under 30 mph then deploy even further if the vehicle's speed is above 30 mph.
  • FIG. 13 shows an alternate embodiment of the speed sensing retractable support controller means similar to those shown in FIGS. 12 and 4.
  • the image sensing means 1310 is affixed to a retractable support means 1320 to the vehicle 1300.
  • the vehicle 1300 also comprises processing means 1330 and display means 1340.
  • the image sensing means 1310 additionally comprises a windspeed detection apparatus 1350 coupled to a windspeed detection controller 1360.
  • the windspeed detection controller 1360 issues control signals to the retractable support 1320 via link 1370.
  • the link 1370 permits the operation of the retractable support 1320 to be controlled responsive to the detected ambient wind condition.
  • the link 1370 can be direct control (wired or wireless) of the retractable support, or can be coupled to processing means 1330 which controls the retractable support.
  • FIG. 14 shows an alternate embodiment of the retractable support wherein the retractable support 1420 variably positions the image sensing means 1410 illustrating the opti on of off to the si de of vehi cle 1400.
  • FIG. 15 shows one embodiment of the present invention wherein the retractable support structure is activated by means of hydraulic piston.
  • Image sensing means 1510 is affixed to one end of a boom 1520 which is affixed at its opposite end 1522 to vehicle 1500.
  • Hydraulic cylinder 1524 positioned between vehicle 1500 and boom 1520 permits the boom 1520 to be raised and lowered around pivot 1522 with respect to the vehicle.
  • Tilt means 1526 compensates for the motion of the raising and lowering of the boom by rotating image sensing means 1510 correspondingly.
  • Image sensing means 1510 provides data to processing means 1530 which then provides image data to display means 1540 for display to the user.
  • FIG. 16 shows a schematic drawing of a multi-sensor apparatus as in one embodiment of the present invention wherein there are a plurality of image sensing means 1610, 1615, and 1620, and other sensing means 1625 coupled to processor 1630.
  • Processor 1630 responsive to signals from the various sensing means produces a display output signal for display to the user on display means 1640.
  • some of the image sensing means (such as image sensing means 1610) may be supported on the vehicle 1600 via retractable support 1650.
  • some of the image sensing means (for example, 1615 and 1620, as illustrated) are supported via a fixed support means (respectively, 1660 and 1670).
  • sensing means 1625 is an antenna connected to the receiving means 1680. The antenna may be fixed or retractable.
  • the processing means 1630 includes means to discriminate the various signals supplied to it from various sensing means and to select portions of those signals for integration into a display integration for the user.
  • FIG. 17 shows an embodiment of the present invention as utilized on watercraft.
  • Watercraft 1700 is shown with image sensing means 1710 mounted on support mast 1720.
  • the image sensing means 1710 is coupled to processing means 1730 and thence to display means 1740 for producing an integrated display presentation for the user.
  • FIG. 18 shows details of one implementation of the mounting means as in FIG. 17.
  • Image sensing means 1810 mounted on support mast 1820 is capable of the pan and tilt motions as shown previously in FIGS. 5 and 11.
  • the tilt mechanism permits locating the image sensing means line of sight 1870 variably up and down through range 1850 as shown in the illustration.
  • the pan mechanism permits image sensing means line of sight to be rotated through a range 1830 as shown on the illustration in the direction 1840 as shown by the arrows.
  • an additional axis of freedom is supplied, that freedom namely being the ability to roll along the axis of the line of sight 1870 on the directions 1860 as shown by the arrows.
  • By permitting the image sensing means 1810 to be additionally adjusted in the direction 1860 of rotation it permits the image sensing means 1810 to compensate for the variable roll of the watercraft.
  • FIG. 19 shows one embodiment of the present invention wherein the system comprises a receiver 1970 coupled to a processing means 1930 which is in turn coupled to a display means 1940 within the vehicle 1900. Also coupled to the receiving means 1970 is antenna 1960.
  • the antenna 1960 provides for receiving transmissions from sources external to the vehicle, decoding of the received transmissions via the receiver 1970, and passing said decoded data to processing means 1930 for integration into a display presentation for the user.
  • FIG. 20 shows an alternate embodiment of the present invention wherein the system comprises an antenna 2060 coupled to receiving means 2070, which in turn is coupled to processing means 2030 which is, in turn, coupled to a display means 2040.
  • signals are received by antenna 2060 from sources outside the present system and are decoded by receiving means 2070 producing data that is then relayed to processing means 2030 for integration into a display presentation on display 2040 to the user.
  • the system is not required to be physically present within a vehicle.
  • FIG. 21 shows another embodiment of the present invention as used in a central station.
  • a plurality of signal receiving means such as 2160, 2165, and 2168) are coupled to a receiving decoder 2170. Examples include a radio frequency antenna 2160, a microwave antenna 2165, and telecommunications links 2168.
  • the receiving and decoding means 2170 relays received and decoded data to the processing means 2130.
  • the processing means 2130 generates data for a plurality of individual display elements 2145 that comprise the display means 2140. This permits an operator to receive information communicated from a variety of sources (as shown in the illustrated example, telecommunications, radio frequency, and microwave links) and to have the system integrate that data and present it for display.
  • the system of the present invention is not limited to the communications link varieties as shown.
  • Other additional communication links include infrared communications, satellite communications, fiber optics, etc.
  • manufacturers providing components for microwave communications include Sprague-Goodman Electronics (Westbury, NY), Atlantic Coast Instruments (Brick, NJ), and Microwave Communication Laboratories (St. Russia, FL).
  • the individual display elements 2145 in the illustrated embodiment may comprise any combination of CRT displays, LED displays, LCD displays, plasma displays, or other types of displays.
  • FIG. 22 shows an alternate embodiment of the present invention wherein the image sensing means 2210 is affixed to vehicle 2200 via the fixed support 2220.
  • the image sensing means 2210 relays signals to processor 2230 which decodes signals and produces an integrated display presentation for display to the user on display 2240.
  • the image sensing means 2210 has a fixed field of view 2250 determined by the mounting of image sensing 2210 on support 2220 and the relative orientation of vehicle 2200.
  • FIG. 23 shows another embodiment of the present invention wherein the image sensing means 2310 may be operatively elevated above the position of vehicle 2300 via the support element 2320, thereby permitting the field of view 2350 to encompass objects such as vehicle 2360 which would otherwise be invisible to the operator of vehicle 2300 due to interfering vehicle 2340.
  • FIG. 24 shows a schematic of one multiple vehicle embodiment of the present invention.
  • a first vehicle 2400 comprises a image sensing means 2410, a processing means 2430, a display means 2440, and transmission means 2435.
  • the image sensing means 2410 having a field of view 2450 senses data corresponding to an image and supplies that information to transmitter 2435.
  • Transmitter 2435 transmits said information via transmitted signal 2437 to the antenna 2460 of second vehicle 2490.
  • the received signal via antenna 2460 is relayed through receiving means 2470 which receives and decodes the data in transmitted signal 2437 and provides that data to processing unit 2475.
  • Processing unit 2475 generates a local display presentation for the user which is displayed on display means 2480.
  • the image sensing means 2410 and transmission means 2435 are separate means that are coupled.
  • FIG. 25 illustrates an alternate embodiment of the first vehicle as shown in FIG. 24.
  • the combination image sensing means and transmission means 2590 is affixed to vehicle 2500 via support 2520.
  • the combination image sensing and transmission means 2590 provides a transmitted signal 2595 as well as a signal coupled to processing element 2530.
  • Processing element 2530 produces an integrated display presentation on display means 2540 for the user.
  • the signal transmission means and the image sensing means may be combined into one unit.
  • FIG. 26 shows an alternate embodiment of the present invention wherein emergency vehicle 2600 is equipped with a image sensing and transmission apparatus 2690 which provides a transmitted signal 2695.
  • the transmitted 2695 is received by antenna 2660 and decoded by receiver 2670 for processing by processing element 2680 and subsequent display to the user of a integrated display presentation on display means 2685, said display means located in or on second vehicle 2650.
  • This embodiment illustrates the usefulness of the present invention wherein the emergency vehicle 2600 can relay important information regarding traffic or road conditions that are significantly ahead of the user in the vehicle 2650.
  • FIG. 27 shows an alternate embodiment of the present invention wherein a number of image sensing means 2710, 2720, 2730 are arranged at strategic location along a roadway 2700.
  • the data relayed from image sensing means 2710, 2720, and 2730 are processed by central station 2740 and the information content is then is relayed via transmitter over transmitted signal 2760 to one or more vehicles 2770 equipped to receive said signals.
  • the individual sensing means can further comprise transmitter subsystems and provide for direct communication to the vehicles.
  • the central station may comprise a plurality of different types of receiving means, and a plurality of individual displays, permitting an operator to observe and control the operation of central station 2740. Also, as shown in FIG.
  • image sensing sources 2710, 2720, 2730 may deployed on a wide variety of supports.
  • image sensing means 2710 is deployed on top of a building structure 2715 located adjacent to the roadway 2700.
  • image sensing means 2720 is located on a sign overpass 2725 which bridges roadway 2700.
  • image sensing means 2730 may simply be located adjacent to roadway 2700.
  • FIG. 28 illustrates schematically the central station 2805 comprising a variety of sources of information including textual information 2800 and visual information 2810.
  • the visual information may be derived from roadside image sensing means, such as those illustrated in FIG. 27, or mobile image sensing means, such as those illustrated in FIG. 26. Additionally, the textual information may be generated internally to the central station 2805, or externally from the central station.
  • the sources of information are assembled by processing unit 2820 into a signal to be transmitted by transmitter 2830.
  • the transmitted signal, 2835 conveys any combination of visual and textual information from the central station 2805 to the user's vehicle 2840.
  • the user's vehicle 2840 comprises an antenna 2850 coupled to a receiving and decoding means 2860 and a processing means 2870.
  • the processing means 2870 is the further coupled to a display means 2880.
  • the processing means 2870 has the ability to interpret the various types of information conveyed by transmitted signal 2835 and to selectively incorporate elements of that transmitted signal into an integrated display presentation 2890.
  • the integrated display presentation 2890 can comprise one or both of visual information 2893 corresponding to a processed version of the visual content 2810 and an overlay of processed textual information 2896 corresponding to the textual information 2800.
  • the system of the present invention thereby permits the user of the system to discriminate and interpret a wide variety of data transmitted to the user's vehicle.
  • FIG. 29 illustrates another embodiment of the present invention, wherein an image sensing means 2910 is affixed via support structure 2920 to a road sign 2930 located adjacent the right-of-way 2940.
  • the information content sensed by image sensing means 2910 is then transmitted via transmitter 2960 to vehicles such as the user's vehicle 2980, via the transmitted signal 2970.
  • the transmitted signal 2970 is received, decoded, and processed within the user's vehicle 2980 and displayed on display 2990.
  • the information content relayed from image sensing means 2910 and thus displayed on the user's display means 2990 permits the user to make a decision on whether to stay-on-right of way 2940 or to take exit 2950 responsive to the traffic conditions that are thus visually observed on display means 2990.
  • FIG. 30A and 30B are representative examples of the types of display information that may be presented by an embodiment of the present invention.
  • FIG. 30A illustrates the display of a schematic overview of 3010 of traffic congestion in the vicinity of road hazard 3020.
  • other areas of information that may be included in the display comprise a warning, or advisement area 3045, and a user control area 3055.
  • the warning or advisement area 3045 may comprise urgent warnings displayed in a textual or graphic form 3030 and helpful information displayed independently in a textual or graphic form 3040.
  • the user control area 3055 may comprise a plurality of user controls 3050, which control the operation of the system and/or modify attributes of the display presentation. In the illustrated example, these user controls include the ability to display distances in miles or kilometers.
  • FIG. 30B shows an alternate display presentation of substantially the same information as shown in FIG. 30A wherein the display comprises a visual representation 3070 of the area in which the traffic congestion occurs responsive to road hazard 3020.
  • the display comprises a visual representation 3070 of the area in which the traffic congestion occurs responsive to road hazard 3020.
  • superimposed on top of the visual presentation 3070 are useful warnings in a textual or graphical form including speed warning 3030 and directional information 3040.
  • user control elements such as user control 3050 may be superimposed on top of display presentation.
  • FIG. 31A is a schematic representation of an alternate embodiment of the present invention wherein a combination image sensing and transmission apparatus 3110 is located adjacent the roadway 3170.
  • the apparatus 3110 transmits data representative of the image sensed by the image sensing means via transmitted signal 3120 to receiver 3130.
  • Receiver 3130 relays the transmitted and decoded data to processing means 3140 which produces an integrated display presentation for display on video billboard 3150.
  • Video billboard 3150 then provides a display usable by users in one or more vehicles 3160 on the road 3170 thereby giving the operators of said vehicles advance warnings of traffic conditions as observed by image sensing means 3110. Note that in the illustrative embodiment, no portion of the system need be required to be located within or on the user's vehicle.
  • the transmittal signal 3120 can be received by a receiver within a vehicle and processed and displayed within the vehicle (such as shown in FIGS. 1, 6, 7, 8, etc.)
  • FIG. 3 IB illustrates another embodiment of the invention similar to that depicted in FIG. 31 A wherein the image sensing means 3115 is directly coupled to a processing means 3145 which is in turn directly coupled to billboard 3150 thereby providing for display to integrated information display through the users responsive to the image detected by image sensing means 3115 of the roadway 3180.
  • a processing means 3145 which is in turn directly coupled to billboard 3150 thereby providing for display to integrated information display through the users responsive to the image detected by image sensing means 3115 of the roadway 3180.
  • FIG. 32 shows one embodiment of the signal receiving and processing element in accordance with the present invention.
  • one or more antennas 3200 and 3201 are coupled via connections 3205 and 3206 to receiver/decoder 3210.
  • Receiver/decoder 3210 then decodes the signals thus received and relays them to processing means 3350 via a plurality of data connections 3220.
  • one or more image sensing means 3223 may coupled via data connections 3226 to the processing means 3350.
  • the processing means 3350 is further comprised of a selector subsystem 3360 and formatting and processing subsystem 3340.
  • the selector subsystem 3330 selects from the available data inputs 3220 and 3226 and produces one or more selected output signals 3333 and optionally 3336 for input by the processing and formatting means 3340.
  • the processing and formatting means accepts the plurality of selected data signals and performs processing and formatting operations on that data producing an integrated display presentation.
  • the integrated display presentation is conveyed to display 3360 (for presentation) via link 3345.
  • Processing and formatting operations may include operations such as superimposing textual information on top of video information, producing schematic displays of traffic patterns and congestion, producing warning information responsive to signals received via antennas 3200 and 3201, and integrating received information with data signals from image sensing means 3223.
  • the receiver 3210 includes within each of the output signals 3220, an identification of which of the antennas (3200, 3201) the resultant data signal was received from. This in turn permits the processing means 3350 to discriminate between the different data signals received on the basis of transmission methodology and identification.
  • FIG. 33 represents a simplified block diagram of the receiving and processing systems of the present invention. Shown in FIG. 33, the antenna 3300 receives signals broadcast to it and conveys those signals via link 3305 to receiver 3310. Receiver 3310 then relays the received signal through decoder 3320. The receiver 3310 and decoder 3320 together comprise receiver/decoder 3370. The output of the receiver decoder 3370 is a data signal that is then coupled to the image processing and format means 3350. The image processing and format means 3350 then integrates the data signals as received into a display and then couples that display information to display means 3360 for display to a user. Specific commercially available receivers, decoders, image processing and formatting subsystems and displays as detailed above herein are also applicable to FIGS. 32 and 33.
  • FIG. 34 shows one embodiment of the present invention showing detail of the image processing and formatting means 3350 as shown in FIGS. 32 and 33.
  • the received data is representative of an impaired visual that has noise patterns established within it as shown in visual 3410.
  • This impaired visual is characteristic of signals received via conventional wireless technology in the environment of other transmitters or other sources of electromagnetic interference.
  • the impaired visual 3410 is then processed by the image processing and format means 3350 and specifically undergoes a noise reduction step 3450 producing a restored image 3420 which is representative of the image content without the noise imposed by interference.
  • FIG. 34 shows one example of types of image processing that can be performed by the image processing and format means 3350.
  • FIG. 35 is a schematic representation of the multiple transmitter embodiment of the present invention wherein the user's vehicle 3500 receives a plurality of signals via antenna 3510 from sources including a first transmitter 3520 and a second transmitter 3530 wherein each of the transmitters 3520 and 3530 comprise image sensing means directly coupled to signal transmission means.
  • the transmitter 3520 transmits data representative of the image perceived in field of view 3560.
  • the transmitter 3530 transmits data representative of the field of view 3550.
  • the user's vehicle 3500 on right of way 3540 receives signals via antenna 3510 and the processing and formatting means 3512 collects and integrates the received signals into an integrated display presentation which is then displayed to the user on display means 3514.
  • the display may include a representation of a picture in a picture wherein the field of view 3560 is contained within the image of the field of view at 3550 or vice versa, or may comprise alternatingly switching between the visual received from transmitter 3520 and the visual received from transmitter 3530, or in yet another alternate embodiment, may include a split screen mode of operation where visuals received from each of the transmitters 3520 and 3530 are shown on respective portions of the display screen 3514.
  • FIG. 36 shows an schematic illustration of the operation of an embodiment of the present invention wherein a building 3630 blocks the view of roadway 3640 some distance from the user's vehicle 3600.
  • the present invention includes combination sensing and transmitting means 3620 on or near building 3630 having a field of view 3660 which includes that portion of roadway 3640 that is obscured from the user.
  • the combination transmitter sensing means 3620 transmits a signal 3625 that is received at antenna 3610 of the user's vehicle 3600.
  • the received signal is decoded and processed by processing means 3612 and the processing means 3612 produces an integrated display for presentation to the user on display means 3614 within the vehicle, wherein the presentation is of the roadway 3640 in the area covered by the field of view 3660.
  • FIG. 37 shows a schematic representation of the one embodiment of the present invention as used in watercraft.
  • Watercraft 3700 incorporates an antenna member 3710 which is coupled to a receiver 3720.
  • Receiver 3720 is in turn coupled to a processing and integration means 3730 which responsive to the data from receiver 3720 produces an integrated display for presentation for the user.
  • the integrated display data is communicated to display 3740 for presentation thereupon.
  • the system of the present invention permits the operator of the watercraft to have the benefit of remotely located image sensing means, for example, giving a visual of an approach to a harbor or a docking area, or navigation of a difficult stretch of waterway.
  • the image sensors may also or alternatively include infrared or other types of information derived from radar, such as maritime radar, thus permitting the generation of a schematic view of a watercraft within a waterway to be displayed on the display means 3470 without requiring the user's watercraft 3700 to possess radar means.
  • radar such as maritime radar
  • FIGS. 38 A, 38B, 38C represent one embodiment of the present invention.
  • FIG. 38 A is a schematic representation of traffic congestion involving an emergency vehicle 3800. The user's view (from vehicle 3810) is blocked by intermediate vehicle 3820 which is located between the emergency vehicle 3800 and the user's vehicle 3810.
  • image means 3830 affixed to support means 3840 produces a display on display means 3850 on the interior of the user's vehicle 3810 thereby providing a display on the display means of areas that would otherwise be obscured from direct line of sight of the user in vehicle 3810 by intermediate vehicle 3820.
  • FIG. 38B shows a representation of one possible display as shown by display means 3850 wherein the information received by image sensing means 3830 is directly displayed for the user.
  • FIG. 38C shows a different display that would be presented to the user on display means 3850 which comprises a schematic view derived from information received from image sensing means 3830 and processed to produce the overhead schematic view and other user information as shown in the illustration.
  • FIGS. 39A and 39B represent an alternate embodiment of the system as depicted in FIGS. 38A, 38B, and 38C wherein the emergency vehicle 3800 is again blocked from the direct view of the user in vehicle 3810 by intermediate vehicle 3820. As in FIGS.
  • the image sensing means 3830 is able to provide the user a view similar to that shown in FIG. 38B of the traffic congestion ahead of the user's vehicle.
  • the user' s vehicle is equipped with a receiving antenna 3870 coupled to receiver and processing means 3860.
  • Receiving and processing means 3860 combines the image information received from sensing means 3830 with the transmitted signal 3885 from combination sensing means and transmitter 3880, located adjacent to the roadway as shown in the illustration.
  • the combined information may be presented to the user in the form as shown in the FIG. 39B, wherein a smaller inset portion of the display includes a visual representation of data from sensing means 3830, as shown in 3900, and also shows a schematic view of the traffic congestion as image 3910.
  • textual and graphic information 3920 can be superimposed on the combined display.
  • FIGS. 38A-C and FIGS. 39A-B therefore demonstrate, inter alia, in accordance with the present invention, one or more sources of information content can be utilized in generating displays.
  • the types of displays that can be generated include a plurality of direct visual representations, generated schematic representations, textual information, graphical information, and any combination of the above.
  • FIG. 40 is another exemplary embodiment of the present invention, wherein a first vehicle 4000 comprises a support means 4010, image sensing means 4020, and a transmission means 4030.
  • the image sensing means 4020 relays data to transmitter 4030 which in turns transmits said data as transmitted signal 4035.
  • Transmitted signal 4035 is received by a plurality of vehicles 4065 and 4075 via antenna means 4060 and 4070, respectively.
  • vehicle 4065 the signal received via antenna 4060 is processed by processing means 4068 and displayed for a user on display 4062.
  • vehicle 4075 the signal received by antenna means 4070 is processed by processing means 4078 and displayed for the user on display means 4072. As illustrated in FIG.
  • FIG. 41 shows one embodiment of the present invention which utilizes multiple display means, 4110, 4120, and 4130, all coupled to processing means 4140.
  • the processing means 4140 is also coupled to receiving means 4150 and image sensing means 4160.
  • the processing means 4140 operates on data supplied by image sensing means 4160 and data supplied by receiving means 4150 to generate a plurality of integrated displays for the user. Each of these integrated displays is routed to one or more of the display means 4110, 4120, and 4130.
  • display 4110 may display a visible light image of the data received from image sensing means 4160 and display means 4120 may display a visual light image representation of an infrared scene sensed by image sensing means 4160.
  • display means 4120 may display a visual light image representation of an infrared scene sensed by image sensing means 4160.
  • FIG. 42 shows one embodiment of the present invention that supports a relay mode of operation. In FIG. 42, image sensing means 4210 having a field of view 4200 is affixed to support 4215.
  • support 4215 is located on top of roadway exit sign 4217, although this location of sensing means 4210 is not a requirement of the invention.
  • Image sensing means 4210 relays data signals to transmitter 4220.
  • Transmitter 4220 transmits a first signal 4225.
  • signal 2245 is received by vehicle 4260 which is sufficiently close to transmitter 4220 as to be in receivable range.
  • the signal 4225 is received by antenna and receiving means 4230 and is subsequently repeated via signal transmitter 4240.
  • vehicle 4270 is sufficiently far away from transmitter 4220 as to preclude reception of first transmitted signal 4225. However, vehicle 4270 is within reception range of the transmitter 4240 atop vehicle 4260.
  • transmitted signal 4245 is received by vehicle 4270 on antenna 4250.
  • the data provided by image sensing means 4210 is first relayed from the first transmitter 4220 to the vehicle 4260 where it is in turn relayed by transmitter 4240 as a second transmitted signal before arriving at the user's vehicle 4270.
  • the vehicle 4260 has both it's own sensor and a receiver, and a transmitter (as in 4240 in FIG. 42)
  • the relay can be additive, conveying (relaying) both received data and locally sensed data. This permits the invention to be used by users in vehicles significantly distant from a low power transmitter.
  • FIGS. 43 A-43B illustrate land-based call operation relative to a moving vehicle.
  • the signals 4325 and 4335 are within the range of and are received at antenna 4310 for display on display 4315.
  • the vehicle 4305 is in range of signal 4335 as received by antenna 4310 for display on display 4315.
  • FIG. 43C is a flow chart of a control program to implement the cell handoff operation for the system controller of FIGS. 43 A and 43B.
  • FIG. 44 shows an example of a display in accordance with the present invention, wherein a road map is shown indicating the location of the user's vehicle 4400, locations of junctions 4410 superimposed on the road map, and known emergency vehicle locations 4430. Also shown are indications of travel time between junctions 4410, where an example outbound travel time is shown at 4440.
  • the present invention allows the user to touch the location on the screen, for example the location of the emergency vehicle 4430, at which point the view corresponding to the traffic congestion in that area is shown as in FIG. 45.
  • Positioning information can be locally computed or detected and/or based on a GPS-based system.
  • FIG. 45 responsive to information from any of the plurality of received signals, shows a directional indication 4510 superimposed on the display indicating to the user that the most efficient way to bypass this particular congestion would be, in the illustrated example, to drive to the left.
  • FIG. 46 illustrates an alternative embodiment of the present invention, wherein a plurality of sources of traffic condition data communicate that data, and wherein one or more vehicles have receivers for receiving the communicated traffic condition data from one or more of a plurality of sources, and provide processing and display internal to the vehicle to provide a display of traffic conditions responsive to the traffic condition data, in a manner as described consistent with the detailed description herein elsewhere.
  • a plurality of sensor subsystems 4660a, 4660b, and 4660c respectively each having antennas 4620 provide for sensing traffic conditions, imaging and other information, and transmit respective traffic condition data A, B, and C, (which can also contain optional location data for the respective sensor subsystem originating the signal), to provide communicated traffic condition data.
  • an aircraft 4670 is illustrated as having a sensor subsystem 4672 (imaging sensor) and 4673 (preprocessor and/or transmitter) which provide output communication via antenna 4674 of traffic condition data D.
  • the traffic condition (A, B, C, D) can then be communicated to one or both of a centrally located relay or accumulation center 4650 which then rebroadcasts the traffic communication data signals A, B, C, D, and/or can be communicated directly from the sensor subsystems 4660a, 4660b, 4660c, and airborne sensors subsystem and transmitter 4672, 4673, and 4674, to directly communicate traffic condition data (A, B, C, D) to vehicles with receivers, such as vehicles 4610 and 4611.
  • vehicles 4610 and 4611 each have respective receiving subsystems 4612 and 4613, which provide for receipt of the traffic condition data communications (A, B, C, D) for processing and display internal to the respective vehicles 4610 and 4611.
  • the communicated data can be received directly from the originating data sources, and/or can be received from a centralized communication center 4650 or a relay network setup off of the centralized or other type of gathering center 4650, in a manner consistent with that described above herein relative to the relay and other gathering redistribution communications. Additionally, the central center 4650 can provide for local display.
  • traffic condition data illustrated in FIG. 46, it is to be understood that any type of traffic condition data communication can be included herein, such as other types of stationary fixed sensors, other types of moving sensors, whether airborne or ground- driven, as well as buried sensors, infrared communication data, and radar communication data.
  • All of these sources of traffic condition data can be provided for in accordance with the present invention, and either the central system 4650 or the vehicles receiving the signal can contain the appropriate processing to provide for selection of relevant communicated traffic condition data for the vehicle. These decisions can be based on user input, positional data such as GPS for the respective vehicle, relative position of sensor subsystems based on location data associated therewith, etc.
  • FIG. 47 there is illustrated a control receiver processing and display subsystem, such as 4612 or 4613 in FIG. 46.
  • Multiple sources of data communication are illustrated, including sensor traffic condition data 4710 (A), sensor communication data 4720 (B), traffic condition communication data 4730 (C), traffic condition data 4740 (D), and global positioning signal 4750 (GPS) originating from satellite communication. Additional communicated data sources can also be provided, including radar, ground sensors, etc.
  • Antenna 4760 such as on the vehicles 4610 or 4611, or the central system 4650, all of FIG. 46, provides for initial receiving and coupling of the communicated data signals (A, B, C, D, GPS) for coupling to receiver 4770.
  • Receiver 4770 provides for receiver decoding and filtering of the incoming communicated data, including source selection and identification, and provides decoded received data to the processor 4780.
  • the processor 4780 provides for signal processing and filtering, and formatting for display presentation of the received decoded data.
  • a user input device 4785 such as a keypad, joystick, touchpad, or other input means, provides for user input and partial selection and control to permit user control of displayed information, including selection of sources, types of display, etc.
  • the processor provides display data output to the display 4790 which provides a display presentation in accordance with the processor output.
  • the display presentation is of the type as described elsewhere herein, and any of the types of displays as mentioned elsewhere herein may be utilized, including CRT, LCD, electroluminescent, etc.
  • FIG. 48 illustrates the software pseudo-code for the operation of the control system of FIG. 47. Note that FIG. 47 is an enhancement and alternative embodiment to the system as shown in FIG. 19. FIG. 48 illustrates the pseudo-code functional operation and flow for the control system of FIG. 47, where there is a GPS communication data signal.
  • the system detects the GPS system signal and feeds that signal to the receiver and processor.
  • the processor determines the vehicle (e.g., automobile) position based on the GPS communicated data.
  • the processor utilizes the GPS determined automobile position from step 4820 in conjunction with other communicated and stored data, such as other sensor signal communicated data relative locations, and computes which sensor subsystems are relevant for this vehicle for its present position, and determines both ahead and, where appropriate, the distance limits to receive sensor data from.
  • the processor causes the selection of only the relevant respective signals from the incoming communicated traffic condition data (e.g., A, B, C, D).
  • the processor provides formatting of the appropriate data and processing necessary to generate a display of a particular selected relevant one of the sensor subsystem traffic condition data. In the illustrated embodiment of FIG. 48, this is shown as displaying the closest, in a forward direction, traffic conditions for display.
  • the processor 4780 of FIG. 47 in conjunction with the user input 4785, provides the user with the option to select from other relevant traffic conditions for display.
  • a display for use by a vehicle operator, a means to receive signals that represent traffic conditions that the operator is interested in, and a processing means coupled to the receiving means that takes the information and formats in a manner suitable for display to the user.
  • This system can be used, for example, to receive traffic telemetry information or estimated arrival times compiling traffic congestion information and displaying that on a display that an operator in a vehicle can use it to make decisions about how to pilot that vehicle.
  • the display can be any kind of display technology such as, but not limited to, liquid crystal display, (LCD), CRT, plasma display, display built of light-emitting diodes (LED) display, a projector driver, a half-mirrored display, or a heads up display that projects the image to the user directly on the windshield or screen of the vehicle that the user is operating.
  • LCD liquid crystal display
  • CRT CRT
  • plasma display plasma display
  • LED light-emitting diodes
  • projector driver a half-mirrored display
  • a heads up display that projects the image to the user directly on the windshield or screen of the vehicle that the user is operating.
  • Other methods that can be used include a half-mirror display which would allow the user to look through the display and see the environment directly outside the vehicle but also see the reflection of another type of display superimposed on that view.
  • the delivery method to the user of the display can result in printing a representation of the traffic conditions of the traffic conditions on paper.
  • Another method of display is an audible system, which, upon receipt of information about traffic conditions, speaks selected portions of that information to the user in a language that the user understands.
  • the processing means that takes the signals from the receiving means and formats them for display can have a number of additional functions.
  • the process of taking the information received via radio frequency or microwave, or satellite, or cellular phone or any of the other common wireless transfer means has to be converted in some way so that the user can actually understand that particular data. It can be displayed as a schematic kind of information showing a stylized road and stylized vehicles; it can be actual image content showing the view of the right- of-way ahead of the operator in a pictorial form; or it can be a combination of the two. If it is a pictorial of the roadway, an imaging processing operation may be performed on that image content to make it more suitable for display. Such an image processing operation might include, for example, adjusting the brightness or contrast of the display to correct for bright highlights as a result of sun or some other light source reflecting off of the vehicles in the traffic flow.
  • noise reduction can be required to remove a herringbone or other kinds of obvious image noise that might be present in the image.
  • the processing can integrate a series of images over time, to result in a different representation of the traffic flow that the user might find useful. Filtering operations can be performed to sharpen edges or accentuate certain attributes of an image especially in the conditions of poor weather when fog, rain or snow may otherwise obscure important features of the road ahead.
  • the image processing can be geared to remove altogether artifacts such as highlights or reflections or distracting road signs from the field of view. Further, the image data that is being received need not be characteristic of visible light. It can be representative of thermal, sonar, ultrasound, or other imaging types.
  • the image data can originate from a thermal imager which provides information about the temperature of its surroundings.
  • This processing is commonly used in applications such as night vision or night spotting scopes where infrared energy is converted from the invisible spectrum to the visible spectrum.
  • This processing can be used in the present invention so that the user can have an image of what ahead of them in the roadway even in the absence of visible light based images.
  • infrared is a particularly useful technique to use. The range of infrared at night can far exceed the range shown by a user's headlights.
  • the processing means can perform several different image processing operations concurrently or simultaneously. For instance, converting from infrared to visible light and performing noise reduction is a very common combination. Similarly, brightness and contrast correction are generally performed at the same time. Other image processing operations are also within the scope of this invention. There is nothing that limits the class of image processing operations to those explicitly listed herein. Detailed information on reasonable image processing steps to perform on images to make them more intelligible to a user or have them convey more information are readily available in image processing text books, such as: Computer Graphics and Applications r by Foley and Van Dam, and Image Processing by Gonzalez, and many other titles which are readily available from academic publishers and from professional organizations such as the IEEE.
  • the receiving means actually can also acquire multiple data types simultaneously from an external source, wherein the processing means is utilized to combine them. For example, (1) an image of what is ahead in the right-of-way might be combined with textual legends indicating the distance to an obstruction or another vehicle, or (2) information about the location of the operator's vehicle with respect to a specific element being displayed.
  • Estimated travel times, recommended speed, recommended side of the road to be on, or which lane to be in can either be combined in the receiving means and relayed to the processing means, or they can be combined in the processing means as received.
  • the user can select a portion, or portions of the data received for display where certain information might be interesting if the operator of the vehicle is making rapid progress. Other times, different kinds of information might be interesting when the vehicle is not making progress.
  • the user's requirements can change over time, and the present invention provides the user with the ability to select which kinds of information or image content are displayed. Further, the user can select to display only the data, or only the image information, or only parts of one or both, even though both can be received.
  • the display means comprises a plurality of displays, operable to provide a plurality of selectable display modes comprising combinations and permutations, wherein different images can be displayed on each display, the same images can be displayed on each of the displays, the same data or different data can be displayed on the various displays, or there can be a combination where image content is displayed on one and other received data is displayed on another display, in any combination.
  • Means are provided for the user to select which elements of the received information end up on which displays.
  • the user configures the system to show particular types of information on particular physical displays by selecting from a list of predefined correspondences between the information displays and the physical displays, user preferred correspondences between the information displays and the physical displays, and a user selectable mode screen.
  • the system shows on one portion of one screen a miniature view, or thumbnail view, of each of the kinds of information displays that the system can generate and on another portion of the same display are miniature of thumbnail representations of the available display screens.
  • the user uses a pointing device to click on the miniature information display and then clicks on the miniature available display screen, thereby assigning that information display to that available display screen.
  • the user uses the pointing device to drag a representation of the thumbnail of the kind of information display on top of the representation of the available display screen.
  • This dragging methodology is intuitive and a similar mechanism is present in conventional microcomputer operating systems.
  • the user selects a thumbnail of a kind of information screen and then draws a connection line between that information display thumbnail and a thumbnail representing an available display screen.
  • the user selects through a keypad apparatus a letter or numeral corresponding to a particular type of information display that the system can generate, and a corresponding letter or numeral corresponding to an available display screen to assign that information display type to.
  • the letters or numerals may be replaced by directional arrows, or a forward/backward 'next'-type of selector.
  • the means to select comprises a 'next' and a 'previous' button, which allows the user for each available display screen to sequence through the types of information that can be displayed on that screen. The user continues to activate either the next or previous button until the desired display type appears on the available display screen.
  • One variant of this embodiment replaces the 'previous' and 'next' buttons with a wheel control that the user rotates in order to sequence through the available types of information displays.
  • the information that is being received by the system in the operator's vehicle can come from a number of different sources either individually or simultaneously.
  • a traffic emergency warning beacon placed on a roadside or wayside relaying information about the vicinity where the beacon is placed.
  • This can be a permanent installation or a temporary installation. If it tends to be near an interchange where traffic congestion is very common it may be permanent. If it happens to be placed near where road construction is being performed, it might be a temporary kind of beacon.
  • the data and or image content can be relayed by a transmitter on an emergency vehicle.
  • the emergency vehicle may be en route to a traffic incident or it may be on site.
  • This feature of the invention allows the operator of the vehicle to respond to an emergency vehicle that may be in the nearby vicinity of the operator's vehicle. Such response may include, for example, getting out of the way of the emergency vehicle.
  • the image information and other data received by the system in the operator's vehicle need not come from a road side source or on board the user's or another vehicle that happens to be on the roadway or in the air.
  • the information can also be provided from a central traffic monitoring location or a central data collection point. Data relayed from a central source may include both image and data content (for example, information about distances to particular exits, or travel times from exits to exits).
  • the processing means can use this information to display continuously updated estimated time of arrival to the user based on gross overall traffic conditions as reported from the central location.
  • this information can be derived from any localized beacons such as a fixed or temporary wayside beacon, or a beacon on an emergency vehicle, or from signals received from other vehicles or aircraft or satellites.
  • a central station reports travel time from a reference point in a route to the location of each known beacon, and also to the location of each known exit. As the user travels along the route and passes each beacon, the system in the user's vehicle will detect a signal strength for each beacon that gets stronger when the user's vehicle is within proximity of each beacon, and fades the user's vehicle from said beacon.
  • the user's vehicle can utilize information from a central station as a rough estimate of travel times and then receive more refined or updated information from roadside beacons as the user' s travels. As shown on the flow chart of FIG.
  • the user enters a desired trip itinerary into the system which may include the start location, the destination location, and any way -points the user expects to pass through on the user's route.
  • the next step 4382 resets an elapsed time counter to zero.
  • the next step 4383 determines whether a signal is received from a central station. If such a signal is received from the central station, the next step 4384 is to store and retain the estimated travel time for each exit and to each beacon as provided to the central station within a memory of the system. Once this stored and retained step has completed, and also in the case of there being no signal received from the central station, the next step 4385 is a determination to be made is whether a signal has been received from any beacon.
  • beacons may be roadside beacons or beacons located on emergency vehicles or other vehicles in close proximity to the user's vehicle. If there is no signal received from any beacon, then the next step 4386 is to update the displays in the user's vehicle responsive to the current information that is known by the system and the elapsed time counter. Processing then continues at step 4383 with the determination of whether there was a single received from the central station.
  • the next step 4387 is a determination that is made is whether the strongest signal that is being received from a beacon corresponds to the same beacon as was last the strongest signal. If this is not the case, this is an indication that the user has traveled out of the range of one beacon and into the range of a second beacon.
  • the system in that situation makes note of the fact that the current closest beacon, that being the beacon with the strongest signal is now considered the current beacon.
  • the next step 4389 is the determination of whether this particular beacon has updated information.
  • step 4390 the stored and retained estimated travel time per exit and beacon provided by the central station is then updated using the information from the current beacon.
  • the next step 4386 is then to update the displays in the user's vehicle responsive to the current stored information and the elapsed time. Processing resumes again at step 4383, looking for a signal from the central station.
  • the elapsed time counter will provide the system a means to give the user a rough estimation of what the remaining travel time is, while the user is en route. If there is no signal received from the central station but the user does receive information from roadside beacons along the way, the roadside beacons provide current and up to date information that permits the system to then provide updated information regarding the total travel time.
  • Video cameras are commonly available from video camera manufacturers, such as Sony and JVC and Ikegami.
  • the transmitting means, or radio frequency transmitters are commonly available through a variety of vendors, including Motorola and other companies that specialize in radio communications and two-way radios. It is not usually a requirement for the system to use a two-way radio wave. A single direction transmitter on a beacon would be sufficient for most embodiments.
  • information can also be conveyed to the operator's vehicle via an infrared receiver, or a visible light receiver, or a GPS receiver, or a cellular network receiver, or satellite receiver.
  • a plurality of simultaneous receptions via any of the above means can be received and processed.
  • An example is simultaneously receiving remote information via a cellular network, and receiving local information via an infrared receiver.
  • the processing means has a way to discriminate between signals that arrive simultaneously. The user selects which image content or data information is desired, and the processing means automatically generates and relays video signals to the display (or displays) without requiring the user to do anything special to receive it from one source or another.
  • the first means to discriminate relies up detecting which transceiver received the incoming signal. For example, if the system is receiving signals from both an infrared transceiver and a radio frequency transceiver, the system is aware, based on which input the signals arrive in, what the signal source of that particular signal was.
  • An additional means to determine or discriminate the source of the signal is an ID code that is transmitted along with the message or data or image from each transmitter that identifies the source and also the type of information being conveyed.
  • the ID code may indicate that the source is from a helicopter hovering over the traffic area, from a roof-top camera, from a camera on a particular billboard, from an image gathering source in a mobile or emergency vehicle, or from a central station.
  • the types of information that might be conveyed include for example, video information, image information, congestion information, traffic pattern information, travel time information, road condition information, the sensor's location, weather condition information, construction alerts, a warning of road hazards or of disabled vehicles or locations where vehicles such as a farm vehicle might be moving slowly.
  • the sources and data types permit the system to intelligently select the best source and type of information for the user based on the desired image display.
  • traffic congestion information that is relayed from a mobile vehicle in the immediate proximity of the user is more likely to be accurate than the same type of information available from a central reporting center, and therefore, the system could intelligently permit on-site or near-by information to override information provided by a central information center.
  • the information provided from a mobile vehicle such as a view of traffic congestion or information about road hazards, might be overridden by the similar type of information being transmitted from an emergency vehicle in the area.
  • the system may elect to override purely informational displays requested by the user, such as travel times or congestion information, and display road hazard information or adverse weather condition information that require the more immediate attention of the user.
  • the system can have a number of video generation circuits such as video adapters, that are commonly used for personal computing. Examples are VGA devices manufactured by ATI or Matrox, or S3. All of them provide graphic chips that are specifically designed to generate high resolution displays (examples of these are available on the World Wide Web). In addition, there are a number of older display generation devices which will work on lower resolutions that are probably more typical of what would be in a moving vehicle.
  • the display that the user observes is actually located in the user's vehicle. It may be located so that the operator of the vehicle can see it, or so that another occupant of the operator's vehicle is able to monitor the display. In jurisdictions where a operator-visible display is not permitted by law, having the display visible by a co-pilot is the advised method of operation. Alternatively, or additionally, audio or speech can be utilized instead of a visible display.
  • the display may actually be external to the user's vehicle such as a roadside billboard video display, or a system of indicator lights (or display elements (e.g., character, live)) either in the roadway, above the roadway, or to the side of the roadway giving information, instead of or in addition to a display within the user vehicle.
  • Roadside displays may also include textual displays that would relay information for the user to read as the user is travelling.
  • the display is incorporated in a central monitoring location where that information is used by one or more users at the central monitoring location, such as to determine, for example, emergency response to traffic conditions, responses to temporary congestion on the basis of construction, or normal and abnormal traffic flows.
  • the invention includes an image or other kind of sensing means (that is supported in one of a plurality of positions with respect to the operator's vehicle) coupled to a processing means which provides a display output responsive to information received from the sensing means for display to the operator.
  • a camera, or other image sensing apparatus is supported on a vehicle providing a view of traffic around the vehicle, and relays that view via the processing means to a display for viewing by the operator or other person of that vehicle.
  • This allows the operator to have a viewpoint of traffic conditions different from the operator's normal perspective of sitting within the vehicle. For example, the operator's normal perspective might be obstructed from seeing something that is ahead of the operator in the traffic flow, because a large truck was in front of the operator's vehicle.
  • the operator can see around or over the truck, thereby giving the operator a visual representation of what exists beyond the immediate obstruction.
  • the sensing means or camera can be on a fixed mount.
  • the support can be a retractable system that allows the camera to be selectively elevated sufficiently high above the operator's vehicle to be able to see over large obstructions, and/or it can also be extended out from either side of the operator's vehicle thus allowing the operator to see around a traffic obstruction that is viewable from a different angle.
  • the support can be a boom arm that allows a plurality of these motions to be accomplished by lifting the sensor above the top of the vehicle and independently or simultaneously translating or shifting off to one side.
  • the retractable support comprises a hydraulically actuated means to lift the camera up.
  • it can be an electric motor driving a mechanism similar to a retractable radio antenna for a car.
  • it can be implemented with a scissors lift device, telescoping rod arrangement, a single arm boom, or multiple arm jointed boom similar in geometry to what you might find on a snorkel fire engine.
  • Retractable supports are commercially available from many sources. Examples include all major automobile manufacturers and parts suppliers that supply retractable antennas for major automobile manufactures. On an industrial scale, there are also numerous commercial companies that supply the hydraulic lifts that lift up microwave dishes for the mini-cam vans, and have a similar product on a smaller scale that can be used for smaller vehicles.
  • One such support comprises a plurality of nested cylindrical tubes of sequential decreasing diameter, having alternately swaged ends to prevent the tubes from separating from one another, said tubes configured in a telescoping arrangement, with an electric motor (such as a 12-volt motor commonly used to operate automobile accessories) that drives a take-up spool onto which a semi-rigid support is coiled with one end of said semi-rigid support affixed to the smallest cylindrical tube.
  • an electric motor such as a 12-volt motor commonly used to operate automobile accessories
  • the semi-rigid support is unwound from the spool and thrust into the telescoping arrangement of tubes forcing said tubes to extend telescopically.
  • the motor operates in the reverse direction, said support is withdrawn from the tubes and wound around the spool, thus forcing said tubes to collapse and shorten telescopically into one another.
  • the support means also includes a means to allow the camera or sensing device to be further altered in position and orientation, such as the ability to pan the sensing means from left to right to allow it to see (e.g., sense, detect, perceive) information to one side or the other of the vehicle; to tilt the sensing means up or down, allowing it to sense information from close to the vehicle to far away from the vehicle, or perhaps even an aerial view from above the vehicle.
  • the support also has the ability to roll the camera from side to side, either to present a different kind of image via the sensing means or to compensate for any motion that might be imparted by the support means.
  • a zoom mechanism allows the camera or sensing means to frame and focus on an area of interest responsive to the operator of the vehicle.
  • the sensing means can relay data to the processing means via a wired or wireless technology or via fiber optic cable. Data can also be conveyed using the support as an electrical means to communicate. Alternatively, a transmitter in the sensing means relays data down to a receiving means actually on the vehicle. Data transmission can be infrared or RF or microwave or visible light or any of the transmission technologies that have been previously discussed herein. Specific examples of fiber-optic cables include cable produced by Nippon Sheet Glass of Japan, or Owens-Corning Corporation, which is a U.S. Company. Types of wired conveyance include cables of coaxial or other variants that are produced by Belden Corporation, a US Company. The most likely common variant is a standard 75 Ohm coaxial cable as used in video distribution. Infrared transmitters and receivers are available in discrete form through such vendors as
  • RadioShack as well as from a number of other vendors or manufacturers, such as Jameco, Digikey, and other electronic parts suppliers.
  • RF, radio-frequency, transmitters and receivers are as already discussed earlier herein, such as from Motorola, Qualcomm, Sony Corporation, Analog Devices, etc. These are all vendors that produce products that are designed for use in wireless transmission.
  • the sensing means detects visible light images using for example, a camera or CCD array or vidicon tube. All of these technologies are capable of producing signals representative of a visible light image. This visible light image may require some processing to ideally present the information to the user. For example, a selected portion of the image might be of interest so the user can electronically zoom in on a section of the image. In a similar fashion the user can apply brightness or contrast correction to make out fine detail, or filtering to bring out edges or resolve differences between elements of the image.
  • Some image sensing means are more sensitive than others to noise induced by the operation of the vehicle itself of sources of interference in or around the vehicle.
  • Performing noise reducing processing on the data from the sensing means can improve the displayed image.
  • Some image sensing means tend to have sensitivity in the infrared regions of the spectrum. This is useful from the standpoint of a preferred embodiment of the invention in that it allows the system to see heat signatures of vehicles or persons or other elements representative of traffic or human beings around the user's vehicle. However, since the average person cannot see in infrared, the image received by the CCD array is processed and converted to provide a perceivable visual display representative of a visible light equivalent so that the user can actually observe the infrared data.
  • image processing operations can be performed simultaneously to meet the user's needs.
  • the user is be able to select any or all of them simultaneously.
  • multiple-function sensors or multiple separate sensors can be utilized.
  • the sensing means is not limited to sensing just image content; there can also or alternatively be a data source within view of the sensing means.
  • the sensing means comprises a visible light image sensor or camera, it can receive a coded transmission via visible light sent from a roadside beacon.
  • Information content can be relayed via a flashing strobe off an emergency vehicle or via a roadside sign.
  • the flashes are be sensed by the camera and information is extracted by the processing means for further formatting and display for the user.
  • This is analogous to using a signal light to communicate via Morse code between ships at sea in World War II.
  • the information thus transmitted includes information about an obstruction in a road, detours, recommended alternate paths for travel, estimated time to a particular exit, travel time between points, etc.
  • the processing means selectively chooses some of this information, under control of the user, to display selected portions of it on a screen.
  • the user might be interested in a detour or possible alternate paths, or might be particularly interested in arrival time at a destination.
  • the processing means does not need to convert all the data that is received or sensed into a form suitable for display.
  • the system might selectively choose to ignore extremely bright heat signatures of exhausts of vehicles, thereby performing a threshold function where received infrared signals above a certain threshold are ignored or not processed and not conveyed to the user, as they contain irrelevant information.
  • the support means is mounted to a vehicle
  • its deployment is responsive to the vehicle's speed sensing input such that the support will not deploy unless the vehicle is moving below a preset speed threshold.
  • the retractable support can also be set to deploy (to elevate) the camera only if when the vehicle is slower than a reference speed or stopped. This would function as an additional safety feature in that the operator would not be able to deploy the camera or be distracted by a display of traffic conditions around the operator's vehicle unless the vehicle is already slowed or stopped as a result of congestion and traffic.
  • the support is sufficiently strong to allow deployment and operation of the camera even when the vehicle is moving at higher speeds.
  • the position and elevation of the deployed camera can also be controlled as related to the speed of the vehicle. For example, it might be desirable to deploy a camera at a height of several feet above the vehicle if the vehicle is moving relatively slowly, where it may be desirable to deploy the camera at a height often or more feet above the vehicle if the vehicle is moving in a faster fashion, or vice-versa, or both, thereby permitting the sensing means to see both closer and further ahead in the flow of traffic from the vehicle's position.
  • the support means deployment is responsive to other factors such as relative wind speed. The processing determines the support means deployment based upon factors such as the vehicle's speed and the direction of the ambient wind.
  • the retractable support allows the camera to retract down to a rest position which is still an operable position for the system.
  • the sensing means can be retracted to a position that is still capable of looking forward or to one side of the vehicle.
  • the retracting means may actually retract the sensing means within the vehicle thereby protecting it from theft or vandalism when not in operation.
  • the retractable support is under direct control of the operator thus allowing the operator to make decisions about whether or not to deploy the imaging sensor, and where to deploy the imaging sensor, relative to the vehicle.
  • the methods of controlling the deployment of the retractable support are combined to provide the system with multiple modes of operation, wherein the system is responsive to vehicle speed and relative wind speed in one mode, and is also responsive to a user control in another mode, and where the operator can override either or both the vehicle speed and wind speed sensing apparatus in yet another mode.
  • the retractable support is affixed to the user's vehicle.
  • the retractable support that positions the sensing means is independent of the vehicle and is affixed with respect to a fixed structure such as a garage, a parking structure or a roadside fixture.
  • the support for the sensing means can include any wayside structure such as a light house, a street light, a marker buoy, an exit or entrance sign, some other informational sign, a toll booth, a fixed structure adjacent to the right-of-way, a moveable platform (moveable with respect to the right-of-way), a crossing signal for railroad tracks, a power pole or other adjacent utility poles providing a vantage with sufficient altitude to be of use, with sufficient support structure such as attached to concrete (or other construction) of lane dividers, weighted down, aircraft based, underground based, or other means not expressly specified here.
  • the sensing means is a camera with either a fixed focus lens, or a variable focus lens supporting different depths of field.
  • the camera has an automatic exposure correction system such as commonly available on consumer camcorders and similar products where the lens opening is automatically adjusted to accommodate for ambient light and other lighting conditions.
  • the same exposure technology can be applied in an analogous fashion for an infrared sensing apparatus to automatically adjust for the total amount of signal received in an infrared spectrum.
  • the sensing means has a plurality of sensors, of homogeneous or heterogeneous types.
  • a sensor comprising both an optical sensing apparatus (such as a television camera) and a CCD array (for sensing infrared imaging) may be part of the same sensing means.
  • an optical sensing apparatus such as a television camera
  • a CCD array for sensing infrared imaging
  • one CCD array with functionality in both areas of the spectrum could be used.
  • a plurality of homogenous sensors such as an array of CCD arrays (or an array of other imaging sensors) looking in the same direction or in slightly different directions provide a wider field of view without requiring panning or tilting operations by the user.
  • multiple sensors are used to produce a stereoscopic image of representative of traffic around the operator's vehicle, thus allowing the operator to use depth perception on suitable displays, such as heads up displays supporting depth perception (or with suitable display processing on a conventional display), to make assessments about the relative positions of vehicles thus displayed.
  • the processing can combine the stereo images or multi-viewpoint images thus obtained by a plurality of sensors to create and modify the display presentation. For example, processing can be used to add highlighting or assigning of color to objects that are closer that might be of more significance to the operator of the vehicle.
  • the electronic visualization system in accordance with the present invention is used to present an electronic representation of traffic flow in and around location of a vehicle.
  • sensing means designed to detect vehicles from an elevated perspective position relative to the operator's vehicle. This allows the sensing means to detect vehicles that would ordinarily not be visible to the operator of the vehicle from the typical operator's position.
  • elevating a sensing means above the operator's vehicle a new perspective is gained in a bird's eye view of the traffic situation, and that content from the vehicle sensing means is communicated to a processing means in the user's vehicle which the processing creates for the user a visualization representative of traffic patterns around the vehicle.
  • the data that is sensed by the vehicle sensing means can be representative of video data, individual still images, or other sensor data signals, conveyed to the processing means.
  • the processing means can provide for display of the data directly as video and it can perform processing such as feature detection and content recognition in order to generate the electronic visualization display representative of the traffic.
  • the data from the overhead perspective can come from a sensing means actually attached to the vehicle via some support (which can be a retractable support), and/or it can come from a sensing means attached to a different vehicle or vehicle that is not part of the traffic flow, such as a helicopter hovering over the traffic flow, or a sensor attached to some fixed or moveable structure in the vicinity of the traffic flow.
  • some support which can be a retractable support
  • a sensing means attached to a different vehicle or vehicle that is not part of the traffic flow such as a helicopter hovering over the traffic flow, or a sensor attached to some fixed or moveable structure in the vicinity of the traffic flow.
  • the data from the sensing means is communicated to the processing means of the user's vehicle.
  • Each sensing means can be stationary or mobile.
  • the processor accepts image or data content from all of these different sources, or a user's selection of these sources, and provides for combining them as required or otherwise integrating the information for a coherent display to the user.
  • This display can involve overlays and/or picture-in-picture visual presentations, both well known in the art in video and image presentation.
  • the images received from multiple sensors can be simultaneously displayed as a mosaic of smaller images, allowing the user to select a view of any one of them for larger display.
  • the processing means can be programmed or controlled by the user to sequence through a plurality of different modes of display.
  • the user can select cycling at a user controlled rate between different views available to the user's vehicle from other sensing means, or perhaps from multiple imaging sensors that are on the user's vehicle itself. The cycling operation continues until the user selects a particular view to stay with, and it periodically updates the images available based on the availability of image vehicle sensing sources nearby becoming available, and other sources becoming unavailable.
  • a system on another occupant's vehicle can transmit image content or data to other vehicles nearby in a traffic flow, including the operator's vehicle, which receives data from the other vehicle's sensing means via wireless transmission.
  • This operation conveys image data from another vehicle to the user via one transmission "hop".
  • the user's vehicle may also include a transmitter that relays information received and/or sensed (via a sensor local to the user's vehicle) to other vehicles in the traffic flow.
  • the invention thus grants an operator of a vehicle a view of traffic significantly forward of his or her position based on having images relayed through multiple "hops" via a multiple sensing and relaying means from other vehicles or from other fixed locations.
  • this invention implements a cooperative network where image data beyond the view of a user in the traffic flow can be relayed from quite some distance from the user's vehicle via a series of "hops" either from mobile platforms such as other vehicles in the traffic or through relatively stationary platforms such as buildings or roadside fixtures.
  • the processing means provides additional functions, such as data compression, data encryption, and or data origination tagging.
  • Data compression is used to reduce the amount of data transmitted to a reasonable amount to send via any of the previously mentioned transmission and reception means (e.g., radio frequency, infrared, and so forth).
  • Data encryption can be used if the image content was being sensed for a purpose other than immediate supply to other operators of vehicles in the area (for example, relayed to a central monitoring area where the information is used and subsequently resold in some fashion).
  • Data origination tagging is simply a way of identifying the origin of the data being transmitted from a given vehicle. For example, if a vehicle operator is using the present invention to transmit image data that is being acquired by an image sensing means mounted to that operator's vehicle, it would be tagged with an identification indicating which vehicle (or that vehicle's location) is sending that data and its absolute location as determined by position relative to local way points, fixtures or landmarks, or relative to the global positioning system. Data origination tagging allows other receivers in the traffic flow to use tag information to identify where in the traffic flow this transmission originates from, and if it is relevant for their own respective use.
  • Other data origination information that can be transmitted includes the relative and absolute direction of flow. For example, if a given user's vehicle is travelling in a northbound lane, other northbound vehicles might be very interested in receiving transmissions from a northbound vehicle ahead of them, while they would obviously have no interest in receiving information from northbound vehicles that are somewhere south of them and therefore behind them. Southbound vehicles would likewise probably have little interest in information transmitted from northbound vehicles regardless of their relative location to the southbound vehicle.
  • Data origination tagging is a powerful feature because it allows the user select from the plurality of images and data that arrive at the operator's vehicle from a variety of different sources, and to collapse those data sources down into a manageable number for subsequent processing and display.
  • information is transmitted from an operator's vehicle in a broadcast fashion thus allowing other vehicles to selectively discriminate between the totality of signals being received to determine whether any particular transmission is relevant.
  • a visualization or early warning traffic system present in the vehicle can be made highly directional, such as a hooded light source or directional communication beacon directed to a particular direction from the user's vehicle, or to particular reception sites located with respect to the operator's vehicle (for example, a particular roadside collection point or central monitoring location).
  • the image data can be communicated in a direction downwind to other craft such as those that would be required by the rules of navigation on the water to yield to the sailboat, if they are equipped with similar systems, thus allowing the operators of the other vehicles to take action to avoid collision or otherwise operate their vehicles in a safer manner.
  • data is communicated to the user's vehicle and processed to provide for the display to the user of a schematic view of traffic conditions surrounding the operator's vehicle.
  • This can be accomplished by using (1) a number of data or image content sensors located at different positions with respect to the right of way and (2) a central (or distributed) collection apparatus comprising processing and transmission means that use the data collected by the various sensors and present it in a broadcast fashion such that vehicles within the traffic flow could receive it and format it in a display suitable for the user.
  • a roadway containing sensors detecting positions of vehicles can relay that data to a central location where that data is assembled into an instantaneous mapping of traffic on that roadway.
  • That mapping is then transmitted or broadcast to the vehicle safety early warning systems, which use the position of the operator's vehicle relative to roadside beacons, or landmarks, or the global positioning system, or another positioning system, or sensing the ID's of the roadside sensors, resulting in a comprehensive display of nearby traffic at whatever resolution the user might require, integrated in with a methodology to allow the user to select a destination for a given roadway, such as a particular highway exit.
  • vehicle safety early warning systems use the position of the operator's vehicle relative to roadside beacons, or landmarks, or the global positioning system, or another positioning system, or sensing the ID's of the roadside sensors, resulting in a comprehensive display of nearby traffic at whatever resolution the user might require, integrated in with a methodology to allow the user to select a destination for a given roadway, such as a particular highway exit.
  • the user is given a comprehensive display indication (representative of the traffic flow around the vehicle) of the amount of traffic that would have to be navigated in order to reach the destination, and any road hazards or other obstructions that might be sensed by the network of sensors in and around the highway.
  • the system can even instruct the user either via display or audio, such as that the middle lane of three lanes of traffic happens to be moving the fastest in the area where the user is located, and that when safe to do so, the user should effect a lane change to be in the middle lane to make the best forward progress.
  • This type of system can also be implemented without a separate network of road sensors, such as by using the previously disclosed imaging sensors on roadsides or structures or mobile platforms to collect the same data representative of the overall traffic patterns.
  • the processing means on board the user's vehicles can distill those details that are relevant to the particular user from the data being broadcast regarding the instantaneous traffic flow.
  • the information is relayed locally or from a central location about non-traffic factors that may affect traffic but aren't directly caused by traffic.
  • non-traffic factors For example, local flooding conditions, local icing conditions on bridges, or other very local weather patterns might be relayed to the user's vehicle from a sensing apparatus located roadside or within the right-of-way thus giving the user advance warning of those hazardous conditions.
  • sensors embedded in an overpass may detect whether or not the road surface has sufficient moisture on it and is sufficiently cold to have frozen over. This information can be broadcast from a localized emergency beacon thereby warning travellers about to arrive at that overpass to slow down.
  • toll plaza booth availability for toll roads where given lanes of traffic are spread out over a plurality of toll booths. There may be an indication that one or more booths are inoperative at the given time or have an unusual traffic pattern associated with them. That information allows the user to select a more expedient toll booth lane to be in. Having that knowledge in advance before a ⁇ iving at the toll plaza would make the transit of the toll plaza more efficient for the user.
  • the display need not be limited to a display actually present in an operator's vehicle.
  • the display can also or alternatively be located separate from any user's vehicle, such as housed in a structure, for example, for study of local traffic conditions by police or news media, or to analyze if a municipality needs to affect a change in a local right-of-way. It might be used in an alternate embodiment as part of a central monitoring station where traffic patterns over all are analyzed and reported.
  • the sensing means can be local to the display (such as one mounted on or affixed to that structure) thereby providing a fixed local view.
  • the sensing means for such a fixed roadside display can be located some distance from the actual display itself, giving user's advance warning of an area significantly distant.
  • the case of a fixed structure blocking direct view of the path of the road is commonly evidenced by expressways winding through urban areas. Those blocking buildings prove to be ideal mounting points for fixed cameras to observe traffic patterns in that they would allow a direct view (to users approaching the buildings) of obstructions that may be beyond them.
  • the structure supporting an image sensing means is isolated from shock or vibration. If the sensing means is actually affixed via a fixed or retractable support on a vehicle, the preferred embodiment is to have the assembly shock mounted in a fashion that reduces the amount of vehicle vibration coupled by the sensor.
  • jitter reduction may be performed by the processing means to reduce the appearance of vibration in a sensed image.
  • the mounting for the sensing means may also include gyroscopic stabilization. This might also be used if the sensing means is located in a mobile platfo ⁇ n such as a helicopter. There is no technical reason other than cost that would prohibit this from being used in a user' s vehicle, which would also allow a very stable image to be provided to the user.
  • a retractable support structure located on the operator's vehicle which allows an image sensing means to be positioned at a height above the operator's vehicle, enables (for example) a safer traffic passing operation where the vehicle operator would be able to look beyond the vehicle directly in front of his or her vehicle to determine whether or not it is safe to pass.
  • the vehicle ahead that is blocking the operator's vision would have its own sensing apparatus and transmitter and would relay directly an image of what is located in front of it back to the operator wishing to pass.
  • This method of relay might be via a broadcast transmitter, or in a alternate embodiment a modulated light source such as an LED or an infrared emitter transmitting data backwards from the blocking vehicle to the operator's vehicle.
  • This process of relaying image or data content backwards through the traffic flow from one vehicle to the next is another important aspect of the present invention, because it allows vehicles further back in the traffic flow to have the benefit of image and data content representative of traffic conditions in front of them, thereby giving them advanced warning of road blockages and/or hazards that would otherwise be invisible to the operator's of those vehicles.
  • a plurality of vehicles (each equipped with image sensors and processing means and transceivers) have the ability to communicate images and data via the transceivers between one another.
  • the group of vehicles each have a relative orientation within the traffic flow and also a relative position with respect to one another.
  • the processing means further comprises an election system that allows each vehicle operator to select, for display presentation, data comprising either images or data content from other vehicles in a selected direction from the user's vehicle, or from the imaging sensor that is part of the operator's own vehicle.
  • a plurality of cameras or sensors with transceiving means send information back and forth to each other and receive data from the closest beacon or camera position.
  • another application of this technology is using it as a black box to capture telemetry of a vehicle. For example, there are situations where police squad cars or cruisers have cameras mounted to view (out the front windshield) the activities involved in a chase or an apprehension or an arrest. These signals can be processed and used in accordance with the present invention.
  • the present invention can be used with other types of public or commercial vehicles, for example, cameras/sensors/transmitters could be put on taxicabs or on the cabs of trucks for a commercial hauler to allow them to capture video information about the operation of the vehicle.
  • the captured video can be transmitted and/or recorded to a memory of some sort continuously while the vehicle is in operation.
  • the memory can record over the same sections over and over, except in the event of an accident, thereby providing 10 to 90 minutes (design option) of context to any accident investigators that might happen upon the scene. So even if the operator of the vehicle is incapacitated by the accident, if the device survives, then the company or next of kin might have an indication of what happened.
  • signals and telemetry from the vehicle such as pressure in the brake lines, manifold pressure in the engine, whether or not the engine was actually running at the time of impact, whether there was a steering input being applied, whether the vehicle was in panic braking, or any of those kinds of factual information.
  • this type of technology is very useful and can be employed as a logical extension of what already exists in on-board computers that are monitoring engines and systems in cars, adding telemetry and external inputs, such as video and/or sensing apparatus.

Abstract

An apparatus and method is provided to allow an operator of a vehicle to have an early warning of traffic conditions that may effect his or her travel, traffic conditions in the immediate vicinity of the operator's vehicle and traffic conditions that are a little bit further than the immediate vicinity of the operator's vehicle but still of significance to the operator at the time. An electronic vehicle safety early warning system includes a receiver which receives content signals representative of surrounding traffic conditions, a processing subsystem which processes the signals to format the content signals for display, and a display apparatus which provides a visual display presentation of the surrounding traffic conditions.

Description

METHODOLOGY, APPARATUS, AND SYSTEM FOR ELECTRONIC VISUALIZATION OF TRAFFIC CONDITIONS
RELATED APPLICATIONS Not Applicable
FEDERALLY SPONSORED RESEARCH OR DEVELOPMENT Not Applicable.
BACKGROUND OF THE INVENTION:
This invention relates to a methodology, apparatus, and system for electronic visualization of traffic conditions and in-vehicle as well as external-to-vehicle-based subsystems and networked combinations thereof.
Users travel often in private vehicles such as cars (buses, trucks, vans, etc.) or boats which are confined to certain rights-of-way, namely roads or waterways. The nature of these rights-of-ways are that the individual vehicles tend to accumulate in patterns comprising traffic, where one present vehicle is said to be following a vehicle that's in front of it; likewise there are vehicles behind the present vehicle. Even though each vehicle may have its own individual destination, the path from an origination point to the destination point often involves many vehicles travelling over substantially the same right of way. A common example of this would be commuters driving into work via expressways or other highways, where they join the highway at some point close to their origination point and exit the highway at some point close to their destination, meanwhile sharing the highway with a plurality of other vehicles also going in the same general direction.
As any commuter is familiar with, the collection of vehicles - especially during peak travel times - becomes a difficult traffic problem to navigate. Often the operator of a vehicle finds himself or herself stuck immediately behind a stopped vehicle in a solid mass of traffic without knowing what the circumstances are that caused the traffic to be stopped and whether there is any action that the operator can take that would improve that operator's ability to get to their destination in a timely manner. Prior art solutions generally involved "rubbernecking", which is the process of the operator trying to see outside of his or her vehicle in order to determine which lane is the optimal lane to be in; listening to radio traffic reports, which are relayed from mass media concerns such as news radio stations; and alternatively using the newer mobile or cellular phones to call an information traffic service.
The problem with the prior art is that, in the case of the operator simply trying to see what is around his or her vehicle, the scope of visibility is too limited. The operator can only see a short distance in any direction from his or her vehicle and that line of sight may be blocked by other vehicles that happen to be in the way, or structures, or bends of the road preventing the operator from actually getting any useful information out of looking ahead.
The problems with mass media and traffic services are the opposite. The ability to look anywhere within a road system is certainly available, but the information - because it has to be presented to a mass market is not detailed enough for a given vehicle operator. For example, a traffic report might indicate that a particular highway is busy from point A to point B. Often that traffic report information is delayed by the amount of time that it takes the report to arrive at the news service, be properly added to their transmission program, and subsequently relayed to the operators of the vehicle, during which time the road situation may have changed. Furthermore the vehicle operator may only be interested in their ability to get to point C where reported points A & B are irrelevant.
Another problem with the prior art is that if an operator is unfamiliar with the traffic in a given area, for instance a new visitor or professional driver that has just arrived in a city, he or she may not be familiar with the local colloquialisms and common names for roadways that otherwise appear in an atlas as route numbers. For example, a visitor to Washington, D.C. might have difficulty knowing which road is the Beltway if they have never been to Washington before. Similarly, in the Bay area of California, finding the Almaden expressway, or in the Chicago area, finding the Eisenhower expressway, can be difficult for operators of vehicles that have never been to those particular locations before, and invariably traffic reports are given in terms of the common names for the rights of way. Due to the limited market, these kinds of information are not available in certain areas that might be more rural or not serviced by a large news organization, or rights of way that are not as commonly traversed such as waterways. As a result, the existing solutions for the problem of informing operators of road conditions that may effect their travel fall short of the ideal of giving the operator the amount of detail of information and the amount of resolution of information that would result in the operator having a better operating experience and getting to his or her destination in a more efficient manner.
SUMMARY OF THE INVENTION: In accordance with the present invention, an apparatus and method is provided to allow a operator of a vehicle to have an early warning of traffic conditions that may effect his or her travel, traffic conditions in the immediate vicinity of the operator's vehicle and traffic conditions that are a little bit further than the immediate vicinity of the operator's vehicle but still of significance to the operator at the time. It is another object of the present invention to convey detailed information about very localized traffic conditions to persons other than operators of vehicles, such as emergency vehicle crews, or persons involved in the collecting of information about traffic conditions for relay, for example to the news stations previously mentioned.
It is another object of the invention to provide an apparatus and a method that allows traffic conditions in a defined (e.g., forward) direction of travel of the user, or ideally in any direction of travel from an operator's vehicle, to be relayed to the user(s), such as from other vehicles, or possibly from fixed or stationary points located at some distance from the operator's vehicle or from the user. This relay process can be direct from the source of the images, or data about traffic conditions, or it may be relayed through a series of steps.
These and other aspects and attributes of the present invention will be discussed with reference to the following drawings and accompanying specification.
BRIEF DESCRIPTION OF THE DRAWINGS: FIG. 1 is a system diagram of one embodiment in accordance with the present invention; FIG. 2 is a structural diagram of an image sensor in accordance with the present invention;
FIG. 3 is a structural drawing of an image sensor wherein camera body 300 comprises image sensor 310 and lens 320 and an infrared filter 325; FIG. 4 shows another embodiment of the invention indicating the use of a vehicle speed sensor input to control the retractable support;
FIG. 5 shows a detailed view of an alternate embodiment of the pan and tilt mechanism of the present invention;
FIG. 6 is a schematic diagram of one embodiment of the present invention in operation;
FIG. 7 is a schematic drawing of another embodiment of the present invention in operation, where road hazard 700 exists on a thoroughfare ahead of a first vehicle 745, a second vehicle 765 and a third vehicle 775;
FIG. 8 is an illustration of an example of a display provided in accordance with the present invention, illustrating multiple images from ahead in the traffic flow and operator controls;
FIG. 9 is a detailed schematic illustration of a retractable support as in one embodiment of the present invention;
FIG. 10 is an illustration showing additional detail of the mechanism of FIG. 9 wherein the retractable support comprises a retractable support housing 1040 that emits a retractable support column 1040, upon which is mounted a pan and tilt means 1020, which is in turn affixed to a image sensing means 1000 having an angular field of view 1010;
FIG. 11 is an even more detailed view of the image sensing means and the pan and tilt means of FIG. 10;
FIG. 12 is a schematic illustration showing one embodiment of the present invention utilizing a speed sensor input similar to the one shown in FIG. 4;
FIG. 13 shows an alternate embodiment of the speed sensing retractable support controller means similar to those shown in FIGS. 12 and 4; FIG. 14 shows an alternate embodiment of the retractable support wherein the retractable support 1420 variably positions the image sensing means 1410 off to the side of vehicle 1400;
FIG. 15 shows one embodiment of the present invention wherein the retractable support structure is activated by means of hydraulic piston;
FIG. 16 shows a schematic drawing of a multi-sensor apparatus as in one embodiment of the present invention wherein there are a plurality of image sensing means 1610, 1615, and 1620, and other sensing means 1625 coupled to processor 1630;
FIG. 17 shows an embodiment of the present invention as utilized on watercraft; FIG. 18 shows details of one implementation of the mounting means as in FIG.
17;
FIG. 19 shows one embodiment of the present invention wherein the system comprises a receiver 1970 coupled to a processing means 1930 which is in turn coupled to a display means 1940 within the vehicle 1900; FIG. 20 shows an alternate embodiment of the present invention wherein the system comprises an antenna 2060 coupled to receiving means 2070, which in turn is coupled to processing means 2030 which is, in turn, coupled to a display means 2040;
FIG. 21 shows another embodiment of the present invention as used in a central station; FIG. 22 shows an embodiment of the present invention wherein the image sensing means 2210 is affixed to vehicle 2200 via the fixed support 2220;
FIG. 23 shows another embodiment of the present invention, wherein the image sensing means 2310 may be operatively elevated above the position of vehicle 2300 via the support element 2320; FIG. 24 shows a schematic of one multiple vehicle embodiment of the present invention;
FIG. 25 illustrates an alternate embodiment of the first vehicle as shown in FIG. 24, wherein the combination image sensing means and transmission means 2590 is affixed to vehicle 2500 via support 2520; FIG. 26 shows an alternate embodiment of the present invention wherein emergency vehicle 2600 is equipped with an image sensing and transmission apparatus 2690 which provides a transmitted signal 2695;
FIG. 27 shows an alternate embodiment of the present invention wherein a number of image sensing means 2710, 2720, 2730 are arranged at strategic locations along a roadway 2700;
FIG. 28 illustrates schematically the central station 2805 comprising a variety of sources of information including textual information 2800 and visual information 2810;
FIG. 29 is an example embodiment of the present invention, wherein an image sensing means 2910 is affixed via support structure 2920 to a road sign 2930 located adjacent the right-of-way 2940;
FIGS. 30A and 30B are representative examples of the types of display information that may be presented in accordance with one embodiment of the present invention; FIG. 31 A is a schematic representation of an alternate embodiment of the present invention wherein a combination image sensing and transmission apparatus 3110 is located adjacent the roadway 3170;
FIG. 3 IB is an alternate embodiment to that depicted in FIG. 31 A wherein the image sensing means 3115 is directly coupled to a processing means 3145 which is in turn directly coupled to billboard 3150 thereby providing for integrated information display responsive to the image detected by image sensing means 3115 of the roadway 3180;
FIG. 32 shows one embodiment of the signal receiving and processing element of the present invention; FIG. 33 represents a simplified block diagram of the receiving and processing systems of the present invention;
FIG. 34 shows one embodiment of the present invention illustrating detail of the image processing and formatting means 3350 as shown in FIGS. 32 and 33;
FIG. 35 is a schematic representation of a multiple transmitter embodiment of the present invention wherein the user's vehicle 3500 receives a plurality of signals via antenna 3510 from sources including a first transmitter 3520 and a second transmitter 3530;
FIG. 36 shows a schematic illustration of the operation of an embodiment of the present invention wherein a building 3630 blocks the view of roadway 3640 some distance from the user's vehicle 3600;
FIG. 37 shows a schematic representation of the one embodiment of the present invention as used in watercraft;
FIG. 38A is a schematic representation of traffic congestion involving an emergency vehicle 3800, the user's view of which is blocked; FIG. 38B shows a representation of one possible display as shown by display means 3850 of the present invention wherein the information received by image sensing means 3830 is directly displayed for the user;
FIG. 38C shows a different display of the present invention that would be presented to the user on display means 3850 which comprises a schematic view derived from information received from image sensing means 3830 and processed to produce the overhead schematic view and other user information as shown in the illustration;
FIGS. 39A and 39B represent an alternate embodiment of the system as depicted in FIGS. 38A, 38B, and 38C wherein the emergency vehicle 3800 is again blocked from direct view of the user in vehicle 3810 by intermediate vehicle 3820; FIG. 40 is another example embodiment of the present invention wherein a first vehicle 4000 comprises a support means 4010, image sensing means 4020, and a transmission means 4030;
FIG. 41 shows one embodiment of the present invention which utilizes multiple display means, 4110, 4120, and 4130, all coupled to processing means 4140; FIG. 42 shows one embodiment of the present invention that supports a relay mode of operation;
FIGS. 43 A and 43B show alternate embodiments of the relay mode of operation;
FIG. 43C illustrates a flow chart for the relay operation handoff for FIGS. 43 A and 43B; FIG. 44 shows an example of a display in accordance with the present invention, wherein a road map is shown indicating the location of the user's vehicle 4400, locations of junctions 4410 superimposed on the road map, and known emergency vehicle locations 4430; and
FIG. 45 illustrates a display provided responsive to information from any of the plurality of received signals, and illustrates a directional indication 4510 superimposed on the display indicating to the user that the most efficient way to bypass this particular congestion would be, in the illustrated example, to drive to the left.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT:
While this invention is susceptible of embodiment in many different forms, there is shown in the drawing, and will be described herein in detail, specific embodiments thereof with the understanding that the present disclosure is to be considered as an exemplification of the principles of the invention and is not intended to limit the invention to the specific embodiments illustrated.
FIG. 1 is a system diagram of one embodiment of the present invention. Vehicle 100 comprises sensing means 110 affixed to support means 120 affixed to vehicle 100, wherein the sensing means couples data to processing means 130 and display means 140. Data sensed by sensing means 110 and processed by the processing means 130 is relayed via logical display link 150 to display means 144 displayed to the user. The processing means can alternatively be positioned with the display to receive and process the sensed data for display.
The display means may comprise a CRT display such as manufactured by major consumer electronics manufacturers including Sony Corp. (Japan), JVC Corp (Japan), Zenith Corp. (Chicago, IL), and ASC Systems (St. Claire Shores, MI). Other display technologies such as liquid crystal displays available through vendors such as NEC Electronics (Santa Clara, CA), Toshiba America Electronic Components (Irvine, CA), and King Bright Corp. (City of Industry, CA); Plasma displays are available from ASC Systems, and Displays Inc. (Lewiston, PA); and LED displays such as provided by Colorado Microdisplay Inc. (Boulder, CO), King Bright Corp., and Bivar Inc. (Irvine, CA.). FIG. 2 illustrates a structural diagram of an image sensor in accordance with the present invention. Camera body 200 comprises image sensor means 210, which may be a CCD array, or other image sensing device. Camera body 200 also comprises lens 220. Sensing of a physical object 230 is performed by lens 220 gathering reflected light from the object 230 and projecting the image 240 of that object on the image sensing means 210. In preferred embodiment of the invention, the image sensing means further comprises a moveable filter as shown in FIG. 3. FIG. 3 is a structural drawing of an image sensor wherein camera body 300 comprises image sensor 310 and lens 320 and an infrared filter 325. In operation, the infrared filter 325 may be in a bypass position 330 or in an active position 340, rotating pivotally around pivot 350. In this preferred embodiment, the CCD array which is sensitive to both visible light spectrum and infrared spectrum may optionally have an infrared filter interposed between said sensor and lens assembly 320, thereby restricting or eliminating the infrared spectrum from arriving at the image sensor thereby limiting the sensor to only sensing visible light. Alternatively, the filter may be deployed in the bypass position thereby allowing infrared energy to travel through lens 320 and impinge incident upon sensor 310. CCD arrays suitable for this invention are available from a number of commercial sources, including Edmund Scientific (Barrington, NJ), Hewlett Packard (San Jose, CA), Samuel Video Components USA (Rochelle Park, NJ), and Orbit Semi-conductor (Sunny Valley, CA).
FIG. 4 shows another embodiment of the invention indicating the use of a vehicle speed sensor input to control a retractable support. Vehicle 400 comprises a sensing means 410, a retractable support means 420, a processing means 430, a display means 440, and a wiring harness 460. Image data is processed for display by the processor 430 and is coupled (wirelessly or via wiring harness) to the display 440. Alternatively, data can be conveyed via logical display link 450 from sensing means 410 to display 440. As shown in FIG. 4, wheel speed sensor 470, or engine computer 480, may be coupled via wiring harness 460 to the retractable support means 420. This permits either or both of wheel sensor 470 and engine computer 480 to operatively control retractable support means 420 responsive to vehicle speed.
FIG. 5 shows a detailed view of an alternate embodiment of a pan and tilt mechanism in accordance with the present invention. Camera body 500 is mounted on an assembly comprising a tilt mechanism and a pan mechanism. The tilt mechanism comprises a tilt motor 520 driving a tilt pinion 530 which operates in cooperation with a tilt rack 540. The pan motor 540 operates a pan pinion 560 which operates in cooperation with a pan rack 570. The support means comprising the pan and tilt mechanism is attached to support 580. Operating the tilt motor 520 in a forward direction will tilt the camera body 500 through a tilt axis thereby positioning lens assembly 510 of the camera body 500 to view or sense image data above the center line of site of the camera. Operating the tilt motor 520 in a backward direction will tilt the camera body 500 through a tilt axis thereby positioning lens assembly 510 of the camera body 500 to view or sense image data below the center line of site of the camera Pan motor 550 permits the image sensor assembly 500 to rotate around a vertical axis coincident with the axis of support 580 thereby allowing the camera body 500 and lens assembly 510 to face in various heading directions with respect to the support 580. Motors and actuators suitable for use with the pan and tilt mechanism of the present invention are available from a number of manufacturers including vendors such as MGC Inc.
FIG. 6 is a schematic diagram of the one embodiment of the present invention in operation. A road hazard 600 is present on a thoroughfare ahead of a first vehicle 610, a second vehicle 650, and an intermediate vehicle 660. The first vehicle 610 comprises a first sensing means 620 and a first transceiver means 630. The first sensing means 620 is coupled to a first display means 623 via first logical display link 626. Data corresponding to information sensed by the first sensing means 620 is transmitted by the first transceiver means 630 as a first transmitted signal 635. The second vehicle 650 comprises a second sensing means 640 and a second transceiver means 670 both of which are operatively coupled via a second logical display link 690 to a second display means 680 within second vehicle 650. The first transmitted signal 635 carries information content that is displayed on second display means 680 responsive to the reception by the second transceiver means 670. This permits a display generation on the second display means 680 irrespective of the fact that intermediate vehicle 660 may be blocking a direct line of sight between the second vehicle 650 and the road hazard 600. Radio frequency transceivers suitable for use with the present invention are available from a variety of vendors including Motorola Corp. (Schaumburg, IL). FIG. 7 is a schematic drawing of another embodiment of the present invention in operation where road hazard 700 exists on a thoroughfare ahead of a first vehicle 745, a second vehicle 765 and a third vehicle 775. A stationary image sensing means 720 additionally comprising a stationary transceiver means 730 are fixed to stationary support 710, which in the illustrative embodiment, can be a road sign. The stationary transceiver means 730 transmits a transmitted signal 735 in the direction reversed to that of traffic flow, and/or to a logical central controller which displays and/or rebroadcasts the signal. The first vehicle 745 comprises a first vehicle transceiver 750 and a first image sensing means 740. The first vehicle 745 also comprises a first vehicle display 748 coupled to the first vehicle transceiver 750 and the first image sensing means 740. The first vehicle's transceiver 750 produces a first vehicle's transmission 755 which is transmitted opposite of traffic flow. Second vehicle 765 comprises a second vehicle reception and sensing means 760 logically linked to second vehicle's display 768. The second vehicle's reception and sensing means 760 has the ability to receive the first vehicle's transmission 755 and the stationary transmitted signals 735 and to discriminate between the two received signals and to generate a display on the second vehicle's display 768 responsive to the data in the transmitted signal 735 and the data in the first vehicle's transmission 755. In an analogous manner, the third vehicle 775 has a third vehicle's reception and sensing means 770 coupled to the third vehicle's display 778 and is also capable of independently receiving and discriminating transmitted signals 735 and first vehicle's transmission 755 for generating a display responsive thereto. The selection and discrimination means is detailed later in FIG. 32.
FIG. 8 is an illustration of a display in accordance with the present invention, illustrating multiple images from ahead in the traffic flow and operator controls. The display means 800 includes a display comprised of a textual information window 810, a first visual image 820, and a second visual image 830. The first visual image 820 may have a first legend 822 superimposed upon said image. In an analogous fashion, the second visual image 830 may have a second legend 832 superimposed upon it. Legends 822 and 832 may be produced by a character generator circuit such as available by Hyperception Inc. (Dallas, TX). Alternatively, general purpose display driver integrated circuits are available from companies including Telcom Semiconductor, Inc. (Mountain View, CA), Silicon Motion, Inc. (San Jose, CA), Toshiba America Electronic Components (Irvine, CA), Harris Semiconductor (Melbourne, FL), Chip Supply, Inc. (Orlando, FL), Maxim Integrated Products (Sunnyvale, CA), and NEC Electronics (Santa Clara, CA). The legends 822 and 832 may comprise identification indicating the source of the visual image and/or additionally may comprise other information such as transmitted by the source of the information or generated internally by the processing means of the present invention. The user controls of display means 800 include a tilt control 840, a pan control 850, a zoom control 860, and a focus control 870. In the illustrated embodiment, these controls are shown as thumbwheels permitting the user to easily adjust respectively tilt, pan, zoom, and focus of the image sensing means. In an alternate embodiment these controls may be implemented as buttons, or as graphical user interface elements on an interactive touch screen display.
Touch screens are commercially available from a number of manufacturers, including Microtouch Systems, Inc (Methuen, MA), Omron Electronics (Schaumburg, IL), MSI Technologies LLC (Englewood, CO), Advanced Input Devices (Couer D'Alene, ID). (Eden Prairie, MN), Micro Mo Electronics (Cleaπvater, FA), Nidec America (Canton, MA), and Dado Corp. (Summerset, NJ).
FIG. 9 is a detailed schematic illustration of a retractable support as one embodiment of the present invention. Vehicle 900 comprises a retractable support means 940 which provides for positioning an image sensing means 910 in either of a retracted position 920 and a deployed position 930.
FIG. 10 is an illustration showing additional detail of the mechanism of FIG. 9 wherein the retractable support comprises a retractable support housing 1040 that emits a retractable support column 1040, upon which is mounted a pan and tilt means 1020, which is in turn affixed to an image sensing means 1000 having an angular field of view 1010.
FIG. 11 is an even more detailed view of the image sensing means and the pan and tilt means of FIG. 10. As shown in FIG. 11, the image sensing means 1100 has a first line of sight 1110 and a second line of sight 1120 comprising a tilt range 1130 between the first and second line of sights, respectively. In a similar fashion, the image sensing means has a pan range 1140 as illustrated in FIG. 11. The image sensing means 1100 is thereby provided means to tilt the line of sight within the tilt range 1130 and to pan in different directions within the pan range 1140 relative to an axis extending through support column 1150. The tilt mechanism permits the image sensing means to look up and look down and the pan means permits the image sensing means to rotate and essentially "look around."
FIG. 12 is a schematic illustration showing one embodiment of the present invention utilizing a speed sensor input similar to the one shown in FIG. 4. As shown, vehicle 1200 comprises a sensing means 1210 affixed on a retractable support means 1220 to the vehicle 1200. Data from the sensing mean 1210 is relayed via the sense data relay link 1225 to the processing means 1230. The processing means then produces image data responsive to the data from the sensing means and relays the image data via an image data relay link 1235 to display means 1240. The vehicle's speed sensing means 1250, which in the illustrated embodiment, is a vehicle wheel speed sensor, is coupled to a retractable support controller 1260. The retractable support controller 1260 issues a control signal via control signal link 1270 to the retractable support means 1220 which responds operatively thereto. In the illustrated embodiment, the operation of the retractable support means is thereby responsive to the speed of the vehicle as detected by the vehicle speed sensing means 1250. In practice, this can be used to either deploy, or retract, the camera if the vehicle speed approaches certain thresholds which are programmably determined. For example, the system may prevent the image sensing means from being deployed on the retractable support by instructing the retractable support to retract if the vehicle's speed is above 5 mph. In an alternative embodiment, the retractable support may deploy to a preset point if the vehicle's ground speed is under 30 mph then deploy even further if the vehicle's speed is above 30 mph. FIG. 13 shows an alternate embodiment of the speed sensing retractable support controller means similar to those shown in FIGS. 12 and 4. The image sensing means 1310 is affixed to a retractable support means 1320 to the vehicle 1300. The vehicle 1300 also comprises processing means 1330 and display means 1340. The image sensing means 1310 additionally comprises a windspeed detection apparatus 1350 coupled to a windspeed detection controller 1360. The windspeed detection controller 1360 issues control signals to the retractable support 1320 via link 1370. The link 1370 permits the operation of the retractable support 1320 to be controlled responsive to the detected ambient wind condition. The link 1370 can be direct control (wired or wireless) of the retractable support, or can be coupled to processing means 1330 which controls the retractable support. In the illustrative embodiment, this can give a more accurate indication of the true wind force that might be acting upon the image sensing means 1310 and the retractable support 1320 because it includes information that would not be available from an engine computer or ground speed sensor as shown in FIGS. 4 and 12. FIG. 14 shows an alternate embodiment of the retractable support wherein the retractable support 1420 variably positions the image sensing means 1410 illustrating the opti on of off to the si de of vehi cle 1400.
FIG. 15 shows one embodiment of the present invention wherein the retractable support structure is activated by means of hydraulic piston. Image sensing means 1510 is affixed to one end of a boom 1520 which is affixed at its opposite end 1522 to vehicle 1500. Hydraulic cylinder 1524 positioned between vehicle 1500 and boom 1520 permits the boom 1520 to be raised and lowered around pivot 1522 with respect to the vehicle. Tilt means 1526 compensates for the motion of the raising and lowering of the boom by rotating image sensing means 1510 correspondingly. Image sensing means 1510 provides data to processing means 1530 which then provides image data to display means 1540 for display to the user. FIG. 16 shows a schematic drawing of a multi-sensor apparatus as in one embodiment of the present invention wherein there are a plurality of image sensing means 1610, 1615, and 1620, and other sensing means 1625 coupled to processor 1630. Processor 1630 responsive to signals from the various sensing means produces a display output signal for display to the user on display means 1640. As shown in the illustration, some of the image sensing means (such as image sensing means 1610) may be supported on the vehicle 1600 via retractable support 1650. Alternatively, some of the image sensing means (for example, 1615 and 1620, as illustrated) are supported via a fixed support means (respectively, 1660 and 1670). In the illustrated embodiment, sensing means 1625 is an antenna connected to the receiving means 1680. The antenna may be fixed or retractable. The processing means 1630 includes means to discriminate the various signals supplied to it from various sensing means and to select portions of those signals for integration into a display integration for the user.
FIG. 17 shows an embodiment of the present invention as utilized on watercraft. Watercraft 1700 is shown with image sensing means 1710 mounted on support mast 1720. The image sensing means 1710 is coupled to processing means 1730 and thence to display means 1740 for producing an integrated display presentation for the user.
FIG. 18 shows details of one implementation of the mounting means as in FIG. 17. Image sensing means 1810 mounted on support mast 1820 is capable of the pan and tilt motions as shown previously in FIGS. 5 and 11. The tilt mechanism permits locating the image sensing means line of sight 1870 variably up and down through range 1850 as shown in the illustration. The pan mechanism, as previously disclosed, permits image sensing means line of sight to be rotated through a range 1830 as shown on the illustration in the direction 1840 as shown by the arrows. However, given the nature of water craft, an additional axis of freedom is supplied, that freedom namely being the ability to roll along the axis of the line of sight 1870 on the directions 1860 as shown by the arrows. By permitting the image sensing means 1810 to be additionally adjusted in the direction 1860 of rotation, it permits the image sensing means 1810 to compensate for the variable roll of the watercraft.
FIG. 19 shows one embodiment of the present invention wherein the system comprises a receiver 1970 coupled to a processing means 1930 which is in turn coupled to a display means 1940 within the vehicle 1900. Also coupled to the receiving means 1970 is antenna 1960. The antenna 1960 provides for receiving transmissions from sources external to the vehicle, decoding of the received transmissions via the receiver 1970, and passing said decoded data to processing means 1930 for integration into a display presentation for the user.
FIG. 20 shows an alternate embodiment of the present invention wherein the system comprises an antenna 2060 coupled to receiving means 2070, which in turn is coupled to processing means 2030 which is, in turn, coupled to a display means 2040. In the illustrated embodiment, signals are received by antenna 2060 from sources outside the present system and are decoded by receiving means 2070 producing data that is then relayed to processing means 2030 for integration into a display presentation on display 2040 to the user. As shown in the drawing, the system is not required to be physically present within a vehicle.
FIG. 21 shows another embodiment of the present invention as used in a central station. A plurality of signal receiving means (such as 2160, 2165, and 2168) are coupled to a receiving decoder 2170. Examples include a radio frequency antenna 2160, a microwave antenna 2165, and telecommunications links 2168. The receiving and decoding means 2170 relays received and decoded data to the processing means 2130. The processing means 2130 generates data for a plurality of individual display elements 2145 that comprise the display means 2140. This permits an operator to receive information communicated from a variety of sources (as shown in the illustrated example, telecommunications, radio frequency, and microwave links) and to have the system integrate that data and present it for display. The system of the present invention is not limited to the communications link varieties as shown. Other additional communication links include infrared communications, satellite communications, fiber optics, etc. Examples of manufacturers providing components for microwave communications include Sprague-Goodman Electronics (Westbury, NY), Atlantic Coast Instruments (Brick, NJ), and Microwave Communication Laboratories (St. Petersburg, FL). The individual display elements 2145 in the illustrated embodiment may comprise any combination of CRT displays, LED displays, LCD displays, plasma displays, or other types of displays.
FIG. 22 shows an alternate embodiment of the present invention wherein the image sensing means 2210 is affixed to vehicle 2200 via the fixed support 2220. The image sensing means 2210 relays signals to processor 2230 which decodes signals and produces an integrated display presentation for display to the user on display 2240. In the illustrative embodiment, the image sensing means 2210 has a fixed field of view 2250 determined by the mounting of image sensing 2210 on support 2220 and the relative orientation of vehicle 2200.
FIG. 23 shows another embodiment of the present invention wherein the image sensing means 2310 may be operatively elevated above the position of vehicle 2300 via the support element 2320, thereby permitting the field of view 2350 to encompass objects such as vehicle 2360 which would otherwise be invisible to the operator of vehicle 2300 due to interfering vehicle 2340.
FIG. 24 shows a schematic of one multiple vehicle embodiment of the present invention. A first vehicle 2400 comprises a image sensing means 2410, a processing means 2430, a display means 2440, and transmission means 2435. The image sensing means 2410 having a field of view 2450 senses data corresponding to an image and supplies that information to transmitter 2435. Transmitter 2435 transmits said information via transmitted signal 2437 to the antenna 2460 of second vehicle 2490. The received signal via antenna 2460 is relayed through receiving means 2470 which receives and decodes the data in transmitted signal 2437 and provides that data to processing unit 2475. Processing unit 2475 generates a local display presentation for the user which is displayed on display means 2480. In the illustrated embodiment, the image sensing means 2410 and transmission means 2435 are separate means that are coupled.
FIG. 25 illustrates an alternate embodiment of the first vehicle as shown in FIG. 24. The combination image sensing means and transmission means 2590 is affixed to vehicle 2500 via support 2520. The combination image sensing and transmission means 2590 provides a transmitted signal 2595 as well as a signal coupled to processing element 2530. Processing element 2530 produces an integrated display presentation on display means 2540 for the user. As shown in the illustrated embodiment, the signal transmission means and the image sensing means may be combined into one unit. FIG. 26 shows an alternate embodiment of the present invention wherein emergency vehicle 2600 is equipped with a image sensing and transmission apparatus 2690 which provides a transmitted signal 2695. The transmitted 2695 is received by antenna 2660 and decoded by receiver 2670 for processing by processing element 2680 and subsequent display to the user of a integrated display presentation on display means 2685, said display means located in or on second vehicle 2650. This embodiment illustrates the usefulness of the present invention wherein the emergency vehicle 2600 can relay important information regarding traffic or road conditions that are significantly ahead of the user in the vehicle 2650. FIG. 27 shows an alternate embodiment of the present invention wherein a number of image sensing means 2710, 2720, 2730 are arranged at strategic location along a roadway 2700. The data relayed from image sensing means 2710, 2720, and 2730 are processed by central station 2740 and the information content is then is relayed via transmitter over transmitted signal 2760 to one or more vehicles 2770 equipped to receive said signals. This permits the central station to consolidate information from a variety of sources and to present the consolidated information, or alternatively, the raw information from the image sensing stations, to vehicles on the roadway. Additionally, or alternatively, the individual sensing means can further comprise transmitter subsystems and provide for direct communication to the vehicles. As shown in FIG. 21, the central station may comprise a plurality of different types of receiving means, and a plurality of individual displays, permitting an operator to observe and control the operation of central station 2740. Also, as shown in FIG. 27, the image sensing sources 2710, 2720, 2730 may deployed on a wide variety of supports. In the illustrated embodiment, image sensing means 2710 is deployed on top of a building structure 2715 located adjacent to the roadway 2700. Also, as shown in the illustrated example, image sensing means 2720 is located on a sign overpass 2725 which bridges roadway 2700. Alternatively, as shown in the illustrative embodiment, image sensing means 2730 may simply be located adjacent to roadway 2700.
FIG. 28 illustrates schematically the central station 2805 comprising a variety of sources of information including textual information 2800 and visual information 2810. The visual information may be derived from roadside image sensing means, such as those illustrated in FIG. 27, or mobile image sensing means, such as those illustrated in FIG. 26. Additionally, the textual information may be generated internally to the central station 2805, or externally from the central station. The sources of information are assembled by processing unit 2820 into a signal to be transmitted by transmitter 2830. The transmitted signal, 2835 conveys any combination of visual and textual information from the central station 2805 to the user's vehicle 2840. The user's vehicle 2840 comprises an antenna 2850 coupled to a receiving and decoding means 2860 and a processing means 2870. The processing means 2870 is the further coupled to a display means 2880. The processing means 2870 has the ability to interpret the various types of information conveyed by transmitted signal 2835 and to selectively incorporate elements of that transmitted signal into an integrated display presentation 2890. The integrated display presentation 2890 can comprise one or both of visual information 2893 corresponding to a processed version of the visual content 2810 and an overlay of processed textual information 2896 corresponding to the textual information 2800. The system of the present invention thereby permits the user of the system to discriminate and interpret a wide variety of data transmitted to the user's vehicle.
FIG. 29 illustrates another embodiment of the present invention, wherein an image sensing means 2910 is affixed via support structure 2920 to a road sign 2930 located adjacent the right-of-way 2940. The information content sensed by image sensing means 2910 is then transmitted via transmitter 2960 to vehicles such as the user's vehicle 2980, via the transmitted signal 2970. The transmitted signal 2970 is received, decoded, and processed within the user's vehicle 2980 and displayed on display 2990. In the illustrated embodiment, the information content relayed from image sensing means 2910 and thus displayed on the user's display means 2990 permits the user to make a decision on whether to stay-on-right of way 2940 or to take exit 2950 responsive to the traffic conditions that are thus visually observed on display means 2990. FIGS. 30 A and 30B are representative examples of the types of display information that may be presented by an embodiment of the present invention. FIG. 30A illustrates the display of a schematic overview of 3010 of traffic congestion in the vicinity of road hazard 3020. In addition to the schematic representation of 3010, other areas of information that may be included in the display comprise a warning, or advisement area 3045, and a user control area 3055. The warning or advisement area 3045 may comprise urgent warnings displayed in a textual or graphic form 3030 and helpful information displayed independently in a textual or graphic form 3040. Additionally, the user control area 3055 may comprise a plurality of user controls 3050, which control the operation of the system and/or modify attributes of the display presentation. In the illustrated example, these user controls include the ability to display distances in miles or kilometers.
FIG. 30B shows an alternate display presentation of substantially the same information as shown in FIG. 30A wherein the display comprises a visual representation 3070 of the area in which the traffic congestion occurs responsive to road hazard 3020. In the illustrated embodiment, superimposed on top of the visual presentation 3070 are useful warnings in a textual or graphical form including speed warning 3030 and directional information 3040. Additionally, user control elements such as user control 3050 may be superimposed on top of display presentation.
FIG. 31A is a schematic representation of an alternate embodiment of the present invention wherein a combination image sensing and transmission apparatus 3110 is located adjacent the roadway 3170. The apparatus 3110 transmits data representative of the image sensed by the image sensing means via transmitted signal 3120 to receiver 3130. Receiver 3130 relays the transmitted and decoded data to processing means 3140 which produces an integrated display presentation for display on video billboard 3150. Video billboard 3150 then provides a display usable by users in one or more vehicles 3160 on the road 3170 thereby giving the operators of said vehicles advance warnings of traffic conditions as observed by image sensing means 3110. Note that in the illustrative embodiment, no portion of the system need be required to be located within or on the user's vehicle. In addition, the transmittal signal 3120 can be received by a receiver within a vehicle and processed and displayed within the vehicle (such as shown in FIGS. 1, 6, 7, 8, etc.)
FIG. 3 IB illustrates another embodiment of the invention similar to that depicted in FIG. 31 A wherein the image sensing means 3115 is directly coupled to a processing means 3145 which is in turn directly coupled to billboard 3150 thereby providing for display to integrated information display through the users responsive to the image detected by image sensing means 3115 of the roadway 3180. Again, as in FIG. 31 A, note that no portion of the present invention needs to be physically present in any of the users' vehicles.
FIG. 32 shows one embodiment of the signal receiving and processing element in accordance with the present invention. As shown in FIG. 32, one or more antennas 3200 and 3201 are coupled via connections 3205 and 3206 to receiver/decoder 3210. Receiver/decoder 3210 then decodes the signals thus received and relays them to processing means 3350 via a plurality of data connections 3220. Optionally, one or more image sensing means 3223 may coupled via data connections 3226 to the processing means 3350. The processing means 3350 is further comprised of a selector subsystem 3360 and formatting and processing subsystem 3340. The selector subsystem 3330 selects from the available data inputs 3220 and 3226 and produces one or more selected output signals 3333 and optionally 3336 for input by the processing and formatting means 3340. The processing and formatting means accepts the plurality of selected data signals and performs processing and formatting operations on that data producing an integrated display presentation. The integrated display presentation is conveyed to display 3360 (for presentation) via link 3345. Processing and formatting operations may include operations such as superimposing textual information on top of video information, producing schematic displays of traffic patterns and congestion, producing warning information responsive to signals received via antennas 3200 and 3201, and integrating received information with data signals from image sensing means 3223. In a preferred environment, the receiver 3210 includes within each of the output signals 3220, an identification of which of the antennas (3200, 3201) the resultant data signal was received from. This in turn permits the processing means 3350 to discriminate between the different data signals received on the basis of transmission methodology and identification.
FIG. 33 represents a simplified block diagram of the receiving and processing systems of the present invention. Shown in FIG. 33, the antenna 3300 receives signals broadcast to it and conveys those signals via link 3305 to receiver 3310. Receiver 3310 then relays the received signal through decoder 3320. The receiver 3310 and decoder 3320 together comprise receiver/decoder 3370. The output of the receiver decoder 3370 is a data signal that is then coupled to the image processing and format means 3350. The image processing and format means 3350 then integrates the data signals as received into a display and then couples that display information to display means 3360 for display to a user. Specific commercially available receivers, decoders, image processing and formatting subsystems and displays as detailed above herein are also applicable to FIGS. 32 and 33.
FIG. 34 shows one embodiment of the present invention showing detail of the image processing and formatting means 3350 as shown in FIGS. 32 and 33. In this embodiment, the received data is representative of an impaired visual that has noise patterns established within it as shown in visual 3410. This impaired visual is characteristic of signals received via conventional wireless technology in the environment of other transmitters or other sources of electromagnetic interference. The impaired visual 3410 is then processed by the image processing and format means 3350 and specifically undergoes a noise reduction step 3450 producing a restored image 3420 which is representative of the image content without the noise imposed by interference. FIG. 34 shows one example of types of image processing that can be performed by the image processing and format means 3350. Other examples as cited elsewhere in the specification include enhancing the contrast, adjusting brightness, adjusting color balance, filtering to remove impulse noise, removing artifacts that may be brought upon by the particular kind of image sensor that is being used, spectrum conversion from a non-visible to a visible spectrum, etc.
FIG. 35 is a schematic representation of the multiple transmitter embodiment of the present invention wherein the user's vehicle 3500 receives a plurality of signals via antenna 3510 from sources including a first transmitter 3520 and a second transmitter 3530 wherein each of the transmitters 3520 and 3530 comprise image sensing means directly coupled to signal transmission means. The transmitter 3520 transmits data representative of the image perceived in field of view 3560. The transmitter 3530 transmits data representative of the field of view 3550. The user's vehicle 3500 on right of way 3540 receives signals via antenna 3510 and the processing and formatting means 3512 collects and integrates the received signals into an integrated display presentation which is then displayed to the user on display means 3514. The display may include a representation of a picture in a picture wherein the field of view 3560 is contained within the image of the field of view at 3550 or vice versa, or may comprise alternatingly switching between the visual received from transmitter 3520 and the visual received from transmitter 3530, or in yet another alternate embodiment, may include a split screen mode of operation where visuals received from each of the transmitters 3520 and 3530 are shown on respective portions of the display screen 3514.
FIG. 36 shows an schematic illustration of the operation of an embodiment of the present invention wherein a building 3630 blocks the view of roadway 3640 some distance from the user's vehicle 3600. The present invention includes combination sensing and transmitting means 3620 on or near building 3630 having a field of view 3660 which includes that portion of roadway 3640 that is obscured from the user. The combination transmitter sensing means 3620 transmits a signal 3625 that is received at antenna 3610 of the user's vehicle 3600. The received signal is decoded and processed by processing means 3612 and the processing means 3612 produces an integrated display for presentation to the user on display means 3614 within the vehicle, wherein the presentation is of the roadway 3640 in the area covered by the field of view 3660.
FIG. 37 shows a schematic representation of the one embodiment of the present invention as used in watercraft. Watercraft 3700 incorporates an antenna member 3710 which is coupled to a receiver 3720. Receiver 3720 is in turn coupled to a processing and integration means 3730 which responsive to the data from receiver 3720 produces an integrated display for presentation for the user. The integrated display data is communicated to display 3740 for presentation thereupon. In the illustrated embodiment, the system of the present invention permits the operator of the watercraft to have the benefit of remotely located image sensing means, for example, giving a visual of an approach to a harbor or a docking area, or navigation of a difficult stretch of waterway. The image sensors may also or alternatively include infrared or other types of information derived from radar, such as maritime radar, thus permitting the generation of a schematic view of a watercraft within a waterway to be displayed on the display means 3470 without requiring the user's watercraft 3700 to possess radar means.
FIGS. 38 A, 38B, 38C represent one embodiment of the present invention. FIG. 38 A is a schematic representation of traffic congestion involving an emergency vehicle 3800. The user's view (from vehicle 3810) is blocked by intermediate vehicle 3820 which is located between the emergency vehicle 3800 and the user's vehicle 3810. Utilizing the system of the present invention, image means 3830 affixed to support means 3840 produces a display on display means 3850 on the interior of the user's vehicle 3810 thereby providing a display on the display means of areas that would otherwise be obscured from direct line of sight of the user in vehicle 3810 by intermediate vehicle 3820.
FIG. 38B shows a representation of one possible display as shown by display means 3850 wherein the information received by image sensing means 3830 is directly displayed for the user. FIG. 38C shows a different display that would be presented to the user on display means 3850 which comprises a schematic view derived from information received from image sensing means 3830 and processed to produce the overhead schematic view and other user information as shown in the illustration. FIGS. 39A and 39B represent an alternate embodiment of the system as depicted in FIGS. 38A, 38B, and 38C wherein the emergency vehicle 3800 is again blocked from the direct view of the user in vehicle 3810 by intermediate vehicle 3820. As in FIGS. 38A, 38B, and 38C, the image sensing means 3830 is able to provide the user a view similar to that shown in FIG. 38B of the traffic congestion ahead of the user's vehicle. Further, the user' s vehicle is equipped with a receiving antenna 3870 coupled to receiver and processing means 3860. Receiving and processing means 3860 combines the image information received from sensing means 3830 with the transmitted signal 3885 from combination sensing means and transmitter 3880, located adjacent to the roadway as shown in the illustration. The combined information may be presented to the user in the form as shown in the FIG. 39B, wherein a smaller inset portion of the display includes a visual representation of data from sensing means 3830, as shown in 3900, and also shows a schematic view of the traffic congestion as image 3910. In addition, textual and graphic information 3920 can be superimposed on the combined display.
FIGS. 38A-C and FIGS. 39A-B therefore demonstrate, inter alia, in accordance with the present invention, one or more sources of information content can be utilized in generating displays. The types of displays that can be generated include a plurality of direct visual representations, generated schematic representations, textual information, graphical information, and any combination of the above.
FIG. 40 is another exemplary embodiment of the present invention, wherein a first vehicle 4000 comprises a support means 4010, image sensing means 4020, and a transmission means 4030. The image sensing means 4020 relays data to transmitter 4030 which in turns transmits said data as transmitted signal 4035. Transmitted signal 4035 is received by a plurality of vehicles 4065 and 4075 via antenna means 4060 and 4070, respectively. In vehicle 4065, the signal received via antenna 4060 is processed by processing means 4068 and displayed for a user on display 4062. In vehicle 4075, the signal received by antenna means 4070 is processed by processing means 4078 and displayed for the user on display means 4072. As illustrated in FIG. 40, the present invention permits the operators of vehicles 4065 and 4075 to have the benefit of the field of view of image sensor 4020 notwithstanding the fact that intermediate vehicle 4045 blocks their direct line of sight. FIG. 41 shows one embodiment of the present invention which utilizes multiple display means, 4110, 4120, and 4130, all coupled to processing means 4140. The processing means 4140 is also coupled to receiving means 4150 and image sensing means 4160. The processing means 4140 operates on data supplied by image sensing means 4160 and data supplied by receiving means 4150 to generate a plurality of integrated displays for the user. Each of these integrated displays is routed to one or more of the display means 4110, 4120, and 4130. This embodiment of the present invention permits a user to see multiple views of a traffic congestion situation and to receive more information simultaneously than would be able to be conveyed via a single display means. In one embodiment, display 4110 may display a visible light image of the data received from image sensing means 4160 and display means 4120 may display a visual light image representation of an infrared scene sensed by image sensing means 4160. Note that although a single image sensing means 4160 is shown on the illustration, the present invention also can operate when there are plurality of image sensing means, each providing individual respective input to processing means 4140. FIG. 42 shows one embodiment of the present invention that supports a relay mode of operation. In FIG. 42, image sensing means 4210 having a field of view 4200 is affixed to support 4215. In the illustrated example, support 4215 is located on top of roadway exit sign 4217, although this location of sensing means 4210 is not a requirement of the invention. Image sensing means 4210 relays data signals to transmitter 4220. Transmitter 4220 transmits a first signal 4225. In the illustrated example, signal 2245 is received by vehicle 4260 which is sufficiently close to transmitter 4220 as to be in receivable range. The signal 4225 is received by antenna and receiving means 4230 and is subsequently repeated via signal transmitter 4240. In the illustrated example, vehicle 4270 is sufficiently far away from transmitter 4220 as to preclude reception of first transmitted signal 4225. However, vehicle 4270 is within reception range of the transmitter 4240 atop vehicle 4260. As a result, transmitted signal 4245 is received by vehicle 4270 on antenna 4250. Thus the data provided by image sensing means 4210 is first relayed from the first transmitter 4220 to the vehicle 4260 where it is in turn relayed by transmitter 4240 as a second transmitted signal before arriving at the user's vehicle 4270. Although in the illustrated embodiment one interstitial vehicle 4260 was shown, in practice any number of vehicles may relay successive signals to other vehicles within their respective transmission ranges. Furthermore, where the vehicle (as in FIG. 41) has both it's own sensor and a receiver, and a transmitter (as in 4240 in FIG. 42), the relay can be additive, conveying (relaying) both received data and locally sensed data. This permits the invention to be used by users in vehicles significantly distant from a low power transmitter.
FIGS. 43 A-43B illustrate land-based call operation relative to a moving vehicle. In FIGS. 43 A-43B, there are three land-based transmitters 4320, 4330, and 4340 each transmitting respective content signals 4325, 4335, and 4345, respectively. In FIG. 43 A, the signals 4325 and 4335 are within the range of and are received at antenna 4310 for display on display 4315. In FIG. 43B, the vehicle 4305 is in range of signal 4335 as received by antenna 4310 for display on display 4315.
FIG. 43C is a flow chart of a control program to implement the cell handoff operation for the system controller of FIGS. 43 A and 43B.
FIG. 44 shows an example of a display in accordance with the present invention, wherein a road map is shown indicating the location of the user's vehicle 4400, locations of junctions 4410 superimposed on the road map, and known emergency vehicle locations 4430. Also shown are indications of travel time between junctions 4410, where an example outbound travel time is shown at 4440. The present invention allows the user to touch the location on the screen, for example the location of the emergency vehicle 4430, at which point the view corresponding to the traffic congestion in that area is shown as in FIG. 45. Positioning information can be locally computed or detected and/or based on a GPS-based system.
FIG. 45, responsive to information from any of the plurality of received signals, shows a directional indication 4510 superimposed on the display indicating to the user that the most efficient way to bypass this particular congestion would be, in the illustrated example, to drive to the left. FIG. 46 illustrates an alternative embodiment of the present invention, wherein a plurality of sources of traffic condition data communicate that data, and wherein one or more vehicles have receivers for receiving the communicated traffic condition data from one or more of a plurality of sources, and provide processing and display internal to the vehicle to provide a display of traffic conditions responsive to the traffic condition data, in a manner as described consistent with the detailed description herein elsewhere.
Referring now to FIG. 46, a plurality of sensor subsystems 4660a, 4660b, and 4660c, respectively each having antennas 4620 provide for sensing traffic conditions, imaging and other information, and transmit respective traffic condition data A, B, and C, (which can also contain optional location data for the respective sensor subsystem originating the signal), to provide communicated traffic condition data. Additionally, an aircraft 4670 is illustrated as having a sensor subsystem 4672 (imaging sensor) and 4673 (preprocessor and/or transmitter) which provide output communication via antenna 4674 of traffic condition data D. The traffic condition (A, B, C, D) can then be communicated to one or both of a centrally located relay or accumulation center 4650 which then rebroadcasts the traffic communication data signals A, B, C, D, and/or can be communicated directly from the sensor subsystems 4660a, 4660b, 4660c, and airborne sensors subsystem and transmitter 4672, 4673, and 4674, to directly communicate traffic condition data (A, B, C, D) to vehicles with receivers, such as vehicles 4610 and 4611. As illustrated, vehicles 4610 and 4611 each have respective receiving subsystems 4612 and 4613, which provide for receipt of the traffic condition data communications (A, B, C, D) for processing and display internal to the respective vehicles 4610 and 4611. The communicated data can be received directly from the originating data sources, and/or can be received from a centralized communication center 4650 or a relay network setup off of the centralized or other type of gathering center 4650, in a manner consistent with that described above herein relative to the relay and other gathering redistribution communications. Additionally, the central center 4650 can provide for local display. In addition to the traffic condition data illustrated in FIG. 46, it is to be understood that any type of traffic condition data communication can be included herein, such as other types of stationary fixed sensors, other types of moving sensors, whether airborne or ground- driven, as well as buried sensors, infrared communication data, and radar communication data. All of these sources of traffic condition data can be provided for in accordance with the present invention, and either the central system 4650 or the vehicles receiving the signal can contain the appropriate processing to provide for selection of relevant communicated traffic condition data for the vehicle. These decisions can be based on user input, positional data such as GPS for the respective vehicle, relative position of sensor subsystems based on location data associated therewith, etc.
Referring to FIG. 47, there is illustrated a control receiver processing and display subsystem, such as 4612 or 4613 in FIG. 46. Multiple sources of data communication are illustrated, including sensor traffic condition data 4710 (A), sensor communication data 4720 (B), traffic condition communication data 4730 (C), traffic condition data 4740 (D), and global positioning signal 4750 (GPS) originating from satellite communication. Additional communicated data sources can also be provided, including radar, ground sensors, etc. Antenna 4760, such as on the vehicles 4610 or 4611, or the central system 4650, all of FIG. 46, provides for initial receiving and coupling of the communicated data signals (A, B, C, D, GPS) for coupling to receiver 4770. Receiver 4770 provides for receiver decoding and filtering of the incoming communicated data, including source selection and identification, and provides decoded received data to the processor 4780. The processor 4780 provides for signal processing and filtering, and formatting for display presentation of the received decoded data. A user input device 4785, such as a keypad, joystick, touchpad, or other input means, provides for user input and partial selection and control to permit user control of displayed information, including selection of sources, types of display, etc. The processor provides display data output to the display 4790 which provides a display presentation in accordance with the processor output. The display presentation is of the type as described elsewhere herein, and any of the types of displays as mentioned elsewhere herein may be utilized, including CRT, LCD, electroluminescent, etc.
FIG. 48 illustrates the software pseudo-code for the operation of the control system of FIG. 47. Note that FIG. 47 is an enhancement and alternative embodiment to the system as shown in FIG. 19. FIG. 48 illustrates the pseudo-code functional operation and flow for the control system of FIG. 47, where there is a GPS communication data signal. At step 4810, the system detects the GPS system signal and feeds that signal to the receiver and processor. At step 4820, the processor determines the vehicle (e.g., automobile) position based on the GPS communicated data. At step 4830, the processor utilizes the GPS determined automobile position from step 4820 in conjunction with other communicated and stored data, such as other sensor signal communicated data relative locations, and computes which sensor subsystems are relevant for this vehicle for its present position, and determines both ahead and, where appropriate, the distance limits to receive sensor data from. At step 4840, the processor causes the selection of only the relevant respective signals from the incoming communicated traffic condition data (e.g., A, B, C, D). At step 4850, the processor provides formatting of the appropriate data and processing necessary to generate a display of a particular selected relevant one of the sensor subsystem traffic condition data. In the illustrated embodiment of FIG. 48, this is shown as displaying the closest, in a forward direction, traffic conditions for display. At step 4860, the processor 4780 of FIG. 47, in conjunction with the user input 4785, provides the user with the option to select from other relevant traffic conditions for display.
In a preferred embodiment of the present invention, there is a display for use by a vehicle operator, a means to receive signals that represent traffic conditions that the operator is interested in, and a processing means coupled to the receiving means that takes the information and formats in a manner suitable for display to the user. This system can be used, for example, to receive traffic telemetry information or estimated arrival times compiling traffic congestion information and displaying that on a display that an operator in a vehicle can use it to make decisions about how to pilot that vehicle. The display can be any kind of display technology such as, but not limited to, liquid crystal display, (LCD), CRT, plasma display, display built of light-emitting diodes (LED) display, a projector driver, a half-mirrored display, or a heads up display that projects the image to the user directly on the windshield or screen of the vehicle that the user is operating. Other methods that can be used include a half-mirror display which would allow the user to look through the display and see the environment directly outside the vehicle but also see the reflection of another type of display superimposed on that view. The delivery method to the user of the display can result in printing a representation of the traffic conditions of the traffic conditions on paper.
Another method of display is an audible system, which, upon receipt of information about traffic conditions, speaks selected portions of that information to the user in a language that the user understands.
All of these methods result in information, received from outside of the vehicle, being received by the user and allowing the user to make decisions about the operation of the vehicle based on that information. The direction of primary interest to the user is obviously directly in front of the user or in the direction of travel of the user, but it isn't always the case that only the forward direction is interesting. It may be significant for the operator of the vehicle to know that an emergency vehicle is somewhere behind him or her and trying to get through. This is especially important in light of today's quieter vehicles where sound from outside is significantly less audible than has been the case in previous generations of vehicles. The present invention is compatible with single and/or multiple direction information communication and display.
It is also possible that the user might be interested in traffic conditions flowing in the direction contrary to the way that they are operating their vehicle, to solve the problem of the well- known "gaper's block" where an accident or some other incident in the opposing flow of traffic causes one's own flow of traffic to slow down. The processing means that takes the signals from the receiving means and formats them for display can have a number of additional functions.
Clearly the process of taking the information received via radio frequency or microwave, or satellite, or cellular phone or any of the other common wireless transfer means has to be converted in some way so that the user can actually understand that particular data. It can be displayed as a schematic kind of information showing a stylized road and stylized vehicles; it can be actual image content showing the view of the right- of-way ahead of the operator in a pictorial form; or it can be a combination of the two. If it is a pictorial of the roadway, an imaging processing operation may be performed on that image content to make it more suitable for display. Such an image processing operation might include, for example, adjusting the brightness or contrast of the display to correct for bright highlights as a result of sun or some other light source reflecting off of the vehicles in the traffic flow.
Depending upon the source of the image, noise reduction can be required to remove a herringbone or other kinds of obvious image noise that might be present in the image. The processing can integrate a series of images over time, to result in a different representation of the traffic flow that the user might find useful. Filtering operations can be performed to sharpen edges or accentuate certain attributes of an image especially in the conditions of poor weather when fog, rain or snow may otherwise obscure important features of the road ahead. The image processing can be geared to remove altogether artifacts such as highlights or reflections or distracting road signs from the field of view. Further, the image data that is being received need not be characteristic of visible light. It can be representative of thermal, sonar, ultrasound, or other imaging types. For example, the image data can originate from a thermal imager which provides information about the temperature of its surroundings. This processing is commonly used in applications such as night vision or night spotting scopes where infrared energy is converted from the invisible spectrum to the visible spectrum. This processing can be used in the present invention so that the user can have an image of what ahead of them in the roadway even in the absence of visible light based images. Given that most features of traffic flows that might be of interest to a user generally involve heat, infrared is a particularly useful technique to use. The range of infrared at night can far exceed the range shown by a user's headlights. If a person or an animal is in the road, or another vehicle, they tend to give off heat in the infrared spectrum which would be easily detectible at quite some distance and thereby give the operator of the vehicle an early warning as to whether there is a road obstruction or some other hazard that might effect the operation of the vehicle.
The processing means can perform several different image processing operations concurrently or simultaneously. For instance, converting from infrared to visible light and performing noise reduction is a very common combination. Similarly, brightness and contrast correction are generally performed at the same time. Other image processing operations are also within the scope of this invention. There is nothing that limits the class of image processing operations to those explicitly listed herein. Detailed information on reasonable image processing steps to perform on images to make them more intelligible to a user or have them convey more information are readily available in image processing text books, such as: Computer Graphics and Applicationsr by Foley and Van Dam, and Image Processing by Gonzalez, and many other titles which are readily available from academic publishers and from professional organizations such as the IEEE.
The receiving means actually can also acquire multiple data types simultaneously from an external source, wherein the processing means is utilized to combine them. For example, (1) an image of what is ahead in the right-of-way might be combined with textual legends indicating the distance to an obstruction or another vehicle, or (2) information about the location of the operator's vehicle with respect to a specific element being displayed.
Estimated travel times, recommended speed, recommended side of the road to be on, or which lane to be in, can either be combined in the receiving means and relayed to the processing means, or they can be combined in the processing means as received.
The user can select a portion, or portions of the data received for display where certain information might be interesting if the operator of the vehicle is making rapid progress. Other times, different kinds of information might be interesting when the vehicle is not making progress. The user's requirements can change over time, and the present invention provides the user with the ability to select which kinds of information or image content are displayed. Further, the user can select to display only the data, or only the image information, or only parts of one or both, even though both can be received.
In an alternate embodiment (e.g., see FIG. 41), the display means comprises a plurality of displays, operable to provide a plurality of selectable display modes comprising combinations and permutations, wherein different images can be displayed on each display, the same images can be displayed on each of the displays, the same data or different data can be displayed on the various displays, or there can be a combination where image content is displayed on one and other received data is displayed on another display, in any combination. Means are provided for the user to select which elements of the received information end up on which displays. In the preferred embodiment, the user configures the system to show particular types of information on particular physical displays by selecting from a list of predefined correspondences between the information displays and the physical displays, user preferred correspondences between the information displays and the physical displays, and a user selectable mode screen. In one embodiment of the user selectable mode, the system shows on one portion of one screen a miniature view, or thumbnail view, of each of the kinds of information displays that the system can generate and on another portion of the same display are miniature of thumbnail representations of the available display screens. The user uses a pointing device to click on the miniature information display and then clicks on the miniature available display screen, thereby assigning that information display to that available display screen. In an alternate embodiment, the user uses the pointing device to drag a representation of the thumbnail of the kind of information display on top of the representation of the available display screen. This dragging methodology is intuitive and a similar mechanism is present in conventional microcomputer operating systems. In yet another embodiment, the user selects a thumbnail of a kind of information screen and then draws a connection line between that information display thumbnail and a thumbnail representing an available display screen. In a different embodiment, the user selects through a keypad apparatus a letter or numeral corresponding to a particular type of information display that the system can generate, and a corresponding letter or numeral corresponding to an available display screen to assign that information display type to. In an alternative embodiment, the letters or numerals may be replaced by directional arrows, or a forward/backward 'next'-type of selector. In another embodiment, the means to select comprises a 'next' and a 'previous' button, which allows the user for each available display screen to sequence through the types of information that can be displayed on that screen. The user continues to activate either the next or previous button until the desired display type appears on the available display screen. One variant of this embodiment replaces the 'previous' and 'next' buttons with a wheel control that the user rotates in order to sequence through the available types of information displays. The information that is being received by the system in the operator's vehicle can come from a number of different sources either individually or simultaneously. For example, it can receive information from a traffic emergency warning beacon placed on a roadside or wayside relaying information about the vicinity where the beacon is placed. This can be a permanent installation or a temporary installation. If it tends to be near an interchange where traffic congestion is very common it may be permanent. If it happens to be placed near where road construction is being performed, it might be a temporary kind of beacon.
Alternatively, the data and or image content can be relayed by a transmitter on an emergency vehicle. The emergency vehicle may be en route to a traffic incident or it may be on site. This feature of the invention allows the operator of the vehicle to respond to an emergency vehicle that may be in the nearby vicinity of the operator's vehicle. Such response may include, for example, getting out of the way of the emergency vehicle.
The image information and other data received by the system in the operator's vehicle need not come from a road side source or on board the user's or another vehicle that happens to be on the roadway or in the air. The information can also be provided from a central traffic monitoring location or a central data collection point. Data relayed from a central source may include both image and data content (for example, information about distances to particular exits, or travel times from exits to exits). The processing means can use this information to display continuously updated estimated time of arrival to the user based on gross overall traffic conditions as reported from the central location. Similarly, this information can be derived from any localized beacons such as a fixed or temporary wayside beacon, or a beacon on an emergency vehicle, or from signals received from other vehicles or aircraft or satellites.
In one embodiment, a central station reports travel time from a reference point in a route to the location of each known beacon, and also to the location of each known exit. As the user travels along the route and passes each beacon, the system in the user's vehicle will detect a signal strength for each beacon that gets stronger when the user's vehicle is within proximity of each beacon, and fades the user's vehicle from said beacon. In this embodiment, the user's vehicle can utilize information from a central station as a rough estimate of travel times and then receive more refined or updated information from roadside beacons as the user' s travels. As shown on the flow chart of FIG. 43C, in the first step 4381, the user enters a desired trip itinerary into the system which may include the start location, the destination location, and any way -points the user expects to pass through on the user's route. The next step 4382 resets an elapsed time counter to zero. The next step 4383 determines whether a signal is received from a central station. If such a signal is received from the central station, the next step 4384 is to store and retain the estimated travel time for each exit and to each beacon as provided to the central station within a memory of the system. Once this stored and retained step has completed, and also in the case of there being no signal received from the central station, the next step 4385 is a determination to be made is whether a signal has been received from any beacon. These beacons may be roadside beacons or beacons located on emergency vehicles or other vehicles in close proximity to the user's vehicle. If there is no signal received from any beacon, then the next step 4386 is to update the displays in the user's vehicle responsive to the current information that is known by the system and the elapsed time counter. Processing then continues at step 4383 with the determination of whether there was a single received from the central station.
In the event that a signal is received from a beacon, the next step 4387 is a determination that is made is whether the strongest signal that is being received from a beacon corresponds to the same beacon as was last the strongest signal. If this is not the case, this is an indication that the user has traveled out of the range of one beacon and into the range of a second beacon. At step 4388, the system in that situation makes note of the fact that the current closest beacon, that being the beacon with the strongest signal is now considered the current beacon. After that note, and in the event that the strongest signal is from the same beacon as it was previously, the next step 4389 is the determination of whether this particular beacon has updated information. If it does have updated information, then at step 4390, the stored and retained estimated travel time per exit and beacon provided by the central station is then updated using the information from the current beacon. The next step 4386 is then to update the displays in the user's vehicle responsive to the current stored information and the elapsed time. Processing resumes again at step 4383, looking for a signal from the central station. As a result, if the user's vehicle initially receives a report from a central station that gives it information on travel times but does not receive any other information en route, the elapsed time counter will provide the system a means to give the user a rough estimation of what the remaining travel time is, while the user is en route. If there is no signal received from the central station but the user does receive information from roadside beacons along the way, the roadside beacons provide current and up to date information that permits the system to then provide updated information regarding the total travel time.
Video cameras are commonly available from video camera manufacturers, such as Sony and JVC and Ikegami. The transmitting means, or radio frequency transmitters, are commonly available through a variety of vendors, including Motorola and other companies that specialize in radio communications and two-way radios. It is not usually a requirement for the system to use a two-way radio wave. A single direction transmitter on a beacon would be sufficient for most embodiments. In addition to the previously mentioned radio frequency and other methods of relay of information, information can also be conveyed to the operator's vehicle via an infrared receiver, or a visible light receiver, or a GPS receiver, or a cellular network receiver, or satellite receiver. In accordance with the present invention, a plurality of simultaneous receptions via any of the above means can be received and processed. An example is simultaneously receiving remote information via a cellular network, and receiving local information via an infrared receiver. The processing means has a way to discriminate between signals that arrive simultaneously. The user selects which image content or data information is desired, and the processing means automatically generates and relays video signals to the display (or displays) without requiring the user to do anything special to receive it from one source or another.
There are several ways to discriminate between the various signal sources that my be transmitting data to the present invention. The first means to discriminate relies up detecting which transceiver received the incoming signal. For example, if the system is receiving signals from both an infrared transceiver and a radio frequency transceiver, the system is aware, based on which input the signals arrive in, what the signal source of that particular signal was. An additional means to determine or discriminate the source of the signal is an ID code that is transmitted along with the message or data or image from each transmitter that identifies the source and also the type of information being conveyed. For example, the ID code may indicate that the source is from a helicopter hovering over the traffic area, from a roof-top camera, from a camera on a particular billboard, from an image gathering source in a mobile or emergency vehicle, or from a central station. The types of information that might be conveyed include for example, video information, image information, congestion information, traffic pattern information, travel time information, road condition information, the sensor's location, weather condition information, construction alerts, a warning of road hazards or of disabled vehicles or locations where vehicles such as a farm vehicle might be moving slowly. The sources and data types permit the system to intelligently select the best source and type of information for the user based on the desired image display. For example, traffic congestion information that is relayed from a mobile vehicle in the immediate proximity of the user is more likely to be accurate than the same type of information available from a central reporting center, and therefore, the system could intelligently permit on-site or near-by information to override information provided by a central information center. In another example, the information provided from a mobile vehicle, such as a view of traffic congestion or information about road hazards, might be overridden by the similar type of information being transmitted from an emergency vehicle in the area. As yet another example of the intelligence of the system, the system may elect to override purely informational displays requested by the user, such as travel times or congestion information, and display road hazard information or adverse weather condition information that require the more immediate attention of the user.
Examples of how to automatically generate and relay signals is covered hereinabove, such as in the description of how the user selects a particular kind of information display to appear on an available physical display.
The system can have a number of video generation circuits such as video adapters, that are commonly used for personal computing. Examples are VGA devices manufactured by ATI or Matrox, or S3. All of them provide graphic chips that are specifically designed to generate high resolution displays (examples of these are available on the World Wide Web). In addition, there are a number of older display generation devices which will work on lower resolutions that are probably more typical of what would be in a moving vehicle.
In a preferred embodiment, the display that the user observes is actually located in the user's vehicle. It may be located so that the operator of the vehicle can see it, or so that another occupant of the operator's vehicle is able to monitor the display. In jurisdictions where a operator-visible display is not permitted by law, having the display visible by a co-pilot is the advised method of operation. Alternatively, or additionally, audio or speech can be utilized instead of a visible display. In an alternate embodiment, the display may actually be external to the user's vehicle such as a roadside billboard video display, or a system of indicator lights (or display elements (e.g., character, live)) either in the roadway, above the roadway, or to the side of the roadway giving information, instead of or in addition to a display within the user vehicle. Roadside displays may also include textual displays that would relay information for the user to read as the user is travelling. In accordance with one embodiment of the present invention, the display is incorporated in a central monitoring location where that information is used by one or more users at the central monitoring location, such as to determine, for example, emergency response to traffic conditions, responses to temporary congestion on the basis of construction, or normal and abnormal traffic flows. In a preferred embodiment, the invention includes an image or other kind of sensing means (that is supported in one of a plurality of positions with respect to the operator's vehicle) coupled to a processing means which provides a display output responsive to information received from the sensing means for display to the operator.
In one embodiment, a camera, or other image sensing apparatus, is supported on a vehicle providing a view of traffic around the vehicle, and relays that view via the processing means to a display for viewing by the operator or other person of that vehicle. This allows the operator to have a viewpoint of traffic conditions different from the operator's normal perspective of sitting within the vehicle. For example, the operator's normal perspective might be obstructed from seeing something that is ahead of the operator in the traffic flow, because a large truck was in front of the operator's vehicle. Utilizing the present invention, the operator can see around or over the truck, thereby giving the operator a visual representation of what exists beyond the immediate obstruction.
The sensing means or camera can be on a fixed mount. Alternatively, the support can be a retractable system that allows the camera to be selectively elevated sufficiently high above the operator's vehicle to be able to see over large obstructions, and/or it can also be extended out from either side of the operator's vehicle thus allowing the operator to see around a traffic obstruction that is viewable from a different angle. The support can be a boom arm that allows a plurality of these motions to be accomplished by lifting the sensor above the top of the vehicle and independently or simultaneously translating or shifting off to one side.
In one embodiment, the retractable support comprises a hydraulically actuated means to lift the camera up. In an alternate embodiment, it can be an electric motor driving a mechanism similar to a retractable radio antenna for a car. In other embodiments, it can be implemented with a scissors lift device, telescoping rod arrangement, a single arm boom, or multiple arm jointed boom similar in geometry to what you might find on a snorkel fire engine.
Any type of support mechanism that supports the sensors or cameras is compatible with the present invention. Retractable supports are commercially available from many sources. Examples include all major automobile manufacturers and parts suppliers that supply retractable antennas for major automobile manufactures. On an industrial scale, there are also numerous commercial companies that supply the hydraulic lifts that lift up microwave dishes for the mini-cam vans, and have a similar product on a smaller scale that can be used for smaller vehicles.
One such support comprises a plurality of nested cylindrical tubes of sequential decreasing diameter, having alternately swaged ends to prevent the tubes from separating from one another, said tubes configured in a telescoping arrangement, with an electric motor (such as a 12-volt motor commonly used to operate automobile accessories) that drives a take-up spool onto which a semi-rigid support is coiled with one end of said semi-rigid support affixed to the smallest cylindrical tube. As the motor operates in a forward direction, the semi-rigid support is unwound from the spool and thrust into the telescoping arrangement of tubes forcing said tubes to extend telescopically. When the motor operates in the reverse direction, said support is withdrawn from the tubes and wound around the spool, thus forcing said tubes to collapse and shorten telescopically into one another.
In a preferred embodiment, the support means also includes a means to allow the camera or sensing device to be further altered in position and orientation, such as the ability to pan the sensing means from left to right to allow it to see (e.g., sense, detect, perceive) information to one side or the other of the vehicle; to tilt the sensing means up or down, allowing it to sense information from close to the vehicle to far away from the vehicle, or perhaps even an aerial view from above the vehicle. The support also has the ability to roll the camera from side to side, either to present a different kind of image via the sensing means or to compensate for any motion that might be imparted by the support means. In an additional embodiment, a zoom mechanism allows the camera or sensing means to frame and focus on an area of interest responsive to the operator of the vehicle. The sensing means can relay data to the processing means via a wired or wireless technology or via fiber optic cable. Data can also be conveyed using the support as an electrical means to communicate. Alternatively, a transmitter in the sensing means relays data down to a receiving means actually on the vehicle. Data transmission can be infrared or RF or microwave or visible light or any of the transmission technologies that have been previously discussed herein. Specific examples of fiber-optic cables include cable produced by Nippon Sheet Glass of Japan, or Owens-Corning Corporation, which is a U.S. Company. Types of wired conveyance include cables of coaxial or other variants that are produced by Belden Corporation, a US Company. The most likely common variant is a standard 75 Ohm coaxial cable as used in video distribution. Infrared transmitters and receivers are available in discrete form through such vendors as
RadioShack, as well as from a number of other vendors or manufacturers, such as Jameco, Digikey, and other electronic parts suppliers. RF, radio-frequency, transmitters and receivers are as already discussed earlier herein, such as from Motorola, Qualcomm, Sony Corporation, Analog Devices, etc. These are all vendors that produce products that are designed for use in wireless transmission. In a prefeπed embodiment, the sensing means detects visible light images using for example, a camera or CCD array or vidicon tube. All of these technologies are capable of producing signals representative of a visible light image. This visible light image may require some processing to ideally present the information to the user. For example, a selected portion of the image might be of interest so the user can electronically zoom in on a section of the image. In a similar fashion the user can apply brightness or contrast correction to make out fine detail, or filtering to bring out edges or resolve differences between elements of the image.
Some image sensing means are more sensitive than others to noise induced by the operation of the vehicle itself of sources of interference in or around the vehicle.
Performing noise reducing processing on the data from the sensing means can improve the displayed image.
Some image sensing means, particularly CCD arrays, tend to have sensitivity in the infrared regions of the spectrum. This is useful from the standpoint of a preferred embodiment of the invention in that it allows the system to see heat signatures of vehicles or persons or other elements representative of traffic or human beings around the user's vehicle. However, since the average person cannot see in infrared, the image received by the CCD array is processed and converted to provide a perceivable visual display representative of a visible light equivalent so that the user can actually observe the infrared data.
Multiple of these image processing operations can be performed simultaneously to meet the user's needs. In a preferred embodiment, the user is be able to select any or all of them simultaneously.
Where visible and infrared sensing is desired, multiple-function sensors or multiple separate sensors can be utilized.
The sensing means is not limited to sensing just image content; there can also or alternatively be a data source within view of the sensing means. For example, if the sensing means comprises a visible light image sensor or camera, it can receive a coded transmission via visible light sent from a roadside beacon. Information content can be relayed via a flashing strobe off an emergency vehicle or via a roadside sign. The flashes are be sensed by the camera and information is extracted by the processing means for further formatting and display for the user. This is analogous to using a signal light to communicate via Morse code between ships at sea in World War II. Obviously with electronics and electronic control of both transmission and reception, one can transmit a significant amount of information. In one embodiment of the present invention, the information thus transmitted includes information about an obstruction in a road, detours, recommended alternate paths for travel, estimated time to a particular exit, travel time between points, etc.
In a preferred embodiment, the processing means selectively chooses some of this information, under control of the user, to display selected portions of it on a screen. For example, the user might be interested in a detour or possible alternate paths, or might be particularly interested in arrival time at a destination.
The processing means does not need to convert all the data that is received or sensed into a form suitable for display. For example, in the case of an infrared sensed image of the traffic representative around the user's vehicle, the system might selectively choose to ignore extremely bright heat signatures of exhausts of vehicles, thereby performing a threshold function where received infrared signals above a certain threshold are ignored or not processed and not conveyed to the user, as they contain irrelevant information.
In another preferred embodiment, where the support means is mounted to a vehicle, its deployment is responsive to the vehicle's speed sensing input such that the support will not deploy unless the vehicle is moving below a preset speed threshold. The retractable support can also be set to deploy (to elevate) the camera only if when the vehicle is slower than a reference speed or stopped. This would function as an additional safety feature in that the operator would not be able to deploy the camera or be distracted by a display of traffic conditions around the operator's vehicle unless the vehicle is already slowed or stopped as a result of congestion and traffic.
In another embodiment, the support is sufficiently strong to allow deployment and operation of the camera even when the vehicle is moving at higher speeds.
The position and elevation of the deployed camera can also be controlled as related to the speed of the vehicle. For example, it might be desirable to deploy a camera at a height of several feet above the vehicle if the vehicle is moving relatively slowly, where it may be desirable to deploy the camera at a height often or more feet above the vehicle if the vehicle is moving in a faster fashion, or vice-versa, or both, thereby permitting the sensing means to see both closer and further ahead in the flow of traffic from the vehicle's position. In an alternate embodiment, the support means deployment is responsive to other factors such as relative wind speed. The processing determines the support means deployment based upon factors such as the vehicle's speed and the direction of the ambient wind. This would, for example, prevent injury to the apparatus as a result of deploying on an otherwise motionless vehicle in a heavy cross-wind. In one embodiment, the retractable support allows the camera to retract down to a rest position which is still an operable position for the system. For example, the sensing means can be retracted to a position that is still capable of looking forward or to one side of the vehicle. In another embodiment, the retracting means may actually retract the sensing means within the vehicle thereby protecting it from theft or vandalism when not in operation.
In another embodiment, the retractable support is under direct control of the operator thus allowing the operator to make decisions about whether or not to deploy the imaging sensor, and where to deploy the imaging sensor, relative to the vehicle. In a preferred embodiment the methods of controlling the deployment of the retractable support are combined to provide the system with multiple modes of operation, wherein the system is responsive to vehicle speed and relative wind speed in one mode, and is also responsive to a user control in another mode, and where the operator can override either or both the vehicle speed and wind speed sensing apparatus in yet another mode. In a preferred embodiment, the retractable support is affixed to the user's vehicle. In an alternate or additional embodiment, the retractable support that positions the sensing means is independent of the vehicle and is affixed with respect to a fixed structure such as a garage, a parking structure or a roadside fixture. The support for the sensing means can include any wayside structure such as a light house, a street light, a marker buoy, an exit or entrance sign, some other informational sign, a toll booth, a fixed structure adjacent to the right-of-way, a moveable platform (moveable with respect to the right-of-way), a crossing signal for railroad tracks, a power pole or other adjacent utility poles providing a vantage with sufficient altitude to be of use, with sufficient support structure such as attached to concrete (or other construction) of lane dividers, weighted down, aircraft based, underground based, or other means not expressly specified here. Regardless of where the actual support is located, the optional functions of being able to pan the sensing means from side to side, tilting the sensing means, rolling the sensing means, and/or zooming the sensing means would apply in all of the mounting variations. Similarly the operations of pan, tilt, roll and, zoom can be simultaneously acted upon the sensing means in order to orient the sensing means to be the most advantageous for the operator of the vehicle. In one embodiment, the sensing means is a camera with either a fixed focus lens, or a variable focus lens supporting different depths of field. In a preferred embodiment, the camera has an automatic exposure correction system such as commonly available on consumer camcorders and similar products where the lens opening is automatically adjusted to accommodate for ambient light and other lighting conditions. The same exposure technology can be applied in an analogous fashion for an infrared sensing apparatus to automatically adjust for the total amount of signal received in an infrared spectrum.
In another embodiment, the sensing means has a plurality of sensors, of homogeneous or heterogeneous types. For example, a sensor comprising both an optical sensing apparatus (such as a television camera) and a CCD array (for sensing infrared imaging) may be part of the same sensing means. Alternatively, one CCD array with functionality in both areas of the spectrum could be used.
In another embodiment a plurality of homogenous sensors such as an array of CCD arrays (or an array of other imaging sensors) looking in the same direction or in slightly different directions provide a wider field of view without requiring panning or tilting operations by the user.
In an alternate embodiment, multiple sensors are used to produce a stereoscopic image of representative of traffic around the operator's vehicle, thus allowing the operator to use depth perception on suitable displays, such as heads up displays supporting depth perception (or with suitable display processing on a conventional display), to make assessments about the relative positions of vehicles thus displayed. The processing can combine the stereo images or multi-viewpoint images thus obtained by a plurality of sensors to create and modify the display presentation. For example, processing can be used to add highlighting or assigning of color to objects that are closer that might be of more significance to the operator of the vehicle. The electronic visualization system in accordance with the present invention is used to present an electronic representation of traffic flow in and around location of a vehicle. This is accomplished by having a sensing means designed to detect vehicles from an elevated perspective position relative to the operator's vehicle. This allows the sensing means to detect vehicles that would ordinarily not be visible to the operator of the vehicle from the typical operator's position. By elevating a sensing means above the operator's vehicle, a new perspective is gained in a bird's eye view of the traffic situation, and that content from the vehicle sensing means is communicated to a processing means in the user's vehicle which the processing creates for the user a visualization representative of traffic patterns around the vehicle. This allows the user, for example, to have a visualization of a traffic blockage not located immediately next to his or her vehicle and thus provides advance warning to that operator of the vehicle of the need to reposition within the traffic flow (e.g., to the left or right lane) in order to avoid that blockage.
The data that is sensed by the vehicle sensing means can be representative of video data, individual still images, or other sensor data signals, conveyed to the processing means. The processing means can provide for display of the data directly as video and it can perform processing such as feature detection and content recognition in order to generate the electronic visualization display representative of the traffic.
The data from the overhead perspective can come from a sensing means actually attached to the vehicle via some support (which can be a retractable support), and/or it can come from a sensing means attached to a different vehicle or vehicle that is not part of the traffic flow, such as a helicopter hovering over the traffic flow, or a sensor attached to some fixed or moveable structure in the vicinity of the traffic flow.
The data from the sensing means is communicated to the processing means of the user's vehicle. There can be multiple sensing means in use simultaneously, for example a sensing means present on support affixed to the user's vehicle, another sensing means affixed to a roadside sign, another sensing means affixed to a building near the roadside, another sensing means in a helicopter hovering over the area, and another sensing means attached to another vehicle in the vicinity of the operator's vehicle. Each sensing means can be stationary or mobile. The processor accepts image or data content from all of these different sources, or a user's selection of these sources, and provides for combining them as required or otherwise integrating the information for a coherent display to the user. This display can involve overlays and/or picture-in-picture visual presentations, both well known in the art in video and image presentation. Alternatively, the images received from multiple sensors can be simultaneously displayed as a mosaic of smaller images, allowing the user to select a view of any one of them for larger display.
In a preferred embodiment, the processing means can be programmed or controlled by the user to sequence through a plurality of different modes of display. For example, the user can select cycling at a user controlled rate between different views available to the user's vehicle from other sensing means, or perhaps from multiple imaging sensors that are on the user's vehicle itself. The cycling operation continues until the user selects a particular view to stay with, and it periodically updates the images available based on the availability of image vehicle sensing sources nearby becoming available, and other sources becoming unavailable. In an alternative embodiment, a system on another occupant's vehicle can transmit image content or data to other vehicles nearby in a traffic flow, including the operator's vehicle, which receives data from the other vehicle's sensing means via wireless transmission. This operation conveys image data from another vehicle to the user via one transmission "hop". In analogous fashion the user's vehicle may also include a transmitter that relays information received and/or sensed (via a sensor local to the user's vehicle) to other vehicles in the traffic flow. The invention thus grants an operator of a vehicle a view of traffic significantly forward of his or her position based on having images relayed through multiple "hops" via a multiple sensing and relaying means from other vehicles or from other fixed locations. As a result this invention implements a cooperative network where image data beyond the view of a user in the traffic flow can be relayed from quite some distance from the user's vehicle via a series of "hops" either from mobile platforms such as other vehicles in the traffic or through relatively stationary platforms such as buildings or roadside fixtures.
In supporting the function of relaying data within the traffic flow to other compatible systems, the processing means provides additional functions, such as data compression, data encryption, and or data origination tagging. Data compression is used to reduce the amount of data transmitted to a reasonable amount to send via any of the previously mentioned transmission and reception means (e.g., radio frequency, infrared, and so forth). Data encryption can be used if the image content was being sensed for a purpose other than immediate supply to other operators of vehicles in the area (for example, relayed to a central monitoring area where the information is used and subsequently resold in some fashion).
Data origination tagging is simply a way of identifying the origin of the data being transmitted from a given vehicle. For example, if a vehicle operator is using the present invention to transmit image data that is being acquired by an image sensing means mounted to that operator's vehicle, it would be tagged with an identification indicating which vehicle (or that vehicle's location) is sending that data and its absolute location as determined by position relative to local way points, fixtures or landmarks, or relative to the global positioning system. Data origination tagging allows other receivers in the traffic flow to use tag information to identify where in the traffic flow this transmission originates from, and if it is relevant for their own respective use.
Other data origination information that can be transmitted includes the relative and absolute direction of flow. For example, if a given user's vehicle is travelling in a northbound lane, other northbound vehicles might be very interested in receiving transmissions from a northbound vehicle ahead of them, while they would obviously have no interest in receiving information from northbound vehicles that are somewhere south of them and therefore behind them. Southbound vehicles would likewise probably have little interest in information transmitted from northbound vehicles regardless of their relative location to the southbound vehicle. Data origination tagging is a powerful feature because it allows the user select from the plurality of images and data that arrive at the operator's vehicle from a variety of different sources, and to collapse those data sources down into a manageable number for subsequent processing and display.
In a preferred embodiment of this aspect of the present invention, information is transmitted from an operator's vehicle in a broadcast fashion thus allowing other vehicles to selectively discriminate between the totality of signals being received to determine whether any particular transmission is relevant. Alternatively, the transmission from a visualization or early warning traffic system present in the vehicle can be made highly directional, such as a hooded light source or directional communication beacon directed to a particular direction from the user's vehicle, or to particular reception sites located with respect to the operator's vehicle (for example, a particular roadside collection point or central monitoring location). In the case of a sailboat incorporating this traffic visualization system, the image data can be communicated in a direction downwind to other craft such as those that would be required by the rules of navigation on the water to yield to the sailboat, if they are equipped with similar systems, thus allowing the operators of the other vehicles to take action to avoid collision or otherwise operate their vehicles in a safer manner.
In yet another embodiment, data is communicated to the user's vehicle and processed to provide for the display to the user of a schematic view of traffic conditions surrounding the operator's vehicle. This can be accomplished by using (1) a number of data or image content sensors located at different positions with respect to the right of way and (2) a central (or distributed) collection apparatus comprising processing and transmission means that use the data collected by the various sensors and present it in a broadcast fashion such that vehicles within the traffic flow could receive it and format it in a display suitable for the user. For example, a roadway containing sensors detecting positions of vehicles can relay that data to a central location where that data is assembled into an instantaneous mapping of traffic on that roadway. That mapping is then transmitted or broadcast to the vehicle safety early warning systems, which use the position of the operator's vehicle relative to roadside beacons, or landmarks, or the global positioning system, or another positioning system, or sensing the ID's of the roadside sensors, resulting in a comprehensive display of nearby traffic at whatever resolution the user might require, integrated in with a methodology to allow the user to select a destination for a given roadway, such as a particular highway exit.
In accordance with the present invention, the user is given a comprehensive display indication (representative of the traffic flow around the vehicle) of the amount of traffic that would have to be navigated in order to reach the destination, and any road hazards or other obstructions that might be sensed by the network of sensors in and around the highway. The system can even instruct the user either via display or audio, such as that the middle lane of three lanes of traffic happens to be moving the fastest in the area where the user is located, and that when safe to do so, the user should effect a lane change to be in the middle lane to make the best forward progress.
This type of system can also be implemented without a separate network of road sensors, such as by using the previously disclosed imaging sensors on roadsides or structures or mobile platforms to collect the same data representative of the overall traffic patterns. Similarly the processing means on board the user's vehicles can distill those details that are relevant to the particular user from the data being broadcast regarding the instantaneous traffic flow.
In another alternate embodiment, the information is relayed locally or from a central location about non-traffic factors that may affect traffic but aren't directly caused by traffic. For example, local flooding conditions, local icing conditions on bridges, or other very local weather patterns might be relayed to the user's vehicle from a sensing apparatus located roadside or within the right-of-way thus giving the user advance warning of those hazardous conditions. For example, sensors embedded in an overpass may detect whether or not the road surface has sufficient moisture on it and is sufficiently cold to have frozen over. This information can be broadcast from a localized emergency beacon thereby warning travellers about to arrive at that overpass to slow down. Additionally, information can be relayed about toll plaza booth availability for toll roads where given lanes of traffic are spread out over a plurality of toll booths. There may be an indication that one or more booths are inoperative at the given time or have an unusual traffic pattern associated with them. That information allows the user to select a more expedient toll booth lane to be in. Having that knowledge in advance before aπiving at the toll plaza would make the transit of the toll plaza more efficient for the user.
In accordance with the present invention, the display need not be limited to a display actually present in an operator's vehicle. The display can also or alternatively be located separate from any user's vehicle, such as housed in a structure, for example, for study of local traffic conditions by police or news media, or to analyze if a municipality needs to affect a change in a local right-of-way. It might be used in an alternate embodiment as part of a central monitoring station where traffic patterns over all are analyzed and reported. In the example of a roadside display such as a billboard type display (or other array of light display), the sensing means can be local to the display (such as one mounted on or affixed to that structure) thereby providing a fixed local view. Alternatively, the sensing means for such a fixed roadside display can be located some distance from the actual display itself, giving user's advance warning of an area significantly distant. The case of a fixed structure blocking direct view of the path of the road is commonly evidenced by expressways winding through urban areas. Those blocking buildings prove to be ideal mounting points for fixed cameras to observe traffic patterns in that they would allow a direct view (to users approaching the buildings) of obstructions that may be beyond them. In a preferred embodiment, the structure supporting an image sensing means is isolated from shock or vibration. If the sensing means is actually affixed via a fixed or retractable support on a vehicle, the preferred embodiment is to have the assembly shock mounted in a fashion that reduces the amount of vehicle vibration coupled by the sensor. Alternatively, using techniques that are well known in the art of consumer camcorders, jitter reduction may be performed by the processing means to reduce the appearance of vibration in a sensed image. For installation such as at the top of a mast of a sailboat, the mounting for the sensing means may also include gyroscopic stabilization. This might also be used if the sensing means is located in a mobile platfoπn such as a helicopter. There is no technical reason other than cost that would prohibit this from being used in a user' s vehicle, which would also allow a very stable image to be provided to the user. The ability to use a retractable support structure located on the operator's vehicle, which allows an image sensing means to be positioned at a height above the operator's vehicle, enables (for example) a safer traffic passing operation where the vehicle operator would be able to look beyond the vehicle directly in front of his or her vehicle to determine whether or not it is safe to pass.
In a preferred embodiment, the vehicle ahead that is blocking the operator's vision would have its own sensing apparatus and transmitter and would relay directly an image of what is located in front of it back to the operator wishing to pass. This method of relay might be via a broadcast transmitter, or in a alternate embodiment a modulated light source such as an LED or an infrared emitter transmitting data backwards from the blocking vehicle to the operator's vehicle.
This process of relaying image or data content backwards through the traffic flow from one vehicle to the next is another important aspect of the present invention, because it allows vehicles further back in the traffic flow to have the benefit of image and data content representative of traffic conditions in front of them, thereby giving them advanced warning of road blockages and/or hazards that would otherwise be invisible to the operator's of those vehicles.
In another networked vehicle embodiment, a plurality of vehicles (each equipped with image sensors and processing means and transceivers) have the ability to communicate images and data via the transceivers between one another. The group of vehicles each have a relative orientation within the traffic flow and also a relative position with respect to one another. The processing means further comprises an election system that allows each vehicle operator to select, for display presentation, data comprising either images or data content from other vehicles in a selected direction from the user's vehicle, or from the imaging sensor that is part of the operator's own vehicle.
While the description herein has concentrated on terrestrial applications, the present invention is also applicable to aircraft and space craft. Some applications of remote sensing apparatus that is retractable also applies to space craft, such as to judge docking distances or clearances. In accordance with the present invention, in a geneve context, a plurality of cameras or sensors with transceiving means send information back and forth to each other and receive data from the closest beacon or camera position. In accordance with another embodiment of the present invention, another application of this technology is using it as a black box to capture telemetry of a vehicle. For example, there are situations where police squad cars or cruisers have cameras mounted to view (out the front windshield) the activities involved in a chase or an apprehension or an arrest. These signals can be processed and used in accordance with the present invention. In a similar vein, the present invention can be used with other types of public or commercial vehicles, for example, cameras/sensors/transmitters could be put on taxicabs or on the cabs of trucks for a commercial hauler to allow them to capture video information about the operation of the vehicle. The captured video can be transmitted and/or recorded to a memory of some sort continuously while the vehicle is in operation. The memory can record over the same sections over and over, except in the event of an accident, thereby providing 10 to 90 minutes (design option) of context to any accident investigators that might happen upon the scene. So even if the operator of the vehicle is incapacitated by the accident, if the device survives, then the company or next of kin might have an indication of what happened. This can be combined with other signals and telemetry from the vehicle, such as pressure in the brake lines, manifold pressure in the engine, whether or not the engine was actually running at the time of impact, whether there was a steering input being applied, whether the vehicle was in panic braking, or any of those kinds of factual information. From a manufacturing, safety, public interest, and insurance standpoint, this type of technology is very useful and can be employed as a logical extension of what already exists in on-board computers that are monitoring engines and systems in cars, adding telemetry and external inputs, such as video and/or sensing apparatus.
From the foregoing, it will be observed that numerous variations and modifications may be effected without departing from the spirit and scope of the invention. It is to be understood that no limitation with respect to the specific apparatus illustrated herein is intended or should be inferred. It is, of course, intended to cover by the appended claims all such modifications as fall within the scope of the claims.

Claims

WHAT IS CLAIMED IS:
1. An electronic traffic visualization system comprising: a receiver for receiving content signals representative of surrounding traffic conditions from at least one location; a processing subsystem for processing the received content signals to format the content signals for display to a user; and a display apparatus for providing a presentation of the surrounding conditions representative of the traffic to the user responsive to the received and processed content signals.
2. The system as in claim 1, wherein the traffic conditions comprise traffic conditions located forward of the position of a vehicle.
3. The system as in claim 1, wherein the content signals are comprised of both data content and image content signals representative of traffic, wherein the processing additionally comprises combining both the data and the image content signals into a combined format suitable for a corresponding combined display to the user.
4. The system as in claim 1, wherein the processing selectively translates image content outside the range of humanly perceptible wavelengths into image content that can be displayed entirely within the range of humanly perceptible wavelengths.
5. The system as in claim 1, wherein the content signals representative of traffic are both received from an external source and locally sensed by at least one sensor; and wherein the processing additionally comprises processing both the received external and locally sensed content signals to provide for a display presentation to the user corresponding to both the content signals.
6. The system as in claim 1, further comprising a transmitter for transmitting at least one of the received content signals and the formatted content signals to a location remote from the vehicle.
7. The system as in claim 6, wherein the transmitting further comprises broadcasting content signals.
8. The system as in claim 1, wherein the processing comprises at least one of the operations of content combining, content discrimination, content integration, content layering, edge detection, brightness correction, contrast correction, noise reduction, integration over time, filtering, artifact removal, spectrum conversion, data compression, data encryption, and data origination tagging.
9. The system as in claim 8, wherein the processing comprises a plurality of the operations.
10. The system as in claim 1, further comprising: a sensor apparatus for sensing content signals representative of surrounding traffic conditions; positioning means for positioning said sensor apparatus in a selectable position relative to a vehicle; and wherein the processor subsystem is additionally responsive to the sensed content signals.
11. The system as in claim 10, further comprising at least one of vehicle speed sensing means and relative speed sensing means, and wherein the positioning means is responsive to at least one of the vehicle speed sensing means and the relative speed sensing means.
12. The system as in claim 1, wherein the receiver further couples and decodes transmitted signals from at least one sensor.
13. The system as in claim 12, wherein at least one sensor is external to and not attached to a user's vehicle.
14. The system as in claim 12, wherein there are a plurality of sensors, wherein each of the sensors outputs locally sensed content signals.
15. The system as in claim 14, wherein at least one of the plurality of sensors are mounted to at least one fixed structure.
16. The system as in claim 14, wherein signals from the plurality of sensors are broadcast as a subscription service, wherein each subscriber has an enabled receiver, a processor, and a display subsystem; and wherein the content signals are broadcast to the subscribers.
17. The system as in claim 1, further comprising: control logic for selecting which of the content signals will be selected and displayed.
18. The system as in claim 17, wherein the control logic is responsive to position data associated with the user viewing the display presentation.
19. The system as in claim 14, wherein each of the plurality of sensors provides content signals representative of surrounding sensed conditions, and associated origination tagging data representative of the position of the respective sensor.
20. The system as in claim 19, wherein the position is at least one of sensor location identification, absolute location, vehicle identification, relative direction of flow, and absolute direction of flow.
21. The system as in claim 1, wherein the receiver, processing subsystem, and display apparatus are all within a vehicle.
22. The system as in claim 1, wherein there are a plurality of vehicles each comprising a respective receiver, processing subsystem, and display apparatus.
23. The system as in claim 6, further comprising the vehicle, wherein the vehicle is comprised of the receiver, processing subsystem, display apparatus, and the transmitter.
24. The system as in claim 21, wherein there are a plurality of said vehicles, wherein each of said vehicles is selectively responsive to origination tagging data for selecting which of the respective content signals to process and display.
25. The system as in claim 24, wherein each of said vehicles has associated position data; wherein each of said vehicles is further responsive to the associated position data for each of said respective vehicles in selecting which of the respective content signals to process and display.
26. The system as in claim 21, wherein the vehicle has associated position data, wherein there are a plurality of broadcast content signals, wherein the selection of which of the content signals to process and display is selected for traffic conditions located forward of the respective vehicle.
27. The system in claim 26, wherein there are a plurality of said vehicles each moving in a respective direction within a traffic flow, wherein at least one of the vehicles provide for relaying content signals in a direction back through traffic flow relative to the respective vehicles' direction.
28. The system as in claim 1, wherein the display apparatus is a roadside display device alongside a road, viewable from along the road.
29. The system as in claim 1, further comprising: sensing means for sensing content data representative of traffic conditions; wherein the processing means processes the content data to convert the sensed content data into a format suitable for at least one of transmission and display; and communication means for communicating the content data, responsive to the processing means, for providing at least one of a transmission of and a reception of processed content data.
30. The system as in claim 10, further comprising support means having a first end and a second end, wherein the first end is coupled to the vehicle and the second end is positionable to extend beyond the vehicle, for supporting the sensing apparatus in a plurality of positions relative to the vehicle.
31. The system as in claim 29, wherein the communications is accomplished via at least one of wires, optical fiber, optical conduit, radio frequency waves, microwaves, infrared light waves, satellite communications, cellular communications, and visible light waves.
32. The system as in claim 1, wherein said display presentation is comprised of at least a visual presentation, wherein the visual presentation operates in one of a plurality of modes of visualization and view, comprising at least one of an actual overhead view, an actual perspective view, an actual side view, a schematic overhead view, a schematic perspective view, a schematic side view, a virtual overhead view, a virtual perspective view, a virtual side view, a virtual right-of-way flyover view, a virtual radar view, an informational view, a navigational view, and a warning view.
33. The system as in claim 32, wherein the processing additionally auto-cycles between selected ones of the plurality of modes of visualization.
34. The system as in claim 30, wherein the support means additionally functions as an antenna.
35. The system as in claim 14, wherein selected ones of the plurality of additional sensors are affixed to other vehicles.
36. The system as in claim 30, additionally comprising a user deployment actuator, and wherein the support means provides a retractable support that selectively deploys responsive to the user deployment actuator.
37. The system as in claim 1, wherein the display presentation is one of an audio presentation, a visual presentation, and a combined audio and visual presentation.
38. The system as in claim 1, further comprising: means for sensing signal strength of the broadcast content signals; means for computing the distance between the transmitter and the receiver for identifying the distance from the vehicle to an origination position of the images, responsive to the means for sensing signal strength; and means for integrating the computed distance into the display presentation responsive to the computing.
39. The system as in claim 1, further comprising: means for providing traffic flow direction for the user; and means for relaying selected ones of the respective local content signals in a backwards direction that is opposite of the traffic flow direction of the user.
40. The system as in claim 14, wherein the system provides for display presentation on the display apparatus within at least one vehicle of surrounding traffic conditions for at least one of the plurality of sensors.
41. The system as in claim 10, wherein the sensor apparatus further comprises means to variably position the sensor in at least one axis of orientation of pan, tilt, and roll.
42. The system as in claim 40, wherein the variable positioning is further comprised of simultaneously variably positioning along a plurality of the axes.
43. The system as in claim 40, wherein there are a plurality of vehicles each having relative forward and backward directions orientation within traffic flow, and wherein the system provides for display presentation within at least one of the vehicles of respective surrounding traffic conditions for at least one of the plurality of sensors in at least one of the forward direction and the backward direction.
44. The system as in claim 22, wherein the system provides for visual communication.
45. The system as in claim 21, wherein defined content signals provide for emergency communications to a user in the vehicle.
46. The system as in claim 45, wherein the presentation is at least one of audio, visual, and audiovisual.
47. The system as in claim 45, wherein the defined content signals provide for an emergency vehicle to provide communications for presentation to a plurality of separate ones of the vehicle.
48. The system as in claim 1, further comprising an emergency vehicle communications subsystem providing for communications regarding the location of an emergency vehicle in proximity to the display apparatus for use by a user, to provide content signals to the receiver to provide notification to the user regarding the emergency vehicle.
49. The system as in claim 22, further comprising an emergency vehicle communications subsystem providing for priority communications from an emergency vehicle to provide notification regarding the emergency vehicle to the plurality of vehicles.
50. The system as in claim 22, wherein a uniform protocol is provided for compatibility in message structure between the plurality of vehicles.
51. The system as in claim 1, further comprising: means for providing a plurality of display modes; means for automatically cycling between selected ones of the plurality of display modes.
52. The system as in claim 1, wherein external content signals provide for presentation of emergency vehicle traffic conditions for presentation as at least one of an audio, a visual, and an audiovisual presentation.
53. The system as in claim 52, wherein there are two levels for the content signals, a normal level and a priority level which is preemptive over the normal level for local display on the display apparatus, wherein the emergency vehicle traffic conditions are communicated at the priority level.
54. The system as in claim 1, further comprising: a position detection subsystem providing position location data for a vehicle; wherein the processing subsystem is responsive to the position location data to selectively process and format the content signals to provide a display presentation relative to the location of the vehicle.
55. The system as in claim 1, wherein the display presentation is one of an audio, a video, and an audiovisual presentation.
56. The system as in claim 54, wherein the position detection subsystem utilizes a Global Positioning System.
57. The system as in claim 25, wherein the associated position data is global positioning data.
58. The method as in claim 22, further comprising: means for communicating emergency vehicle data as the content data to the plurality of vehicles which each provide means for generating a local presentation responsive thereto.
59. The method as in claim 58, wherein the local presentation is one of an audio, a video, and an audiovisual presentation.
60. The system as in claim 1, wherein the presentation is a display of a map indicating relative locations of the display apparatus of a user and known traffic conditions.
61. The system as in claim 60, wherein the presentation is a display of a directional indication superimposed on the display of the road map indicating a way for the user to bypass congestion otherwise indicated in surrounding traffic conditions.
62. The system as in claim 1, wherein the presentation is provided on a display internal to a vehicle.
63. The system as in claim 1, wherein the presentation is provided on a display external to a vehicle.
64. The system as in claim 1, wherein the content signals are further comprised of an associated data type, wherein the data type is at least one of video information, image information, congestion information, traffic pattern information, travel time information, road condition information, the vehicle's location, weather condition information, an emergency warning, construction alerts, a warning of locations of road hazards, a warning of disabled vehicles, a warning of slow moving vehicle locations.
65. The system as in claim 64, wherein there are a plurality of images to select for display responsive to the content signals; wherein each of the content signals includes a source identification; wherein selection of selected ones of the plurality of images for display is responsive to the source identification and the data type for the respective ones of the content signals.
66. The system as in claim 65, wherein the content signals are processed for selection based on proximity to a defined reference point.
67. The system as in claim 66, wherein the display apparatus is within a vehicle having an associated location; wherein the defined reference point is selected by the processing subsystem responsive to the associated location.
68. The system as in claim 67, wherein the associated location is determined responsive to a Global Positioning System.
69. The system as in claim 1, wherein there are a plurality of images to select for display responses to the broadcast content signals; wherein selection of selected ones of the plurality of images for display provides for display of at least one of traffic conditions for a forward direction, traffic conditions for a backward direction, traffic conditions for along a planned route, computed travel times, congestion information, road hazard information, adverse weather condition information, and emergency vehicle information.
PCT/US2000/033433 1999-12-10 2000-12-07 Methodology, apparatus, and system for electronic visualization of traffic conditions WO2001043104A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US45918399A 1999-12-10 1999-12-10
US09/459,183 1999-12-10

Publications (1)

Publication Number Publication Date
WO2001043104A1 true WO2001043104A1 (en) 2001-06-14

Family

ID=23823747

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2000/033433 WO2001043104A1 (en) 1999-12-10 2000-12-07 Methodology, apparatus, and system for electronic visualization of traffic conditions

Country Status (1)

Country Link
WO (1) WO2001043104A1 (en)

Cited By (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1324274A2 (en) * 2001-12-28 2003-07-02 Matsushita Electric Industrial Co., Ltd. Vehicle information recording system
EP1416458A1 (en) * 2002-10-30 2004-05-06 Dr. Bernard Monnier Device for control of vehicle speed
WO2004068433A1 (en) * 2003-01-27 2004-08-12 Energy Laser S.R.L. Modular surveillance system for monitoring critical environments
EP1492056A1 (en) * 2003-06-24 2004-12-29 Matsushita Electric Industrial Co., Ltd. Drive recorder comprising a camera photographing images inside and outside of a vehicle
WO2006008825A1 (en) 2004-07-16 2006-01-26 Fourie Road condition informing system and road condition informing method
EP1736361A1 (en) * 2005-06-21 2006-12-27 Robert Bosch Gmbh Night vision device for a motor vehicle
EP1647448A3 (en) * 2004-10-13 2007-01-31 Robert Bosch Gmbh Method and apparatus for improving visibility of a driver in a vehicle
US7177738B2 (en) * 2001-05-30 2007-02-13 Alpine Electronics, Inc. Vehicle management system
EP1693816A3 (en) * 2005-02-16 2007-09-05 Aisin Seiki Kabushiki Kaisha Communication device for a movable body
EP1965366A1 (en) * 2007-03-02 2008-09-03 Fujitsu Limited Driving assist system and vehicle-mounted apparatus
EP2208967A1 (en) 2009-01-20 2010-07-21 Alpine Electronics, Inc. Navigation system including route guidance function and method of route searching
US7876374B2 (en) 2006-12-07 2011-01-25 Sony Corporation Image display system, display apparatus, and display method
WO2011057715A1 (en) * 2009-11-13 2011-05-19 Valeo Schalter Und Sensoren Gmbh Method and system for generating and supplying traffic-relevant information
DE10356500B4 (en) * 2002-12-04 2011-06-22 Toyota Jidosha Kabushiki Kaisha, Aichi-ken Vehicle communication device
DE102012102693A1 (en) * 2012-03-29 2013-10-02 Continental Automotive Gmbh Method for providing traffic information in vehicle, involves receiving transmitted traffic information of secondary vehicle through primary communication device and outputting through optical output unit and acoustic output unit
US8687925B2 (en) 2007-04-10 2014-04-01 Sony Corporation Image storage processing apparatus, image search apparatus, image storage processing method, image search method and program
US8711005B2 (en) 2010-12-27 2014-04-29 Nicholas R. Musachio Variable speed traffic control system
US8797331B2 (en) 2007-08-06 2014-08-05 Sony Corporation Information processing apparatus, system, and method thereof
US8872941B2 (en) 2006-11-07 2014-10-28 Sony Corporation Imaging apparatus and imaging method
JP2015146174A (en) * 2013-02-01 2015-08-13 エリック シンクレアEric Sinclair Traffic event detection system for vehicles
EP2983152A1 (en) * 2014-08-04 2016-02-10 Eric Sinclair Traffic event detection system for vehicles
EP2177878A3 (en) * 2008-10-20 2016-04-20 HERE Global B.V. Traffic display depicting view of traffic from within a vehicle
US20160328629A1 (en) * 2013-02-01 2016-11-10 Eric Sinclair Systems and methods for traffic event detection for vehicles using rolling averages
CN106981212A (en) * 2016-01-19 2017-07-25 霍尼韦尔国际公司 Traffic visualization system
WO2018053252A1 (en) * 2016-09-16 2018-03-22 Allstate Insurance Company Electronic display systems connected to vehicles and vehicle-based systems
US10051306B1 (en) 2014-05-19 2018-08-14 Allstate Insurance Company Electronic display systems connected to vehicles and vehicle-based systems
IT201700043262A1 (en) * 2017-04-20 2018-10-20 Angelo Zizzari DEVICE SUPPORTING TECHNICAL SURVEYS, PREFERABLY APPLIED TO THE SCENARIO OF ROAD ACCIDENTS
US10133530B2 (en) 2014-05-19 2018-11-20 Allstate Insurance Company Electronic display systems connected to vehicles and vehicle-based systems
US10380642B2 (en) 2014-05-19 2019-08-13 Allstate Insurance Company Content output systems using vehicle-based data
US20200331496A1 (en) * 2016-04-08 2020-10-22 Faraday&Future Inc. Moveable-sensor for autonomous driving
CN112558008A (en) * 2019-09-26 2021-03-26 北京外号信息技术有限公司 Navigation method, system, equipment and medium based on optical communication device
FR3106215A1 (en) * 2020-01-09 2021-07-16 Psa Automobiles Sa Vehicle environment data communication method and device
US11127042B2 (en) 2014-05-19 2021-09-21 Allstate Insurance Company Content output systems using vehicle-based data
US11170638B2 (en) 2018-12-19 2021-11-09 International Business Machines Corporation Look ahead auto dashcam (LADCAM) for improved GPS navigation

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE3726065A1 (en) * 1987-08-06 1988-01-21 Friedhelm Fredrich BOX-TYPE MOTOR VEHICLE having an evaluation device in the driver's cab, which evaluation device allows an operator to assess an object by means of screens and sound transducers, the signals from a video camera being used especially for assessment of the recording angle
DE19636632A1 (en) * 1996-09-10 1998-03-12 Johannes Hanusch Identification generator system for warning signals between vehicles
EP0841648A2 (en) * 1992-09-30 1998-05-13 Hitachi, Ltd. Vehicle driving support system and vehicle therewith
EP0872710A1 (en) * 1996-11-01 1998-10-21 Seiko Epson Corporation Image/voice output apparatus and car navigation system
GB2330989A (en) * 1997-10-30 1999-05-05 Clive William Dunster Emergency vehicle having RDS transmitter for transmitting a warning signal to vehicles in the vicinity
DE19749750A1 (en) * 1997-11-11 1999-06-02 Hans E Dr Ing Speckter Self-supporting tubular mast antenna for navigation system transmitter
EP0921509A2 (en) * 1997-10-16 1999-06-09 Navigation Technologies Corporation System and method for updating, enhancing or refining a geographic database using feedback
EP0933747A2 (en) * 1998-01-29 1999-08-04 Adam Opel Ag Warning system for vehicles

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE3726065A1 (en) * 1987-08-06 1988-01-21 Friedhelm Fredrich BOX-TYPE MOTOR VEHICLE having an evaluation device in the driver's cab, which evaluation device allows an operator to assess an object by means of screens and sound transducers, the signals from a video camera being used especially for assessment of the recording angle
EP0841648A2 (en) * 1992-09-30 1998-05-13 Hitachi, Ltd. Vehicle driving support system and vehicle therewith
DE19636632A1 (en) * 1996-09-10 1998-03-12 Johannes Hanusch Identification generator system for warning signals between vehicles
EP0872710A1 (en) * 1996-11-01 1998-10-21 Seiko Epson Corporation Image/voice output apparatus and car navigation system
EP0921509A2 (en) * 1997-10-16 1999-06-09 Navigation Technologies Corporation System and method for updating, enhancing or refining a geographic database using feedback
GB2330989A (en) * 1997-10-30 1999-05-05 Clive William Dunster Emergency vehicle having RDS transmitter for transmitting a warning signal to vehicles in the vicinity
DE19749750A1 (en) * 1997-11-11 1999-06-02 Hans E Dr Ing Speckter Self-supporting tubular mast antenna for navigation system transmitter
EP0933747A2 (en) * 1998-01-29 1999-08-04 Adam Opel Ag Warning system for vehicles

Cited By (61)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7177738B2 (en) * 2001-05-30 2007-02-13 Alpine Electronics, Inc. Vehicle management system
EP1324274A2 (en) * 2001-12-28 2003-07-02 Matsushita Electric Industrial Co., Ltd. Vehicle information recording system
EP1324274A3 (en) * 2001-12-28 2005-11-02 Matsushita Electric Industrial Co., Ltd. Vehicle information recording system
US7254482B2 (en) 2001-12-28 2007-08-07 Matsushita Electric Industrial Co., Ltd. Vehicle information recording system
EP1416458A1 (en) * 2002-10-30 2004-05-06 Dr. Bernard Monnier Device for control of vehicle speed
WO2004040532A1 (en) * 2002-10-30 2004-05-13 Bernard Monnier Vehicle speed control device
DE10356500B8 (en) * 2002-12-04 2012-06-14 Toyota Jidosha Kabushiki Kaisha Vehicle communication device
DE10356500B4 (en) * 2002-12-04 2011-06-22 Toyota Jidosha Kabushiki Kaisha, Aichi-ken Vehicle communication device
WO2004068433A1 (en) * 2003-01-27 2004-08-12 Energy Laser S.R.L. Modular surveillance system for monitoring critical environments
EP1492056A1 (en) * 2003-06-24 2004-12-29 Matsushita Electric Industrial Co., Ltd. Drive recorder comprising a camera photographing images inside and outside of a vehicle
EP1770669A1 (en) * 2004-07-16 2007-04-04 Fourie Road condition informing system and road condition informing method
EP1770669A4 (en) * 2004-07-16 2008-09-17 Fourie Road condition informing system and road condition informing method
WO2006008825A1 (en) 2004-07-16 2006-01-26 Fourie Road condition informing system and road condition informing method
US7817064B2 (en) 2004-07-16 2010-10-19 Fourie Road-condition informing apparatus and road-condition informing method
EP1647448A3 (en) * 2004-10-13 2007-01-31 Robert Bosch Gmbh Method and apparatus for improving visibility of a driver in a vehicle
EP1693816A3 (en) * 2005-02-16 2007-09-05 Aisin Seiki Kabushiki Kaisha Communication device for a movable body
US7443314B2 (en) 2005-02-16 2008-10-28 Aisin Seiki Kabushiki Kaisha Communication device for a movable body
EP1736361A1 (en) * 2005-06-21 2006-12-27 Robert Bosch Gmbh Night vision device for a motor vehicle
US8872941B2 (en) 2006-11-07 2014-10-28 Sony Corporation Imaging apparatus and imaging method
US7876374B2 (en) 2006-12-07 2011-01-25 Sony Corporation Image display system, display apparatus, and display method
US8009219B2 (en) 2006-12-07 2011-08-30 Sony Corporation Image display system, display apparatus, and display method
EP1965366A1 (en) * 2007-03-02 2008-09-03 Fujitsu Limited Driving assist system and vehicle-mounted apparatus
US8265861B2 (en) 2007-03-02 2012-09-11 Fujitsu Limited Driving assist system and vehicle-mounted apparatus
US8687925B2 (en) 2007-04-10 2014-04-01 Sony Corporation Image storage processing apparatus, image search apparatus, image storage processing method, image search method and program
US10529114B2 (en) 2007-08-06 2020-01-07 Sony Corporation Information processing apparatus, system, and method for displaying bio-information or kinetic information
US10262449B2 (en) 2007-08-06 2019-04-16 Sony Corporation Information processing apparatus, system, and method for displaying bio-information or kinetic information
US9972116B2 (en) 2007-08-06 2018-05-15 Sony Corporation Information processing apparatus, system, and method for displaying bio-information or kinetic information
US8797331B2 (en) 2007-08-06 2014-08-05 Sony Corporation Information processing apparatus, system, and method thereof
US10937221B2 (en) 2007-08-06 2021-03-02 Sony Corporation Information processing apparatus, system, and method for displaying bio-information or kinetic information
US9568998B2 (en) 2007-08-06 2017-02-14 Sony Corporation Information processing apparatus, system, and method for displaying bio-information or kinetic information
EP2177878A3 (en) * 2008-10-20 2016-04-20 HERE Global B.V. Traffic display depicting view of traffic from within a vehicle
EP2208967A1 (en) 2009-01-20 2010-07-21 Alpine Electronics, Inc. Navigation system including route guidance function and method of route searching
CN102770893B (en) * 2009-11-13 2015-03-25 法雷奥开关和传感器有限责任公司 Method and system for generating and supplying traffic-relevant information
WO2011057715A1 (en) * 2009-11-13 2011-05-19 Valeo Schalter Und Sensoren Gmbh Method and system for generating and supplying traffic-relevant information
CN102770893A (en) * 2009-11-13 2012-11-07 法雷奥开关和传感器有限责任公司 Method and system for generating and supplying traffic-relevant information
US8711005B2 (en) 2010-12-27 2014-04-29 Nicholas R. Musachio Variable speed traffic control system
DE102012102693A1 (en) * 2012-03-29 2013-10-02 Continental Automotive Gmbh Method for providing traffic information in vehicle, involves receiving transmitted traffic information of secondary vehicle through primary communication device and outputting through optical output unit and acoustic output unit
US9975482B2 (en) * 2013-02-01 2018-05-22 Eric Sinclair Systems and methods for traffic event detection for vehicles using rolling averages
JP2015146174A (en) * 2013-02-01 2015-08-13 エリック シンクレアEric Sinclair Traffic event detection system for vehicles
US20160328629A1 (en) * 2013-02-01 2016-11-10 Eric Sinclair Systems and methods for traffic event detection for vehicles using rolling averages
US10133530B2 (en) 2014-05-19 2018-11-20 Allstate Insurance Company Electronic display systems connected to vehicles and vehicle-based systems
US10805659B2 (en) 2014-05-19 2020-10-13 Allstate Insurance Company Electronic display systems connected to vehicles and vehicle-based systems
US11748780B2 (en) 2014-05-19 2023-09-05 Allstate Insurance Company Content output systems using vehicle-based data
US11127042B2 (en) 2014-05-19 2021-09-21 Allstate Insurance Company Content output systems using vehicle-based data
US10838676B2 (en) 2014-05-19 2020-11-17 Allstate Insurance Company Electronic display systems connected to vehicles and vehicle-based systems
US10341709B1 (en) 2014-05-19 2019-07-02 Allstate Insurance Company Electronic display systems connected to vehicles and vehicle-based systems
US10380642B2 (en) 2014-05-19 2019-08-13 Allstate Insurance Company Content output systems using vehicle-based data
US10423982B2 (en) 2014-05-19 2019-09-24 Allstate Insurance Company Content output systems using vehicle-based data
US10051306B1 (en) 2014-05-19 2018-08-14 Allstate Insurance Company Electronic display systems connected to vehicles and vehicle-based systems
US10545711B2 (en) 2014-05-19 2020-01-28 Allstate Insurance Company Electronic display systems connected to vehicles and vehicle-based systems
US10582248B2 (en) 2014-05-19 2020-03-03 Allstate Insurance Company Electronic display systems connected to vehicles and vehicle-based systems
EP2983152A1 (en) * 2014-08-04 2016-02-10 Eric Sinclair Traffic event detection system for vehicles
CN106981212A (en) * 2016-01-19 2017-07-25 霍尼韦尔国际公司 Traffic visualization system
EP3196859A3 (en) * 2016-01-19 2017-09-20 Honeywell International Inc. Traffic visualization system
US20200331496A1 (en) * 2016-04-08 2020-10-22 Faraday&Future Inc. Moveable-sensor for autonomous driving
WO2018053252A1 (en) * 2016-09-16 2018-03-22 Allstate Insurance Company Electronic display systems connected to vehicles and vehicle-based systems
IT201700043262A1 (en) * 2017-04-20 2018-10-20 Angelo Zizzari DEVICE SUPPORTING TECHNICAL SURVEYS, PREFERABLY APPLIED TO THE SCENARIO OF ROAD ACCIDENTS
US11170638B2 (en) 2018-12-19 2021-11-09 International Business Machines Corporation Look ahead auto dashcam (LADCAM) for improved GPS navigation
CN112558008A (en) * 2019-09-26 2021-03-26 北京外号信息技术有限公司 Navigation method, system, equipment and medium based on optical communication device
CN112558008B (en) * 2019-09-26 2024-03-12 北京外号信息技术有限公司 Navigation method, system, equipment and medium based on optical communication device
FR3106215A1 (en) * 2020-01-09 2021-07-16 Psa Automobiles Sa Vehicle environment data communication method and device

Similar Documents

Publication Publication Date Title
WO2001043104A1 (en) Methodology, apparatus, and system for electronic visualization of traffic conditions
US8577549B2 (en) Information display system for a vehicle
US9376061B2 (en) Accessory system of a vehicle
JP3473321B2 (en) Display device for vehicles
US20030016146A1 (en) Enhanced vehicle hazard warning and safety features integrated with an onboard navigation system
US11697425B1 (en) Method and system for assisting drivers in locating objects that may move into their vehicle path
JP4787196B2 (en) Car navigation system
JPH09180087A (en) Traffic information provision system
WO2021131201A1 (en) Driving assistance device, driving assistance method, and program
WO2008051730A2 (en) Systems and methods for monitoring and/or controlling traffic
JP2008213759A (en) On-vehicle display device
JP2001084492A (en) Vehicle driving support device
JPH1047975A (en) Method and system for communication and communication equipment and central station constituting the system
JP7349888B2 (en) Driving support method and in-vehicle device
JP2022104107A (en) Vehicle remote operation system and vehicle remote operation method
CN111660932A (en) Device, vehicle and system for reducing the field of view of a vehicle occupant at an accident site
CN111345035A (en) Information processing device, information processing method, and information processing program
KR20090112161A (en) Navigation System including Dual Display Apparatus and Display Method of the same
JP2006119949A (en) Traveling support system

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): CA JP

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR

121 Ep: the epo has been informed by wipo that ep was designated in this application
122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP