US20010016797A1 - Danger deciding apparatus for motor vehicle and environment monitoring apparatus therefor - Google Patents
Danger deciding apparatus for motor vehicle and environment monitoring apparatus therefor Download PDFInfo
- Publication number
- US20010016797A1 US20010016797A1 US09/785,426 US78542601A US2001016797A1 US 20010016797 A1 US20010016797 A1 US 20010016797A1 US 78542601 A US78542601 A US 78542601A US 2001016797 A1 US2001016797 A1 US 2001016797A1
- Authority
- US
- United States
- Prior art keywords
- information
- vehicle
- image
- danger
- images
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
Definitions
- This invention relates to a danger deciding apparatus for a motor vehicle and an environment monitoring apparatus for a motor vehicle. More specifically, this invention relates to a danger deciding apparatus for a motor vehicle for detecting the approaching degree of an approaching object in a picked-up image of an environment of the vehicle to decide the danger of contact or collision (hereinafter simply referred to contact) with the approaching object, and an environment monitoring apparatus for monitoring the environment of the vehicle to decide the danger of contact with the approaching object.
- FIGS. 8 A- 8 C are views for explaining a change in a rear/rear-side direction (hereinafter referred to as rear-side direction) image acquired by a video camera 1 .
- FIG. 8A shows a status inclusive of the vehicle concerned.
- FIG. 8B shows an image acquired by a video camera 13 at timing t in an environment of the vehicle concerned.
- FIG. 8C shows an image acquired at timing t+ ⁇ t.
- the optical flows radially appear from an FOE (Focus of Expansion) where the road disappears. While the vehicle concerned runs, the optical flows of an object which leaves from the vehicle concerned are vectors in a direction converging toward the FOE. The optical flows on the object which approaches toward the object concerned are vectors in a direction diverging from the FOE.
- FOE Full-E of Expansion
- a conventional environment monitoring system detects the optical flows by processing the rear-side direction image acquired by a camera 13 , and uses to monitor the relationship between the vehicle concerned and the following vehicle or another vehicle running on an adjacent lane to detect the other vehicle approaching the vehicle concerned, thereby deciding danger of the contact of the vehicle concerned with the approaching object.
- a technique of searching corresponding points using two cameras is adopted. Specifically, an edge point Pa (not shown) of an object is detected in a rear-side direction image acquired by the one camera. A point Pb (not shown) of the image acquired by another camera corresponding to the detected edge point Pa is detected. The position P of the object is acquired by the pixel coordinates of Pa and Pb. On the basis of the position P of the approaching object acquired, the existence of the approaching object such as another vehicle approaching the vehicle concerned is detected and danger of contact of the approaching object is decided.
- the above prior art environment monitoring apparatus individually makes image processing according to the installed position of the camera to detect the approaching object.
- two CPUs therefor are prepared separately.
- the one is intended for processing the rear-side direction image to detect the approaching object from the rear-side direction.
- the other is intended for processing the side direction image to detect the approaching object from the side direction.
- plural CPUs for image processing must be prepared. This is problematic from the standpoint of cost.
- a second object of this invention is to provide an environment monitoring apparatus for a motor vehicle which can monitor the environment accurately and inexpensively.
- a danger deciding apparatus for a vehicle comprising:
- information collecting means 22 a - 1 for collecting information which changes according to a road structure around a vehicle concerned;
- detecting means 22 a - 3 for processing the selected image to detect the approaching degree of an approaching object
- the approaching degree detecting means 22 a - 3 which is adapted to process a plurality of images, processes only the image of the region which must be monitored so that the accuracy of detecting the approaching object is not deteriorated.
- the plurality of images include a rear-side direction image obtained when a rear region is picked up and left/right-side direction images obtained when left/right-direction regions are obtained.
- the left/right-side direction region must be monitored.
- the rear-side direction region must be monitored. Noting this, the image selected from the rear-side direction image and left/right direction images on the basis of the information which changes according to the road structure is processed.
- the approaching degree detecting means 22 a - 3 which is adapted to process a rear-side direction image and left/right-side direction images, processes only the image of the region which must be monitored so that the accuracy of detecting the approaching object is not deteriorated.
- the danger deciding apparatus for a vehicle comprises a speed sensor 23 for detecting a speed of a vehicle concerned and the information collecting means collects the speed as the above information.
- the information collecting means 22 a - 1 can divert the speed of the vehicle concerned from the speed sensor as required information.
- the plurality of images include an image which is received only when the road structure around the vehicle concerned becomes a prescribed structure, and the information collecting means collects the received image as the required information.
- the information collecting means can divert such an image as required information.
- the selecting means always selects the images of the regions which must be monitored simultaneously or a single image when it is received.
- the left/right-side direction images which must be processed simultaneously or the single image which is not selected according the road structure can be processed all the time.
- an environment monitoring apparatus for a vehicle comprising:
- information collecting means for acquiring information which changes according to a road structure around a vehicle concerned
- pick-up means for picking up different monitoring regions to acquire a plurality of images
- detecting means for processing the selected image to detect the approaching degree of an approaching object
- the approaching degree detecting means 22 a - 3 which is adapted to process a plurality of images, processes only the image of the region which must be monitored so that the accuracy of detecting the approaching object is not deteriorated.
- FIG. 1 is a block diagram showing the basic configuration of a danger deciding apparatus for a motor vehicle and an environment monitoring apparatus for a motor vehicle according to this invention
- FIG. 2 is a block diagram showing a first embodiment of an environment monitoring apparatus equipped with the danger deciding apparatus according to this invention
- FIGS. 3A and 3B are views for explaining the positions where a rear-side direction camera, a right-side direction camera and a left-side direction camera are attached;
- FIG. 5 is a flowchart for explaining collection/selection of information according to the first embodiment
- FIG. 2 is an environment monitoring apparatus for a motor vehicle equipped with a danger deciding apparatus for a motor vehicle according this invention.
- the environment monitoring apparatus includes an image pick-up section 1 picking up different monitoring regions to acquire a plurality of images; and a danger deciding apparatus for a motor vehicle for detecting the approaching degree of an approaching object such as an approaching vehicle residing within a monitoring region picked up by the image pick-up section 1 .
- the image pick-up means 1 includes a right-side direction camera 11 R which is attached to the right-side position of the front and picks up the right-side direction region of the vehicle to acquire a right-side direction image, a right image plane 12 R on which the right-side direction image is imaged, a left-side direction camera 11 L which is attached to the left-side position of the front and picks up the left-side direction region of the vehicle to acquire a left-side direction image, and a left image plane 12 R on which the left-side direction image is imaged. Therefore, as shown in FIG. 3A, when the vehicle concerned is located at a position A before the approaching position of an intersection, the left- and right-side direction cameras 11 R and 11 L can monitor the side direction road which intersects the lane concerned on which the vehicle concerned runs.
- the image pick-up means 1 further includes a rear-side direction camera 14 which is attached to the top or rear of a rear trunk of the vehicle toward the rear of the vehicle and picks up the rear-side direction region to acquire a rear direction image and a rear image plane 14 on which the rear-side direction image is imaged. Therefore, as seen from FIG. 3B, the rear-side direction camera 13 can monitor the lane concerned behind the vehicle concerned and lanes adjacent to the vehicle concerned.
- the danger deciding apparatus for a vehicle includes a storage unit 21 for storing the image acquired from the image pick-up unit 1 , a microcomputer A 22 (hereinafter referred to as “ ⁇ COM A 22 ”) which performs the processing based on the image acquired by the image pick-up unit 1 and processing of deciding danger of contact with another vehicle, a speed sensor 23 for detecting the speed of the vehicle concerned to supply speed information to the ⁇ COM A 22 , and a warning unit 24 which gives a warning when it is decided by the ⁇ COM A 22 that there is danger of contact with another vehicle.
- ⁇ COM A 22 microcomputer A 22
- the storage unit 21 includes a first frame memory 21 a , a second frame memory 21 b and an optical flow memory 21 c .
- the first and the second frame memories 21 a and 21 b temporarily store pixel data with m rows and n columns of e.g. 512 ⁇ 512 pixels and luminance of 0-255 levels which have been converted from the image acquired by the image pick-up unit 1 and supplies them to the ⁇ COM A 22 .
- the optical flow which is a movement quantity of an approaching object between two images acquired apart by a prescribed time ⁇ t from each other is temporarily stored in the optical flow memory 21 c.
- the ⁇ COM 22 includes a CPU 22 a for perfuming various computations according to a prescribed control program, a ROM 22 b for storing the control program and prescribed values and a RAM 22 c for temporarily storing data necessary for performing the computation.
- the CPU 22 a performs processing of collecting information which changes according to the road structure around a vehicle concerned (as information collecting means); processing of selecting, on the basis of the collected information, an image acquired by picking up a region which must be monitored from the plurality of images when a rear-side direction image, left-side and right-side direction images are received (as selecting means); processing of detecting, as an optical flow, a movement quantity of an approaching object in the selected images apart by a prescribed time from each other (as means for detecting an approaching degree) and processing of deciding danger of contact with an approaching vehicle on the magnitude or position of the optical flow (as danger deciding means).
- a speed sensor 23 has a running sensor (not shown) which generates a pulse signal whenever the vehicle runs by a unit distance and others and detects the speed information on the pulse signal from the running sensor.
- the speed information changes according to the road structure around the vehicle concerned. This is because the vehicle speed falls e.g. when there is an intersection difficult to see ahead of the vehicle concerned, a stopping line ahead of the vehicle, or the width of the lane concerned is narrow, and rises e.g. when the lane concerned is a speedway.
- the warning unit 24 includes a display 24 a and a speaker 24 b .
- the display 24 a informs the driver of danger in such a manner that the image acquired by the image pick-up unit 1 is displayed, or message is displayed when the CPU 22 a within the ⁇ COM A 22 decides that there is danger of contact with another vehicle.
- the speaker 24 b informs the driver of danger by audio guidance or warning sound.
- FIG. 4 is a flowchart showing the procedure of processing performed by the CPU 22 a . Now referring to the flowchart of FIG. 4, an explanation will be given of the operation of the environment monitoring apparatus for a vehicle.
- the CPU 22 a proceeds to an initial step (not shown) where the left-side direction flag F 1 , right-side direction flag F 2 and rear-side direction flag F 3 are reset to 0, and thereafter proceeds to a next step S 1 .
- step S 1 the CPU 22 a decides whether or not both the rear-side direction image and left/right-side direction image have been received. If only one of the rear-side direction image and the left/right-side direction image has been received (“NO” in step S 1 ), the CPU 22 a sets the flag corresponding to the received image at ‘1’ (step S 2 ) and proceeds to step S 4 . Namely, if only the left/right-side direction images are received, the CPU 22 a sets the flag F 1 for selecting the left-side direction image at ‘1’, and if only the rear-side direction image is received, the CPU 22 a sets the flag F 3 for selecting the rear-side direction image at ‘1’.
- step S 1 the CPU 22 a performs the processing of collecting information which changes according to the road structure around the vehicle concerned and selecting one of the rear-side direction image and the left/right-side direction image on the basis of the collected information (step S 3 ).
- step S 3 the processing of collecting information which changes according to the road structure around the vehicle concerned and selecting one of the rear-side direction image and the left/right-side direction image on the basis of the collected information.
- CPU 22 a collects speed information from the speed sensor 23 (step S 301 ) , and decides whether or not the speed of the vehicle concerned is not higher than e.g. 3 Km/h (step S 302 ).
- step S 302 If the vehicle speed is not higher than 3 km/h (“YES” in step S 302 ), CPU 22 a decides that the road structure has an intersection difficult to see ahead of the vehicle concerned, a stopping line ahead of the vehicle, and/or narrow lane concerned. Where the environment around the vehicle concerned has such a road structure, it is necessary to detect an approaching object such as another vehicle which approaches from a side road intersecting the lane concerned. Therefore, CPU 22 a sets the left-side direction flag F 1 at ‘1’ to select the left-side direction image (step S 303 ) and proceeds to step S 4 .
- step S 304 CPU 22 a decides that the road structure is a speedway or road of high priority. Where the environment around the vehicle concerned has such a road structure, it is necessary to detect an approaching object such as another vehicle which approaches from the rear of the vehicle concerned. Therefore, CPU 22 a sets the rear-side direction flag F 3 at ‘1’ to select the rear-side direction image (step S 305 ) and proceeds to step S 4 . If the vehicle speed is not higher than 3 km/h nor lower than 60 km/h, CPU 22 a decides that the road structure around the vehicle concerned does not include the left/right-side direction monitoring and rear-side direction monitoring regions, and returns to step S 11 .
- step S 4 in FIG. 4 CPU 22 a takes in the image (hereinafter referred to as “selected image”) corresponding to the flag with ‘1’ among the left-side direction flag F 1 , right-side direction flag F 2 and rear-side direction flag F 3 , and stores it in the first frame memory 21 a (step S 5 ). After a prescribed time ⁇ t, CPU 22 a takes in the selected image and stores it in the second frame memory 21 b.
- CPU 22 a performs the processing of detecting an optical flow indicative of the movement quantity of the approaching object in the selected images apart by the prescribed time, and stores the detected optical flow in the optical flow memory 21 c (step S 6 ).
- the approaching object moves in a diverging direction of the optical flow from the point where the road and others vanishes from the selected image, i.e. FOE.
- FOE selected image
- a slender window W 1 is set around a certain characteristic point P in a radial direction of the FOE set as described above (i.e. in the direction of connecting the FOE 1 to the characteristic point P) (FIG. 6A).
- a window W 2 corresponding to the window W 1 is shifted one point by one point in the radial direction from the FOE, its correlated value with the window W 1 (FIG. 6B).
- the point Q of the window W 2 where the correlated value is the maximum is taken as the point corresponding to the point P, i.e. the point on the same object.
- the movement PQ is detected as the optical flow which represents the movement quantity of the approaching object.
- the detected optical flow is stored in the optical flow memory 21 c.
- the characteristic point may be a pixel having a prescribed luminance difference from that of its adjacent pixel.
- the FOE may be a crossing point of the extended lines of the white lines located on both sides of the road picked up on the selected image.
- CPU 22 a performs the processing of deciding danger of the contact with another vehicle on the basis of the size of the optical flow stored in the optical flow memory 21 c (Step S 7 ).
- step S 8 If it is decided that there is danger of contact (“YES” in step S 8 ), in order to inform the driver of this fact, CPU 22 a performs the processing of warning of issuing a warning sound signal and/or a warning image signal to the warning unit 24 (step S 9 ), and proceeds to step S 10 .
- a warning indication “there is an approaching vehicle” is displayed on the display 24 a , or a warning guidance “there is an approaching vehicle” is issued by sound from e.g. the speaker 5 a.
- step S 10 it is decided whether or not the left-side direction flag F 1 is 1. If the left-side direction flag F 1 has been set at 1 in the processing of step S 2 or S 3 (“YES” in step S 10 ), in order to monitor the right-side direction region, CPU 22 a sets the left-side direction flag F 1 at 0 and the right-side direction flag F 2 at 1, and returns to step S 2 (step S 11 ). In step S 1 , after the processing has been made on the left-side direction image, the processing will be automatically made on the right-side direction image.
- step S 12 if the rear-side direction flag F 3 is 1 or the right-direction flag F 2 is 1 after completion of the left/right monitoring by the processing of step S 2 or step S 3 (“NO” in step S 10 ), CPU 22 a resets the left/right-side direction flags F 1 , F 2 and rear-side direction flag F 3 (step S 12 ) and returns to step S 1 .
- the speed information of the vehicle was collected as the information which changes according to the road structure. Meanwhile, in a main conventional monitoring apparatus, the side-direction monitoring has been made while the driver is seeing the left/right-side direction image displayed on the display 24 a.
- the left/right-side direction image pick-up cameras 11 are automatically tuned on to acquire the left/right-side direction images and display them on the display 24 a . Therefore, for the purpose of image selection, the acquired information on the left/right-side direction image from the monitoring apparatus can be used as the information which changes according to the road structure in the environment of the vehicle concerned. In this case also, there is no need of providing additional means for producing necessary information. This contributes to cost reduction of the monitoring apparatus.
- the car navigation system 25 includes a GPS receiver 25 b (information acquisition means) for receiving the present position information representative of the present position of the vehicle on an earth through a GPS antenna 25 a , a map data base 25 c in which the map information having road information containing a road structure is stored, and ⁇ COM A 22 for computing the passage of the vehicle concerned to a destination.
- the ⁇ COM A 22 receives the present position information from the GPS receiver 25 b , map information from the map data base 25 c and the passage information from the ⁇ COM B 25 d.
- the environment monitoring apparatus operates in the same manner as the first embodiment except the processing of information collection/selection. Now referring to the flowchart of FIG. 7 showing the information collection/selection by the CPU 22 a , an explanation will be given of the operation of the third embodiment.
- the CPU 22 a acquires the present position information from the GPS receiver 25 a (step S 306 ), and reads, from the map data base 25 c , the environmental road information of the vehicle concerned corresponding to the present position information thus acquired (step S 307 ). In step S 307 , the CPU 22 a operates as a reading means.
- step S 308 On the basis of the road information thus acquired and the passage information from the ⁇ COMB 25 d , in step S 308 , it is decided whether or not the environment of the vehicle concerned has a road structure for which the side direction region must be monitored (e.g. there is an intersection difficult to see or a temporarily stopping line ahead of the vehicle concerned, the width of the road on which the vehicle concerned is running is narrow, the present position is an exit from a car park or a facility to a road). If YES, in order to select the left/right-side direction images, the left-side flag F 1 is first set at 1 so that the left-side direction image is selected (step S 309 ).
- the CPU 22 a does not read the road information, but the ⁇ COM B 25 d has only to acquire the road information.
Abstract
In a danger deciding means for a vehicle, information which changes according to the road structure around a vehicle concerned is acquired. When a plurality of images obtained by picking up different monitoring regions are received, on the basis of the acquired information, the image of the region which must be monitored is selected from the plurality of images. The selected image is processed to detect the approaching degree of an approaching object. On basis of the approaching degree, danger of contact or collision with the approaching object is decided.
Description
- 1. Field of the Invention
- This invention relates to a danger deciding apparatus for a motor vehicle and an environment monitoring apparatus for a motor vehicle. More specifically, this invention relates to a danger deciding apparatus for a motor vehicle for detecting the approaching degree of an approaching object in a picked-up image of an environment of the vehicle to decide the danger of contact or collision (hereinafter simply referred to contact) with the approaching object, and an environment monitoring apparatus for monitoring the environment of the vehicle to decide the danger of contact with the approaching object.
- 2. Description of the Related Art
- For example, when a driver of a vehicle running on a road of one-side two or more lanes of e.g. a speed way intends to change his running lane, if he changes the lane while he is missing a vehicle which catches up with his own vehicle on another lane at a higher speed than his own vehicle from the rear-side direction, there is strong possibility of a serious accident.
- When the following vehicle on the same lane abruptly approaches his own vehicle from the rear-side direction, if the driver makes abrupt braking, there is possibility of bumping-into-the back. Therefore, it is desired that the driver surely notices or recognizes other vehicles in the environment.
- A technique for avoiding the danger as described above has been proposed as an environment monitoring system for a vehicle in JP-A-7-50769. Now referring to FIGS.8A-8D, an explanation will be given of this environment monitoring system.
- FIGS.8A-8C are views for explaining a change in a rear/rear-side direction (hereinafter referred to as rear-side direction) image acquired by a
video camera 1. FIG. 8A shows a status inclusive of the vehicle concerned. FIG. 8B shows an image acquired by avideo camera 13 at timing t in an environment of the vehicle concerned. FIG. 8C shows an image acquired at timing t+Δt. - Now it is assumed that the vehicle concerned is running straight on a flat road. The road sign and building residing in the rear of the vehicle concerned in FIG. 8A are observed as images shown in FIGS. 8B and 8C at timings t and t+Δt, respectively. Coupling the corresponding points in these two images provides speed vectors as shown in FIG. 8D. There are referred to as “optical flows”.
- It can be seen that the optical flows radially appear from an FOE (Focus of Expansion) where the road disappears. While the vehicle concerned runs, the optical flows of an object which leaves from the vehicle concerned are vectors in a direction converging toward the FOE. The optical flows on the object which approaches toward the object concerned are vectors in a direction diverging from the FOE.
- Therefore, a conventional environment monitoring system detects the optical flows by processing the rear-side direction image acquired by a
camera 13, and uses to monitor the relationship between the vehicle concerned and the following vehicle or another vehicle running on an adjacent lane to detect the other vehicle approaching the vehicle concerned, thereby deciding danger of the contact of the vehicle concerned with the approaching object. - In some prior arts, a technique of searching corresponding points using two cameras is adopted. Specifically, an edge point Pa (not shown) of an object is detected in a rear-side direction image acquired by the one camera. A point Pb (not shown) of the image acquired by another camera corresponding to the detected edge point Pa is detected. The position P of the object is acquired by the pixel coordinates of Pa and Pb. On the basis of the position P of the approaching object acquired, the existence of the approaching object such as another vehicle approaching the vehicle concerned is detected and danger of contact of the approaching object is decided.
- As an alternative technique, cameras are installed to be oriented toward both left-side and right-side directions in the front or rear portion of the vehicle, and the images in both left-side and right-side directions acquired by the cameras are processed in the manner similar to the above rear-side direction image. When the vehicle concerned approaches a side road intersecting the vehicle concerned such an approaching an intersection or outgoing from the garage facing a road, the existence of the approaching object such as a man or another vehicle approaching from the side road is detected and danger of collision of the vehicle concerned with the approaching object is decided.
- In the environment monitoring apparatus for a vehicle, in order to recognize an approaching object in the environment, the camera(s) may be located at various points other than the points where the rear-side direction image or side direction images can be acquired.
- However, the above prior art environment monitoring apparatus individually makes image processing according to the installed position of the camera to detect the approaching object. Specifically, where both of rear-side monitoring and side monitoring are intended simultaneously, two CPUs therefor are prepared separately. The one is intended for processing the rear-side direction image to detect the approaching object from the rear-side direction. The other is intended for processing the side direction image to detect the approaching object from the side direction. Where extension of the monitoring range is intended by provision of cameras in various directions, plural CPUs for image processing must be prepared. This is problematic from the standpoint of cost.
- In order to solve the above problem, it can be proposed to make image processing for plural images sequentially using a single CPU. However, this lengthens the time required for image processing, thereby deteriorating the accuracy of detecting the approaching object. Thus, it is not possible to detect danger of contact with an environmental approaching object.
- A first object of this invention is to provide a danger deciding apparatus for a motor vehicle which can decide danger of contact with an environmental approaching object inexpensively and accurately by selecting an image of the region which must be monitored from a plurality of images in view of a road structure and processing the selected image to detect the approaching object.
- A second object of this invention is to provide an environment monitoring apparatus for a motor vehicle which can monitor the environment accurately and inexpensively.
- In order to attain the first object, in accordance with this invention, as shown in FIG. 1, there is provided a danger deciding apparatus for a vehicle comprising:
- information collecting means22 a-1 for collecting information which changes according to a road structure around a vehicle concerned;
- selecting means22 a-2 for selecting an image of the region which must be monitored from a plurality of images on the basis of the collected information when they are received, the plurality of images being obtained by picking up different monitoring regions;
- detecting means22 a-3 for processing the selected image to detect the approaching degree of an approaching object; and
- deciding means22 a-4 for deciding danger of contact with the approaching object on basis of the approaching degree.
- In this configuration, noting that a region which must be monitored should be determined according to the road structure around the vehicle concerned, the image selected from a plurality of images on the basis of the information which changes according to the road structure is processed. In this case, the approaching degree detecting means22 a-3, which is adapted to process a plurality of images, processes only the image of the region which must be monitored so that the accuracy of detecting the approaching object is not deteriorated.
- Preferably, the plurality of images include a rear-side direction image obtained when a rear region is picked up and left/right-side direction images obtained when left/right-direction regions are obtained. For example, where the road structure has an intersection difficult to see ahead of the vehicle concerned, the left/right-side direction region must be monitored. Where the road structure is a speedway, the rear-side direction region must be monitored. Noting this, the image selected from the rear-side direction image and left/right direction images on the basis of the information which changes according to the road structure is processed. In this way, the approaching degree detecting means22 a-3, which is adapted to process a rear-side direction image and left/right-side direction images, processes only the image of the region which must be monitored so that the accuracy of detecting the approaching object is not deteriorated.
- Preferably, the danger deciding apparatus for a vehicle comprises a
speed sensor 23 for detecting a speed of a vehicle concerned and the information collecting means collects the speed as the above information. In this configuration, the information collecting means 22 a-1 can divert the speed of the vehicle concerned from the speed sensor as required information. - Preferably, the plurality of images include an image which is received only when the road structure around the vehicle concerned becomes a prescribed structure, and the information collecting means collects the received image as the required information. In this configuration, the information collecting means can divert such an image as required information.
- Preferably, the danger deciding apparatus for a vehicle further comprises map storage means25 c for storing map information having road information including the road structure; information acquiring means 25 b for acquiring information indicative of the present position of the vehicle concerned; and reading means 22 a-5 for reading the road information around the present position of the vehicle concerned from the map storage means on the basis of the information acquired by the information acquiring means. The information collecting means collects the road information read by the reading means as the information. In this configuration, the information collecting means 22 a-1 collects the road information including the road structure around the present position read from the map storage means 25 c by the reading means 22 a-5. Therefore, the map storage means 25 c and information acquiring means 25 b of the car navigation system installed in the vehicle concerned can be used to collect the necessary information. In addition, the road information including the road structure can be collected as the required information.
- Preferably, the selecting means always selects the images of the regions which must be monitored simultaneously or a single image when it is received. In this configuration, the left/right-side direction images which must be processed simultaneously or the single image which is not selected according the road structure can be processed all the time.
- In order to attain the second object, in accordance with this invention, there is provided an environment monitoring apparatus for a vehicle comprising:
- information collecting means for acquiring information which changes according to a road structure around a vehicle concerned;
- pick-up means for picking up different monitoring regions to acquire a plurality of images;
- selecting means for selecting an image of the region which must be monitored from a plurality of images on the basis of the collected information when they are received,
- detecting means for processing the selected image to detect the approaching degree of an approaching object; and
- deciding means for deciding danger of contact with the approaching object on basis of the approaching degree.
- In this configuration, noting that a region which must be monitored should be determined according to the road structure around the vehicle concerned, the image selected from a plurality of images on the basis of the information which changes according to the road structure is processed. In this case, the approaching
degree detecting means 22 a-3, which is adapted to process a plurality of images, processes only the image of the region which must be monitored so that the accuracy of detecting the approaching object is not deteriorated. - The above and other objects and features of this invention will be more apparent from the following description taken in conjunction with the accompanying drawings.
- FIG. 1 is a block diagram showing the basic configuration of a danger deciding apparatus for a motor vehicle and an environment monitoring apparatus for a motor vehicle according to this invention;
- FIG. 2 is a block diagram showing a first embodiment of an environment monitoring apparatus equipped with the danger deciding apparatus according to this invention;
- FIGS. 3A and 3B are views for explaining the positions where a rear-side direction camera, a right-side direction camera and a left-side direction camera are attached;
- FIG. 4 is a flowchart for explaining the processing procedure of a CPU shown in FIG. 1;
- FIG. 5 is a flowchart for explaining collection/selection of information according to the first embodiment;
- FIGS. 6A and 6B are views for explaining the processing of detection of an optical flow in FIG. 4;
- FIG. 7 is a flowchart for explaining collection/selection of information according to the third embodiment;
- FIG. 8A to8D are views for explaining a change in the road image in a rear-side direction acquired by a camera.
-
Embodiment 1 - An explanation will be given of the first embodiment of this invention.
- FIG. 2 is an environment monitoring apparatus for a motor vehicle equipped with a danger deciding apparatus for a motor vehicle according this invention. As seen from FIG. 1, the environment monitoring apparatus includes an image pick-up
section 1 picking up different monitoring regions to acquire a plurality of images; and a danger deciding apparatus for a motor vehicle for detecting the approaching degree of an approaching object such as an approaching vehicle residing within a monitoring region picked up by the image pick-upsection 1. - As shown in FIGS. 2 and 3, at the front of a vehicle for example, the image pick-up means1 includes a right-side direction camera 11R which is attached to the right-side position of the front and picks up the right-side direction region of the vehicle to acquire a right-side direction image, a right image plane 12R on which the right-side direction image is imaged, a left-side direction camera 11L which is attached to the left-side position of the front and picks up the left-side direction region of the vehicle to acquire a left-side direction image, and a left image plane 12R on which the left-side direction image is imaged. Therefore, as shown in FIG. 3A, when the vehicle concerned is located at a position A before the approaching position of an intersection, the left- and right-side direction cameras 11R and 11L can monitor the side direction road which intersects the lane concerned on which the vehicle concerned runs.
- The image pick-up means1 further includes a rear-side direction camera 14 which is attached to the top or rear of a rear trunk of the vehicle toward the rear of the vehicle and picks up the rear-side direction region to acquire a rear direction image and a rear image plane 14 on which the rear-side direction image is imaged. Therefore, as seen from FIG. 3B, the rear-
side direction camera 13 can monitor the lane concerned behind the vehicle concerned and lanes adjacent to the vehicle concerned. - The danger deciding apparatus for a vehicle includes a
storage unit 21 for storing the image acquired from the image pick-upunit 1, a microcomputer A22 (hereinafter referred to as “μCOM A22”) which performs the processing based on the image acquired by the image pick-upunit 1 and processing of deciding danger of contact with another vehicle, aspeed sensor 23 for detecting the speed of the vehicle concerned to supply speed information to the μCOM A22, and awarning unit 24 which gives a warning when it is decided by the μCOM A22 that there is danger of contact with another vehicle. - The
storage unit 21 includes a first frame memory 21 a, a second frame memory 21 b and an optical flow memory 21 c. The first and the second frame memories 21 a and 21 b temporarily store pixel data with m rows and n columns of e.g. 512×512 pixels and luminance of 0-255 levels which have been converted from the image acquired by the image pick-upunit 1 and supplies them to the μCOM A22. - These first and second frame memories21 a and 21 b successively store the images in such a fashion that the images are stored in the first frame memory 21 a at timing t, in the second frame memory 21 b at timing t+Δt, again in the first frame memory 21 a at timing t+2Δt . . .
- The optical flow which is a movement quantity of an approaching object between two images acquired apart by a prescribed time Δt from each other is temporarily stored in the optical flow memory21 c.
- The
μCOM 22 includes a CPU 22 a for perfuming various computations according to a prescribed control program, a ROM 22 b for storing the control program and prescribed values and a RAM 22 c for temporarily storing data necessary for performing the computation. - The CPU22 a performs processing of collecting information which changes according to the road structure around a vehicle concerned (as information collecting means); processing of selecting, on the basis of the collected information, an image acquired by picking up a region which must be monitored from the plurality of images when a rear-side direction image, left-side and right-side direction images are received (as selecting means); processing of detecting, as an optical flow, a movement quantity of an approaching object in the selected images apart by a prescribed time from each other (as means for detecting an approaching degree) and processing of deciding danger of contact with an approaching vehicle on the magnitude or position of the optical flow (as danger deciding means).
- A
speed sensor 23 has a running sensor (not shown) which generates a pulse signal whenever the vehicle runs by a unit distance and others and detects the speed information on the pulse signal from the running sensor. The speed information changes according to the road structure around the vehicle concerned. This is because the vehicle speed falls e.g. when there is an intersection difficult to see ahead of the vehicle concerned, a stopping line ahead of the vehicle, or the width of the lane concerned is narrow, and rises e.g. when the lane concerned is a speedway. - The
warning unit 24 includes a display 24 a and a speaker 24 b. The display 24 a informs the driver of danger in such a manner that the image acquired by the image pick-upunit 1 is displayed, or message is displayed when the CPU 22 a within the μCOM A22 decides that there is danger of contact with another vehicle. The speaker 24 b informs the driver of danger by audio guidance or warning sound. - FIG. 4 is a flowchart showing the procedure of processing performed by the CPU22 a. Now referring to the flowchart of FIG. 4, an explanation will be given of the operation of the environment monitoring apparatus for a vehicle.
- First, in response to “ON” of an ignition switch, the CPU22 a proceeds to an initial step (not shown) where the left-side direction flag F1, right-side direction flag F2 and rear-side direction flag F3 are reset to 0, and thereafter proceeds to a next step S1.
- In step S1, the CPU 22 a decides whether or not both the rear-side direction image and left/right-side direction image have been received. If only one of the rear-side direction image and the left/right-side direction image has been received (“NO” in step S1), the CPU 22 a sets the flag corresponding to the received image at ‘1’ (step S2) and proceeds to step S4. Namely, if only the left/right-side direction images are received, the CPU 22 a sets the flag F1 for selecting the left-side direction image at ‘1’, and if only the rear-side direction image is received, the CPU 22 a sets the flag F3 for selecting the rear-side direction image at ‘1’.
- In these steps S1 and S2, when only the rear-side direction image is received, i.e. only one image is received, or only the left/right-side image is received, i.e. although two or more images are received, only the images acquired when the regions which must be monitored simultaneously are picked up, the input image (s) can be monitored all the time irrespectively of the road structure.
- On the other hand, if both images are received (“YES” in step S1), the CPU 22 a performs the processing of collecting information which changes according to the road structure around the vehicle concerned and selecting one of the rear-side direction image and the left/right-side direction image on the basis of the collected information (step S3). Referring to the flowchart of FIG. 5, an explanation will be given of the details of the processing of information collection and selection. First, CPU 22 a collects speed information from the speed sensor 23 (step S301) , and decides whether or not the speed of the vehicle concerned is not higher than e.g. 3 Km/h (step S302).
- If the vehicle speed is not higher than 3 km/h (“YES” in step S302), CPU 22 a decides that the road structure has an intersection difficult to see ahead of the vehicle concerned, a stopping line ahead of the vehicle, and/or narrow lane concerned. Where the environment around the vehicle concerned has such a road structure, it is necessary to detect an approaching object such as another vehicle which approaches from a side road intersecting the lane concerned. Therefore, CPU 22 a sets the left-side direction flag F1 at ‘1’ to select the left-side direction image (step S303) and proceeds to step S4.
- On the other hand, if the vehicle speed is not lower than 60 km/h (“NO” in step S302 and “YES” in step S304), CPU 22 a decides that the road structure is a speedway or road of high priority. Where the environment around the vehicle concerned has such a road structure, it is necessary to detect an approaching object such as another vehicle which approaches from the rear of the vehicle concerned. Therefore, CPU 22 a sets the rear-side direction flag F3 at ‘1’ to select the rear-side direction image (step S305) and proceeds to step S4. If the vehicle speed is not higher than 3 km/h nor lower than 60 km/h, CPU 22 a decides that the road structure around the vehicle concerned does not include the left/right-side direction monitoring and rear-side direction monitoring regions, and returns to step S11.
- In step S4 in FIG. 4, CPU 22 a takes in the image (hereinafter referred to as “selected image”) corresponding to the flag with ‘1’ among the left-side direction flag F1, right-side direction flag F2 and rear-side direction flag F3, and stores it in the first frame memory 21 a (step S5). After a prescribed time Δt, CPU 22 a takes in the selected image and stores it in the second frame memory 21 b.
- CPU22 a performs the processing of detecting an optical flow indicative of the movement quantity of the approaching object in the selected images apart by the prescribed time, and stores the detected optical flow in the optical flow memory 21 c (step S6).
- An explanation will be given of the details of the processing of detecting an optical flow.
- As described in connection with the prior art, the approaching object moves in a diverging direction of the optical flow from the point where the road and others vanishes from the selected image, i.e. FOE. Noting this fact, referring to FIGS. 6A and 6B, the procedure of detecting the optical flow will be explained.
- First, on the selected image picked up at timing t, a slender window W1 is set around a certain characteristic point P in a radial direction of the FOE set as described above (i.e. in the direction of connecting the
FOE 1 to the characteristic point P) (FIG. 6A). Subsequently, on the selected image at timing t+tΔ acquired from the second frame memory 21 b, while a window W2 corresponding to the window W1 is shifted one point by one point in the radial direction from the FOE, its correlated value with the window W1 (FIG. 6B). The point Q of the window W2 where the correlated value is the maximum is taken as the point corresponding to the point P, i.e. the point on the same object. The movement PQ is detected as the optical flow which represents the movement quantity of the approaching object. The detected optical flow is stored in the optical flow memory 21 c. - Incidentally, the characteristic point may be a pixel having a prescribed luminance difference from that of its adjacent pixel. The FOE may be a crossing point of the extended lines of the white lines located on both sides of the road picked up on the selected image.
- CPU22 a performs the processing of deciding danger of the contact with another vehicle on the basis of the size of the optical flow stored in the optical flow memory 21 c (Step S7).
- If it is decided that there is danger of contact (“YES” in step S8), in order to inform the driver of this fact, CPU 22 a performs the processing of warning of issuing a warning sound signal and/or a warning image signal to the warning unit 24 (step S9), and proceeds to step S10. In response to the warning signal, a warning indication “there is an approaching vehicle” is displayed on the display 24 a, or a warning guidance “there is an approaching vehicle” is issued by sound from e.g. the speaker 5 a.
- On the other hand, if it is decided in the processing of danger deciding that there is no danger of contact with another vehicle (“NO” in step S8), CPU 22 a directly proceeds to step S10. In step S10, it is decided whether or not the left-side direction flag F1 is 1. If the left-side direction flag F1 has been set at 1 in the processing of step S2 or S3 (“YES” in step S10), in order to monitor the right-side direction region, CPU 22 a sets the left-side direction flag F1 at 0 and the right-side direction flag F2 at 1, and returns to step S2 (step S11). In step S1, after the processing has been made on the left-side direction image, the processing will be automatically made on the right-side direction image.
- On the other hand, if the rear-side direction flag F3 is 1 or the right-direction flag F2 is 1 after completion of the left/right monitoring by the processing of step S2 or step S3 (“NO” in step S10), CPU 22 a resets the left/right-side direction flags F1, F2 and rear-side direction flag F3 (step S12) and returns to step S1.
- As described above, the processing is made on the image selected from the rear-side direction image and the left/right-side direction image on the basis of the speed information which changes according to the road structure. Therefore, where CPU22 a is adpted to process both the rear-side direction image and the left/right-side direction images, the processing is made on only the image acquired by picking up the region which must be monitored. Thus, it is possible to decide the danger of contact with an approaching object in the environment of the vehicle concerned inexpensively and accurately without deteriorating the detecting accuracy of the object.
- Since the speed information from the
speed sensor 23 mounted on the vehicle is used as the information which changes according to the road structure, there is no need of providing additional means for producing necessary information. This contributes to cost reduction of the monitoring apparatus. -
Embodiment 2 - In the first embodiment, the speed information of the vehicle was collected as the information which changes according to the road structure. Meanwhile, in a main conventional monitoring apparatus, the side-direction monitoring has been made while the driver is seeing the left/right-side direction image displayed on the display24 a.
- In some monitoring apparatus, when the vehicle speed becomes 3 Km/h or lower, on the basis of the decision that the road structure around the vehicle concerned has a prescribed structure e. g. there is an intersection difficult to see ahead of the vehicle concerned, the left/right-side direction image pick-up
cameras 11 are automatically tuned on to acquire the left/right-side direction images and display them on the display 24 a. Therefore, for the purpose of image selection, the acquired information on the left/right-side direction image from the monitoring apparatus can be used as the information which changes according to the road structure in the environment of the vehicle concerned. In this case also, there is no need of providing additional means for producing necessary information. This contributes to cost reduction of the monitoring apparatus. -
Embodiment 3 - In place of the speed information, the information taken from a
car navigation system 25 which is mounted in the vehicle can be used for the purpose of image selection. An explanation will be given of an environment monitoring apparatus for a vehicle equipped with a danger deciding apparatus using the car navigation system. - As shown in FIG. 2, the
car navigation system 25 includes a GPS receiver 25 b (information acquisition means) for receiving the present position information representative of the present position of the vehicle on an earth through a GPS antenna 25 a, a map data base 25 c in which the map information having road information containing a road structure is stored, andμ COM A 22 for computing the passage of the vehicle concerned to a destination. The μ COM A22 receives the present position information from the GPS receiver 25 b, map information from the map data base 25 c and the passage information from the μ COM B 25 d. - The environment monitoring apparatus according to the third embodiment operates in the same manner as the first embodiment except the processing of information collection/selection. Now referring to the flowchart of FIG. 7 showing the information collection/selection by the CPU22 a, an explanation will be given of the operation of the third embodiment.
- First, the CPU22 a acquires the present position information from the GPS receiver 25 a (step S306), and reads, from the map data base 25 c, the environmental road information of the vehicle concerned corresponding to the present position information thus acquired (step S307). In step S307, the CPU 22 a operates as a reading means.
- On the basis of the road information thus acquired and the passage information from the μ COMB25 d, in step S308, it is decided whether or not the environment of the vehicle concerned has a road structure for which the side direction region must be monitored (e.g. there is an intersection difficult to see or a temporarily stopping line ahead of the vehicle concerned, the width of the road on which the vehicle concerned is running is narrow, the present position is an exit from a car park or a facility to a road). If YES, in order to select the left/right-side direction images, the left-side flag F1 is first set at 1 so that the left-side direction image is selected (step S309).
- If NO in step S308, it is decided that the rear-side direction must be monitored. In this case, the rear-side direction image flag F3 is set at 1 to select the rear-side direction image (step S310).
- As described above, since the environment road information of the vehicle concerned acquired from the map data base25 c is collected as information which changes according to the road structure, the map data base 25 c and GPS receiver 25 b of the car navigation apparatus can be used to acquire the information. Therefore, there is no need of providing means for producing the information corresponding to the road structure. This simplifies the structure of the environment monitoring apparatus and further reduces the production cost thereof. Further, since the road information including the road structure is acquired, it is possible to decide accurately the danger of contact with an approaching object.
- Where the μ COMB25 d reads the road information of the present position from the map data base 25 c and produces the read information thus read, as in the third embodiment, the CPU 22 a does not read the road information, but the μ COM B 25 d has only to acquire the road information.
- In the first to the third embodiments, the CPU22 a could operate as the means for detecting the approaching degree by way of detection of the optical flow. However, the CPU 22 a can operate as the means for detecting the approaching degree by way of detection of the distance from the object using two cameras apart from each other by a prescribed distance in a sterescopic system.
Claims (7)
1. A danger deciding apparatus for a vehicle comprising:
information collecting means for acquring information which changes according to a road structure around a vehicle concerned;
selecting means for selecting an image of the region which must be monitored from a plurality of images on the basis of the collected information when they are received, the plurality of images being obtained by picking up different monitoring regions;
detecting means for processing the selected image to detect the approaching degree of an approaching object; and
deciding means for deciding danger of contact with the approaching object on basis of the approaching degree.
2. A danger deciding apparatus for a vehicle according to , wherein the plurality of images include a rear-side direction image obtained when a rear region is picked up and left/right-side direction images obtained when left/right-direction regions are obtained.
claim 1
3. A danger deciding apparatus for a vehicle according to , further comprising a speed sensor for detecting a speed of a vehicle concerned, wherein the information collecting means collects the speed as the above information.
claim 1
4. A danger deciding apparatus for a vehicle according to , wherein the plurality of images includes an image which is received only when the road structure around the vehicle concerned becomes a prescribed structure, and the information collecting means collects the received image as the information.
claim 1
5. A danger deciding apparatus for a vehicle according to , further comprising:
claim 1
map storage means for storing map information having road information including the road structure;
information acquiring means for acquiring information indicative of the present position of the vehicle concerned; and
reading means for reading the road information around the present position of the vehicle concerned from the map storage means on the basis of the information acquired by the information acquiring means, wherein the information collecting means collects the road information read by the reading means as the information.
6. A danger deciding apparatus for a vehicle according to , wherein the selecting means always selects the images of the regions which must be monitored simultaneously or a single image when they or it is received.
claim 1
7. An environment monitoring apparatus for a vehicle comprising:
information collecting means for acquring information which changes according to a road structure around a vehicle concerned;
pick-up means for picking up different monitoring regions to acquire a plurality of images;
selecting means for selecting an image of the region which must be monitored from a plurality of images on the basis of the collected information when they are received,
detecting means for processing the selected image to detect the approaching degree of an approaching object; and
deciding means for deciding danger of contact with the approaching object on basis of the approaching degree.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP12-044026 | 2000-02-22 | ||
JP2000044026A JP2001233150A (en) | 2000-02-22 | 2000-02-22 | Danger judging device for vehicle and periphery monitoring device for vehicle |
JP2000-44026 | 2000-02-22 |
Publications (2)
Publication Number | Publication Date |
---|---|
US20010016797A1 true US20010016797A1 (en) | 2001-08-23 |
US6330511B2 US6330511B2 (en) | 2001-12-11 |
Family
ID=18566855
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/785,426 Expired - Fee Related US6330511B2 (en) | 2000-02-22 | 2001-02-20 | Danger deciding apparatus for motor vehicle and environment monitoring apparatus therefor |
Country Status (3)
Country | Link |
---|---|
US (1) | US6330511B2 (en) |
JP (1) | JP2001233150A (en) |
DE (1) | DE10108646A1 (en) |
Cited By (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030014162A1 (en) * | 2001-07-13 | 2003-01-16 | Nissan Motor Co., Ltd. | Lane-keep control system for vehicle |
US20030022681A1 (en) * | 2001-07-24 | 2003-01-30 | Ruppel Christopher D. | Adaptive dynamic technique for multi-path signal processing |
EP1362742A1 (en) * | 2002-05-17 | 2003-11-19 | Pioneer Corporation | Image pickup apparatus and method of controlling the apparatus |
US20050004743A1 (en) * | 2001-12-07 | 2005-01-06 | Hitachi, Ltd. | Vehicle running control apparatus and map information data recording medium |
US20050043879A1 (en) * | 2001-12-05 | 2005-02-24 | Jens Desens | System for automatically monitoring a motor vehicle |
EP1705623A1 (en) * | 2005-03-23 | 2006-09-27 | Aisin AW Co., Ltd. | Visual recognition system for vehicles |
US20080114513A1 (en) * | 2002-06-13 | 2008-05-15 | Oshkosh Truck Corporation | Steering control system and method |
US7711460B2 (en) | 2001-01-31 | 2010-05-04 | Oshkosh Corporation | Control system and method for electric vehicle |
US7835838B2 (en) | 1999-07-30 | 2010-11-16 | Oshkosh Corporation | Concrete placement vehicle control system and method |
US7848857B2 (en) | 2001-01-31 | 2010-12-07 | Oshkosh Corporation | System and method for braking in an electric vehicle |
US8095247B2 (en) | 1999-07-30 | 2012-01-10 | Oshkosh Corporation | Turret envelope control system and method for a vehicle |
US20150146930A1 (en) * | 2012-06-29 | 2015-05-28 | Denso Corporation | Image analysis apparatus mounted to vehicle |
US20150191119A1 (en) * | 2012-07-20 | 2015-07-09 | Toyota Jidosha Kabushiki Kaisha | Vehicle periphery monitoring device and vehicle periphery monitoring system |
US20150234045A1 (en) * | 2014-02-20 | 2015-08-20 | Mobileye Vision Technologies Ltd. | Navigation based on radar-cued visual imaging |
US20180336426A1 (en) * | 2017-05-22 | 2018-11-22 | Toyota Jidosha Kabushiki Kaisha | Image processing system, image processing method, information processing device and recording medium |
US10431088B2 (en) * | 2014-09-24 | 2019-10-01 | Denso Corporation | Object detection apparatus |
US10696228B2 (en) * | 2016-03-09 | 2020-06-30 | JVC Kenwood Corporation | On-vehicle display control device, on-vehicle display system, on-vehicle display control method, and program |
Families Citing this family (87)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5910854A (en) | 1993-02-26 | 1999-06-08 | Donnelly Corporation | Electrochromic polymeric solid films, manufacturing electrochromic devices using such solid films, and processes for making such solid films and devices |
US5668663A (en) | 1994-05-05 | 1997-09-16 | Donnelly Corporation | Electrochromic mirrors and devices |
US6891563B2 (en) | 1996-05-22 | 2005-05-10 | Donnelly Corporation | Vehicular vision system |
US6326613B1 (en) | 1998-01-07 | 2001-12-04 | Donnelly Corporation | Vehicle interior mirror assembly adapted for containing a rain sensor |
US8294975B2 (en) | 1997-08-25 | 2012-10-23 | Donnelly Corporation | Automotive rearview mirror assembly |
US6124886A (en) | 1997-08-25 | 2000-09-26 | Donnelly Corporation | Modular rearview mirror assembly |
US6172613B1 (en) | 1998-02-18 | 2001-01-09 | Donnelly Corporation | Rearview mirror assembly incorporating vehicle information display |
US8288711B2 (en) | 1998-01-07 | 2012-10-16 | Donnelly Corporation | Interior rearview mirror system with forwardly-viewing camera and a control |
US6445287B1 (en) | 2000-02-28 | 2002-09-03 | Donnelly Corporation | Tire inflation assistance monitoring system |
US6477464B2 (en) | 2000-03-09 | 2002-11-05 | Donnelly Corporation | Complete mirror-based global-positioning system (GPS) navigation solution |
US6329925B1 (en) | 1999-11-24 | 2001-12-11 | Donnelly Corporation | Rearview mirror assembly with added feature modular display |
US6693517B2 (en) | 2000-04-21 | 2004-02-17 | Donnelly Corporation | Vehicle mirror assembly communicating wirelessly with vehicle accessories and occupants |
JP2001195699A (en) * | 2000-01-14 | 2001-07-19 | Yazaki Corp | Vehicle circumference monitor device and recording medium for stored with vehicle collision danger judgement processing program |
JP2001213254A (en) * | 2000-01-31 | 2001-08-07 | Yazaki Corp | Side monitoring device for vehicle |
US7167796B2 (en) | 2000-03-09 | 2007-01-23 | Donnelly Corporation | Vehicle navigation system for use with a telematics system |
US7370983B2 (en) | 2000-03-02 | 2008-05-13 | Donnelly Corporation | Interior mirror assembly with display |
AU2001243285A1 (en) | 2000-03-02 | 2001-09-12 | Donnelly Corporation | Video mirror systems incorporating an accessory module |
WO2007053710A2 (en) | 2005-11-01 | 2007-05-10 | Donnelly Corporation | Interior rearview mirror with display |
US7852462B2 (en) * | 2000-05-08 | 2010-12-14 | Automotive Technologies International, Inc. | Vehicular component control methods based on blind spot monitoring |
US7581859B2 (en) | 2005-09-14 | 2009-09-01 | Donnelly Corp. | Display device for exterior rearview mirror |
US7255451B2 (en) | 2002-09-20 | 2007-08-14 | Donnelly Corporation | Electro-optic mirror cell |
ES2287266T3 (en) | 2001-01-23 | 2007-12-16 | Donnelly Corporation | IMPROVED VEHICLE LIGHTING SYSTEM. |
US6918674B2 (en) | 2002-05-03 | 2005-07-19 | Donnelly Corporation | Vehicle rearview mirror system |
EP1504276B1 (en) | 2002-05-03 | 2012-08-08 | Donnelly Corporation | Object detection system for vehicle |
DE10223269B4 (en) * | 2002-05-24 | 2017-11-09 | Volkswagen Ag | Driver assistance system |
WO2003105099A1 (en) | 2002-06-06 | 2003-12-18 | Donnelly Corporation | Interior rearview mirror system with compass |
US7329013B2 (en) | 2002-06-06 | 2008-02-12 | Donnelly Corporation | Interior rearview mirror system with compass |
US7433889B1 (en) * | 2002-08-07 | 2008-10-07 | Navteq North America, Llc | Method and system for obtaining traffic sign data using navigation systems |
AU2003278863A1 (en) | 2002-09-20 | 2004-04-08 | Donnelly Corporation | Mirror reflective element assembly |
WO2004103772A2 (en) | 2003-05-19 | 2004-12-02 | Donnelly Corporation | Mirror assembly for vehicle |
US7310177B2 (en) | 2002-09-20 | 2007-12-18 | Donnelly Corporation | Electro-optic reflective element assembly |
DE10312249A1 (en) * | 2003-03-19 | 2004-09-30 | Ibeo Automobile Sensor Gmbh | Process for the joint processing of deep-resolution images and video images |
JP4244684B2 (en) * | 2003-04-10 | 2009-03-25 | 三菱自動車工業株式会社 | Vehicle monitoring device |
JP3879696B2 (en) * | 2003-04-25 | 2007-02-14 | 日産自動車株式会社 | Driving assistance device |
US20040217851A1 (en) * | 2003-04-29 | 2004-11-04 | Reinhart James W. | Obstacle detection and alerting system |
US7446924B2 (en) | 2003-10-02 | 2008-11-04 | Donnelly Corporation | Mirror reflective element assembly including electronic component |
US7308341B2 (en) | 2003-10-14 | 2007-12-11 | Donnelly Corporation | Vehicle communication system |
US7190282B2 (en) | 2004-03-26 | 2007-03-13 | Mitsubishi Jidosha Kogyo Kabushiki Kaisha | Nose-view monitoring apparatus |
JP4134939B2 (en) | 2004-04-22 | 2008-08-20 | 株式会社デンソー | Vehicle periphery display control device |
JP4088288B2 (en) * | 2004-10-21 | 2008-05-21 | 本田技研工業株式会社 | Vehicle periphery monitoring device |
US7720580B2 (en) | 2004-12-23 | 2010-05-18 | Donnelly Corporation | Object detection system for vehicle |
JP2006197034A (en) * | 2005-01-11 | 2006-07-27 | Sumitomo Electric Ind Ltd | Image recognition system, imaging apparatus, and image recognition method |
EP1883855B1 (en) | 2005-05-16 | 2011-07-20 | Donnelly Corporation | Vehicle mirror assembly with indicia at reflective element |
JP4426535B2 (en) * | 2006-01-17 | 2010-03-03 | 本田技研工業株式会社 | Vehicle periphery monitoring device |
MX2008011219A (en) | 2006-03-09 | 2008-09-11 | Gentex Corp | Vehicle rearview assembly including a high intensity display. |
DE102006026370B4 (en) * | 2006-06-07 | 2020-03-19 | Volkswagen Ag | Vehicle with a driver assistance system |
JP4613906B2 (en) | 2006-12-14 | 2011-01-19 | トヨタ自動車株式会社 | Vehicle periphery monitoring device |
US8154418B2 (en) | 2008-03-31 | 2012-04-10 | Magna Mirrors Of America, Inc. | Interior rearview mirror system |
TW201008812A (en) * | 2008-08-22 | 2010-03-01 | shi-xiong Li | Auxiliary video warning device for vehicle |
US9487144B2 (en) | 2008-10-16 | 2016-11-08 | Magna Mirrors Of America, Inc. | Interior mirror assembly with display |
EP2179892A1 (en) | 2008-10-24 | 2010-04-28 | Magna Electronics Europe GmbH & Co. KG | Method for automatic calibration of a virtual camera |
US8964032B2 (en) | 2009-01-30 | 2015-02-24 | Magna Electronics Inc. | Rear illumination system |
JP5453048B2 (en) * | 2009-10-22 | 2014-03-26 | 富士重工業株式会社 | Vehicle driving support control device |
EP2523831B1 (en) | 2010-01-13 | 2015-12-16 | Magna Electronics Inc. | Vehicular camera and method for periodic calibration of vehicular camera |
US8879139B2 (en) | 2012-04-24 | 2014-11-04 | Gentex Corporation | Display mirror assembly |
DE102012223131A1 (en) | 2012-12-13 | 2014-06-18 | BSH Bosch und Siemens Hausgeräte GmbH | Cold apparatus i.e. refrigerator, for household application for storing food products at certain temperature, has container located in extension box, where container at extension box is pivotally supported around pivotal axis |
US9598018B2 (en) | 2013-03-15 | 2017-03-21 | Gentex Corporation | Display mirror assembly |
CN103358996B (en) * | 2013-08-13 | 2015-04-29 | 吉林大学 | Automobile A pillar perspective vehicle-mounted display device |
US9575315B2 (en) | 2013-09-24 | 2017-02-21 | Gentex Corporation | Display mirror assembly |
US9511715B2 (en) | 2014-01-31 | 2016-12-06 | Gentex Corporation | Backlighting assembly for display for reducing cross-hatching |
EP3119643B1 (en) | 2014-03-21 | 2018-05-23 | Gentex Corporation | Tri-modal display mirror assembly |
CN106163873B (en) | 2014-04-01 | 2019-04-26 | 金泰克斯公司 | Automatic display mirror assembly |
US9694751B2 (en) | 2014-09-19 | 2017-07-04 | Gentex Corporation | Rearview assembly |
US9694752B2 (en) | 2014-11-07 | 2017-07-04 | Gentex Corporation | Full display mirror actuator |
US10071689B2 (en) | 2014-11-13 | 2018-09-11 | Gentex Corporation | Rearview mirror system with a display |
KR101997815B1 (en) | 2014-12-03 | 2019-07-08 | 젠텍스 코포레이션 | Display mirror assembly |
USD746744S1 (en) | 2014-12-05 | 2016-01-05 | Gentex Corporation | Rearview device |
US9744907B2 (en) | 2014-12-29 | 2017-08-29 | Gentex Corporation | Vehicle vision system having adjustable displayed field of view |
US9720278B2 (en) | 2015-01-22 | 2017-08-01 | Gentex Corporation | Low cost optical film stack |
EP3286038A4 (en) | 2015-04-20 | 2018-04-25 | Gentex Corporation | Rearview assembly with applique |
CN107614324B (en) | 2015-05-18 | 2020-11-27 | 金泰克斯公司 | Complete display rearview device |
EP3310618A4 (en) | 2015-06-22 | 2018-07-04 | Gentex Corporation | System and method for processing streamed video images to correct for flicker of amplitude-modulated lights |
US9598076B1 (en) | 2015-10-22 | 2017-03-21 | Ford Global Technologies, Llc | Detection of lane-splitting motorcycles |
USD797627S1 (en) | 2015-10-30 | 2017-09-19 | Gentex Corporation | Rearview mirror device |
CN108349436B (en) | 2015-10-30 | 2019-12-20 | 金泰克斯公司 | Rear-view device |
WO2017075420A1 (en) | 2015-10-30 | 2017-05-04 | Gentex Corporation | Toggle paddle |
USD798207S1 (en) | 2015-10-30 | 2017-09-26 | Gentex Corporation | Rearview mirror assembly |
USD800618S1 (en) | 2015-11-02 | 2017-10-24 | Gentex Corporation | Toggle paddle for a rear view device |
USD845851S1 (en) | 2016-03-31 | 2019-04-16 | Gentex Corporation | Rearview device |
USD817238S1 (en) | 2016-04-29 | 2018-05-08 | Gentex Corporation | Rearview device |
US10025138B2 (en) | 2016-06-06 | 2018-07-17 | Gentex Corporation | Illuminating display with light gathering structure |
US10300859B2 (en) | 2016-06-10 | 2019-05-28 | Magna Electronics Inc. | Multi-sensor interior mirror device with image adjustment |
USD809984S1 (en) | 2016-12-07 | 2018-02-13 | Gentex Corporation | Rearview assembly |
USD854473S1 (en) | 2016-12-16 | 2019-07-23 | Gentex Corporation | Rearview assembly |
US20180191966A1 (en) | 2016-12-30 | 2018-07-05 | Gentex Corporation | Full display mirror with on-demand spotter view |
US10735638B2 (en) | 2017-03-17 | 2020-08-04 | Gentex Corporation | Dual display reverse camera system |
DE102018206062A1 (en) * | 2018-04-20 | 2019-10-24 | Robert Bosch Gmbh | Method for controlling a display system |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
ATE199517T1 (en) * | 1995-12-12 | 2001-03-15 | Alusuisse Tech & Man Ag | METHOD FOR PRODUCING BLISTER PACKAGES |
US6025797A (en) * | 1997-07-22 | 2000-02-15 | Denso Corporation | Angular shift determining apparatus for determining angular shift of central axis of radar used in automotive obstacle detection system |
JP3930110B2 (en) * | 1997-08-11 | 2007-06-13 | 富士重工業株式会社 | Vehicle cruise control device |
US6115651A (en) * | 1998-01-15 | 2000-09-05 | Cruz; Diogenes J. | Large vehicle blindspot monitor |
JPH11353565A (en) * | 1998-06-09 | 1999-12-24 | Yazaki Corp | Method and device for alarm of collision for vehicle |
US6269308B1 (en) * | 1998-08-20 | 2001-07-31 | Honda Giken Kogyo Kabushiki Kaisha | Safety running system for vehicle |
US6226592B1 (en) * | 1999-03-22 | 2001-05-01 | Veridian Erim International, Inc. | Method and apparatus for prompting a motor vehicle operator to remain within a lane |
-
2000
- 2000-02-22 JP JP2000044026A patent/JP2001233150A/en not_active Withdrawn
-
2001
- 2001-02-20 US US09/785,426 patent/US6330511B2/en not_active Expired - Fee Related
- 2001-02-22 DE DE10108646A patent/DE10108646A1/en not_active Ceased
Cited By (34)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7835838B2 (en) | 1999-07-30 | 2010-11-16 | Oshkosh Corporation | Concrete placement vehicle control system and method |
US8095247B2 (en) | 1999-07-30 | 2012-01-10 | Oshkosh Corporation | Turret envelope control system and method for a vehicle |
US7848857B2 (en) | 2001-01-31 | 2010-12-07 | Oshkosh Corporation | System and method for braking in an electric vehicle |
US7711460B2 (en) | 2001-01-31 | 2010-05-04 | Oshkosh Corporation | Control system and method for electric vehicle |
US20030014162A1 (en) * | 2001-07-13 | 2003-01-16 | Nissan Motor Co., Ltd. | Lane-keep control system for vehicle |
US6853884B2 (en) | 2001-07-13 | 2005-02-08 | Nissan Motor Co., Ltd. | Lane-keep control system for vehicle |
EP1275573A3 (en) * | 2001-07-13 | 2004-02-04 | Nissan Motor Company, Limited | Lane-keep control system for vehicle |
US20030022681A1 (en) * | 2001-07-24 | 2003-01-30 | Ruppel Christopher D. | Adaptive dynamic technique for multi-path signal processing |
US20050043879A1 (en) * | 2001-12-05 | 2005-02-24 | Jens Desens | System for automatically monitoring a motor vehicle |
US7617037B2 (en) * | 2001-12-05 | 2009-11-10 | Daimler Ag | System for automatically monitoring a motor vehicle |
US20050004743A1 (en) * | 2001-12-07 | 2005-01-06 | Hitachi, Ltd. | Vehicle running control apparatus and map information data recording medium |
US7684921B2 (en) * | 2001-12-07 | 2010-03-23 | Hitachi, Ltd. | Vehicle running control apparatus and map information data recording medium |
US20030214576A1 (en) * | 2002-05-17 | 2003-11-20 | Pioneer Corporation | Image pickup apparatus and method of controlling the apparatus |
EP1362742A1 (en) * | 2002-05-17 | 2003-11-19 | Pioneer Corporation | Image pickup apparatus and method of controlling the apparatus |
US20080114513A1 (en) * | 2002-06-13 | 2008-05-15 | Oshkosh Truck Corporation | Steering control system and method |
US7756621B2 (en) | 2002-06-13 | 2010-07-13 | Oshkosh Corporation | Steering control system and method |
EP1705623A1 (en) * | 2005-03-23 | 2006-09-27 | Aisin AW Co., Ltd. | Visual recognition system for vehicles |
US20060215020A1 (en) * | 2005-03-23 | 2006-09-28 | Aisin Aw Co., Ltd. | Visual recognition apparatus, methods, and programs for vehicles |
US8130269B2 (en) | 2005-03-23 | 2012-03-06 | Aisin Aw Co., Ltd. | Visual recognition apparatus, methods, and programs for vehicles |
US20150146930A1 (en) * | 2012-06-29 | 2015-05-28 | Denso Corporation | Image analysis apparatus mounted to vehicle |
US9330343B2 (en) * | 2012-06-29 | 2016-05-03 | Denso Corporation | Image analysis apparatus mounted to vehicle |
US20150191119A1 (en) * | 2012-07-20 | 2015-07-09 | Toyota Jidosha Kabushiki Kaisha | Vehicle periphery monitoring device and vehicle periphery monitoring system |
US10046701B2 (en) * | 2012-07-20 | 2018-08-14 | Toyota Jidosha Kabushiki Kaisha | Vehicle periphery monitoring device and vehicle periphery monitoring system |
US10556541B2 (en) * | 2012-07-20 | 2020-02-11 | Toyota Jidosha Kabushiki Kaisha | Vehicle periphery monitoring device and vehicle periphery monitoring system |
US20150234045A1 (en) * | 2014-02-20 | 2015-08-20 | Mobileye Vision Technologies Ltd. | Navigation based on radar-cued visual imaging |
US9664789B2 (en) * | 2014-02-20 | 2017-05-30 | Mobileye Vision Technologies Ltd. | Navigation based on radar-cued visual imaging |
US10690770B2 (en) * | 2014-02-20 | 2020-06-23 | Mobileye Vision Technologies Ltd | Navigation based on radar-cued visual imaging |
US10274598B2 (en) * | 2014-02-20 | 2019-04-30 | Mobileye Vision Technologies Ltd. | Navigation based on radar-cued visual imaging |
US20190235073A1 (en) * | 2014-02-20 | 2019-08-01 | Mobileye Vision Technologies Ltd. | Navigation based on radar-cued visual imaging |
US10431088B2 (en) * | 2014-09-24 | 2019-10-01 | Denso Corporation | Object detection apparatus |
US10696228B2 (en) * | 2016-03-09 | 2020-06-30 | JVC Kenwood Corporation | On-vehicle display control device, on-vehicle display system, on-vehicle display control method, and program |
CN108932470A (en) * | 2017-05-22 | 2018-12-04 | 丰田自动车株式会社 | Image processing system, image processing method, information processing unit and recording medium |
US20180336426A1 (en) * | 2017-05-22 | 2018-11-22 | Toyota Jidosha Kabushiki Kaisha | Image processing system, image processing method, information processing device and recording medium |
US10740629B2 (en) * | 2017-05-22 | 2020-08-11 | Toyota Jidosha Kabushiki Kaisha | Image processing system, image processing method, information processing device and recording medium |
Also Published As
Publication number | Publication date |
---|---|
US6330511B2 (en) | 2001-12-11 |
DE10108646A1 (en) | 2001-08-30 |
JP2001233150A (en) | 2001-08-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6330511B2 (en) | Danger deciding apparatus for motor vehicle and environment monitoring apparatus therefor | |
US6360170B1 (en) | Rear monitoring system | |
US5424952A (en) | Vehicle-surroundings monitoring apparatus | |
US8175331B2 (en) | Vehicle surroundings monitoring apparatus, method, and program | |
JP5022609B2 (en) | Imaging environment recognition device | |
CN102458964B (en) | Camera system for use in vehicle parking | |
JP2800531B2 (en) | Obstacle detection device for vehicles | |
US20010012982A1 (en) | Side-monitoring apparatus for motor vehicle | |
EP2993654B1 (en) | Method and system for forward collision warning | |
JP4321821B2 (en) | Image recognition apparatus and image recognition method | |
US6853908B2 (en) | System and method for controlling an object detection system of a vehicle | |
EP1033693A2 (en) | Rear and side view monitor with camera for a vehicle | |
JP2000295604A (en) | Rear and side monitoring device for vehicle | |
US10846546B2 (en) | Traffic signal recognition device | |
US6549124B1 (en) | Environment monitoring system for a vehicle with an image pickup device | |
JP2002314989A (en) | Peripheral monitor for vehicle | |
CN114492679B (en) | Vehicle data processing method and device, electronic equipment and medium | |
CN112124304B (en) | Library position positioning method and device and vehicle-mounted equipment | |
JP4479183B2 (en) | Video presentation device | |
JP5134608B2 (en) | Vehicle periphery display device, vehicle periphery display method and program | |
JP3916930B2 (en) | Approach warning device | |
CN116892949A (en) | Ground object detection device, ground object detection method, and computer program for ground object detection | |
JP7402753B2 (en) | Safety support system and in-vehicle camera image analysis method | |
JP2001180404A (en) | Rear monitor for vehicle | |
JP2000315255A (en) | Back side direction monitoring device for vehicle and back side direction monitoring alarm device for vehicle |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: YAZAKI CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:OGURA, HIROYUKI;ISHIKAWA, NAOTO;REEL/FRAME:011558/0476 Effective date: 20010202 |
|
FEPP | Fee payment procedure |
Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
FPAY | Fee payment |
Year of fee payment: 4 |
|
REMI | Maintenance fee reminder mailed | ||
LAPS | Lapse for failure to pay maintenance fees | ||
STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
FP | Lapsed due to failure to pay maintenance fee |
Effective date: 20091211 |