CA1116286A - Perimeter surveillance system - Google Patents
Perimeter surveillance systemInfo
- Publication number
- CA1116286A CA1116286A CA321,857A CA321857A CA1116286A CA 1116286 A CA1116286 A CA 1116286A CA 321857 A CA321857 A CA 321857A CA 1116286 A CA1116286 A CA 1116286A
- Authority
- CA
- Canada
- Prior art keywords
- adder
- values
- array
- area
- cell
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19602—Image analysis to detect motion of the intruder, e.g. by frame subtraction
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19602—Image analysis to detect motion of the intruder, e.g. by frame subtraction
- G08B13/19608—Tracking movement of a target, e.g. by detecting an object predefined as a target, using target direction and or velocity to predict its new position
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19639—Details of the system layout
- G08B13/19652—Systems using zones in a single scene defined for different treatment, e.g. outer zone gives pre-alarm, inner zone gives alarm
Abstract
Abstract of the Disclosure Disclosed is an automatic video intrusion detection syste. The system monitors a given area such as between two parallel fences. The image of the given area is divided into an array of cells each approximately equal in size to the image of a man. The video for each cell is integrated, digitized and stored in an array.
Changes in cell values arc detected and tracked so that changes in, say, 3 adjacent cells in a given direction, indicate an intruder.
A microcomputer implements a tracking algorithm. Filtering discriminates against light level changes. The system is simpler than a known one which analyzes grey values of thousands of points in a video field.
Changes in cell values arc detected and tracked so that changes in, say, 3 adjacent cells in a given direction, indicate an intruder.
A microcomputer implements a tracking algorithm. Filtering discriminates against light level changes. The system is simpler than a known one which analyzes grey values of thousands of points in a video field.
Description
- - -This invention relates to an automatic video intrusion detection system.
A video intrusion detection sys~em enables a given area to be moni~ored to detect persons trying to enter a prohibited area, such as between two parallel spaced apart fences. For example, it may be desired to detect persons trying to enter premises without authorization or to leave an area, such as a prison compound, without authorization. It will be understood that the term "intrusion"
includes escape attempts in which case the would-be escapee still "intrudes" a forbidden area.
Some existing video intrusion detection devices have proved inadequate in an outdoor environment. Apparently they are effective in a controlled indoor environment but when used outdoors suffer from a very high nuisance alarm rate. Such ~hings as trees moving in the wind, birds, the shadows of birds, cloud shadows, blowing paper and even insects near the camera can at times trip the alarm.
In addition, considerable problems exist with lighting variation (both in time and space) and during rain and snow. Thus the problem . ~
is very complex and no solution can be expected to be perfect.
United States Patent 3,988,533 of Mick et al, issued October 26, 1976, discloses a video motion and intrusion detection system which samples a large number of points in each video field, for example 16, 384 points. The digitized value of the grey scale of each point is stored and then, on a subsequent field of the same type, new digitized values of the points are compared with the stored values from the previous scan. If the difference exceeds a predetermined limit, an alarm is generated. The digital information provides the `.
~ .
., . , .. ~ . , .
8~
basis for an alarm map which may be displayed on a monitor. The system appears to be very complicated.
The present invention provides a si~lpler system which, instead of analyzing a huge number of "points", analy~es a relatively small number of "cells" which may be obtained by dividing a video frame in a grid pattern, e.g. 5 x 14 or 7 x lO cells. Each cell is selected to be approximately equal in size to the area which a man's image would occupy. The video camera is located at one end of an area to be surveyed so that the image size of a man varies with distance from the camera; the cell sizes are similarly varied.
; Two problems which can affect the operation of an automatic video intrusion detector are:
~1) achieving sufficient sensitivity to produce a high probability of detection, especially at night and during snow and rain.
; (2) rejecting nuisance false alarms of the type discussed ` above.
The present invention goes a long way toward solving these problems.
The first problem is addressed by integrating the video signal in both time and space while the second problem is addressed by various means, such as:
~1) limiting the area of image processed, for example ; to that between two perimeter fences, ~2) filtering in time to ensure tha~ only targets which move within a certain velocity region are detected, (3) filtering in space to discriminate against targets .. :
. ~
; . . . .
which are much smaller or much larger than humans, (4) tracking of potential targets to discriminate against all targets which do not move in a direction from one fence to the other.
In some cases the area to be monitored may not be defined by actual fences but the principle can be the same; i.e. the area ; of image processed can be limited to that between two lines. In ; the preferred embodiment disclosed hereinafter the lines (fences) are parallel but it will be apparent to those skilled in the art that the lines could be non-parallel.
~ccording to a broad aspect of the invention there is provided an automatic video intrusion detection system to detect targets comprising intruders moving more than a predetermined distance in a direction from one side to the other of an area to be ' monitored, said system comprising a television camera disposed to view a predetermined area which includes said area to be monitored, said television camera supplying a video image of said predetermined area to a preprocessor, said preprocessor having means for di~iding into a predetermined number of cells a portion of said video image
A video intrusion detection sys~em enables a given area to be moni~ored to detect persons trying to enter a prohibited area, such as between two parallel spaced apart fences. For example, it may be desired to detect persons trying to enter premises without authorization or to leave an area, such as a prison compound, without authorization. It will be understood that the term "intrusion"
includes escape attempts in which case the would-be escapee still "intrudes" a forbidden area.
Some existing video intrusion detection devices have proved inadequate in an outdoor environment. Apparently they are effective in a controlled indoor environment but when used outdoors suffer from a very high nuisance alarm rate. Such ~hings as trees moving in the wind, birds, the shadows of birds, cloud shadows, blowing paper and even insects near the camera can at times trip the alarm.
In addition, considerable problems exist with lighting variation (both in time and space) and during rain and snow. Thus the problem . ~
is very complex and no solution can be expected to be perfect.
United States Patent 3,988,533 of Mick et al, issued October 26, 1976, discloses a video motion and intrusion detection system which samples a large number of points in each video field, for example 16, 384 points. The digitized value of the grey scale of each point is stored and then, on a subsequent field of the same type, new digitized values of the points are compared with the stored values from the previous scan. If the difference exceeds a predetermined limit, an alarm is generated. The digital information provides the `.
~ .
., . , .. ~ . , .
8~
basis for an alarm map which may be displayed on a monitor. The system appears to be very complicated.
The present invention provides a si~lpler system which, instead of analyzing a huge number of "points", analy~es a relatively small number of "cells" which may be obtained by dividing a video frame in a grid pattern, e.g. 5 x 14 or 7 x lO cells. Each cell is selected to be approximately equal in size to the area which a man's image would occupy. The video camera is located at one end of an area to be surveyed so that the image size of a man varies with distance from the camera; the cell sizes are similarly varied.
; Two problems which can affect the operation of an automatic video intrusion detector are:
~1) achieving sufficient sensitivity to produce a high probability of detection, especially at night and during snow and rain.
; (2) rejecting nuisance false alarms of the type discussed ` above.
The present invention goes a long way toward solving these problems.
The first problem is addressed by integrating the video signal in both time and space while the second problem is addressed by various means, such as:
~1) limiting the area of image processed, for example ; to that between two perimeter fences, ~2) filtering in time to ensure tha~ only targets which move within a certain velocity region are detected, (3) filtering in space to discriminate against targets .. :
. ~
; . . . .
which are much smaller or much larger than humans, (4) tracking of potential targets to discriminate against all targets which do not move in a direction from one fence to the other.
In some cases the area to be monitored may not be defined by actual fences but the principle can be the same; i.e. the area ; of image processed can be limited to that between two lines. In ; the preferred embodiment disclosed hereinafter the lines (fences) are parallel but it will be apparent to those skilled in the art that the lines could be non-parallel.
~ccording to a broad aspect of the invention there is provided an automatic video intrusion detection system to detect targets comprising intruders moving more than a predetermined distance in a direction from one side to the other of an area to be ' monitored, said system comprising a television camera disposed to view a predetermined area which includes said area to be monitored, said television camera supplying a video image of said predetermined area to a preprocessor, said preprocessor having means for di~iding into a predetermined number of cells a portion of said video image
2~ substantially corresponding to said area to be monit~red, means for deriving integrated video intensity ~alues for said cells and means for storing said integrated intensity values in an ou~pu buffer, said system further comprising computer means for periodically analyzing the values stored in said buffer to detect changes in said values exceeding a predetermined amount and to track changes in values for consecutive cells indicating mo~ement of a target, said computer activating an alarm if changes in more than a predetermined numker of adjacent cells are tracked in said direction.
~ ;
~ 3 The invention will now be further discussed ~n conjunct-ion with the accompanying drawings, in which:
Figure 1 ls a diagram illustrating a video image of an area between t~o parallel fences to be monitored by the video detection system according to the invention, the image being ` divided into a plurality of regions or "cells".
Figure 2 illustrates an alternative way in which the area of the video image may be divided into regions.
Figure 3 is a block diagram of a video detection system according to the invention.
~: Figure 4 is a block diagram of a preprocessor according ~` to the invention.
; Figure 5 is a functional block diagram oE the processes carried out according to the invention.
Figure 6 is a functional block diagram of the invention.
Figure 7 illustrates a tracking algorith~ target array.
- 3a -~"~ ~ .
:, , ; , . :- , ,.. ,. . ,.: . , ,., ;
*~
Figure 8 is a tracking algorithm according ~o the invention.
Referring to Figure 1, there is shown an area 10 between two fences 12 and 13 as seen by a television camera moun~ed at some distance above the area and some distance back from the near end. For example, the camera may be spaced about 52 feet back from the near end of the area 10, the area 10 being, for example, approximately l,30 feet long. The video image is divided into a number of picture cells, each cell being of a size corresponding to the height of a man ~6 feet) at the range occupied by the cell. Obviously, a man at the near end of the area 10 appears much larger than at the far end, and hence the cells of the image are bigger at the near end than at the far end. Figure l shows an array of 5 by 14 cells, purely as an example.
Other arrays, such as 7 by 10, may be used, as shown in Figure 2.
Figure 3 shows a system block diagram. The television camera 15, which scans an area 10 such as sho~n in Figure 1 or Figure 2, feeds a video signal to a preprocessor and AGC 16. Part of the composite video output of television camera 15 is fed back through an automatic light level compensation circuit l9 to adjust the television camera to changing light conditions. The remainder of the composite video signal feeds the preprocessor 16 which converts the video signal from the television camera into 70 digital words representing the average brightness i~ a 10 x 7 array of picture cells (assuming the cell array of Figure 2 is used). The information stored in the preprocessor 16 feeds a computer 18, preferably a microcomputer, via a suitable interface 17. The computer 18 analy7es the information from the preprocessor to detect an intruder.
Figure 4 is a block diagram of the preprocessor 15 of . ~
Figure 3. An AGC amplifier 20 is used to match the video dynamic range to the digitized dynamic range. Two adders, 24 and 27, are used to perform integration, horizontally and vertically, respectively~
as will be explained. The video signal from AGC amplifier 20 is passed through a filter 21 and is converted into digital form in analog to digital ~A/D) converter 22. The output of A/D converter ~2 is fed via an AND gate 23 to one input of adder 24, gate 23 being enabled by an output of counter 25 on line 26. The output of counter 25 is controlled by clock 28 and ROM 29. ROM 29 stores the values of the horizontal cell boundaries and feeds ~hese to counter 25 which compares them with the count of clock 28. When the clock value is between the values of the cell boundaries, counter 25 produces an output on its line 26 to enable gate 23. At the same time, it produces an output on line 30 to enable line memory 31.
The digitized information for each line of a cell from A/D converter 22 is added in adder 24 and fed back via line memory 31 to another input of adder 24. In this manner, the intensity values for each line of a cell are integrated. The integrated value is then fed to a second adder 27 which, in a manner to be described, adds up all the lines of a frame for each cell and feeds the resultant to frame memory 32.
Frame memory 32 is enabled by line counter 33 which compares a count of video horizontal sync. pulses on line 3~ with the vertical cell boundar~es stored in ROM 35. Thus when line counter 33 counts a certain number of horizontal sync. pulses equal to a number stored in ROM 35, the video scan has reached a vertical cell boundary and line counter 33 enables, via line 36, the frame memory 32 which feeds i ' :
.
, ~'',' : ' '' its da~a to one input of adder 27. The other inpu~ of adder 27 is fed by the output of adder 24. As further horizontal sync. pulses are counted by counter 33, a higher number will be reached equal to a number stored in ROM 35 indicating the end of one vertical cell boundary and the beginning of the next. Frame memory 32 switches to a new storage area for the next cell.
A frame counter 37 receives on its input line 38 vertical sync. pulses and counts a number of these pulses equal to a predetermined number of frames, e.g. 4 to 6 frames, after which it causes frame memory ~2 to be read out into an output buffer 39. Output buffer 39 stores the integrated intensity values of each cell of the array from which they may be read by the microcomputor 18 ~Figure 3) for target detection and tracking.
The general purpose computer 18 (Figure 3) implements detection and tracking. Various algorithms can be used by the computer to perform various functions such as automatic gain control ~of amplifier 20), digital filtering of cell data, detection, target tracking and alarm setting. These functions are repeated every preprocessor cycle, i.e. every 4th to 6th TV frame or every 133 to 200 ms.
The relative organization of these functions is illustrated in the flow diagram of Figure 5, which is self explanatory.
Automatic gain control is accomplished by averaging brightness and contrast over a period of time and using these values to set the offset and gain of the ACC amplifier 20 (Figure 5). Additional outputs for lens iris or camera control may be provided if required.
Constant contrast is preferably provided to compensate for differences between night and day, snow and bare ground, etc.
:.
'`; `~"
, ~ . . . .
. ~ . . . . .
. , , , ~
,' . " ''' ';' " . ",~`. ;' The data from the preprocessor 16 are filtered in time using a recursive digital low yass filter to determine average brightness.
Choice of this parameter determines the minimum crossing speed at which targets will be detected. Too long a time constant results in an inability to adapt quickly to changing light conditions.
To detect the presence of a target, the preprocessor data for the current scan are compared with the filtered data; any significant changes are classified as targets. In order to discriminate against cloud shadows it is necessary to distinguish target siæe. To do this, the data are first examined to determine the number of significant changes in each row. If more than 3 targets are present it is assumed that lighting conditions have changed, and statistics are computed on the basis of all seven cells in the row. If, however, three or less cells have changed significantly then statistics are computed based only on those cells without significant changes. The statistics computed for each row are mean and standard deviation. In addition, the standard deviation is filtered in time to produce a sufficient statistic. Targets are then declared whenever the difference between a cell value and the mean for that row exceeds a set number times the standard deviation. This scheme has the following advantages: ~
Constant probability of false detection. -`
Discrimination against cloud shadows. `~
Detection of up to 3 targets in a single row (required to detect two or more people crossing together).
Target tracking is used to discriminate against all targets that do not cross the boundaries or which cross the boundaries too quickly to be a human. To be declared a target, a person must be ~ .
- :.
.' ' ' . '' ': ;
, : : ' ~ :: , `
: .. ~ , : . . . .
tracked going across at least 3 columns. Note that a person can walk parallel to the fence without causing an alarm. Also objects which move too quickly across the field of view (such as birds, blowing paper, insects, etc.) will not be detected as they do not appear in adjacent cells. Thus tracking provides great immunity to nuisance false alarms.
Figure 6 is a functional configuration of a preprocessor 16 and microcomputer 18 for an embodiment in which the area to be surveyed is divided into 5 x 14 cells. This configuration averages the image over each of these zones to produce a 5 x 14 array It.
The number of picture elements averaged in each cell of It varies from about 6,000 at the near end to about 12 at the far end of the area being surveyed. Obviously, a much greater SNR (signal to noise ratio) will be available at close ranges than for long ranges with consequent effects on probability of detection and system (as opposed to nuisance) false alarm rates.
The integration function is performed by the preprocessor as discussed in connection with Figure 4. This may be accomplished either by digitizing the video directly and integrating digitally to produce a level for each cell or by gating sample and hold devices ~one for each of the 70 elements) at the appropriate point in the horizontal scan and integrating by analogue means in the vertical direction. A low speed digitizer would then be used to digitize .~ .
the results.
It is considered adequate to sample the scene (It) between ` 8 and 5 times per second or every fourth to sixth television frame.
Depending on system requirements it may be desirable to integrate - 8 - ;
, . ., . ~ :
,, , ! . i ! .
over the four to six frames to provide a gain in SNR or it may be possible to time share the preprocessor among several different cameras.
(A typical installation would use eight cameras, two for each side of a perimeter to be guarded). Using this sampling rate effectively discriminates against fast moving targets. For example, a target moving faster than 1 cell per sample will skip at least 1 cell and not be tracked. One cell may equal, for example, 4 feet in width, which would mean a maximum detectable speed of about 20 feet per second.
This is the simplest means of limiting the maximum detectable target speed.
Referring to ~igure 6, the output of the preprocessor 16 is sampled and stored in a store 40 of a microcomputer 18 as It.
This value is then fed to a high pass filter 41 which discriminates against very slowly moving targets. This is accomplished by accumulating in a store 42 an averaged version of It, I*t 1 = ~ ) It ~ ~ I t-2 which represents the average brightness in each cell over a time period determined by ~ . This time period could be, for example, one minute. Thus It is multiplied by (1 - ~) in multiplier 43 and fed to one input of an adder 44, the outpu~ of which feeds the store 42. The output of store 42 is fed back through a multiplier 45 which multiplies the value from the store 42 by ~ , the output of multiplier of 45 being fed to the other input of adder 44. The output of store 42 is I*t 1 and is fed as an input to a subtractor 50, the other input of which is derived from store 40. The output of subtractor 50 is the difference between It and I*t 1 By obtaining this difference, which is stored in store 51, a difference array is produced which is compensated for slowly changing conditions such :. . - : . :
, ~ , .: ,. :
: ,, . ~ .
:.. . . , . ~ .,., . .:
i2~i as variations in lighting. The difference array 51 consîsts ~f 5 x 14 = 70 members with from 8 to 12 bits dynamic range. Moving targets will correspond to elements of the array which deviate significantly from the average value.
The values d(x,y) stored in difference array 51 are fed to a computation unit 52 in the microcomputer which computes statistics comprising the means, m~x) and standard deviations,e~(x) for each row ~y). These statistics are compared in detector 53 with the data from difference array 51 by computing d~x,y)-m~x) 2 K6 ~x). Results greater than or equal to K 6~x) are stored in detection array 54.
The means and standard deviations are computed so as not to include data from target cells. These cells are checked before the means and standard deviations are computed and up to three are eliminated if they show a large deviation from zero.
The last stage in target recognition is tracking which is implemented in the microcomputer 18 by a tracking algorithm indicated by block 55 in Figure 6 with the results being stored in target array T~x,y) which is referenced 56 in Figure 6. An alarm 57 is given if an element of array T exceeds a predetermined threshold. This 2~ particular feature produces an acceptable nuisance alarm rate by greatly narrowing the class of objects that produce alarms. Tracking is achieved by storing the target array S4 ~5 x 14) and recording in that array target history. Figure 7 represents a target array T(x,y) and Figure 8 shows a flow diagram of the tracking algorithm.
Parameters used are:
: :
~- : . :: . . . , . .. ., ~, ,, . , ~ , , . : :
x,y - position coordinates F~x,y) - flag bit used to note old targets D(x,y) - detec~ion array. D is either zero (no target) or one (target) for each element.
It is the output of the detection.
T(x,y) - target array. This array is updated by the algorithm for each new detection array ~D(x,y)). T may have values from O to 5.
An alarm is given whenever an element of T
exceeds a preset threshold from 3 to 5.
~1 - Value of previous column targets.
N - Value of same column targets.
At the start of each tracking cycle ~every 4th to 6th frame) the coordinates of the tracking array are initialized (61) and the tracking loop is en~ered. For a particular coordinate (x,y) in the tracking array the tracking is as ollows. A flag bit is initialized (62) and the detector array 54 [D(x,y)] is tested to see if any targets haYe been detected at that coordinate during the current preprocessor cycle (63). If D(x,y) = O (no detection) then the tracking array value is checked to see if a target was detected there during a previous :~.
cycle t64). If no target history is indicated ~T(x,y) = O] then the coordinates are checked ~65,66) and a new coordinate is set up ;
(67,68) or the tracking loop is terminated ~69). If at 64 T(x,y) was not zero ~i.e. a target has been detected previously at that - coordinate) then the ~lag is set to one (70) and the tracking value [T~x,y)] is decreased by a constant typically equal to 0.4 ~71).
Thus old targets are gradually erased from the tracking matrix.
. , , - . - ~ - .: ~ : .. .. : : , ,.
..
, - - : .. . . .
.. , , , . : . ,.. ,. :, ,.. , :. :; : .
If a target has been detected on the current cycle then D(x~y) =
1 (63) and a parameter r~ is calculated ~72). This parameter establishes whether a target was previously detected in any of the adjoining cells in the previous column (x-l) and what its tracking matri~ value was. The M value is augmented by one (73) to indicate an increase in target history. Next a value N is computed (74) N = Max {T(x,y), T Ix,y-l) + 0.4 F(X,Y 1)J T(x y*l)3 Thus largest of the test cell and the cell immediately above and below the test cell is determined. Note that the flag bit is used to modify the cell preceding the test cell as *his may have been changed at 71. The target array is updated (75) ~o the largest of M, N or 1 and then the coordinates are tested and updated (65, 66, 68, 67) or the tracking cycle is completed (69). After each tracking cycle is complete the tracking array is tested against a threshold. Usually any tracking value between 3 and 5 would be considered to constitute an alarm.
Although the algorithm disclosed detects movement in one direction, it will be apparent to those skilled in the art that algorithms could be devised for tracking in both directions.
Although the invention has been described for use in detecting movement between two fences it will be apparent that it could be used for monitoring areas not clearly marked off by fences.
The system according to the invention is capable of testing its own performance. To do this, one sets up a target at the far ; end of the area being monitored, the target having a black area and a white area, each one cell wide. The data for these cells are taken from array 40 and a per~ormance test done as indicated at 60, (Figure 6) - . .. . .: . . . . . .
the tes* consisting of computing the contrast between the black and white areas. A decrease in contras~, as compared to a day with clear visibility, would then give a measure o atmospheric transmission which, of course, is influenced by fog, rain or snow.
Although the system has been described as using a television camera, it is not contemplated that the system is restricted to cameras sensitive to visible radiation. The system would work equally well with cameras sensitive to other types of radiation, e.g. infra-red, provided they utilize a raster scan. Thus~ the term "television camera" is intended to include cameras sensitive to other than visible radiation.
: .:. :, . . ,- :
~ ;
~ 3 The invention will now be further discussed ~n conjunct-ion with the accompanying drawings, in which:
Figure 1 ls a diagram illustrating a video image of an area between t~o parallel fences to be monitored by the video detection system according to the invention, the image being ` divided into a plurality of regions or "cells".
Figure 2 illustrates an alternative way in which the area of the video image may be divided into regions.
Figure 3 is a block diagram of a video detection system according to the invention.
~: Figure 4 is a block diagram of a preprocessor according ~` to the invention.
; Figure 5 is a functional block diagram oE the processes carried out according to the invention.
Figure 6 is a functional block diagram of the invention.
Figure 7 illustrates a tracking algorith~ target array.
- 3a -~"~ ~ .
:, , ; , . :- , ,.. ,. . ,.: . , ,., ;
*~
Figure 8 is a tracking algorithm according ~o the invention.
Referring to Figure 1, there is shown an area 10 between two fences 12 and 13 as seen by a television camera moun~ed at some distance above the area and some distance back from the near end. For example, the camera may be spaced about 52 feet back from the near end of the area 10, the area 10 being, for example, approximately l,30 feet long. The video image is divided into a number of picture cells, each cell being of a size corresponding to the height of a man ~6 feet) at the range occupied by the cell. Obviously, a man at the near end of the area 10 appears much larger than at the far end, and hence the cells of the image are bigger at the near end than at the far end. Figure l shows an array of 5 by 14 cells, purely as an example.
Other arrays, such as 7 by 10, may be used, as shown in Figure 2.
Figure 3 shows a system block diagram. The television camera 15, which scans an area 10 such as sho~n in Figure 1 or Figure 2, feeds a video signal to a preprocessor and AGC 16. Part of the composite video output of television camera 15 is fed back through an automatic light level compensation circuit l9 to adjust the television camera to changing light conditions. The remainder of the composite video signal feeds the preprocessor 16 which converts the video signal from the television camera into 70 digital words representing the average brightness i~ a 10 x 7 array of picture cells (assuming the cell array of Figure 2 is used). The information stored in the preprocessor 16 feeds a computer 18, preferably a microcomputer, via a suitable interface 17. The computer 18 analy7es the information from the preprocessor to detect an intruder.
Figure 4 is a block diagram of the preprocessor 15 of . ~
Figure 3. An AGC amplifier 20 is used to match the video dynamic range to the digitized dynamic range. Two adders, 24 and 27, are used to perform integration, horizontally and vertically, respectively~
as will be explained. The video signal from AGC amplifier 20 is passed through a filter 21 and is converted into digital form in analog to digital ~A/D) converter 22. The output of A/D converter ~2 is fed via an AND gate 23 to one input of adder 24, gate 23 being enabled by an output of counter 25 on line 26. The output of counter 25 is controlled by clock 28 and ROM 29. ROM 29 stores the values of the horizontal cell boundaries and feeds ~hese to counter 25 which compares them with the count of clock 28. When the clock value is between the values of the cell boundaries, counter 25 produces an output on its line 26 to enable gate 23. At the same time, it produces an output on line 30 to enable line memory 31.
The digitized information for each line of a cell from A/D converter 22 is added in adder 24 and fed back via line memory 31 to another input of adder 24. In this manner, the intensity values for each line of a cell are integrated. The integrated value is then fed to a second adder 27 which, in a manner to be described, adds up all the lines of a frame for each cell and feeds the resultant to frame memory 32.
Frame memory 32 is enabled by line counter 33 which compares a count of video horizontal sync. pulses on line 3~ with the vertical cell boundar~es stored in ROM 35. Thus when line counter 33 counts a certain number of horizontal sync. pulses equal to a number stored in ROM 35, the video scan has reached a vertical cell boundary and line counter 33 enables, via line 36, the frame memory 32 which feeds i ' :
.
, ~'',' : ' '' its da~a to one input of adder 27. The other inpu~ of adder 27 is fed by the output of adder 24. As further horizontal sync. pulses are counted by counter 33, a higher number will be reached equal to a number stored in ROM 35 indicating the end of one vertical cell boundary and the beginning of the next. Frame memory 32 switches to a new storage area for the next cell.
A frame counter 37 receives on its input line 38 vertical sync. pulses and counts a number of these pulses equal to a predetermined number of frames, e.g. 4 to 6 frames, after which it causes frame memory ~2 to be read out into an output buffer 39. Output buffer 39 stores the integrated intensity values of each cell of the array from which they may be read by the microcomputor 18 ~Figure 3) for target detection and tracking.
The general purpose computer 18 (Figure 3) implements detection and tracking. Various algorithms can be used by the computer to perform various functions such as automatic gain control ~of amplifier 20), digital filtering of cell data, detection, target tracking and alarm setting. These functions are repeated every preprocessor cycle, i.e. every 4th to 6th TV frame or every 133 to 200 ms.
The relative organization of these functions is illustrated in the flow diagram of Figure 5, which is self explanatory.
Automatic gain control is accomplished by averaging brightness and contrast over a period of time and using these values to set the offset and gain of the ACC amplifier 20 (Figure 5). Additional outputs for lens iris or camera control may be provided if required.
Constant contrast is preferably provided to compensate for differences between night and day, snow and bare ground, etc.
:.
'`; `~"
, ~ . . . .
. ~ . . . . .
. , , , ~
,' . " ''' ';' " . ",~`. ;' The data from the preprocessor 16 are filtered in time using a recursive digital low yass filter to determine average brightness.
Choice of this parameter determines the minimum crossing speed at which targets will be detected. Too long a time constant results in an inability to adapt quickly to changing light conditions.
To detect the presence of a target, the preprocessor data for the current scan are compared with the filtered data; any significant changes are classified as targets. In order to discriminate against cloud shadows it is necessary to distinguish target siæe. To do this, the data are first examined to determine the number of significant changes in each row. If more than 3 targets are present it is assumed that lighting conditions have changed, and statistics are computed on the basis of all seven cells in the row. If, however, three or less cells have changed significantly then statistics are computed based only on those cells without significant changes. The statistics computed for each row are mean and standard deviation. In addition, the standard deviation is filtered in time to produce a sufficient statistic. Targets are then declared whenever the difference between a cell value and the mean for that row exceeds a set number times the standard deviation. This scheme has the following advantages: ~
Constant probability of false detection. -`
Discrimination against cloud shadows. `~
Detection of up to 3 targets in a single row (required to detect two or more people crossing together).
Target tracking is used to discriminate against all targets that do not cross the boundaries or which cross the boundaries too quickly to be a human. To be declared a target, a person must be ~ .
- :.
.' ' ' . '' ': ;
, : : ' ~ :: , `
: .. ~ , : . . . .
tracked going across at least 3 columns. Note that a person can walk parallel to the fence without causing an alarm. Also objects which move too quickly across the field of view (such as birds, blowing paper, insects, etc.) will not be detected as they do not appear in adjacent cells. Thus tracking provides great immunity to nuisance false alarms.
Figure 6 is a functional configuration of a preprocessor 16 and microcomputer 18 for an embodiment in which the area to be surveyed is divided into 5 x 14 cells. This configuration averages the image over each of these zones to produce a 5 x 14 array It.
The number of picture elements averaged in each cell of It varies from about 6,000 at the near end to about 12 at the far end of the area being surveyed. Obviously, a much greater SNR (signal to noise ratio) will be available at close ranges than for long ranges with consequent effects on probability of detection and system (as opposed to nuisance) false alarm rates.
The integration function is performed by the preprocessor as discussed in connection with Figure 4. This may be accomplished either by digitizing the video directly and integrating digitally to produce a level for each cell or by gating sample and hold devices ~one for each of the 70 elements) at the appropriate point in the horizontal scan and integrating by analogue means in the vertical direction. A low speed digitizer would then be used to digitize .~ .
the results.
It is considered adequate to sample the scene (It) between ` 8 and 5 times per second or every fourth to sixth television frame.
Depending on system requirements it may be desirable to integrate - 8 - ;
, . ., . ~ :
,, , ! . i ! .
over the four to six frames to provide a gain in SNR or it may be possible to time share the preprocessor among several different cameras.
(A typical installation would use eight cameras, two for each side of a perimeter to be guarded). Using this sampling rate effectively discriminates against fast moving targets. For example, a target moving faster than 1 cell per sample will skip at least 1 cell and not be tracked. One cell may equal, for example, 4 feet in width, which would mean a maximum detectable speed of about 20 feet per second.
This is the simplest means of limiting the maximum detectable target speed.
Referring to ~igure 6, the output of the preprocessor 16 is sampled and stored in a store 40 of a microcomputer 18 as It.
This value is then fed to a high pass filter 41 which discriminates against very slowly moving targets. This is accomplished by accumulating in a store 42 an averaged version of It, I*t 1 = ~ ) It ~ ~ I t-2 which represents the average brightness in each cell over a time period determined by ~ . This time period could be, for example, one minute. Thus It is multiplied by (1 - ~) in multiplier 43 and fed to one input of an adder 44, the outpu~ of which feeds the store 42. The output of store 42 is fed back through a multiplier 45 which multiplies the value from the store 42 by ~ , the output of multiplier of 45 being fed to the other input of adder 44. The output of store 42 is I*t 1 and is fed as an input to a subtractor 50, the other input of which is derived from store 40. The output of subtractor 50 is the difference between It and I*t 1 By obtaining this difference, which is stored in store 51, a difference array is produced which is compensated for slowly changing conditions such :. . - : . :
, ~ , .: ,. :
: ,, . ~ .
:.. . . , . ~ .,., . .:
i2~i as variations in lighting. The difference array 51 consîsts ~f 5 x 14 = 70 members with from 8 to 12 bits dynamic range. Moving targets will correspond to elements of the array which deviate significantly from the average value.
The values d(x,y) stored in difference array 51 are fed to a computation unit 52 in the microcomputer which computes statistics comprising the means, m~x) and standard deviations,e~(x) for each row ~y). These statistics are compared in detector 53 with the data from difference array 51 by computing d~x,y)-m~x) 2 K6 ~x). Results greater than or equal to K 6~x) are stored in detection array 54.
The means and standard deviations are computed so as not to include data from target cells. These cells are checked before the means and standard deviations are computed and up to three are eliminated if they show a large deviation from zero.
The last stage in target recognition is tracking which is implemented in the microcomputer 18 by a tracking algorithm indicated by block 55 in Figure 6 with the results being stored in target array T~x,y) which is referenced 56 in Figure 6. An alarm 57 is given if an element of array T exceeds a predetermined threshold. This 2~ particular feature produces an acceptable nuisance alarm rate by greatly narrowing the class of objects that produce alarms. Tracking is achieved by storing the target array S4 ~5 x 14) and recording in that array target history. Figure 7 represents a target array T(x,y) and Figure 8 shows a flow diagram of the tracking algorithm.
Parameters used are:
: :
~- : . :: . . . , . .. ., ~, ,, . , ~ , , . : :
x,y - position coordinates F~x,y) - flag bit used to note old targets D(x,y) - detec~ion array. D is either zero (no target) or one (target) for each element.
It is the output of the detection.
T(x,y) - target array. This array is updated by the algorithm for each new detection array ~D(x,y)). T may have values from O to 5.
An alarm is given whenever an element of T
exceeds a preset threshold from 3 to 5.
~1 - Value of previous column targets.
N - Value of same column targets.
At the start of each tracking cycle ~every 4th to 6th frame) the coordinates of the tracking array are initialized (61) and the tracking loop is en~ered. For a particular coordinate (x,y) in the tracking array the tracking is as ollows. A flag bit is initialized (62) and the detector array 54 [D(x,y)] is tested to see if any targets haYe been detected at that coordinate during the current preprocessor cycle (63). If D(x,y) = O (no detection) then the tracking array value is checked to see if a target was detected there during a previous :~.
cycle t64). If no target history is indicated ~T(x,y) = O] then the coordinates are checked ~65,66) and a new coordinate is set up ;
(67,68) or the tracking loop is terminated ~69). If at 64 T(x,y) was not zero ~i.e. a target has been detected previously at that - coordinate) then the ~lag is set to one (70) and the tracking value [T~x,y)] is decreased by a constant typically equal to 0.4 ~71).
Thus old targets are gradually erased from the tracking matrix.
. , , - . - ~ - .: ~ : .. .. : : , ,.
..
, - - : .. . . .
.. , , , . : . ,.. ,. :, ,.. , :. :; : .
If a target has been detected on the current cycle then D(x~y) =
1 (63) and a parameter r~ is calculated ~72). This parameter establishes whether a target was previously detected in any of the adjoining cells in the previous column (x-l) and what its tracking matri~ value was. The M value is augmented by one (73) to indicate an increase in target history. Next a value N is computed (74) N = Max {T(x,y), T Ix,y-l) + 0.4 F(X,Y 1)J T(x y*l)3 Thus largest of the test cell and the cell immediately above and below the test cell is determined. Note that the flag bit is used to modify the cell preceding the test cell as *his may have been changed at 71. The target array is updated (75) ~o the largest of M, N or 1 and then the coordinates are tested and updated (65, 66, 68, 67) or the tracking cycle is completed (69). After each tracking cycle is complete the tracking array is tested against a threshold. Usually any tracking value between 3 and 5 would be considered to constitute an alarm.
Although the algorithm disclosed detects movement in one direction, it will be apparent to those skilled in the art that algorithms could be devised for tracking in both directions.
Although the invention has been described for use in detecting movement between two fences it will be apparent that it could be used for monitoring areas not clearly marked off by fences.
The system according to the invention is capable of testing its own performance. To do this, one sets up a target at the far ; end of the area being monitored, the target having a black area and a white area, each one cell wide. The data for these cells are taken from array 40 and a per~ormance test done as indicated at 60, (Figure 6) - . .. . .: . . . . . .
the tes* consisting of computing the contrast between the black and white areas. A decrease in contras~, as compared to a day with clear visibility, would then give a measure o atmospheric transmission which, of course, is influenced by fog, rain or snow.
Although the system has been described as using a television camera, it is not contemplated that the system is restricted to cameras sensitive to visible radiation. The system would work equally well with cameras sensitive to other types of radiation, e.g. infra-red, provided they utilize a raster scan. Thus~ the term "television camera" is intended to include cameras sensitive to other than visible radiation.
: .:. :, . . ,- :
Claims (16)
PROPERTY OR PRIVILEGE IS CLAIMED ARE DEFINED AS FOLLOWS:
1. An automatic video intrusion detection system to detect targets comprising intruders moving more than a predetermined distance in a direction from one side to the other of an area to be monitored, said system comprising a television camera disposed to view a predetermined area which includes said area to be monitored, said television camera supplying a video image of said predetermined area to a preprocessor, said preprocessor having means for dividing into a predetermined number of cells a portion of said video image substantially corresponding to said area to be monitored, means for deriving integrated video intensity values for said cells and means for storing said integrated intensity values in an output buffer, said system further comprising computer means for periodically analyzing the values stored in said buffer to detect changes in said values exceeding a predetermined amount and to track changes in values for consecutive cells indicating movement of a target, said computer activating an alarm if changes in more than a predetermined number of adjacent cells are tracked in said direction.
2. A system as claimed in claim 1 wherein said computer means includes means for filtering in time the data from said output buffer to determine average brightness so that only targets moving at more than a minimum speed will be detected.
3. A system as claimed in claim 2 wherein said predetermined number of cells are arranged in a grid pattern comprising rows and columns.
4. A system as claimed in claim 3 wherein said computer means interprets more than a predtermined number of targets in a row as being a change in lighting conditions rather than an intruder.
5. A system as claimed in claim 3 wherein said computer means computes statistics for each row of cells in the grid pattern comprising mean and standard deviation and provides a target indication when the difference between a cell value and the mean for that row exceeds a set number times the standard deviation.
6. A system as claimed in claim 5 wherein said computer means analyzes cell intensity values corresponding to every fourth to sixth frame of the television camera whereby very fast moving targets are discriminated against.
7. A system as claimed in claim 3 wherein said grid pattern comprises a 5 x 14 cell array and the predetermined number of targets is three.
8. A system as claimed in claim 1 wherein said preprocessor comprises an AGC amplifier which feeds the video signals from the television camera to an analog-to-digital (A/D) converter via a filter, said A/D converter feeding, via a gate, one input of a first adder, said first adder having an output which is fed back via a line memory to a second input of said first adder, said gate being controlled by a horizontal cell boundary read only memory (ROM) so that only data from the A/D converter corresponding to the area to be monitored is fed to said first adder, said first adder producing at its output integrated values for each line of a cell which are fed to one input of a second adder, said second adder having an output fed back via a frame memory to a second input of said second adder, said frame memory being controlled by a vertical cell boundary read-only memory to integrate line values from said first adder corresponding to vertical cell boundaries, the output of said second adder being fed to an output buffer which stores integrated intensity values for each cell.
9. A system as claimed in claim 8 wherein said computer means includes means for storing an array It of intensity values, means for deriving averaged values of It, i.e. I*t-1, which represents the average brightness value over a period of time, means for obtaining the difference between values of It and I*t-1 and storing them as a difference array, means utilizing the values stored in the difference array for computing statistics comprising the mean and standard deviation, means for comparing the values stored in the difference array with the statistics to detect differences exceeding a predetermined amount which are stored in a detection array.
10. A system as claimed in claim 9 wherein said computer includes means for analyzing the data stored in the detection array to track changes which are then stored in a target array.
11. A system as claimed in claim 7 wherein said preprocessor comprises an AGC amplifier which feeds the video signals from the television camera to ananalog-to-digital (A/D) converter via a filter, said A/D converter feeding, via a gate, one input of a first adder, said first adder having an output which is fed back via a line memory to a second input of said first adder, said gate being controlled by a horizontal cell boundary read only memory (ROM) so that only data from the A/D converter corresponding to the area to be monitored is fed to said first adder, said first adder producing at its output integrated values for each line of a cell which are fed to one input of a second adder, said second adder having an output fed back via a frame memory to a second input of said second adder, said frame memory being controlled by a vertical cell boundary read-only memory to integrate line values from said first adder corresponding to vertical cell boundaries, the output of said second adder being fed to an output buffer which stores integrated intensity values for each cell.
12. A system as claimed in claim 11 wherein said computer means includes means for storing an array It of intensity values, means for deriving averaged values of It, i.e. I*t-1, which represents the average brightness value over a period of time, means for obtaining the difference between values of It and I*t-1 and storing them a difference array, means utilizing the values stored in the difference array for computing statistics comprising the mean and standard deviation, means for comparing the values stored in the difference array with the statistics to detect differences exceeding a predetermined amount which are stored in a detection array.
13. A system as claimed in claim 12 wherein said computer includes means for analyzing the data stored in the detection array to track changes which are then stored in a target array.
14. A system as claimed in claim 1, 6 or 10 wherein said area to be monitored is defined by two parallel fences.
15. A system as claimed in claim 1, 3 or 6 including means for testing system performance comprising a fixed test target at the far end of the area to be monitored, said target being divided into a black area and a white area, and means for computing contrast of the images of said black and white areas whereby contrast variations with time provide an indication of changes in atmospheric transmission.
16. A system as claimed in claim 8, 9 or 10 including means for testing system performance comprising a fixed test target at the far end of the area to be monitored, said target being divided into a black area and a white area, and means for computing contrast of the images of said black and white areas whereby contrast variations with time provide an indication of changes in atmospheric transmission.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CA321,857A CA1116286A (en) | 1979-02-20 | 1979-02-20 | Perimeter surveillance system |
US06/097,551 US4249207A (en) | 1979-02-20 | 1979-11-26 | Perimeter surveillance system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CA321,857A CA1116286A (en) | 1979-02-20 | 1979-02-20 | Perimeter surveillance system |
Publications (1)
Publication Number | Publication Date |
---|---|
CA1116286A true CA1116286A (en) | 1982-01-12 |
Family
ID=4113582
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CA321,857A Expired CA1116286A (en) | 1979-02-20 | 1979-02-20 | Perimeter surveillance system |
Country Status (2)
Country | Link |
---|---|
US (1) | US4249207A (en) |
CA (1) | CA1116286A (en) |
Families Citing this family (110)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CA1172746A (en) * | 1980-10-22 | 1984-08-14 | Trevor W. Mahoney | Video movement detector |
DE3316122A1 (en) * | 1983-05-03 | 1984-11-08 | Kraftwerk Union AG, 4330 Mülheim | OUTDOOR AREA MONITORING SYSTEM |
AU599469B2 (en) * | 1984-03-06 | 1990-07-19 | Practel Pty Ltd | Vision system |
AU571674B2 (en) * | 1984-03-06 | 1988-04-21 | Practel Pty Ltd | Vision/reaction system |
JPS61198893A (en) * | 1985-02-27 | 1986-09-03 | Mitsubishi Electric Corp | Method for supervising station platform |
JPS61260391A (en) * | 1985-05-14 | 1986-11-18 | 三菱電機株式会社 | Monitor/controller |
JP2603215B2 (en) * | 1985-07-04 | 1997-04-23 | キヤノン株式会社 | Image recognition method and apparatus |
DE3523872C1 (en) * | 1985-07-04 | 1986-09-25 | KTV-Systemtechnik GmbH, 8752 Kleinostheim | Fence with security wires attached to posts via sensors |
GB2183878B (en) * | 1985-10-11 | 1989-09-20 | Matsushita Electric Works Ltd | Abnormality supervising system |
DE3543515A1 (en) * | 1985-12-10 | 1987-06-11 | Strahlen Umweltforsch Gmbh | METHOD FOR MEASURING THE MOTIONS AND CONFIGURATIONS OF BIOLOGICAL AND NON-BIOLOGICAL OBJECTS |
US4827412A (en) * | 1986-01-29 | 1989-05-02 | Computer Sports Systems, Inc. | Pinfall detector using video camera |
FR2609566B1 (en) * | 1987-01-14 | 1990-04-13 | Armine | METHOD FOR DETERMINING THE TRAJECTORY OF A BODY CAPABLE OF MOVING ON A TRACK AND DEVICE FOR IMPLEMENTING THE METHOD |
US4847772A (en) * | 1987-02-17 | 1989-07-11 | Regents Of The University Of Minnesota | Vehicle detection through image processing for traffic surveillance and control |
JPH0695008B2 (en) * | 1987-12-11 | 1994-11-24 | 株式会社東芝 | Monitoring device |
GB8814822D0 (en) * | 1988-06-22 | 1988-07-27 | British Broadcasting Corp | Bandwidth reduction system for television |
US5619264A (en) * | 1988-02-09 | 1997-04-08 | Canon Kabushiki Kaisha | Automatic focusing device |
US4916435A (en) * | 1988-05-10 | 1990-04-10 | Guardian Technologies, Inc. | Remote confinement monitoring station and system incorporating same |
GB8826550D0 (en) * | 1988-11-14 | 1989-05-17 | Smiths Industries Plc | Image processing apparatus and methods |
JP2828324B2 (en) * | 1990-06-21 | 1998-11-25 | 富士通株式会社 | Remote monitoring system |
US5097328A (en) * | 1990-10-16 | 1992-03-17 | Boyette Robert B | Apparatus and a method for sensing events from a remote location |
CH681574A5 (en) * | 1991-03-01 | 1993-04-15 | Cerberus Ag | |
US5182641A (en) * | 1991-06-17 | 1993-01-26 | The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration | Composite video and graphics display for camera viewing systems in robotics and teleoperation |
DE69229831T2 (en) * | 1991-09-12 | 2001-04-12 | Electronic Data Syst Corp | IMAGE ANALYZER |
AU726071B2 (en) * | 1991-09-12 | 2000-10-26 | Electronic Data Systems Corporation | Image analyser |
US5999634A (en) * | 1991-09-12 | 1999-12-07 | Electronic Data Systems Corporation | Device and method for analyzing an electronic image signal |
AU699218B2 (en) * | 1991-09-12 | 1998-11-26 | Electronic Data Systems Corporation | Image analyser |
DE4138254C1 (en) * | 1991-11-21 | 1993-06-24 | Grundig E.M.V. Elektro-Mechanische Versuchsanstalt Max Grundig Hollaend. Stiftung & Co Kg, 8510 Fuerth, De | |
US5210604A (en) * | 1991-12-10 | 1993-05-11 | Carpenter Loren C | Method and apparatus for audience participation by electronic imaging |
US5365266A (en) * | 1991-12-10 | 1994-11-15 | Carpenter Loren C | Video imaging method and apparatus for audience participation |
US5283551A (en) * | 1991-12-31 | 1994-02-01 | Aritech Corporation | Intrusion alarm system |
CZ41994A3 (en) * | 1993-03-05 | 1994-11-16 | Elta Electronics Ind Ltd | Motion detection system |
JPH078735U (en) * | 1993-07-09 | 1995-02-07 | 株式会社村田製作所 | Infrared sensor device |
US5880775A (en) * | 1993-08-16 | 1999-03-09 | Videofaxx, Inc. | Method and apparatus for detecting changes in a video display |
EP0686932A3 (en) * | 1994-03-17 | 1997-06-25 | Texas Instruments Inc | A computer vision system to detect 3-D rectangular objects |
ATE202215T1 (en) * | 1994-08-24 | 2001-06-15 | Seisma Ag | SYSTEM AND METHOD FOR IMAGE EVALUATION |
JP3642591B2 (en) * | 1994-11-29 | 2005-04-27 | 株式会社日立メディコ | Image processing device |
AUPN374495A0 (en) * | 1995-06-23 | 1995-07-13 | Vision Systems Limited | Security sensor arrangement |
FR2736166B1 (en) * | 1995-06-27 | 1997-08-08 | Thomson Csf | METHOD AND DEVICE FOR OPTICAL PROCESSING OF TWO-DIMENSIONAL IMAGES ALLOWING EXTRACTION OF THE SPEED FIELD |
US5778108A (en) * | 1996-06-07 | 1998-07-07 | Electronic Data Systems Corporation | Method and system for detecting transitional markers such as uniform fields in a video signal |
US6061471A (en) * | 1996-06-07 | 2000-05-09 | Electronic Data Systems Corporation | Method and system for detecting uniform images in video signal |
US5734735A (en) * | 1996-06-07 | 1998-03-31 | Electronic Data Systems Corporation | Method and system for detecting the type of production media used to produce a video signal |
US5920360A (en) * | 1996-06-07 | 1999-07-06 | Electronic Data Systems Corporation | Method and system for detecting fade transitions in a video signal |
US5767923A (en) * | 1996-06-07 | 1998-06-16 | Electronic Data Systems Corporation | Method and system for detecting cuts in a video signal |
US5959697A (en) * | 1996-06-07 | 1999-09-28 | Electronic Data Systems Corporation | Method and system for detecting dissolve transitions in a video signal |
US5953055A (en) * | 1996-08-08 | 1999-09-14 | Ncr Corporation | System and method for detecting and analyzing a queue |
US6035341A (en) * | 1996-10-31 | 2000-03-07 | Sensormatic Electronics Corporation | Multimedia data analysis in intelligent video information management system |
US6031573A (en) * | 1996-10-31 | 2000-02-29 | Sensormatic Electronics Corporation | Intelligent video information management system performing multiple functions in parallel |
US5875305A (en) * | 1996-10-31 | 1999-02-23 | Sensormatic Electronics Corporation | Video information management system which provides intelligent responses to video data content features |
BR9713279A (en) | 1996-10-31 | 2000-01-18 | Sensormatic Eletrionics Corp | Intelligent video information management system. |
US5974235A (en) * | 1996-10-31 | 1999-10-26 | Sensormatic Electronics Corporation | Apparatus having flexible capabilities for analysis of video information |
DE19700811A1 (en) * | 1997-01-13 | 1998-07-16 | Heinrich Landert | Method and device for controlling door systems depending on the presence of people |
US6727938B1 (en) * | 1997-04-14 | 2004-04-27 | Robert Bosch Gmbh | Security system with maskable motion detection and camera with an adjustable field of view |
US6456320B2 (en) * | 1997-05-27 | 2002-09-24 | Sanyo Electric Co., Ltd. | Monitoring system and imaging system |
US6097429A (en) * | 1997-08-01 | 2000-08-01 | Esco Electronics Corporation | Site control unit for video security system |
FR2770724A1 (en) * | 1997-11-03 | 1999-04-30 | Telediffusion Fse | VIDEO IMAGE ANALYSIS METHOD FOR FIXED CAMERA |
US6175382B1 (en) * | 1997-11-24 | 2001-01-16 | Shell Oil Company | Unmanned fueling facility |
US7023469B1 (en) * | 1998-04-30 | 2006-04-04 | Texas Instruments Incorporated | Automatic video monitoring system which selectively saves information |
GB2337146B (en) * | 1998-05-08 | 2000-07-19 | Primary Image Limited | Method and apparatus for detecting motion across a surveillance area |
US6636257B1 (en) * | 1998-08-11 | 2003-10-21 | Honda Giken Kogyo Kabushiki Kaisha | Mobile body recognizing apparatus and motor vehicle monitoring apparatus |
JP3729660B2 (en) * | 1998-09-04 | 2005-12-21 | 松下電器産業株式会社 | Network camera monitoring system |
GB2344167B (en) * | 1998-11-26 | 2000-09-06 | Infrared Integrated Syst Ltd | Use of detector arrays to detect cessation of motion |
US6335976B1 (en) | 1999-02-26 | 2002-01-01 | Bomarc Surveillance, Inc. | System and method for monitoring visible changes |
US6493022B1 (en) * | 1999-03-05 | 2002-12-10 | Biscom, Inc. | Security system for notification of an undesired condition at a monitored area with minimized false alarms |
US7124427B1 (en) * | 1999-04-30 | 2006-10-17 | Touch Technologies, Inc. | Method and apparatus for surveillance using an image server |
US7310111B2 (en) * | 1999-08-12 | 2007-12-18 | Innovation Institute | Video monitoring and security system |
US6476858B1 (en) * | 1999-08-12 | 2002-11-05 | Innovation Institute | Video monitoring and security system |
US6803945B1 (en) * | 1999-09-21 | 2004-10-12 | Intel Corporation | Motion detecting web camera system |
US6297844B1 (en) * | 1999-11-24 | 2001-10-02 | Cognex Corporation | Video safety curtain |
US7042492B2 (en) | 1999-12-10 | 2006-05-09 | The Stanley Works | Automatic door assembly with video imaging device |
US6707486B1 (en) * | 1999-12-15 | 2004-03-16 | Advanced Technology Video, Inc. | Directional motion estimator |
JP3389548B2 (en) * | 2000-01-13 | 2003-03-24 | 三洋電機株式会社 | Room abnormality detection device and room abnormality detection method |
US7167575B1 (en) | 2000-04-29 | 2007-01-23 | Cognex Corporation | Video safety detector with projected pattern |
JP2002010240A (en) * | 2000-06-21 | 2002-01-11 | Matsushita Electric Ind Co Ltd | Monitoring system |
DE10039142B4 (en) * | 2000-08-07 | 2006-12-28 | Leuze Lumiflex Gmbh + Co. Kg | Method for monitoring access to a hazardous area |
US7522745B2 (en) * | 2000-08-31 | 2009-04-21 | Grasso Donald P | Sensor and imaging system |
US6714237B2 (en) * | 2000-09-09 | 2004-03-30 | Menix Engineering Co., Ltd. | Apparatus and method for automatically storing an intrusion scene |
US8564661B2 (en) | 2000-10-24 | 2013-10-22 | Objectvideo, Inc. | Video analytic rule detection system and method |
US20050162515A1 (en) * | 2000-10-24 | 2005-07-28 | Objectvideo, Inc. | Video surveillance system |
US8711217B2 (en) | 2000-10-24 | 2014-04-29 | Objectvideo, Inc. | Video surveillance system employing video primitives |
US9892606B2 (en) * | 2001-11-15 | 2018-02-13 | Avigilon Fortress Corporation | Video surveillance system employing video primitives |
US7868912B2 (en) * | 2000-10-24 | 2011-01-11 | Objectvideo, Inc. | Video surveillance system employing video primitives |
US20050146605A1 (en) * | 2000-10-24 | 2005-07-07 | Lipton Alan J. | Video surveillance system employing video primitives |
US7200246B2 (en) * | 2000-11-17 | 2007-04-03 | Honeywell International Inc. | Object detection |
US6441734B1 (en) * | 2000-12-12 | 2002-08-27 | Koninklijke Philips Electronics N.V. | Intruder detection through trajectory analysis in monitoring and surveillance systems |
US7424175B2 (en) | 2001-03-23 | 2008-09-09 | Objectvideo, Inc. | Video segmentation using statistical pixel modeling |
US6825769B2 (en) | 2001-09-14 | 2004-11-30 | Koninklijke Philips Electronics N.V. | Automatic shut-off light system when user sleeps |
US6970083B2 (en) * | 2001-10-09 | 2005-11-29 | Objectvideo, Inc. | Video tripwire |
US6696945B1 (en) * | 2001-10-09 | 2004-02-24 | Diamondback Vision, Inc. | Video tripwire |
AUPR899401A0 (en) | 2001-11-21 | 2001-12-13 | Cea Technologies Pty Limited | Method and apparatus for non-motion detection |
US20030107650A1 (en) * | 2001-12-11 | 2003-06-12 | Koninklijke Philips Electronics N.V. | Surveillance system with suspicious behavior detection |
AU2003230491A1 (en) * | 2002-04-25 | 2003-11-10 | Wespot Ab | Sensor arrangement and method for calibrating the same |
GB0210887D0 (en) * | 2002-05-13 | 2002-06-19 | Central Research Lab Ltd | Verified alarms |
US20040028137A1 (en) * | 2002-06-19 | 2004-02-12 | Jeremy Wyn-Harris | Motion detection camera |
WO2004023787A2 (en) * | 2002-09-06 | 2004-03-18 | Rytec Corporation | Signal intensity range transformation apparatus and method |
US7746379B2 (en) * | 2002-12-31 | 2010-06-29 | Asset Intelligence, Llc | Sensing cargo using an imaging device |
US7421112B2 (en) * | 2004-03-12 | 2008-09-02 | General Electric Company | Cargo sensing system |
WO2006011804A1 (en) * | 2004-07-30 | 2006-02-02 | Eagle Vision Systems B.V. | System and method for the detection of persons |
US8248226B2 (en) * | 2004-11-16 | 2012-08-21 | Black & Decker Inc. | System and method for monitoring security at a premises |
US9077882B2 (en) | 2005-04-05 | 2015-07-07 | Honeywell International Inc. | Relevant image detection in a camera, recorder, or video streaming device |
FR2913795B1 (en) * | 2007-03-13 | 2009-06-05 | Gint Soc Par Actions Simplifie | SURVEILLANCE SYSTEM. |
US7986228B2 (en) * | 2007-09-05 | 2011-07-26 | Stanley Convergent Security Solutions, Inc. | System and method for monitoring security at a premises using line card |
US8390685B2 (en) | 2008-02-06 | 2013-03-05 | International Business Machines Corporation | Virtual fence |
DE102009000173A1 (en) * | 2009-01-13 | 2010-07-15 | Robert Bosch Gmbh | Device for counting objects, methods and computer program |
US20110221606A1 (en) * | 2010-03-11 | 2011-09-15 | Laser Technology , Inc. | System and method for detecting a moving object in an image zone |
WO2014039050A1 (en) | 2012-09-07 | 2014-03-13 | Siemens Aktiengesellschaft | Methods and apparatus for establishing exit/entry criteria for a secure location |
JP6141437B2 (en) * | 2013-09-26 | 2017-06-07 | 三菱電機株式会社 | Surveillance camera, surveillance system, and motion determination method |
US20180176512A1 (en) * | 2016-10-26 | 2018-06-21 | Ring Inc. | Customizable intrusion zones associated with security systems |
US10891839B2 (en) | 2016-10-26 | 2021-01-12 | Amazon Technologies, Inc. | Customizable intrusion zones associated with security systems |
WO2018081328A1 (en) * | 2016-10-26 | 2018-05-03 | Ring Inc. | Customizable intrusion zones for audio/video recording and communication devices |
JP7122556B2 (en) * | 2017-10-27 | 2022-08-22 | パナソニックIpマネジメント株式会社 | Imaging device and imaging method |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3743768A (en) * | 1971-04-02 | 1973-07-03 | Halliburton Co | Method and apparatus for electronically monitoring a field of view |
US3988533A (en) * | 1974-09-30 | 1976-10-26 | Video Tek, Inc. | Video-type universal motion and intrusion detection system |
DE2617111C3 (en) * | 1976-04-17 | 1986-02-20 | Robert Bosch Gmbh, 7000 Stuttgart | Method for detecting movement in the surveillance area of a television camera |
DE2715083C3 (en) * | 1977-04-04 | 1983-02-24 | Robert Bosch Gmbh, 7000 Stuttgart | System for the discrimination of a video signal |
-
1979
- 1979-02-20 CA CA321,857A patent/CA1116286A/en not_active Expired
- 1979-11-26 US US06/097,551 patent/US4249207A/en not_active Expired - Lifetime
Also Published As
Publication number | Publication date |
---|---|
US4249207A (en) | 1981-02-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CA1116286A (en) | Perimeter surveillance system | |
US6707486B1 (en) | Directional motion estimator | |
US6104831A (en) | Method for rejection of flickering lights in an imaging system | |
DE3634628C2 (en) | ||
US6130707A (en) | Video motion detector with global insensitivity | |
CA1181163A (en) | Motion and intrustion detecting system | |
JPH0337354B2 (en) | ||
US7646329B2 (en) | Method for detecting a target | |
CA2275893C (en) | Low false alarm rate video security system using object classification | |
US5999634A (en) | Device and method for analyzing an electronic image signal | |
WO1988006326A1 (en) | Vehicle detection through image processing for traffic surveillance and control | |
WO1998028706B1 (en) | Low false alarm rate video security system using object classification | |
JPH0766048B2 (en) | Forced correlation / mixed mode tracking system | |
US20040114054A1 (en) | Method of detecting a significant change of scene | |
JPH0620049A (en) | Intruder identification system | |
EP0603276B1 (en) | Image analyser | |
JPH0737064A (en) | Method and device for detecting intruding object | |
JP3986643B2 (en) | Image processing device for monitoring | |
CN113593161A (en) | Perimeter intrusion detection method | |
JPH0731248B2 (en) | Moving object detector | |
Rodger et al. | Video motion detection systems: a review for the nineties | |
EP0749619A1 (en) | System and process for evaluating images | |
JP2828345B2 (en) | Intruder monitoring system | |
AU726071B2 (en) | Image analyser | |
Hansen et al. | Adaptive threshold adjustment and control |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
MKEX | Expiry |