WO1996038820A1 - Security control system - Google Patents

Security control system Download PDF

Info

Publication number
WO1996038820A1
WO1996038820A1 PCT/GB1996/001249 GB9601249W WO9638820A1 WO 1996038820 A1 WO1996038820 A1 WO 1996038820A1 GB 9601249 W GB9601249 W GB 9601249W WO 9638820 A1 WO9638820 A1 WO 9638820A1
Authority
WO
WIPO (PCT)
Prior art keywords
door
image
parameter
image data
control system
Prior art date
Application number
PCT/GB1996/001249
Other languages
French (fr)
Inventor
Michael Christopher Fairhurst
Stephen William Kelly
Martin Golding
Original Assignee
Mayor Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mayor Limited filed Critical Mayor Limited
Priority to EP96919896A priority Critical patent/EP0832472A1/en
Priority to AU58264/96A priority patent/AU5826496A/en
Publication of WO1996038820A1 publication Critical patent/WO1996038820A1/en

Links

Classifications

    • EFIXED CONSTRUCTIONS
    • E05LOCKS; KEYS; WINDOW OR DOOR FITTINGS; SAFES
    • E05GSAFES OR STRONG-ROOMS FOR VALUABLES; BANK PROTECTION DEVICES; SAFETY TRANSACTION PARTITIONS
    • E05G5/00Bank protection devices
    • E05G5/003Entrance control
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C9/00Individual registration on entry or exit
    • EFIXED CONSTRUCTIONS
    • E05LOCKS; KEYS; WINDOW OR DOOR FITTINGS; SAFES
    • E05YINDEXING SCHEME RELATING TO HINGES OR OTHER SUSPENSION DEVICES FOR DOORS, WINDOWS OR WINGS AND DEVICES FOR MOVING WINGS INTO OPEN OR CLOSED POSITION, CHECKS FOR WINGS AND WING FITTINGS NOT OTHERWISE PROVIDED FOR, CONCERNED WITH THE FUNCTIONING OF THE WING
    • E05Y2900/00Application of doors, windows, wings or fittings thereof
    • E05Y2900/10Application of doors, windows, wings or fittings thereof for buildings or parts thereof
    • E05Y2900/13Application of doors, windows, wings or fittings thereof for buildings or parts thereof characterised by the type of wing
    • E05Y2900/132Doors

Definitions

  • the present invention relates to a control system for use with a security door, with interlocking or synchronised doors, or with other means, such as turnstiles, flaps or other obstacles, for controlling access to a secured area.
  • a typical security door may comprise a revolving door divided into, say, four compartments by radially extending wings.
  • the wings are coupled centrally at their upper or lower end to an interlock operated by a control system and are typically motor driven, but may alternatively be pushed manually. Turnstile systems are generally free to be pushed manually.
  • the control system may operate, for example, in response to a card reader.
  • An authorised person wishing to pass through the door will then insert their pass card in the reader and, provided that their card is recognised, the control system then operates the interlock to free the revolving door so the user can pass through it. If the card is not recognised, or if an unauthorised person attempts to gain access without use of the pass card reader, then the interlock holds the wings of the revolving door against movement and so prevents passage through the door.
  • the present inventors have adopted an entirely new approach to the detection of tail-gating and piggy-backing, based on the use of video data. While the use of video data has previously been proposed for the recognition of authorised persons, as an alternative, e.g., to the use of card readers, the use of video data for piggy-backing detection (APB) has not previously been thought desirable or possible. On the face of it the complexity of the visual data which would be gathered from e.g., a revolving transparent door, and the processing overheads involved in determining from such data the number of persons present provide a major disincentive to the adoption of such techniques.
  • APB piggy-backing detection
  • APB detection generally has to be carried out in the short interval of time during which the user is passing through the door and so techniques involving large processing overheads tend to be avoided.
  • the present inventors have realised that with appropriate processing, the raw video data can be used to yield a simple parameter for comparison with a predetermined threshold, and that the use of video data in this manner allows effective APB detection in real time using a relatively low-powered processor.
  • the threshold may be for a statistic derived from a number of parameters in combination.
  • the image processor and discriminator comprise a cascaded series of modules, each module being arranged to process image data to determine a respective parameter, and to compare the parameter with a corresponding threshold.
  • the modules are arranged so that when one module is able to make a decision at a predetermined confidence level, then that module produces the said discriminatory output signal, otherwise the said module passing image data on to a subsequent module for further processing.
  • the modules are arranged generally in order of their discriminatory power, with the most powerful module receiving the image data first.
  • the discriminatory power of a module is a measure of its ability to make a discriminatory decision with a minimum processing overhead, and so at maximum speed.
  • the efficiency of the whole system is maximised by using the fastest tests firsts, and only passing on to tests with a greater processing overhead when the preceding tests fail to meet a predetermined confidence level.
  • the image processor is arranged to capture a background image of the door with no person present, and to capture a subsequent image of the door with a person present and to discard from the second said image video data which is unchanged from the background image.
  • a method of controlling a security access device including capturing video image data from a secured area of the access device, processing said image data and thereby deriving at least one parameter for discriminating the number of persons present in the secured area, and
  • the present invention also encompasses a security access device when fitted with a control system in accordance with the first aspect of the present invention.
  • a security access device including a secured region bounded by one or more wholly or partially transparent walls, characterised by a control system including a video input device arranged to view the secured region, and in that the transparent walls include filter means arranged to block or reduce the transmission of light in part of the visible/near-visible optical spectrum, and in that the video input device has a sensitivity/wavelength characteristic generally complementary to the transmission characteristic of the said filter means associated with the transparent walls, the visibility to the video input device of objects outside the transparent walls thereby being reduced or eliminated.
  • Revolving security doors for example, are commonly built with glass doors. Objects outside the secured area are therefore potentially visible to any video security control system and may "confuse" any judgement made by the control system.
  • This aspect of the present invention overcomes this problem by using walls which are transparent in one part of the visible/near-visible spectrum and a video input device which is sensitive in another part of the spectrum.
  • the walls may, for example, be covered by a film which blocks transmission in the infra-red.
  • the video device may either be selected to be inherently sensitive in the infra-red range, and insensitive outside that range, or, may be provided with an input filter giving this desired sensitivity/wavelength characteristic.
  • Figure l is a block diagram of a system embodying the present invention.
  • Figures 2a and 2b are schematic representations of captured video data at different stages of processing
  • Figure 3 shows a set of pixel masks used in processing the image data to detect the head edges shown in Figure 2b;
  • Figure 4 is a schematic illustrating the processing of the image data to derive the image height and width
  • Figure 5 is a flow diagram for the image processing in the APB control circuit
  • Figure 6 is a schematic of hardware for implementing the system of Figure 1;
  • Figures 7a - 7d are graphs showing experimental data obtained using the system of Figure 1.
  • An entry control system comprises a revolving door 1 which, in this example, has four compartments defined by radially extending wings 2a-2d.
  • the movement of the wings is controlled by an interlock 3 mounted centrally above the wings.
  • This interlock operates in a fashion conventional in revolving security doors to lock the door against movement until an appropriate control signal is received from a control unit.
  • a card reader 5 is connected to the door controller 4 and stands outside the door 1. In use, when someone wishes to pass through the door, that person inserts their card in the reader 5. Provided that the card is recognised, the reader 5 outputs an authorization signal to the door controller 4 which then operates the interlock 3 to free the door which is then driven by a motor through an angle sufficient for the authorised user to pass from the entrance to the exit of the door.
  • control system for the door further comprises a video camera 6 which is mounted above the door looking down into an underlying compartment.
  • the video data from the camera is processed in an APB control circuit 7 to produce a discriminatory output which goes to the controller 4 for the door to operate the interlock 3 to lock the door when piggy-backing or tail-gating is detected.
  • the APB control circuit has a cascaded hierarchical structure. This comprises a plurality of stages arranged generally in order of the discriminatory power of the particular parameter which is derived and used.
  • the image data is first subject to processing by the first of this hierarchy of stages. If that results in the production of a parameter having a value such that a decision on the presence or absence of a second person can be made with adequate confidence, then the processing is terminated at that stage and an appropriate output signal produced and fed to the external control unit. If, however, a decision cannot be made based on the first parameter, then the image data is passed on to the next stage in the hierarchy, for further processing and the derivation of a further parameter and so on.
  • the system is structured to maximise the processing speed and to minimise the average processing overhead for the image data.
  • a single parameter may be used for the discrimination in the majority of cases, with further parameters being derived and tested only in those marginal cases where a single parameter does not enable a decision to be made with sufficient confidence.
  • the first stage of the video control circuit is a background-subtraction stage. This firstly captures a background image of the compartment of the door prior to the entry of the user into the field of view of the camera. This background image is stored in order to provide a means of identifying changes occurring when entry into the door compartment is effected. Subsequently, as the user passes into the field of view, the system captures a second image, the "present" image. This is compared with the stored background image and those pixels which are unchanged are discarded, thereby eliminating them from the subsequent processing stages.
  • the comparison is carried out by taking the grey scale values in the range 1-255 of each pixel, subtracting one from the other and writing a value of 0 for each pixel where the result of the subtraction is 0 or is less than a predetermined small value, say 10. Where, the difference is greater than this predetermined threshold, then the value written is simply the grey scale value of the present image for that pixel. In the course of this process a count is maintained of the number of non-zero pixels produced and this is used as the first test parameter for APB detection. It is found that when two or more persons are present the difference count is characteristically higher than when a single person is present in the door. In some cases however the result of the difference count will be too close to the relevant predetermined threshold for a reliable determination to be made. In this case the video data is passed on for subsequent stages of processing and discrimination.
  • the next stage carries out edge detection using the known Sobel edge detection algorithm.
  • This algorithm is applied only to the non-zero pixels of the image output from the first stage and sets those pixels to one of two binary values producing a binary image with, for example, edges shown in black on a white ground as in Figure 2a.
  • a determination is then made of the area inside the edges. This is done by stepping a window of dimensions, e.g., 10 x 10 through the image and recording a count for each pixel which is inside the edges and for which none of the pixels of the window cross an edge. This count then is a measure of the area inside the edges and is related to the area in the view of the camera of the shoulders and head of the person passing through the door.
  • This measure of the area provides a second discriminatory parameter which as in the first stage is compared with a respective discriminatory threshold. As before, if this comparison produces a result with a sufficient degree of certainty, then the processing can be terminated at that stage and the discriminatory output passed on to the control system 7. If this is not the case, then the system passes on to the next stage of processing. As illustrated in Figure 4, this calculates the height and width of the image defined by the edges, calculates the ratio of the two, and compares this with a respective predetermined threshold.
  • the final processing module carries out what has been termed "head edge detection” .
  • This uses the aliasing characteristic of the "corners" of the detected edges.
  • the image data is compared with a set of pixel masks of the form shown in Figure 3. A count is incremented, and a mark displayed ( Figure 2b) , for each pixel where a match is found with one of the masks. The count of the number of matches is then compared with a predetermined threshold for the count to provide a further discriminatory test.
  • control system 7 may activate the -interlock to lock the door against movement, thereby trapping the users, or may drive the door in the reverse direction to expel the users.
  • the inventors have found that the discrimination of the system can be improved by calculating an additional parameter to compensate for the position of an object in the door.
  • the parameters discussed above are based on what is assumed to be a plan view from above of the person in the door. However, if the person is not directly under the camera, then the camera captures part of the side of the body too. This extra information would tend to distort the measurement and could lead to the mis-classification of a tall off-centre person as two people. To avoid this, a measurement is taken first of the position of the centre of the object. This is then compared with the actual centre of the camera and an adjustment parameter is set. The adjustment parameter takes the form of a ratio by which subsequently determined parameters are multiplied.
  • Figure 7a and Column 1 of Table l show the measurements of the position of the object used for determining the adjustment ratio for a sample of 72 test images. The higher the number the further the object is from the centre of the camera.
  • the images are a selection from the following categories: a single person; two people; people carrying large and small objects; a person carrying a large object over their head.
  • Figure 7b and Column 2 of the table show the difference count parameter, after correction by the adjustment ratio or "normalising factor" of Figure 7a.
  • images 68-70 are large objects carried over the top of the head and so result in a large difference count.
  • Figure 7c and Column 3 show the (corrected) head area parameter. This measurement is a guide to the total area available to be a head space. Images 68-70 would cause concern as the values are extremely high. These images would fall above the alarm threshold, as 2 people could fit under such a large object.
  • Figure 7d shows the height/width ratio parameter measuring the compactness of the image. The lower the value, the more likely that the imaged object is two people.
  • Table 1 below lists the data illustrated in these graphs. Columns 1 to 3 are identified above. The data contained in the other columns are as follows:
  • the Normalising Factor is used to adjust the . ratio depending on the position of the object in respect to the centre of the camera.
  • This measurement is the number of pixel patterns that match with a head indicator mask. The greater the number the larger the number of head mask matches.
  • This measurement is the number of straight lines found in the image.
  • the comparators may be arranged to make an "intelligent" decision based on a number of parameters, rather than simply applying a single threshold.
  • the system may also include a knowledge base so that it can judge each situation on the measurements of the current image and knowledge gained about the environment the system is working with..
  • Table 2 lists the subjects used for the different images listed in Table 1.
  • the following thresholds may be used by the comparators for discrimination:
  • the system described above may be modified through the use of moving images, that is to say the capturing of a series of video frames as the person moves through the door. This then produces distinctive "constants" which may be used as the basis for discriminatory parameters in an analogous fashion to the process discussed above.
  • some of the processing modules may use appropriately trained neural networks. These may be trained on the parameters produced by other modules.
  • Appropriate thresholds for the different parameters used by different modules may be obtained by inspection from real sample data such as that illustrated in Figures 7a-7d, and or from computer modelling of the environment in which the system is to be used.
  • Figure 6 illustrates one example of hardware embodying the system.
  • a camera 61 views the target scene and passes the acquired image to a frame grabber 62.
  • the camera views the target scene through an IR bandpass filter.
  • An IR bandstop filter which may be in the form of a film applied to the walls of the revolving door and which is transparent in the rest of the visual spectrum is then used to screen the extraneous visual field from the field of view of the camera.
  • a light source may be provided inside the door to provide illumination at levels appropriate to the sensitiviry of the video camera 61.
  • the image from the frame grabber is passed to a single board PC 63 which runs image processing software implementing the hierarchical modular structure discussed above. Once a decision has been made by the control program, an output is passed to the door controller 64.
  • the camera 61 is a 1/3 inch black and white El camera (RS845-184) having a 1/3 inch CS mount DD lens, 2.6mm (RS846-266) .
  • the frame grabber is that available commercially as VIDEOBLASTER SE.
  • the single board PC uses an Intel 486DX2/66 microprocessor and an ISA bus.
  • the interface to the door controller 64 is a standard PC ISA bus serial card.
  • the user interface which enables the user to set adjustable parameters for the control program may comprise a keyboard or keypad and a VDU or LCD display.
  • appendix A the source code for the image processing/discrimination program run on the single board PC 63
  • appendix B the different elements of the program are listed in pseudocode.

Abstract

Control system for a security access device, in particular for revolving doors (1), including video imaging and processing to determine the number of persons within a secured region of the security access device, and controlling the operability of the security access device in dependence on the number of persons.

Description

SECURITY CONTROL SYSTEM
TECHNICAL FIELD
The present invention relates to a control system for use with a security door, with interlocking or synchronised doors, or with other means, such as turnstiles, flaps or other obstacles, for controlling access to a secured area.
A typical security door may comprise a revolving door divided into, say, four compartments by radially extending wings. The wings are coupled centrally at their upper or lower end to an interlock operated by a control system and are typically motor driven, but may alternatively be pushed manually. Turnstile systems are generally free to be pushed manually.
The control system may operate, for example, in response to a card reader. An authorised person wishing to pass through the door will then insert their pass card in the reader and, provided that their card is recognised, the control system then operates the interlock to free the revolving door so the user can pass through it. If the card is not recognised, or if an unauthorised person attempts to gain access without use of the pass card reader, then the interlock holds the wings of the revolving door against movement and so prevents passage through the door.
BACKGROUND ART
Known security doors suffer from a number of potential forms of misuse. In particular they are vulnerable to "piggy-backing" in which two individuals attempt to pass through the door in one compartment, or "tail-gating" in which an unauthorised person enters the compartment immediately following the one containing the authorised person, or passes through the door in the opposite direction. It has previously been proposed to use pressure-sensitive door mats in the security door, or to use ultra-sonic sensors to detect the presence of more than one person in the door. However these measures have not been wholly successful and there remains a need for a security system capable of detecting reliably piggy-backing or tail-gating, whilst providing ease of use and only a minimum number of false alarms.
SUMMARY OF THE PRESENT INVENTION
According to a first aspect of the present invention a control system suitable for use with a security access device such as a security door comprises a video input device which in use captures visual image data from the secured region of the device, an image processor for processing said image data and deriving at least one parameter for discriminating the number of persons present in the secured area of the device and a comparator for comparing the said parameter derived from the original data with a predetermined threshold and producing a discriminatory output for use in controlling the interlock on the security device depending upon the result of the comparison.
The present inventors have adopted an entirely new approach to the detection of tail-gating and piggy-backing, based on the use of video data. While the use of video data has previously been proposed for the recognition of authorised persons, as an alternative, e.g., to the use of card readers, the use of video data for piggy-backing detection (APB) has not previously been thought desirable or possible. On the face of it the complexity of the visual data which would be gathered from e.g., a revolving transparent door, and the processing overheads involved in determining from such data the number of persons present provide a major disincentive to the adoption of such techniques. By contrast with recognition techniques which take place outside the door, APB detection generally has to be carried out in the short interval of time during which the user is passing through the door and so techniques involving large processing overheads tend to be avoided. However, the present inventors have realised that with appropriate processing, the raw video data can be used to yield a simple parameter for comparison with a predetermined threshold, and that the use of video data in this manner allows effective APB detection in real time using a relatively low-powered processor. The threshold may be for a statistic derived from a number of parameters in combination.
Preferably the image processor and discriminator comprise a cascaded series of modules, each module being arranged to process image data to determine a respective parameter, and to compare the parameter with a corresponding threshold.
Preferably the modules are arranged so that when one module is able to make a decision at a predetermined confidence level, then that module produces the said discriminatory output signal, otherwise the said module passing image data on to a subsequent module for further processing.
Preferably the modules are arranged generally in order of their discriminatory power, with the most powerful module receiving the image data first.
The discriminatory power of a module is a measure of its ability to make a discriminatory decision with a minimum processing overhead, and so at maximum speed. The efficiency of the whole system is maximised by using the fastest tests firsts, and only passing on to tests with a greater processing overhead when the preceding tests fail to meet a predetermined confidence level.
Preferably the image processor is arranged to capture a background image of the door with no person present, and to capture a subsequent image of the door with a person present and to discard from the second said image video data which is unchanged from the background image.
According to a second aspect of the present invention there is provided a method of controlling a security access device, such as a security door, including capturing video image data from a secured area of the access device, processing said image data and thereby deriving at least one parameter for discriminating the number of persons present in the secured area, and
comparing the said parameter with a predetermined threshold and producing a discriminatory output for use in controlling an interlock on the security device to lock the device when more than a predetermined number of persons are present in the secured area.
The present invention also encompasses a security access device when fitted with a control system in accordance with the first aspect of the present invention.
According to a further aspect of the present invention, there is provided a security access device including a secured region bounded by one or more wholly or partially transparent walls, characterised by a control system including a video input device arranged to view the secured region, and in that the transparent walls include filter means arranged to block or reduce the transmission of light in part of the visible/near-visible optical spectrum, and in that the video input device has a sensitivity/wavelength characteristic generally complementary to the transmission characteristic of the said filter means associated with the transparent walls, the visibility to the video input device of objects outside the transparent walls thereby being reduced or eliminated.
Revolving security doors, for example, are commonly built with glass doors. Objects outside the secured area are therefore potentially visible to any video security control system and may "confuse" any judgement made by the control system. This aspect of the present invention overcomes this problem by using walls which are transparent in one part of the visible/near-visible spectrum and a video input device which is sensitive in another part of the spectrum. The walls may, for example, be covered by a film which blocks transmission in the infra-red. The video device may either be selected to be inherently sensitive in the infra-red range, and insensitive outside that range, or, may be provided with an input filter giving this desired sensitivity/wavelength characteristic.
BRIEF DESCRIPTION OF THE DRAWINGS
Figure l is a block diagram of a system embodying the present invention;
Figures 2a and 2b are schematic representations of captured video data at different stages of processing;
Figure 3 shows a set of pixel masks used in processing the image data to detect the head edges shown in Figure 2b;
Figure 4 is a schematic illustrating the processing of the image data to derive the image height and width;
Figure 5 is a flow diagram for the image processing in the APB control circuit;
Figure 6 is a schematic of hardware for implementing the system of Figure 1; and
Figures 7a - 7d are graphs showing experimental data obtained using the system of Figure 1.
DETAILED DESCRIPTION
An entry control system comprises a revolving door 1 which, in this example, has four compartments defined by radially extending wings 2a-2d. The movement of the wings is controlled by an interlock 3 mounted centrally above the wings. This interlock operates in a fashion conventional in revolving security doors to lock the door against movement until an appropriate control signal is received from a control unit.
A card reader 5 is connected to the door controller 4 and stands outside the door 1. In use, when someone wishes to pass through the door, that person inserts their card in the reader 5. Provided that the card is recognised, the reader 5 outputs an authorization signal to the door controller 4 which then operates the interlock 3 to free the door which is then driven by a motor through an angle sufficient for the authorised user to pass from the entrance to the exit of the door.
Such security door systems are vulnerable to misuse if there is piggy-backing, that is to say if a second unauthorised person enters the compartment together with the authorised person, or if there is tail-gating, in which the unauthorised person passes through the door in another of the compartments at the same time as the authorised person goes through. In the present example, the control system for the door further comprises a video camera 6 which is mounted above the door looking down into an underlying compartment. The video data from the camera is processed in an APB control circuit 7 to produce a discriminatory output which goes to the controller 4 for the door to operate the interlock 3 to lock the door when piggy-backing or tail-gating is detected.
As illustrated in Figure 5, the APB control circuit has a cascaded hierarchical structure. This comprises a plurality of stages arranged generally in order of the discriminatory power of the particular parameter which is derived and used. In use, the image data is first subject to processing by the first of this hierarchy of stages. If that results in the production of a parameter having a value such that a decision on the presence or absence of a second person can be made with adequate confidence, then the processing is terminated at that stage and an appropriate output signal produced and fed to the external control unit. If, however, a decision cannot be made based on the first parameter, then the image data is passed on to the next stage in the hierarchy, for further processing and the derivation of a further parameter and so on. In this way, the system is structured to maximise the processing speed and to minimise the average processing overhead for the image data. Just a single parameter may be used for the discrimination in the majority of cases, with further parameters being derived and tested only in those marginal cases where a single parameter does not enable a decision to be made with sufficient confidence.
In the present example, the first stage of the video control circuit is a background-subtraction stage. This firstly captures a background image of the compartment of the door prior to the entry of the user into the field of view of the camera. This background image is stored in order to provide a means of identifying changes occurring when entry into the door compartment is effected. Subsequently, as the user passes into the field of view, the system captures a second image, the "present" image. This is compared with the stored background image and those pixels which are unchanged are discarded, thereby eliminating them from the subsequent processing stages. In this example, the comparison is carried out by taking the grey scale values in the range 1-255 of each pixel, subtracting one from the other and writing a value of 0 for each pixel where the result of the subtraction is 0 or is less than a predetermined small value, say 10. Where, the difference is greater than this predetermined threshold, then the value written is simply the grey scale value of the present image for that pixel. In the course of this process a count is maintained of the number of non-zero pixels produced and this is used as the first test parameter for APB detection. It is found that when two or more persons are present the difference count is characteristically higher than when a single person is present in the door. In some cases however the result of the difference count will be too close to the relevant predetermined threshold for a reliable determination to be made. In this case the video data is passed on for subsequent stages of processing and discrimination.
Referring again to Figure 5, in this example the next stage carries out edge detection using the known Sobel edge detection algorithm. This algorithm is applied only to the non-zero pixels of the image output from the first stage and sets those pixels to one of two binary values producing a binary image with, for example, edges shown in black on a white ground as in Figure 2a. A determination is then made of the area inside the edges. This is done by stepping a window of dimensions, e.g., 10 x 10 through the image and recording a count for each pixel which is inside the edges and for which none of the pixels of the window cross an edge. This count then is a measure of the area inside the edges and is related to the area in the view of the camera of the shoulders and head of the person passing through the door. This measure of the area provides a second discriminatory parameter which as in the first stage is compared with a respective discriminatory threshold. As before, if this comparison produces a result with a sufficient degree of certainty, then the processing can be terminated at that stage and the discriminatory output passed on to the control system 7. If this is not the case, then the system passes on to the next stage of processing. As illustrated in Figure 4, this calculates the height and width of the image defined by the edges, calculates the ratio of the two, and compares this with a respective predetermined threshold.
In this example the final processing module carries out what has been termed "head edge detection" . This uses the aliasing characteristic of the "corners" of the detected edges. The image data is compared with a set of pixel masks of the form shown in Figure 3. A count is incremented, and a mark displayed (Figure 2b) , for each pixel where a match is found with one of the masks. The count of the number of matches is then compared with a predetermined threshold for the count to provide a further discriminatory test.
When any of the tests produces an output indicating that two or more people are in the door, then the control system 7 may activate the -interlock to lock the door against movement, thereby trapping the users, or may drive the door in the reverse direction to expel the users.
The inventors have found that the discrimination of the system can be improved by calculating an additional parameter to compensate for the position of an object in the door. The parameters discussed above are based on what is assumed to be a plan view from above of the person in the door. However, if the person is not directly under the camera, then the camera captures part of the side of the body too. This extra information would tend to distort the measurement and could lead to the mis-classification of a tall off-centre person as two people. To avoid this, a measurement is taken first of the position of the centre of the object. This is then compared with the actual centre of the camera and an adjustment parameter is set. The adjustment parameter takes the form of a ratio by which subsequently determined parameters are multiplied.
Figure 7a and Column 1 of Table l show the measurements of the position of the object used for determining the adjustment ratio for a sample of 72 test images. The higher the number the further the object is from the centre of the camera. The images are a selection from the following categories: a single person; two people; people carrying large and small objects; a person carrying a large object over their head.
Figure 7b and Column 2 of the table show the difference count parameter, after correction by the adjustment ratio or "normalising factor" of Figure 7a. Note that images 68-70 are large objects carried over the top of the head and so result in a large difference count. Figure 7c and Column 3 show the (corrected) head area parameter. This measurement is a guide to the total area available to be a head space. Images 68-70 would cause concern as the values are extremely high. These images would fall above the alarm threshold, as 2 people could fit under such a large object. Figure 7d shows the height/width ratio parameter measuring the compactness of the image. The lower the value, the more likely that the imaged object is two people.
Table 1 below lists the data illustrated in these graphs. Columns 1 to 3 are identified above. The data contained in the other columns are as follows:
Column 4 Width
This is the width of the Head Area region. Column 5 Height
This is the height of the Head Area region.
Column 6 Normalised Ratio
This is the ratio of the width and height. The Normalising Factor is used to adjust the . ratio depending on the position of the object in respect to the centre of the camera. Column 7 Head Edges
This measurement is the number of pixel patterns that match with a head indicator mask. The greater the number the larger the number of head mask matches.
Column 8 Head Edge Groups
This is the number of concentrated groups of Head Edges.
Column 9 Straight Edge Indicator
This measurement is the number of straight lines found in the image.
Column 10 Volume
This is the total volume that would encapsulate the object.
Column 11 Volume Width
This is the width of volume.
Column 12 Volume Height
This is the height of volume.
Column 13 Volume Ratio
This is the ratio of the volume height and volume width. Not all of the parameters listed in the table need be used in any given implementation of the invention. The comparators may be arranged to make an "intelligent" decision based on a number of parameters, rather than simply applying a single threshold. The system may also include a knowledge base so that it can judge each situation on the measurements of the current image and knowledge gained about the environment the system is working with..
Table 2 lists the subjects used for the different images listed in Table 1.
For the data of this example, the following thresholds may be used by the comparators for discrimination:
Normalised Difference
Below 750 - Allow entry
Above 15000 - Refuse entry. Suspect door entry
Area
Below 750 - Allow entry
Above 4500 - Refuse entry. Suspect door entry
Normalised Ratio
Above 95 - Allow entry Not all measurements have a pass & fail level. For example if the ratio is above 95 then we would allow entry. If it is below this level, we use other measurements to make a decision.
The system described above may be modified through the use of moving images, that is to say the capturing of a series of video frames as the person moves through the door. This then produces distinctive "constants" which may be used as the basis for discriminatory parameters in an analogous fashion to the process discussed above.
In systems embodying the invention, some of the processing modules may use appropriately trained neural networks. These may be trained on the parameters produced by other modules.
Appropriate thresholds for the different parameters used by different modules may be obtained by inspection from real sample data such as that illustrated in Figures 7a-7d, and or from computer modelling of the environment in which the system is to be used.
Figure 6 illustrates one example of hardware embodying the system. A camera 61 views the target scene and passes the acquired image to a frame grabber 62. In this preferred implementation, the camera views the target scene through an IR bandpass filter. An IR bandstop filter which may be in the form of a film applied to the walls of the revolving door and which is transparent in the rest of the visual spectrum is then used to screen the extraneous visual field from the field of view of the camera. A light source may be provided inside the door to provide illumination at levels appropriate to the sensitiviry of the video camera 61.
The image from the frame grabber is passed to a single board PC 63 which runs image processing software implementing the hierarchical modular structure discussed above. Once a decision has been made by the control program, an output is passed to the door controller 64.
In the presently described example, the camera 61 is a 1/3 inch black and white El camera (RS845-184) having a 1/3 inch CS mount DD lens, 2.6mm (RS846-266) . The frame grabber is that available commercially as VIDEOBLASTER SE. In this example the single board PC uses an Intel 486DX2/66 microprocessor and an ISA bus. The interface to the door controller 64 is a standard PC ISA bus serial card. The user interface which enables the user to set adjustable parameters for the control program may comprise a keyboard or keypad and a VDU or LCD display.
It will be understood that the above components are given by way of example only, and that a number of alternative implementations are possible.
The following appendices list, in appendix A, the source code for the image processing/discrimination program run on the single board PC 63, and, in appendix B, the different elements of the program are listed in pseudocode.
Figure imgf000015_0001
Figure imgf000016_0001
Figure imgf000017_0001
Figure imgf000018_0001
Figure imgf000019_0001
Figure imgf000020_0001
Figure imgf000021_0001
Figure imgf000022_0001
Figure imgf000023_0001
Figure imgf000024_0001
Figure imgf000025_0001
Figure imgf000026_0001
Figure imgf000027_0001
Figure imgf000028_0001
Figure imgf000029_0001
Figure imgf000030_0001
Figure imgf000031_0001
Figure imgf000032_0001
Figure imgf000033_0001
Figure imgf000034_0001
Figure imgf000035_0001
Figure imgf000036_0001
Figure imgf000037_0001
Figure imgf000038_0001
Figure imgf000039_0001

Claims

1. A control system for use with a security access device such as a revolving door (1), the control system comprising:
a video input device (6) which in use captures image data from the secured region of the access device;
an image processor (7) for processing the said image data and deriving at least one parameter for discriminating the number of persons present in the secured region of the access device; and
a comparator for comparing the said parameter derived from the original data with a predetermined threshold and for producing a discriminatory output for use in controlling the interlock (3) on the security device depending upon the result of the comparison.
2. A system according to claim 1, in which the image processor (7) and discriminator comprise a cascaded series of modules, each module being arranged to process image data to determine a respective parameter, and to compare the parameter with a corresponding threshold.
3. A system according to claim 2, in which the modules are arranged so that when one module is able to make a decision at a predetermined confidence level then that module produces the said discriminatory output signal, otherwise the said module passing image data onto a subsequent module for further processing.
4. A system according to claim 2 or 3, in which the modules are arranged generally in order of their discriminatory power, with the most powerful module receiving the image data first.
5. A system according to any one of the preceding claims, in which the image processor is arranged to capture a background image of the door with no person present, and to capture a subsequent image of the door with a person present, and to discard from the second said image video data which is unchanged from the background image.
6. A method of controlling a security access device such as a security door including capturing video image data from a secured area of the access door;
processing said image data and thereby deriving at least one parameter for discriminating the number of persons present in the secured area; and
comparing the said parameter with a predetermined threshold and producing a discriminatory output for use in controlling an interlock on the security device to lock the device when more than a predetermined number of persons are present in the secured area.
7. A security access device fitted with a control system according to any of claims 1 to 5.
8. A security access device including a secured region bounded by one or more wholly or partially transparent walls, characterised by a control system including a video input device arranged to view the secured region, and in that the transparent walls include filter means arranged to block or reduce the transmission of light in part of the visible/near-visible optical spectrum, and in that the video input device has a sensitivity/wavelength characteristic generally complementary to the transmission characteristic of the said filter means associated with the transparent walls, the visibility to the video input device of objects outside the transparent walls thereby being reduced or eliminated.
9. A device according to claim 8, in which the filter means are arranged to block transmission in the infra-red, and the video input device is insensitive outside the infra-red.
10. A device according to claim 8 or 9, in which the video input device includes an input optical filter arranged to provide the said sensitivity/wavelength characteristic.
11. A device according to any one of claims 8 to 10 including a control system according to any one of claims
1 to 5.
PCT/GB1996/001249 1995-06-02 1996-05-24 Security control system WO1996038820A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP96919896A EP0832472A1 (en) 1995-06-02 1996-05-24 Security control system
AU58264/96A AU5826496A (en) 1995-06-02 1996-05-24 Security control system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GBGB9511140.7A GB9511140D0 (en) 1995-06-02 1995-06-02 Security control system
GB9511140.7 1995-06-02

Publications (1)

Publication Number Publication Date
WO1996038820A1 true WO1996038820A1 (en) 1996-12-05

Family

ID=10775379

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB1996/001249 WO1996038820A1 (en) 1995-06-02 1996-05-24 Security control system

Country Status (4)

Country Link
EP (1) EP0832472A1 (en)
AU (1) AU5826496A (en)
GB (1) GB9511140D0 (en)
WO (1) WO1996038820A1 (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1999055995A1 (en) * 1998-04-29 1999-11-04 Malcolm William Thomas An access control system
WO2003088157A1 (en) * 2002-04-08 2003-10-23 Newton Security Inc. Tailgating and reverse entry detection, alarm, recording and prevention using machine vision
EP1686544A3 (en) * 2005-01-31 2006-12-06 Optex Co., Ltd. Traffic monitoring apparatus
US7397929B2 (en) 2002-09-05 2008-07-08 Cognex Technology And Investment Corporation Method and apparatus for monitoring a passageway using 3D images
US7400744B2 (en) 2002-09-05 2008-07-15 Cognex Technology And Investment Corporation Stereo door sensor
US7623674B2 (en) 2003-11-05 2009-11-24 Cognex Technology And Investment Corporation Method and system for enhanced portal security through stereoscopy
US8326084B1 (en) 2003-11-05 2012-12-04 Cognex Technology And Investment Corporation System and method of auto-exposure control for image acquisition hardware using three dimensional information
ITMI20120686A1 (en) * 2012-04-24 2013-10-25 Cometa S P A PASSAGE DETECTION EQUIPMENT
EP3203447A1 (en) * 2016-02-04 2017-08-09 Holding Assessoria I Lideratge, S.L. (HAL SL) Detection of fraudulent access at control gates

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3564132A (en) * 1966-01-17 1971-02-16 Mardix Apparatus for controlling the passage of persons and objects between two areas utilizing closed circuit television
WO1981001213A1 (en) * 1979-10-16 1981-04-30 Otis Elevator Co Object and people counting system
DE3740115A1 (en) * 1986-11-26 1988-06-09 Matsushita Electric Works Ltd PERSONNEL DETECTION ARRANGEMENT
EP0431363A1 (en) * 1989-12-05 1991-06-12 GALLENSCHÜTZ METALLBAU GmbH Security entrance chamber for personnel
WO1994027408A1 (en) * 1993-05-14 1994-11-24 Rct Systems, Inc. Video traffic monitor for retail establishments and the like

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3564132A (en) * 1966-01-17 1971-02-16 Mardix Apparatus for controlling the passage of persons and objects between two areas utilizing closed circuit television
WO1981001213A1 (en) * 1979-10-16 1981-04-30 Otis Elevator Co Object and people counting system
DE3740115A1 (en) * 1986-11-26 1988-06-09 Matsushita Electric Works Ltd PERSONNEL DETECTION ARRANGEMENT
EP0431363A1 (en) * 1989-12-05 1991-06-12 GALLENSCHÜTZ METALLBAU GmbH Security entrance chamber for personnel
WO1994027408A1 (en) * 1993-05-14 1994-11-24 Rct Systems, Inc. Video traffic monitor for retail establishments and the like

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1999055995A1 (en) * 1998-04-29 1999-11-04 Malcolm William Thomas An access control system
WO2003088157A1 (en) * 2002-04-08 2003-10-23 Newton Security Inc. Tailgating and reverse entry detection, alarm, recording and prevention using machine vision
US7382895B2 (en) 2002-04-08 2008-06-03 Newton Security, Inc. Tailgating and reverse entry detection, alarm, recording and prevention using machine vision
US7397929B2 (en) 2002-09-05 2008-07-08 Cognex Technology And Investment Corporation Method and apparatus for monitoring a passageway using 3D images
US7400744B2 (en) 2002-09-05 2008-07-15 Cognex Technology And Investment Corporation Stereo door sensor
US7623674B2 (en) 2003-11-05 2009-11-24 Cognex Technology And Investment Corporation Method and system for enhanced portal security through stereoscopy
US8326084B1 (en) 2003-11-05 2012-12-04 Cognex Technology And Investment Corporation System and method of auto-exposure control for image acquisition hardware using three dimensional information
US7536253B2 (en) 2005-01-31 2009-05-19 Optex Co., Ltd. Traffic monitoring apparatus
EP1686544A3 (en) * 2005-01-31 2006-12-06 Optex Co., Ltd. Traffic monitoring apparatus
ITMI20120686A1 (en) * 2012-04-24 2013-10-25 Cometa S P A PASSAGE DETECTION EQUIPMENT
EP2657885A1 (en) * 2012-04-24 2013-10-30 Cometa S.p.A. Detection of passage in a revolving door
EP3203447A1 (en) * 2016-02-04 2017-08-09 Holding Assessoria I Lideratge, S.L. (HAL SL) Detection of fraudulent access at control gates
WO2017133902A1 (en) * 2016-02-04 2017-08-10 Holding Assessoria I Lideratge, S.L. (Hal Sl) Detection of fraudulent access at control gates
US11315374B2 (en) 2016-02-04 2022-04-26 Holding Assessoria I Lideratge, S.L. Detection of fraudulent access at control gates

Also Published As

Publication number Publication date
AU5826496A (en) 1996-12-18
EP0832472A1 (en) 1998-04-01
GB9511140D0 (en) 1995-07-26

Similar Documents

Publication Publication Date Title
US5937092A (en) Rejection of light intrusion false alarms in a video security system
US7536037B2 (en) Apparatus and method for human distinction using infrared light
US4872203A (en) Image input device for processing a fingerprint prior to identification
EP1346327B1 (en) Apparatus and method for resolution of entry/exit conflicts for security monitoring systems
US7215798B2 (en) Method for forgery recognition in fingerprint recognition by using a texture classification of gray scale differential images
CN107679471B (en) Indoor personnel air post detection method based on video monitoring platform
CN109409315B (en) Method and system for detecting remnants in panel area of ATM (automatic Teller machine)
Ziliani et al. Image analysis for video surveillance based on spatial regularization of a statistical model-based change detection
KR20090086898A (en) Detection of smoke with a video camera
EP1010130A1 (en) Low false alarm rate video security system using object classification
EP2546807B1 (en) Traffic monitoring device
CN111127810A (en) Automatic alarming method and system for open fire of machine room
CN112396011A (en) Face recognition system based on video image heart rate detection and living body detection
WO1996038820A1 (en) Security control system
JPH07181012A (en) Feature amount detector for image data
JP2003051076A (en) Device for monitoring intrusion
JP2599701B2 (en) Elevator Standby Passenger Number Detection Method
CN108230607A (en) A kind of image fire detection method based on regional characteristics analysis
JP2002304651A (en) Device and method for managing entering/leaving room, program for executing the same method and recording medium with the same execution program recorded thereon
CN107221056A (en) The method stopped based on human bioequivalence
KR102211903B1 (en) Method And Apparatus for Photographing for Detecting Vehicle Occupancy
CN107221058A (en) Intelligent channel barrier system
JPH0869523A (en) Human body recognizing device
KR100338473B1 (en) Face detection method using multi-dimensional neural network and device for the same
EP1161081A2 (en) Automatic bright window detection

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AL AM AT AU AZ BB BG BR BY CA CH CN CZ DE DK EE ES FI GB GE HU IS JP KE KG KP KR KZ LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK TJ TM TR TT UA UG US UZ VN AM AZ BY KG KZ MD RU TJ TM

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): KE LS MW SD SZ UG AT BE CH DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 1996919896

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 1996919896

Country of ref document: EP

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

NENP Non-entry into the national phase

Ref country code: CA

WWR Wipo information: refused in national office

Ref document number: 1996919896

Country of ref document: EP

WWW Wipo information: withdrawn in national office

Ref document number: 1996919896

Country of ref document: EP