WO2011078649A2 - Method of determining loitering event - Google Patents

Method of determining loitering event Download PDF

Info

Publication number
WO2011078649A2
WO2011078649A2 PCT/MY2010/000241 MY2010000241W WO2011078649A2 WO 2011078649 A2 WO2011078649 A2 WO 2011078649A2 MY 2010000241 W MY2010000241 W MY 2010000241W WO 2011078649 A2 WO2011078649 A2 WO 2011078649A2
Authority
WO
WIPO (PCT)
Prior art keywords
time
entering
exiting
interest
properties
Prior art date
Application number
PCT/MY2010/000241
Other languages
French (fr)
Other versions
WO2011078649A3 (en
Inventor
Kim Meng Liang
Sze Ling Tang
Mei Kuan Lim
Kadim Zulaikha
Weng Kin Lai
Samudin Norshuhuda
Chee Seng Chan
Kiran Maleeha
Ahmed Abd Baha'a Aldeen
Original Assignee
Mimos Berhad
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mimos Berhad filed Critical Mimos Berhad
Publication of WO2011078649A2 publication Critical patent/WO2011078649A2/en
Publication of WO2011078649A3 publication Critical patent/WO2011078649A3/en

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19602Image analysis to detect motion of the intruder, e.g. by frame subtraction
    • G08B13/19613Recognition of a predetermined image pattern or behaviour pattern indicating theft or intrusion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/223Analysis of motion using block-matching
    • G06T7/238Analysis of motion using block-matching using non-full search, e.g. three-step search
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30232Surveillance

Definitions

  • the present invention relates to surveillance method.
  • the present invention relates to a method of determining loitering event. Background
  • loitering event One of the demanding monitoring tasks is to detect loitering event. Detection of loitering event is highly crucial as the loitering behavior is related to harmful activities such as drug-dealing activity, scene investigation for robbery and also unhealthy social problem of teenagers wasting their time in the public area.
  • State-of-the-art systems and methods require tracking and tagging the object of interest or the area of interest to carry out the loitering detection.
  • the accuracy of the calculated time is highly dependent on the performance of the tracking and tagging methodology.
  • US 6,985, 172B1 (US' 172) issued to Southwest Research Institute on 10 Jan 2006 discloses a surveillance apparatus and method for detecting incidents.
  • a reference image is generated by removing objects from an image of a region of interest. Accordingly, acquired images are compared with the reference image to detect motion objects.
  • WO 00/73996 (WO'996) by John Reid et al. published on 7 December 2000 discloses a method of tracking an object moving relative to a scene. It generates a background by acquiring images. Potential objects are matched by comparing spatial characteristics of potential objects of a following image of the comparing images, and the matched potential object are being taken as a moving object.
  • loitering object can also be detected through tracking the motion of the detected objects in US' 172 and WO'996. Summary
  • a method for indentifying loitering event occurred on an object within an area of interest from a video stream comprises detecting one or more the objects entering the area- of-interest; extracting (104) an entering time and properties of each of the objects; storing (104) the entering times and properties of the objects; computing (110) a time- stamp for each object based on a difference between a current time and the entering time of the corresponding object; and identifying (112) a loitering event when the time-stamp is longer than a predetermined period.
  • the method further comprises detecting an exiting object exiting the area of interest; extracting (106) an exiting time and properties of the exiting object; and matching (108) the exiting object with the exited object against the stored objects based on the properties to identify a corresponding entering object.
  • the current time for computing (110) the time-stamp is the exiting time.
  • the area of interest may cover the entire area of the video stream. It is also possible that the area of interest includes one or more partial areas within the video stream.
  • detection of the one or more objects, including the exiting object may further comprise computing an overlapping value based on a plurality of motion blocks extracted from the area of interest and a plurality of motion blocks extracted from a previous frame of the video stream.
  • the motion blocks may be computed by any foreground detection technique.
  • the identification (108) of the corresponding entering object may comprise computing a matching value through matching the properties of the exited object with the properties of each object stored, wherein the object with the highest matching value is the corresponding entering object.
  • FIG. 1 is a flowchart illustrating a method of determining loitering event from a video stream in accordance with one embodiment of the present invention
  • FIG. 2 is a flow chart illustrating a process for extracting and storing entering time and properties of new entering object in accordance with one embodiment of the present invention
  • FIG. 3 is a flow chart illustrating a process for extracting exiting time and properties of a detected exiting object in accordance with one embodiment of the present invention
  • FIG. 4 is a flowchart illustrating a process of obtaining a matched object of a exited object from previously stored objects in accordance with another embodiment of the present invention.
  • FIG. 5 is a flowchart illustrating a process that computes a time-stamp of an object in question, the exiting object, and remaining entering objects in accordance with one embodiment of the present invention.
  • a method for determining loitering event includes a technique that calculates time- stamps of objects from a video stream to determine if loitering event occurs.
  • the time-stamp is duration of a particular object entering and exiting a region-of-interest or scene of the video stream.
  • Each particular object is referenced with the two specific timings: Entering Time, t en , e r, and Exiting Time, t ex u-
  • the entering time is a present time of an object entering the region-of-interest or scene
  • the exiting time is a present time of the object in question exiting the region-of-interest or scene.
  • the present method does not require tracking and/or tagging the object in question throughout the stream of video.
  • the tracking/tagging herein referred to mean any motion detection method that requires labeling or the like.
  • the present invention is applicable to detect loitering event based on features extracted/detected from a video stream, generally life video stream.
  • the method may be used to extract features from one specific area-of- interest or scene from the video stream. It is also possible that the features are detected from more than one area-of-interest or scene or the entire screen of the video stream.
  • the area of interest includes the entire area of the video stream and one or more partial areas within the video stream. The entire area of the video stream may cover the one or more partial areas within the video stream. Further, when desire, the present invention is also applicable on a (or more) dynamic area-of-interest within a video stream.
  • an entering time i.e. an object is entering the area-of-interest or scene of the video stream
  • properties and the current time of the object in question entering the scene are extracted and stored together with a reference index.
  • the properties are visual features that include color, texture, shape and etc., which are extractable from a video stream via any possible methods.
  • the reference index is attached to the time and properties captured for indexing.
  • an exiting time i.e. any object (whether or not corresponds with the earlier detected object) exiting the region-of- interest or scene, properties and the current time of that object are also extracted.
  • the properties of the object exiting the region-of-interest or scene are taken to match with the list of properties that are previously stored for objects that entered the region-of- interest or scene.
  • the entering time of the most matched object (based on the properties) is extracted and computed with the exiting time of the detected object exiting the region-of-interest or scene.
  • the properties and present time information is removed from the storage after the extraction process of the corresponding present time.
  • the difference between these two presents times represents a time-stamp of the object in question enters and exit the region-of-interest or scene.
  • the time-stamp is calculated by finding the difference between the current time of the new object in the storage and the running time of the system.
  • the difference value i.e. the time stamp
  • a loitering event is detected. Otherwise, no loitering event is detected.
  • FIG. 1 is a flowchart illustrating a method of determining loitering event from a video stream in accordance with one embodiment of the present invention.
  • the method comprises computing foreground motion blocks in the region-of-interest or scene at step 102; extracting and storing a present time and properties of new objects entering the region-of-interest or scene at step 104; extracting a present time and properties of an exiting object at step 106; computing a matched object by matching the exited object with stored entering objects at step 108; computing a time- stamp at step 110; and determining if the time-stamp is larger/longer than a predefined threshold at step 112.
  • foreground motion blocks are computed in the region-of-interest or scene to detect an object entering the area-of-interest or scene.
  • the foreground motion blocks are determined by finding intensity value differences between a current frame and a previous frame of the video stream. Feature extractions from a digital video stream through intensity are widely known in the art, and thus will not be described herein in details.
  • a new motion block is determined, a new object is considered detected entering the area-of- interest or scene of the video stream.
  • the present time and properties of motion block that are identified as a new object are extracted and recorded in a storage 105 with reference index.
  • the storage 105 is provided to store all the entering time and properties of each entering object.
  • the present time and properties of exited object are extracted. Both entering time and exiting time, and properties of the entering and exiting objects may be extracted with any techniques well known in the art.
  • properties of the exiting object are compared and matched with those of the entering object(s) stored in the storage to determine a corresponding entering object. Once a matched entering object is found, the corresponding entering time and properties are extracted from the storage.
  • exiting time of the exiting object and the entering time of the corresponding entering object are computed to obtain a time-stamp. In addition, the process may further calculate a time stamp of the remaining stored object (i.e.
  • the time-stamp is then evaluated against the predefined threshold, Thresholdi,oT, when the time-stamp is larger/longer than the predefined threshold, a loitering event is detected.
  • Thresholdi,oT the predefined threshold
  • no loitering event is detected with respect to the exiting object within the area-of-interest or scene.
  • FIG. 2 is a flow chart illustrating a process for extracting and storing entering time and properties of new entering object in accordance with one embodiment of the present invention.
  • the process comprises computing overlapping values of a current motion block with all previously stored motion blocks at step 202; determining if the overlapping value is smaller than a predetermined threshold, Threshold 0 mer 3 at step 204; extracting an entering time of a new entering object at step 206; extracting properties of the new entering object at step 208; storing the entering time and properties of the new entering object at the storage 105 at step 210; and determining if any further new entering motion blocks exist at step 212,
  • the process computes the overlapping value of the currently detected motion block with all the previously stored motion blocks.
  • the previously stored motion blocks may be stored at a separate storage (not shown in the FIG. 2).
  • the previously stored motion blocks are the motion blocks that are computed in the previous frame of the video stream.
  • the overlapping value is calculated by determining a total number of overlapping pixels from two motion blocks, i.e. the current motion block and previously stored motion block that are overlapping with each others, then dividing the total number of overlapping pixels with a largest area from the current motion block or previous motion block. If overlapping value is lesser than the predetermined threshold, Threshold cn t C r, the steps 206, 208 and 210 are executed.
  • the process checks if further motion block(s) are to be checked against the previously stored motion blocks, if so, the process loops back to the step 202 to determine if the further motion block has a overlapping value that smaller than the predetermined threshold, Threshold en ier- If not, the process for extracting and storing the time stamp and properties ends. It is to be noted that when the overlapping value is larger than (i.e. not smaller than) the predetermine threshold value, it represents that there is no new object entering the area-of-interest or scene.
  • the process extracts an Entering Time of the entering object at the step 206.
  • the process further extracts properties of the entering object, and annotated as Entering Properties.
  • the Entering Time and Entering Properties corresponds to the new motion block is stored in the storage 105 with a unique reference index in the storage. After the Entering Time and Properties are stored, the process proceeds to the step 212 to check if any further new motion block to be processed.
  • FIG. 3 is a flow chart illustrating a process for extracting exiting time and properties of a detected exiting object in accordance with one embodiment of the present invention. This process is carried out when a new exiting object is detected.
  • the process comprises computing overlapping values of all previous motion blocks with all current motion blocks at step 302; determining if the overlapping value is smaller than a predetermined threshold, Threshold ex i t , at step 304; extracting an exiting time of a detected exiting object at step 306; extracting properties of the detected exiting object at step 308; determining a matched object of the detected exiting object among the entering objects stored at the storage 105 at step 310; and determining if further previous motion blocks are to be compared at step 312.
  • the process computes the overlapping value of the previous motion block that is stored separately with all the detected motion blocks.
  • the steps 306, 308 and 310 are executed. Otherwise, at the step 312, the process checks whether further previous motion block(s) in the storage are to be processed, if so, the process loops back to the step 302 to determine if the further previous motion block has a overlapping value that smaller than the predetermined threshold, Threshold ⁇ . If not, the process for determining a corresponding previous motion block ends.
  • an exiting object is detected. Then an Existing Time of the detected exiting object at the step 306 is extracted. In the following step 308, properties of the exiting object is also extracted, and annotated as Exiting Properties. Once the Exiting Time and Exiting Properties corresponds to the detected existing motion block obtained, the process computes a matched object of detected existing object among the entering objects stored at the storage 105 at step 310. Then, the step 312 is executed to check if any further previous motion block to be processed.
  • FIG. 4 is a flowchart illustrating a process of obtaining a matched object of a exited object from previously stored (entering) objects in accordance with another embodiment of the present invention.
  • This process is meant to identify a corresponding entering object for the exited object.
  • the process comprises computing a matching value of properties of exiting object with those stored in the storage 105 at step 402; storing the matching values with the corresponding properties in a storage 105 at step 404; determining if any properties entries exist at step 406; extracting the current time with the highest matching value from storage at step 408; storing the corresponding entering time and exiting time in a temporary storage 450 at step 410; and removing the property entiies with the highest matching value at step 412 from the storage 105.
  • the matching value is derived from the properties of the exiting object with each property entry of the entering objects previously stored in the storage 105. Then the corresponding matching values are stored with their corresponding entry in the storage 105 at the step 404. In the step 406, when any of the property entries in the storage 105 still exist for processing, the step of computing the matching value in step 402 is repeated. The loops are carried out until all the property entries are processed. At the step 408, when all the property entries in the storage 105 are processed, the corresponding entering time with the highest matching value is extracted from the storage 105.
  • FIG. 5 is a flowchart illustrating a process that computes a time-stamp of an object in question, the exiting object, and remaining entering objects in accordance with one embodiment of the present invention.
  • the process comprises computing time-stamp at step 502; extracting the current time of the running system at step 504; and computing time-stamps for the remaining objects stored in the storage 105.
  • the time-stamp for the exiting object is computed through the Exiting Time and the corresponding Entering Time stored in the temporary memory storage 450 at step 502.
  • the time-stamp of each exited object is calculated with Equation (1) as follows:
  • Time-stamp Exiting Time - Entering Time ( 1 )
  • the current time of the running system is extracted there from and annotated as Running Time.
  • the time-stamps of each of the remaining objects in the storage 105 is calculated with Equation (2) as follows:
  • Time-stamprcmainmg Running Time - Entering Time (2)
  • the time-stamp for the remaining entering objects can be used to determine if the entering objects has been loitering even if the objects never exit the area-of- interest or scene of the video stream.
  • a method of identifying loitering event provided.
  • the time-stamp of an object under surveillance that represent the duration of the object that continuously appears in the region-of-interest or scene is used as a factor to determine loitering event.
  • the computation of the time-stamp in this invention is determined by calculating the duration between the time of the object appears in the region-of-interest or scene and the time of the same object disappears from the region- of-interest or scene.

Abstract

The present invention provides a method for indentifying loitering event occurred on an object within an area of interest from a video stream. The method comprises detecting one or more the objects entering the area-of-interest; extracting (104) an entering time and properties of each of the objects; storing (104) the entering times ad properties of the objects; computing (110) a time-stamp for each object based on a difference between a current time and the entering time of the corresponding object; and identifying (112) a loitering event when the time-stamp is longer than a predetermined period.

Description

METHOD OF DETERMINING LOITERING EVENT
Field of the Invention
The present invention relates to surveillance method. In particular, the present invention relates to a method of determining loitering event. Background
Rising of security concerns has led to the increase of the installation of imaging means such as optical camera in the rooms, buildings, airports, cities and etc for surveillance task. However, it is labour intensive to monitor all the live events or captured videos. Although it is well recognized the monitoring these event manually would be most effective and accurate, but the level of effectiveness is decreases as the attention span and the number of monitor screen increases. Thus, automating the monitoring system would allow the security personnel to carry out the video monitoring task more effectively.
One of the demanding monitoring tasks is to detect loitering event. Detection of loitering event is highly crucial as the loitering behavior is related to harmful activities such as drug-dealing activity, scene investigation for robbery and also unhealthy social problem of teenagers wasting their time in the public area.
State-of-the-art systems and methods require tracking and tagging the object of interest or the area of interest to carry out the loitering detection. The accuracy of the calculated time is highly dependent on the performance of the tracking and tagging methodology.
US 6,985, 172B1 (US' 172) issued to Southwest Research Institute on 10 Jan 2006 discloses a surveillance apparatus and method for detecting incidents. A reference image is generated by removing objects from an image of a region of interest. Accordingly, acquired images are compared with the reference image to detect motion objects.
WO 00/73996 (WO'996) by John Reid et al. published on 7 December 2000 discloses a method of tracking an object moving relative to a scene. It generates a background by acquiring images. Potential objects are matched by comparing spatial characteristics of potential objects of a following image of the comparing images, and the matched potential object are being taken as a moving object.
As many other systems and methods, loitering object can also be detected through tracking the motion of the detected objects in US' 172 and WO'996. Summary
In one aspect of the present invention, there is provided a method for indentifying loitering event occurred on an object within an area of interest from a video stream. Method comprises detecting one or more the objects entering the area- of-interest; extracting (104) an entering time and properties of each of the objects; storing (104) the entering times and properties of the objects; computing (110) a time- stamp for each object based on a difference between a current time and the entering time of the corresponding object; and identifying (112) a loitering event when the time-stamp is longer than a predetermined period.
In one embodiment, the method further comprises detecting an exiting object exiting the area of interest; extracting (106) an exiting time and properties of the exiting object; and matching (108) the exiting object with the exited object against the stored objects based on the properties to identify a corresponding entering object. The current time for computing (110) the time-stamp is the exiting time. The area of interest may cover the entire area of the video stream. It is also possible that the area of interest includes one or more partial areas within the video stream.
In another embodiment, detection of the one or more objects, including the exiting object, may further comprise computing an overlapping value based on a plurality of motion blocks extracted from the area of interest and a plurality of motion blocks extracted from a previous frame of the video stream. The motion blocks may be computed by any foreground detection technique.
Further, the identification (108) of the corresponding entering object may comprise computing a matching value through matching the properties of the exited object with the properties of each object stored, wherein the object with the highest matching value is the corresponding entering object.
Brief Description of Drawings
Preferred embodiments according to the present invention and the existing ait will now be described with reference to the figures accompanied herein, in which like reference numerals denote like elements.
FIG. 1 is a flowchart illustrating a method of determining loitering event from a video stream in accordance with one embodiment of the present invention;
FIG. 2 is a flow chart illustrating a process for extracting and storing entering time and properties of new entering object in accordance with one embodiment of the present invention;
FIG. 3 is a flow chart illustrating a process for extracting exiting time and properties of a detected exiting object in accordance with one embodiment of the present invention; FIG. 4 is a flowchart illustrating a process of obtaining a matched object of a exited object from previously stored objects in accordance with another embodiment of the present invention; and
FIG. 5 is a flowchart illustrating a process that computes a time-stamp of an object in question, the exiting object, and remaining entering objects in accordance with one embodiment of the present invention.
Detailed Description
Embodiments of the present invention shall now be described in detail, with reference to the attached drawings. It is to be understood that no limitation of the scope of the invention is thereby intended, such alterations and further modifications in the illustrated device, and such further applications of the principles of the invention as illustrated therein being contemplated as would normally occur to one skilled in the art to which the invention relates. In one embodiment of the present invention, a method for determining loitering event is presented. The method includes a technique that calculates time- stamps of objects from a video stream to determine if loitering event occurs. The time-stamp is duration of a particular object entering and exiting a region-of-interest or scene of the video stream. Each particular object is referenced with the two specific timings: Entering Time, ten,er, and Exiting Time, texu- The entering time is a present time of an object entering the region-of-interest or scene, while the exiting time is a present time of the object in question exiting the region-of-interest or scene. When the time-stamp exceeding a prescribed period limit, the object in question is considered loitering. It is understood that the present method does not require tracking and/or tagging the object in question throughout the stream of video. The tracking/tagging herein referred to mean any motion detection method that requires labeling or the like.
It is understood that the present invention is applicable to detect loitering event based on features extracted/detected from a video stream, generally life video stream. As mentioned, the method may be used to extract features from one specific area-of- interest or scene from the video stream. It is also possible that the features are detected from more than one area-of-interest or scene or the entire screen of the video stream. In an alternative embodiment, the area of interest includes the entire area of the video stream and one or more partial areas within the video stream. The entire area of the video stream may cover the one or more partial areas within the video stream. Further, when desire, the present invention is also applicable on a (or more) dynamic area-of-interest within a video stream.
In this embodiment, when an entering time is recorded, i.e. an object is entering the area-of-interest or scene of the video stream, properties and the current time of the object in question entering the scene are extracted and stored together with a reference index. Without any limitations, the properties are visual features that include color, texture, shape and etc., which are extractable from a video stream via any possible methods. The reference index is attached to the time and properties captured for indexing. Similarly, when an exiting time is recorded, i.e. any object (whether or not corresponds with the earlier detected object) exiting the region-of- interest or scene, properties and the current time of that object are also extracted. The properties of the object exiting the region-of-interest or scene are taken to match with the list of properties that are previously stored for objects that entered the region-of- interest or scene. When a matched is found, the entering time of the most matched object (based on the properties) is extracted and computed with the exiting time of the detected object exiting the region-of-interest or scene. For the matched reference index, the properties and present time information is removed from the storage after the extraction process of the corresponding present time. The difference between these two presents times represents a time-stamp of the object in question enters and exit the region-of-interest or scene. For each remaining new objects in the storage, the time-stamp is calculated by finding the difference between the current time of the new object in the storage and the running time of the system. When the difference value, i.e. the time stamp, is larger/longer than a predefined threshold, a loitering event is detected. Otherwise, no loitering event is detected.
FIG. 1 is a flowchart illustrating a method of determining loitering event from a video stream in accordance with one embodiment of the present invention. Briefly, the method comprises computing foreground motion blocks in the region-of-interest or scene at step 102; extracting and storing a present time and properties of new objects entering the region-of-interest or scene at step 104; extracting a present time and properties of an exiting object at step 106; computing a matched object by matching the exited object with stored entering objects at step 108; computing a time- stamp at step 110; and determining if the time-stamp is larger/longer than a predefined threshold at step 112.
Referring back to the step 102, foreground motion blocks are computed in the region-of-interest or scene to detect an object entering the area-of-interest or scene. In one non-limiting embodiment, the foreground motion blocks are determined by finding intensity value differences between a current frame and a previous frame of the video stream. Feature extractions from a digital video stream through intensity are widely known in the art, and thus will not be described herein in details. When a new motion block is determined, a new object is considered detected entering the area-of- interest or scene of the video stream. In the step 104 the present time and properties of motion block that are identified as a new object are extracted and recorded in a storage 105 with reference index. The storage 105 is provided to store all the entering time and properties of each entering object. At the step 106, when the new object is detected exiting the area-of-interest or scene, the present time and properties of exited object are extracted. Both entering time and exiting time, and properties of the entering and exiting objects may be extracted with any techniques well known in the art. At the step 108, properties of the exiting object are compared and matched with those of the entering object(s) stored in the storage to determine a corresponding entering object. Once a matched entering object is found, the corresponding entering time and properties are extracted from the storage. At the step 110, exiting time of the exiting object and the entering time of the corresponding entering object are computed to obtain a time-stamp. In addition, the process may further calculate a time stamp of the remaining stored object (i.e. those that are not matched) based on their entering time against a present time. At the step 112, the time-stamp is then evaluated against the predefined threshold, Thresholdi,oT, when the time-stamp is larger/longer than the predefined threshold, a loitering event is detected. When the time-stamp is smaller/shorter than the predefined threshold, no loitering event is detected with respect to the exiting object within the area-of-interest or scene.
FIG. 2 is a flow chart illustrating a process for extracting and storing entering time and properties of new entering object in accordance with one embodiment of the present invention. The process comprises computing overlapping values of a current motion block with all previously stored motion blocks at step 202; determining if the overlapping value is smaller than a predetermined threshold, Threshold0mer3 at step 204; extracting an entering time of a new entering object at step 206; extracting properties of the new entering object at step 208; storing the entering time and properties of the new entering object at the storage 105 at step 210; and determining if any further new entering motion blocks exist at step 212,
Referring back to the step 202, the process computes the overlapping value of the currently detected motion block with all the previously stored motion blocks. The previously stored motion blocks may be stored at a separate storage (not shown in the FIG. 2). The previously stored motion blocks are the motion blocks that are computed in the previous frame of the video stream. The overlapping value is calculated by determining a total number of overlapping pixels from two motion blocks, i.e. the current motion block and previously stored motion block that are overlapping with each others, then dividing the total number of overlapping pixels with a largest area from the current motion block or previous motion block. If overlapping value is lesser than the predetermined threshold, ThresholdcntCr, the steps 206, 208 and 210 are executed. Otherwise, at the step 212, the process checks if further motion block(s) are to be checked against the previously stored motion blocks, if so, the process loops back to the step 202 to determine if the further motion block has a overlapping value that smaller than the predetermined threshold, Thresholdenier- If not, the process for extracting and storing the time stamp and properties ends. It is to be noted that when the overlapping value is larger than (i.e. not smaller than) the predetermine threshold value, it represents that there is no new object entering the area-of-interest or scene.
Referring back to the step 204, when the overlapping value is smaller than that of the predetermined threshold, Thresholdenter5 a new entering object is detected. Then the process extracts an Entering Time of the entering object at the step 206. In the following step 208, the process further extracts properties of the entering object, and annotated as Entering Properties. In the step 210, the Entering Time and Entering Properties corresponds to the new motion block is stored in the storage 105 with a unique reference index in the storage. After the Entering Time and Properties are stored, the process proceeds to the step 212 to check if any further new motion block to be processed.
FIG. 3 is a flow chart illustrating a process for extracting exiting time and properties of a detected exiting object in accordance with one embodiment of the present invention. This process is carried out when a new exiting object is detected. The process comprises computing overlapping values of all previous motion blocks with all current motion blocks at step 302; determining if the overlapping value is smaller than a predetermined threshold, Thresholdexit, at step 304; extracting an exiting time of a detected exiting object at step 306; extracting properties of the detected exiting object at step 308; determining a matched object of the detected exiting object among the entering objects stored at the storage 105 at step 310; and determining if further previous motion blocks are to be compared at step 312.
Referring back to the step 302, the process computes the overlapping value of the previous motion block that is stored separately with all the detected motion blocks. When the computed overlapping value is lesser than the predetermined threshold, ThresholdeXji, the steps 306, 308 and 310 are executed. Otherwise, at the step 312, the process checks whether further previous motion block(s) in the storage are to be processed, if so, the process loops back to the step 302 to determine if the further previous motion block has a overlapping value that smaller than the predetermined threshold, Threshold^. If not, the process for determining a corresponding previous motion block ends.
Referring back to the step 304, when the overlapping value is smaller than that of the predetermined threshold, Thresholc ii, an exiting object is detected. Then an Existing Time of the detected exiting object at the step 306 is extracted. In the following step 308, properties of the exiting object is also extracted, and annotated as Exiting Properties. Once the Exiting Time and Exiting Properties corresponds to the detected existing motion block obtained, the process computes a matched object of detected existing object among the entering objects stored at the storage 105 at step 310. Then, the step 312 is executed to check if any further previous motion block to be processed.
FIG. 4 is a flowchart illustrating a process of obtaining a matched object of a exited object from previously stored (entering) objects in accordance with another embodiment of the present invention. This process is meant to identify a corresponding entering object for the exited object. The process comprises computing a matching value of properties of exiting object with those stored in the storage 105 at step 402; storing the matching values with the corresponding properties in a storage 105 at step 404; determining if any properties entries exist at step 406; extracting the current time with the highest matching value from storage at step 408; storing the corresponding entering time and exiting time in a temporary storage 450 at step 410; and removing the property entiies with the highest matching value at step 412 from the storage 105.
Referring back to the step 402, the matching value is derived from the properties of the exiting object with each property entry of the entering objects previously stored in the storage 105. Then the corresponding matching values are stored with their corresponding entry in the storage 105 at the step 404. In the step 406, when any of the property entries in the storage 105 still exist for processing, the step of computing the matching value in step 402 is repeated. The loops are carried out until all the property entries are processed. At the step 408, when all the property entries in the storage 105 are processed, the corresponding entering time with the highest matching value is extracted from the storage 105. At the step 410, the exiting time of the object in question and the corresponding entering time are stored temporary in the memory storage 450, and subsequently, at the step 412, the property entry with the highest matching value is removed from the storage 105. It signifies that the information that belongs to the object in question is removed as the object in question exits the region-of-interest or scene. FIG. 5 is a flowchart illustrating a process that computes a time-stamp of an object in question, the exiting object, and remaining entering objects in accordance with one embodiment of the present invention. The process comprises computing time-stamp at step 502; extracting the current time of the running system at step 504; and computing time-stamps for the remaining objects stored in the storage 105. At the step 502, the time-stamp for the exiting object is computed through the Exiting Time and the corresponding Entering Time stored in the temporary memory storage 450 at step 502. The time-stamp of each exited object is calculated with Equation (1) as follows:
Time-stamp = Exiting Time - Entering Time ( 1 ) Subsequently, at step 504, the current time of the running system is extracted there from and annotated as Running Time. At step 506, the time-stamps of each of the remaining objects in the storage 105 is calculated with Equation (2) as follows:
Time-stamprcmainmg = Running Time - Entering Time (2)
The time-stamp for the remaining entering objects can be used to determine if the entering objects has been loitering even if the objects never exit the area-of- interest or scene of the video stream.
In the present application, a method of identifying loitering event provided. The time-stamp of an object under surveillance that represent the duration of the object that continuously appears in the region-of-interest or scene is used as a factor to determine loitering event. The computation of the time-stamp in this invention is determined by calculating the duration between the time of the object appears in the region-of-interest or scene and the time of the same object disappears from the region- of-interest or scene.
While specific embodiments have been described and illustrated, it is understood that many changes, modifications, variations and combinations thereof could be made to the present invention without departing from the scope of the invention.

Claims

1. A method for identifying loitering event within an area of interest from a video stream, the method comprising:
detecting one or more objects entering the area-of-interest;
extracting (104) an entering time and properties of each of the objects;
storing (104) the entering times and properties of the objects;
computing (1 10) a time-stamp for each object based on a difference between a current time and the entering time of the corresponding object; and
identifying (112) a loitering event when the time-stamp is longer than a predetermined period,
wherein the method is carried out without tracking and/or tagging each of the object entered the area-of-interest
2. The method according to Claim 1, further comprising:
detecting an exiting object exiting the area of interest;
extracting (106) an exiting time and properties of the exiting object; and matching (108) the exiting object with the exited object against the stored objects based on the properties to identify a corresponding entering object,
wherein the current time for computing (110) the time-stamp is the exiting time.
3. The method according to Claim 1, wherein the area of interest covers the entire area of the video stream.
4. The method according to Claim 1, wherein the area of interest includes one or more partial areas within the video stream.
5. The method according to Claim 2, wherein the detecting of the one or more objects, including the exiting object, further comprises computing an overlapping value based on a plurality of motion blocks extracted from the area of interest and a plurality of motion blocks extracted from a previous frame of the video stream.
6. The method according to Claim 5, wherein the motion blocks are computed by a foreground detection technique.
7. The method according to Claim 2, wherein identifying (108) the corresponding entering object comprises computing a matching value through matching the properties of the exited object with the properties of each object stored, wherein the object with the highest matching value is the corresponding entering object.
PCT/MY2010/000241 2009-12-21 2010-10-29 Method of determining loitering event WO2011078649A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
MYPI20095475 MY150414A (en) 2009-12-21 2009-12-21 Method of determining loitering event
MYPI20095475 2009-12-21

Publications (2)

Publication Number Publication Date
WO2011078649A2 true WO2011078649A2 (en) 2011-06-30
WO2011078649A3 WO2011078649A3 (en) 2011-10-06

Family

ID=44196365

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/MY2010/000241 WO2011078649A2 (en) 2009-12-21 2010-10-29 Method of determining loitering event

Country Status (2)

Country Link
MY (1) MY150414A (en)
WO (1) WO2011078649A2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109934217A (en) * 2017-12-19 2019-06-25 安讯士有限公司 Detect the method, apparatus and system for event of hovering

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5798787A (en) * 1995-08-11 1998-08-25 Kabushiki Kaisha Toshiba Method and apparatus for detecting an approaching object within a monitoring zone
US20050073585A1 (en) * 2003-09-19 2005-04-07 Alphatech, Inc. Tracking systems and methods
WO2009079809A1 (en) * 2007-12-07 2009-07-02 Multi Base Limited Video surveillance system with object tracking and retrieval

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5798787A (en) * 1995-08-11 1998-08-25 Kabushiki Kaisha Toshiba Method and apparatus for detecting an approaching object within a monitoring zone
US20050073585A1 (en) * 2003-09-19 2005-04-07 Alphatech, Inc. Tracking systems and methods
WO2009079809A1 (en) * 2007-12-07 2009-07-02 Multi Base Limited Video surveillance system with object tracking and retrieval

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109934217A (en) * 2017-12-19 2019-06-25 安讯士有限公司 Detect the method, apparatus and system for event of hovering
EP3502952A1 (en) * 2017-12-19 2019-06-26 Axis AB Method, device and system for detecting a loitering event
CN109934217B (en) * 2017-12-19 2021-07-30 安讯士有限公司 Method, apparatus and system for detecting a loitering event

Also Published As

Publication number Publication date
WO2011078649A3 (en) 2011-10-06
MY150414A (en) 2014-01-15

Similar Documents

Publication Publication Date Title
CN109166261B (en) Image processing method, device and equipment based on image recognition and storage medium
US11157778B2 (en) Image analysis system, image analysis method, and storage medium
CN110222640B (en) Method, device and method for identifying suspect in monitoring site and storage medium
CN106203458B (en) Crowd video analysis method and system
CN103270536B (en) Stopped object detection
CN102306304B (en) Face occluder identification method and device
CN105144705B (en) Object monitoring system, object monitoring method, and program for extracting object to be monitored
US20170004629A1 (en) Low-complexity motion detection based on image edges
CN107920223B (en) Object behavior detection method and device
WO2015040929A1 (en) Image processing system, image processing method, and program
Zin et al. A Markov random walk model for loitering people detection
Alzughaibi et al. Review of human motion detection based on background subtraction techniques
CN111091025A (en) Image processing method, device and equipment
CN113887445A (en) Method and system for identifying standing and loitering behaviors in video
CN102244769B (en) Object and key person monitoring system and method thereof
CN109934217B (en) Method, apparatus and system for detecting a loitering event
JP5758165B2 (en) Article detection device and stationary person detection device
KR102182660B1 (en) System and method for detecting violence using spatiotemporal features and computer readable recording media storing program for executing method thereof
CN102339465B (en) Method and system for detecting the mutual closing and/or contact of moving objects
WO2011078649A2 (en) Method of determining loitering event
KR101848367B1 (en) metadata-based video surveillance method using suspective video classification based on motion vector and DCT coefficients
WO2012074366A2 (en) A system and a method for detecting a loitering event
CN113378728A (en) Monitoring method, system, electronic equipment and computer readable storage medium
Joshi et al. Suspicious object detection
CN112449155A (en) Video monitoring method and system for protecting privacy of personnel

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10839836

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10839836

Country of ref document: EP

Kind code of ref document: A2