WO2011147454A1 - Trigger device - Google Patents

Trigger device Download PDF

Info

Publication number
WO2011147454A1
WO2011147454A1 PCT/EP2010/057324 EP2010057324W WO2011147454A1 WO 2011147454 A1 WO2011147454 A1 WO 2011147454A1 EP 2010057324 W EP2010057324 W EP 2010057324W WO 2011147454 A1 WO2011147454 A1 WO 2011147454A1
Authority
WO
WIPO (PCT)
Prior art keywords
sensor
light source
time
sensor elements
sequence
Prior art date
Application number
PCT/EP2010/057324
Other languages
French (fr)
Inventor
Asbjørn BERGE
Jens Toivo Thielemann
Karl Haugholt
Original Assignee
Steinert Vision As
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Steinert Vision As filed Critical Steinert Vision As
Priority to PCT/EP2010/057324 priority Critical patent/WO2011147454A1/en
Publication of WO2011147454A1 publication Critical patent/WO2011147454A1/en

Links

Classifications

    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C1/00Registering, indicating or recording the time of events or elapsed time, e.g. time-recorders for work people
    • G07C1/22Registering, indicating or recording the time of events or elapsed time, e.g. time-recorders for work people in connection with sports or games
    • G07C1/24Race time-recorders
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41JTARGETS; TARGET RANGES; BULLET CATCHERS
    • F41J5/00Target indicating systems; Target-hit or score detecting systems
    • F41J5/02Photo-electric hit-detector systems
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41JTARGETS; TARGET RANGES; BULLET CATCHERS
    • F41J5/00Target indicating systems; Target-hit or score detecting systems
    • F41J5/12Target indicating systems; Target-hit or score detecting systems for indicating the distance by which a bullet misses the target

Definitions

  • This invention relates to a method and a device for registering the time of a triggered event, the device including a camera with an image sensor comprising a number of sensor elements being sampled in a predetermined time sequence, the time of the sampling of the sensor elements thus being identifiable.
  • the invention thus solves the problems related to the known art by simply adding a light source into the camera housing, where the light source is activated by the triggering event. By analysing the image data from the image sensor, the time when the light source was activated may be detected.
  • An advantage of this solution is that all of the necessary information is stored in the standard signal from the image sensor but may be found by simple analysis.
  • the image sensor is a CMOS sensor being read line by line, and the difference between successive lines are compared in order to find at which line readout the light source was activated.
  • Figure 1 illustrates schematically a camera according to the invention.
  • Figure 2 Illustrates the readout from a CMOS image sensor.
  • Figure 3 illustrates the signal obtained from the readout in figure 2.
  • Figure 4 illustrates the images obtained with the device according to the invention.
  • Figure 5 illustrates the difference signal obtained with the method according to the invention.
  • the invention as illustrated in figure 1 relates to a simplified device essentially being constituted by an ordinary camera 1 with a lens 6, with an additional light source 2, e.g. an LED, and triggering device 3 coupled to it.
  • the signal from the image sensor 5 is stored in a storage means 4 which according to the invention also may include processing means. This processing means may also be provided externally if it is unnecessary to obtain the time of the triggered event immediately.
  • the triggering device 3 may be any device suitable for the specific use, such as a trigger on a gun or game console.
  • the device according to the invention may also include positioning and aiming information depending on the application in which the system is used, for example if positioned on a gun or game console.
  • the image sensor may be of any type where the signal may be read from a subset of the image sensor elements, including some types of CCD sensors, but according to the preferred embodiment of the invention the image sensor is a CMOS sensor and the readout functionality of the image sensor according to the invention.
  • CMOS cameras can use a pixel readout method called rolling shutter, the principle of which being illustrated in figure 2 where the time sequence is illustrated as a downward direction.
  • Line by line is reset according to a reset pointer 11, the light is measured for a specified time and the amount of light is read line by line as indicated by the read pointer 12.
  • a rolling window 10 is moving over the sensor 5 one line at the time.
  • the distance between the reset pointer and the readout pointer is the integration time 13 in figure 3 of the sensor related to the shutter time.
  • the time used on each line is a lot shorter that the total time used for a full frame. If a change in the illumination occurs it may be detected as an increased light intensity in or from a certain line in the image.
  • FIG 3 illustrates a challenges related to flash timing and 50/60Hz variation in the light intensity in indoor lighting.
  • the next line 14 is not read before the readout of the previous line is finished, thus defining a readout time sequence. If the image sensor is illuminated the illumination time will start at a point 15 and cover the rest of the integration time before the readout 14 of each line, and one may observe this as a change in the light relative to the previous frame in all the lines that had not been read when the light was turned on.
  • Figure 4 illustrates four consecutive frames in a video as well as a difference image between them, where the trigger signal is provided as a LED (Light Emitting Diode) is activated in the camera. As may be seen in the lower difference image there is an essentially linear increase in the light intensity that is not obvious from the images per se.
  • LED Light Emitting Diode
  • each frame may be compared with a reference image, for example the previous frame.
  • a reference image for example the previous frame.
  • the chance of having large differences between consecutive images is small.
  • a difference between two consecutive images gives a time derivative of the illumination for each pixel in the frame.
  • By summing over the lines in the image a robust measure of the derivative is obtained.
  • the aim is to find the line wherein the light source is activated it is practical to use the illumination change per line as a measured value.
  • the external trigger signal is preferably provided by a
  • LED being positioned so as to illuminate the complete image sensor, and have an insignificant rise time compared to the integration time per line in the image sensor. This means that the precision in the external trigger signal is only limited by to the integration time per line in the image sensor.
  • the LED is assumed to be on for a significant number of line readouts, i.e. several ms. This means that to find the time of the change in illumination on the sensor, i.e. the trigger time, the linear flank of the signal may be found, as illustrated in figure 5. This may be solved using a gliding time window estimating the slope of the curve. A flank corresponding to the activation of the LED will always be represented by an increased slope if it does not start when the camera is in a horizontal blanking mode, i.e. in this case 21 virtual lines overhead in the camera circuitry for reading the image sensor. The latter will be detected in that the signal starts at a value being very different from zero.
  • the recordings shown in figure 4 and 5 was performed at 95.20 images/second in VGA resolution (640x480).
  • 95.20 im/s and 480 lines corresponds to an upper precision less than 1/47695 s.
  • a rough estimate of the uncertainty in where the signal starts from a linear flank may be as large as 5 lines. This means that it is possible to measure the time of the external trigger signal with an accuracy of 1/4000 s or better.
  • the flank calculation may be disturbed by other events, such as a triggered flash or blinking light in the image.
  • a triggered flash or blinking light in the image.
  • more advanced methods may be used for analysing the signal from the sensor.
  • the triggered light may be detected from the detected colour, or light having the same spectrum as the light source may be filtered out in the camera lens, for example having a filter removing infrared light at the lens and using an infrared LED.
  • the line by line readout described above represents the preferred embodiment when using CMOS sensors, but other readout methods may also be used, the main idea being that one or more sensor elements are read in a sequence and the signal from the sensor elements or groups of sensor elements is analysed and the position in the sequence when the light at the image sensor changes may be detected, and thus the time of trigger signal estimated.
  • the invention relates to a registering device for registering the time of a triggered event, the device including a camera with an image sensor comprising a number of sensor elements being sampled in a predetermined time sequence, the time of the sampling of the sensor elements thus being identifiable from their known positions in the time sequence.
  • the device also comprising a light source being activated by the triggered event and being adapted to illuminate the sensor, and analysing means coupled to said sensor elements for detecting the illumination from said light source in the sampled data from the sensor elements and thus the time in said time sequence of the triggered event activating the light source.
  • the light source may be positioned inside or outside the camera housing.
  • the method for using the device includes the steps of sampling the information from the sensor elements in the image sensor in a sequence and registering the time of each sampling. From the sampled information the illumination from the light source is detected, preferably by analysing the sampled data from the sensor elements and thus the time in said time sequence of the triggered event activating the light source. This may be performed by finding the curve representing the change in measured light intensity and calculating the origination of the curve.
  • the device preferably includes analysing means is adapted to analyse the sampled data from sequences containing chosen subsets each including a number of sensor elements.
  • the image sensor is preferably a CMOS sensor wherein each line of sensor elements is sampled in a sequence. The triggered event may then be recognized as the time where the illumination from the light source is first detected at one of said lines of sensor elements.
  • the general sensor elements or groups of sensor elements are used to detect the illumination from the light source, but it is also possible to use specialized sensor elements, e.g. being outside the picture frame or having filters transmitting mainly the wavelengths emitted by the light source.
  • the light from the light source may also be recognize by spectrum or known intensity variations being detectable in the signal from the sensor elements or groups of sensor elements.
  • the groups of sensor elements may include different or partially overlapping sets of sensor elements, but is preferably consecutive lines of sensor elements in the image sensor, and the calculation may including finding the difference between the illumination detected by the sensor elements in each consecutive group.
  • the device may thus be a video camera, e.g. including a filter avoiding light within the wavelength range of the light source to reach the sensor from the imaging system projecting the image to the sensor, and the analyzing means being adapted to recognize light within said wavelength range in order to detect the triggering event.
  • It may also comprise a clock and wherein the analyzing means is adapted to register the time of the triggered event as the time of the sequence start and the position in said time sequence.

Abstract

The present invention relates to a registering device and method for registering the time of a triggered event, the device including a camera with an image sensor comprising a number of sensor elements being sampled in a predetermined time sequence, the time of the sampling of the sensor elements thus being identifiable. The device also comprises a light source being activated by the triggered event and being adapted to illuminate the sensor, and analysing means coupled to said sensor elements for detecting the illumination from said light source in the sampled data from the sensor elements and thus the time in said time sequence of the triggered event activating the light source.

Description

TRIGGER DEVICE
This invention relates to a method and a device for registering the time of a triggered event, the device including a camera with an image sensor comprising a number of sensor elements being sampled in a predetermined time sequence, the time of the sampling of the sensor elements thus being identifiable.
In many situations there is a need to synchronize an external event with a video stream, for example for registering the exact time of a triggering of a gun in shooting simulators or games. In most cameras this is a complicated process involving extra electronics for reading the internal clock of the camera as well as coding meta data to the same clock to the video stream so as to synchronize to the external signal pulse. In addition to providing a complicated and expensive solution the known systems have limited time resolution. An example of a shooting simulator using light projected at the target to identify if the target was hit is described in US 6,942,486, which also illustrates the complexity of the system. Also, the solution described in the patent does not provide the trigger time with sufficient accuracy. Other examples are shown in US5280363 and EP549436 where time is recorded separately and in the latter case is projected onto the film or sensor.
It is an object of this invention to provide a system that find the exact time of an event. This is obtained as using a method and a device as stated above and characterized as stated in the accompanying claims.
The invention thus solves the problems related to the known art by simply adding a light source into the camera housing, where the light source is activated by the triggering event. By analysing the image data from the image sensor, the time when the light source was activated may be detected. An advantage of this solution is that all of the necessary information is stored in the standard signal from the image sensor but may be found by simple analysis.
The preferred embodiment of the invention the image sensor is a CMOS sensor being read line by line, and the difference between successive lines are compared in order to find at which line readout the light source was activated.
The invention will be described below with reference to the accompanying drawings, illustrating the invention by way of examples. Figure 1 illustrates schematically a camera according to the invention.
Figure 2 Illustrates the readout from a CMOS image sensor.
Figure 3 illustrates the signal obtained from the readout in figure 2.
Figure 4 illustrates the images obtained with the device according to the invention. Figure 5 illustrates the difference signal obtained with the method according to the invention.
As is mentioned above the invention as illustrated in figure 1 relates to a simplified device essentially being constituted by an ordinary camera 1 with a lens 6, with an additional light source 2, e.g. an LED, and triggering device 3 coupled to it. The signal from the image sensor 5 is stored in a storage means 4 which according to the invention also may include processing means. This processing means may also be provided externally if it is unnecessary to obtain the time of the triggered event immediately. The triggering device 3 may be any device suitable for the specific use, such as a trigger on a gun or game console.
The device according to the invention may also include positioning and aiming information depending on the application in which the system is used, for example if positioned on a gun or game console.
The image sensor may be of any type where the signal may be read from a subset of the image sensor elements, including some types of CCD sensors, but according to the preferred embodiment of the invention the image sensor is a CMOS sensor and the readout functionality of the image sensor according to the invention.
CMOS cameras can use a pixel readout method called rolling shutter, the principle of which being illustrated in figure 2 where the time sequence is illustrated as a downward direction. Line by line is reset according to a reset pointer 11, the light is measured for a specified time and the amount of light is read line by line as indicated by the read pointer 12. Thus a rolling window 10 is moving over the sensor 5 one line at the time. The distance between the reset pointer and the readout pointer is the integration time 13 in figure 3 of the sensor related to the shutter time. Thus the time used on each line is a lot shorter that the total time used for a full frame. If a change in the illumination occurs it may be detected as an increased light intensity in or from a certain line in the image. This usually creates a challenges related to flash timing and 50/60Hz variation in the light intensity in indoor lighting. As is illustrated in figure 3 the next line 14 is not read before the readout of the previous line is finished, thus defining a readout time sequence. If the image sensor is illuminated the illumination time will start at a point 15 and cover the rest of the integration time before the readout 14 of each line, and one may observe this as a change in the light relative to the previous frame in all the lines that had not been read when the light was turned on. Figure 4 illustrates four consecutive frames in a video as well as a difference image between them, where the trigger signal is provided as a LED (Light Emitting Diode) is activated in the camera. As may be seen in the lower difference image there is an essentially linear increase in the light intensity that is not obvious from the images per se.
In figure 5 the line by line sum over the difference between the consecutive frames is illustrated. The linear flank is obvious in the plot and may be used to calculate the exact trigger time 16 as the rise in the illumination will depend on how large part of the integration time 13 the light source 2 has been active. In figure 5 the illumination in the image is assumed to be relatively constant and the curve 17 resulting from the light source is easily seen.
In order to find the exact frame and line where the external signal activates the light source each frame may be compared with a reference image, for example the previous frame. At an image rate of 100 images per second the chance of having large differences between consecutive images is small. A difference between two consecutive images gives a time derivative of the illumination for each pixel in the frame. By summing over the lines in the image a robust measure of the derivative is obtained. In addition, if the aim is to find the line wherein the light source is activated it is practical to use the illumination change per line as a measured value.
As mentioned above the external trigger signal is preferably provided by a
LED being positioned so as to illuminate the complete image sensor, and have an insignificant rise time compared to the integration time per line in the image sensor. This means that the precision in the external trigger signal is only limited by to the integration time per line in the image sensor.
The LED is assumed to be on for a significant number of line readouts, i.e. several ms. This means that to find the time of the change in illumination on the sensor, i.e. the trigger time, the linear flank of the signal may be found, as illustrated in figure 5. This may be solved using a gliding time window estimating the slope of the curve. A flank corresponding to the activation of the LED will always be represented by an increased slope if it does not start when the camera is in a horizontal blanking mode, i.e. in this case 21 virtual lines overhead in the camera circuitry for reading the image sensor. The latter will be detected in that the signal starts at a value being very different from zero.
The recordings shown in figure 4 and 5 was performed at 95.20 images/second in VGA resolution (640x480). At 95.20 im/s and 480 lines (as well as 21 virtual line overhead in the camera electronics for each frame) corresponds to an upper precision less than 1/47695 s. A rough estimate of the uncertainty in where the signal starts from a linear flank may be as large as 5 lines. This means that it is possible to measure the time of the external trigger signal with an accuracy of 1/4000 s or better.
The flank calculation may be disturbed by other events, such as a triggered flash or blinking light in the image. Thus more advanced methods may be used for analysing the signal from the sensor. For example, as the spectrum of the light source is known the triggered light may be detected from the detected colour, or light having the same spectrum as the light source may be filtered out in the camera lens, for example having a filter removing infrared light at the lens and using an infrared LED.
The line by line readout described above represents the preferred embodiment when using CMOS sensors, but other readout methods may also be used, the main idea being that one or more sensor elements are read in a sequence and the signal from the sensor elements or groups of sensor elements is analysed and the position in the sequence when the light at the image sensor changes may be detected, and thus the time of trigger signal estimated.
In the case where all array elements are read out in parallel, such as is the case with some types of CCD sensors and CMOS sensors with electronic global shutter, all array elements will integrate within the same timeframe. Still, with a high frame rate, difference images will usually be close to zero. To detect external trigger signals in such a setting, the led illumination must be modulated with a low frequency (sufficiently lower than the frame rate) triangular wave. By comparing a sequence of frames, differencing every, and every second frame - illumination increase can be detected, as well as initial illumination of the frame following the frame where triggering was initiated. Knowledge of triangular wave rise time as well as frame rate and initial illumination of the frame following allows us to estimate with sub-frame accuracy the trigger time.
Thus to summarize the invention relates to a registering device for registering the time of a triggered event, the device including a camera with an image sensor comprising a number of sensor elements being sampled in a predetermined time sequence, the time of the sampling of the sensor elements thus being identifiable from their known positions in the time sequence. The device also comprising a light source being activated by the triggered event and being adapted to illuminate the sensor, and analysing means coupled to said sensor elements for detecting the illumination from said light source in the sampled data from the sensor elements and thus the time in said time sequence of the triggered event activating the light source. The light source may be positioned inside or outside the camera housing.
The method for using the device includes the steps of sampling the information from the sensor elements in the image sensor in a sequence and registering the time of each sampling. From the sampled information the illumination from the light source is detected, preferably by analysing the sampled data from the sensor elements and thus the time in said time sequence of the triggered event activating the light source. This may be performed by finding the curve representing the change in measured light intensity and calculating the origination of the curve.
Thus the device preferably includes analysing means is adapted to analyse the sampled data from sequences containing chosen subsets each including a number of sensor elements. The image sensor is preferably a CMOS sensor wherein each line of sensor elements is sampled in a sequence. The triggered event may then be recognized as the time where the illumination from the light source is first detected at one of said lines of sensor elements.
According to the preferred embodiment the general sensor elements or groups of sensor elements are used to detect the illumination from the light source, but it is also possible to use specialized sensor elements, e.g. being outside the picture frame or having filters transmitting mainly the wavelengths emitted by the light source.
The light from the light source may also be recognize by spectrum or known intensity variations being detectable in the signal from the sensor elements or groups of sensor elements. The groups of sensor elements may include different or partially overlapping sets of sensor elements, but is preferably consecutive lines of sensor elements in the image sensor, and the calculation may including finding the difference between the illumination detected by the sensor elements in each consecutive group.
The device may thus be a video camera, e.g. including a filter avoiding light within the wavelength range of the light source to reach the sensor from the imaging system projecting the image to the sensor, and the analyzing means being adapted to recognize light within said wavelength range in order to detect the triggering event.. It may also comprise a clock and wherein the analyzing means is adapted to register the time of the triggered event as the time of the sequence start and the position in said time sequence.

Claims

C l a i m s
1. Registering device for registering the time of a triggered event, the device including a camera with an image sensor comprising a number of sensor elements being sampled in a predetermined time sequence, the time of the sampling of the sensor elements thus being identifiable,
the device also comprising a light source being activated by the triggered event and being adapted to illuminate the sensor, and analysing means coupled to said sensor elements for detecting the illumination from said light source in the sampled data from the sensor elements and thus the time in said time sequence of the triggered event activating the light source.
2. Device according to claim 1, wherein the analysing means is adapted to analyse the sampled data from sequences containing chosen subsets each including a number of sensor elements
3. Device according to claim 1, wherein the image sensor is a CMOS sensor wherein each line of sensor elements is sampled in a sequence and the triggered event is recognized as the time where the illumination from the light source is first detected at one of said lines of sensor elements.
4. Device according to claim 1, wherein a dedicated subset of sensor elements is sampled in the sequence, the sensor elements in said subset being specially adapted to recognize the illumination from said light source.
5. Device according to claim 1, wherein the device includes a filter avoiding light within the wavelength range of the light source to reach the sensor from the imaging system projecting the image to the sensor, and the analyzing means being adapted to recognize light within said wavelength range in order to detect the triggering event.
6. Device according to claim 1, comprising a clock and wherein the analyzing means is adapted to register the time of the triggered event as the time of the sequence start and the position in said time sequence.
7. Device according to claim 1, wherein the light source is positioned outside the camera housing preferably illuminating the sensor through the lens.
8. Device according to claim 1, wherein the light source is positioned inside the camera housing.
9. Method for registering the time of a triggered event in a recorded sequence of images registered by an image sensor, wherein the event triggers an activation of a light source aimed at the image sensor, the method comprising the steps of
- sampling information from sensor elements in the image sensor in a sequence and registering the time of each sampling,
- from the sampled information detecting the illumination from said light source in the sampled data from the sensor elements and thus the time in said time sequence of the triggered event activating the light source.
10. Method according to claim 9, wherein the illumination from the light source is detected by calculating the difference between signals from consecutively sampled sensor elements.
11. Method according to claim 10, wherein the difference is calculated between consecutive subsets each including a chosen number of sensor elements.
12. Method according to claim 11, wherein each subset is constituted by a line of sensor elements, the image sensor being sampled line by line and the difference being calculated between consecutive lines of sensors.
13. Method according to claim 9, wherein the image sensor is a CMOS sensor type.
14. Camera for registering triggered events including a device according to claim 1 and a lens, wherein the light source is positioned between the lens and the image sensor.
PCT/EP2010/057324 2010-05-27 2010-05-27 Trigger device WO2011147454A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/EP2010/057324 WO2011147454A1 (en) 2010-05-27 2010-05-27 Trigger device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2010/057324 WO2011147454A1 (en) 2010-05-27 2010-05-27 Trigger device

Publications (1)

Publication Number Publication Date
WO2011147454A1 true WO2011147454A1 (en) 2011-12-01

Family

ID=43479912

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2010/057324 WO2011147454A1 (en) 2010-05-27 2010-05-27 Trigger device

Country Status (1)

Country Link
WO (1) WO2011147454A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109308747A (en) * 2018-08-26 2019-02-05 国网新疆电力有限公司和田供电公司 Substation's automatic inspection device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4835621A (en) * 1987-11-04 1989-05-30 Black John W Gun mounted video camera
US6433817B1 (en) * 2000-03-02 2002-08-13 Gavin Guerra Apparatus and method for determining the winner of a race
US20020149679A1 (en) * 1994-06-28 2002-10-17 Deangelis Douglas J. Line object scene generation apparatus
DE10336447A1 (en) * 2003-08-06 2005-03-10 Gerd Hansen Recording and classification of sport or race data, whereby each participant in an event triggers a time signal at a finishing line and is digitally photographed so that times and identities can be subsequently correlated
DE102005060048A1 (en) * 2005-12-15 2007-06-21 Siemens Ag Clocked image sequence recording and processing device for use in industrial image processing, has evaluation unit generating image from two images of image sequence, where initiation signal is provided for determining parts of image data
US20100091141A1 (en) * 2008-10-09 2010-04-15 Sony Corporation System and method for correcting artifacts in sequential imagers caused by transient light disturbances

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4835621A (en) * 1987-11-04 1989-05-30 Black John W Gun mounted video camera
US20020149679A1 (en) * 1994-06-28 2002-10-17 Deangelis Douglas J. Line object scene generation apparatus
US6433817B1 (en) * 2000-03-02 2002-08-13 Gavin Guerra Apparatus and method for determining the winner of a race
DE10336447A1 (en) * 2003-08-06 2005-03-10 Gerd Hansen Recording and classification of sport or race data, whereby each participant in an event triggers a time signal at a finishing line and is digitally photographed so that times and identities can be subsequently correlated
DE102005060048A1 (en) * 2005-12-15 2007-06-21 Siemens Ag Clocked image sequence recording and processing device for use in industrial image processing, has evaluation unit generating image from two images of image sequence, where initiation signal is provided for determining parts of image data
US20100091141A1 (en) * 2008-10-09 2010-04-15 Sony Corporation System and method for correcting artifacts in sequential imagers caused by transient light disturbances

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109308747A (en) * 2018-08-26 2019-02-05 国网新疆电力有限公司和田供电公司 Substation's automatic inspection device

Similar Documents

Publication Publication Date Title
CN108370438B (en) Range gated depth camera assembly
CN108513078B (en) Method and system for capturing video imagery under low light conditions using light emission by a depth sensing camera
US7995097B2 (en) Techniques of motion estimation when acquiring an image of a scene that may be illuminated with a time varying luminance
EP2711915B1 (en) Latency measurement system and method
US9996748B2 (en) Emitter tracking system
US20130194390A1 (en) Distance measuring device
RU2018105095A (en) METHOD AND DEVICE FOR DETERMINING A DEPTH CARD FOR IMAGE
EP2513597B1 (en) Laser daylight designation and pointing
US20060073438A1 (en) Enhancement of aimpoint in simulated training systems
JP2003083730A (en) 3-dimensional information acquisition device, projection pattern in 3-dimensional information acquisition and 3- dimensional information acquisition method
CN1073886A (en) Shooting game system and external storage used therefor thereof
JP6292540B2 (en) Information processing system, information processing method, and program
KR20150067924A (en) Firearm laser training system and method thereof
WO2011036090A1 (en) Real-time dynamic reference image generation for range imaging system
US20120211665A1 (en) System and method for identifying non-cooperative pulsed radiation sources in a field-of-view of an imaging sensor
US20180006724A1 (en) Multi-transmitter vlc positioning system for rolling-shutter receivers
KR100887942B1 (en) System for sensing abnormal phenomenon on realtime and method for controlling the same
JP2011169842A (en) Flicker measuring method and device thereof
WO2011147454A1 (en) Trigger device
KR102378216B1 (en) Apparatus and method for object speed detection using rolling shutter
CN107801087B (en) Video playing method and device and terminal equipment
EP3928126A1 (en) Device and method for shot analysis
WO2020079157A1 (en) Device and method for shot analysis
KR101915197B1 (en) Apparatus and method for analyzing aiming accuracy
KR20130104018A (en) A count system of coming and going using image analysis and method thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10726024

Country of ref document: EP

Kind code of ref document: A1

DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10726024

Country of ref document: EP

Kind code of ref document: A1