US20010008561A1 - Real-time object tracking system - Google Patents

Real-time object tracking system Download PDF

Info

Publication number
US20010008561A1
US20010008561A1 US09/798,594 US79859401A US2001008561A1 US 20010008561 A1 US20010008561 A1 US 20010008561A1 US 79859401 A US79859401 A US 79859401A US 2001008561 A1 US2001008561 A1 US 2001008561A1
Authority
US
United States
Prior art keywords
target
color
image
center
scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US09/798,594
Inventor
George Paul
Glenn Beach
Charles Cohen
Charles Jacobus
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
JOLLY SEVEN SERIES 70 OF ALLIED SECURITY TRUST I
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US09/371,460 external-priority patent/US6681031B2/en
Priority to US09/798,594 priority Critical patent/US20010008561A1/en
Application filed by Individual filed Critical Individual
Assigned to CYBERNET SYSTEMS CORPORATION reassignment CYBERNET SYSTEMS CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BEACH, GLENN J., COHEN, CHARLES J., JACOBUS, CHARLES J., PAUL, GEORGE V.
Priority to US09/896,150 priority patent/US7121946B2/en
Publication of US20010008561A1 publication Critical patent/US20010008561A1/en
Priority to US10/004,058 priority patent/US7050606B2/en
Priority to US11/440,228 priority patent/US20070195997A1/en
Priority to US11/550,138 priority patent/US20070066393A1/en
Priority to US12/013,717 priority patent/US7684592B2/en
Priority to US14/811,212 priority patent/US20160023100A1/en
Assigned to NORTHERN LIGHTS, SERIES 74 OF ALLIED SECURITY TRUST I reassignment NORTHERN LIGHTS, SERIES 74 OF ALLIED SECURITY TRUST I ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CYBERNET SYSTEMS CORPORATION
Assigned to JOLLY SEVEN, SERIES 70 OF ALLIED SECURITY TRUST I reassignment JOLLY SEVEN, SERIES 70 OF ALLIED SECURITY TRUST I ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: NORTHERN LIGHTS, SERIES 74 OF ALLIED SECURITY TRUST I
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1087Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera
    • A63F2300/1093Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera using visible light
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6045Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/69Involving elements of the real world in the game world, e.g. measurement in live races, real video

Definitions

  • This invention relates generally to computer vision systems and, in particular, to a real-time object tracking system and method color involving a color matching technique requiring minimal computation.
  • Another set of tracking methods uses a 3D model of the object being tracked.
  • the model is mapped into the target location based on the location and illumination parameters.
  • the disadvantage of such model based tracking methods is the relatively high amount of computation for the mapping from the 3D model to the image.
  • the tracking systems that avoid the correlation or model matching approaches, use characteristics of the object's appearance or motion in estimating the location of the object in the current image. These techniques are faster than correlation methods but are less robust to changing shape and temporary occlusion by similarly colored objects in the scene.
  • the method of Richards in U.S. Pat. No. 6,163,336 uses special cameras and infrared lighting and a specialized background.
  • the method of Marques et. al. in U.S. Pat. No. 6,130,964 involves a layered segmentation of the object in the scene based on a homogenuity measure. The method also involves a high amount of computation.
  • the template matching method proposed by Holliman et. al. in U.S. Pat. No. 6,075,557 which tracks subimages in the larger camera image involves search and correlation means relatively large amounts of computation.
  • the method of Ponticos in U.S. Pat. No. 6,035,067 uses segmentation of the image based on pixel color.
  • the system of Wakitani in U.S. Pat. No. 6,031,568 uses hardware to do template matching of the target. The method is computationally expensive correlation is done via hardware.
  • This invention resides in a real-time computer vision system capable of tracking moving objects in a scene. Unlike current search and locate algorithms, the subject algorithm uses a target location technique which does not involve search. The system tracks objects based on the color, motion and shape of the object in the image. The tracking algorithm uses a unique color matching technique which uses minimal computation. This color matching function is used to compute three measures of the target's probable location based on the target color, shape and motion. It then computes the most probable location of the target using a weighting technique. These techniques make the invention very computationally efficient also makes it robust to noise, occlusion and rapid motion of the target.
  • the imaging hardware of the real-time object tracking system includes a color camera, a frame grabber, and a personal computer.
  • the software includes low-level image grabbing software and the tracking algorithm.
  • a graphical user interface displays the live image from the color camera on the computer screen. The operator can then use the mouse to click on the hand in the image to select a target for tracking. The system will then keep track of the moving target in the scene in real-time.
  • FIG. 1 is a simplified drawing of an imaging system and computer with tracking algorithm according to the invention
  • FIG. 2 is a flow chart illustrating important steps of the tracking algorithm
  • FIG. 3 is a drawing of a preferred graphical user interface for use with the system of the invention.
  • FIG. 4 is a series of drawings which show the use of color to track a target or feature
  • FIG. 5 illustrates the use a truncated cone to account for slight variations in color
  • FIG. 6 illustrates steps of a method according to the invention written in pseudocode.
  • FIG. 1 A schematic of the system is shown in FIG. 1.
  • the imaging hardware includes a color camera 102 and a digitizer.
  • the sequence of images of the scene is then fed to a computer 104 which runs tracking software according to the invention.
  • the tracking algorithm is independent of the imaging system hardware.
  • the tracking system has a graphical user interface (GUI) to initialize the target and show the tracking result on the screen 106 .
  • GUI graphical user interface
  • the GUI for the ROTS displays a live color image from the camera on the computer screen.
  • the user can initialize the target manually or automatically. Once initialized, the ROTS will then track the target in real-time.
  • the flow chart of the tracking algorithm is shown in FIG. 2.
  • the program captures live images from the camera and displays them on the screen. It then allows the user to select the target manually using the mouse or automatically by moving the target to a predetermined position in the scene.
  • the color, the shape and location of the target are computed and stored.
  • the target is initialized, we compute an estimate of the target location using target dynamics.
  • the input to the ROTS is a sequence of color images, preferably in the standard RGB24 format.
  • the hardware can be a camera with a image grabbing board or a USB camera connected to the USB port of the computer.
  • a preferred GUI is shown in FIG. 3.
  • the color of a pixel in a color image is determined by the values of the Red, Green and Blue bytes corresponding to the pixel in the image buffer. This color value will form a point in the three-dimensional RGB color space.
  • the color of the target is then the median RGB value of a sample set of pixels constituting the target. When the target moves and the illumination changes the color of the target is likely to change.
  • We use a computationally efficient color matching function which allows us to compute whether a pixel color matches the target color within limits.
  • the intensity of the color will change. This will appear as a movement along the RGB color vector as shown in FIG. 5.
  • the point in color space In order to account for slight variations in the color, we further allow the point in color space to lie within a small-truncated cone as shown in FIG. 5.
  • the two thresholds will decide the shape of the matching color cone.
  • a threshold on the angle of the color cone and another threshold on the minimum length of the color vector decides the matching color space.
  • any pixel whose color lies within the truncated cone in color space will be considered as having the same color as the target.
  • the value of d m is related to the length of the projection of the given color vector onto the reference vector.
  • the value of d a is related to the angle between the two vectors. If we set two threshold bands for d m and d a , we can filter out those pixels which lie within the truncated cone around the reference vector. Their product will indicate the goodness of the match.
  • the parameters d m and d a are chosen to be computationally simple to implement which becomes important when all the pixels in a region have to be compared to the reference color in each new image.
  • centroid of the target is computed as a weighted sum.
  • the weights are the color matching measure of the pixel. This weighting of the pixel contrasts with the usual practice of weighting all matching pixels the same makes our algorithm less prone to creep. We also keep track of the sum of the matched pixel weights. If this sum is less than a threshold we assume that the target is not in the region.
  • the closeness of the shape is a summation of the product of the pixel color match P(i, j) with the target template M(i, j). Note again that the color matching measure is used to weight the shape measure. This makes our algorithm robust to creep. Once the region S is obtained, we can compute the centroid of S. This is the probable location of the target based solely on the shape of the target.
  • the algorithm checks for motion in a region near the estimated target position using a motion detecting function. This function computes the difference between the current image and the previous image, which is stored in memory. If motion has occurred, there will be sufficient change in the intensities in the region. The motion detection function will trigger if a sufficient number of pixels change intensity by a certain threshold value. This detection phase eliminates unnecessary computation when the object is stationary.
  • the motion detection function detects motion
  • the next step is to locate the target. This is done using the difference image and the target color.
  • the color of the pixels changes between frames near the target (unless the target and the background are of the same color).
  • the pixels whose color changes beyond a threshold make up the difference image.
  • the difference image will have areas, which are complementary.
  • the pixels where the object used to be will complement those pixels where the object is at now.
  • we separate these pixels using the color of the target we can compute the new location of the target.
  • the set of pixels in the difference image which has the color of the target in the new image, will correspond to the leading edge of the target in the new image. If we assume that the shape of the target changes negligibly between frames, we can use the shape of the target from the previous image to compute the position of the center of the target from this difference image.
  • D be the difference sub-image between the previous target and the estimated target location in the new image. If we threshold the difference image, we end up with a binary image. If we intersect this binary image D with the shape of the target in the new image M we get the moving edge of the target as the region V. We then weight this region by the color matching measure P.
  • centroid of the region V is then computed as the probable location of the target based on motion alone. This weighting of the intesection region by the color matching measure makes out tracking less prone to jitter.
  • the image capture board is capable of providing us with a 480 ⁇ 640-pixel color image at 30 frames per second. Processing such a large image will slow down the program. Fortunately, the nature of the tracking task is such that, only a fraction of the image is of interest. This region called the window of interest lies around the estimated position of the target in the new image. We can compute the location of the target in t he new image from the location of the target in the previous image and its dynamics. We have used prediction based on velocity computation between frames. This technique is able to keep track of the target even when the target moves rapidly. We have found that the window of interest is typically one one-hundredth the area of the original image. This speeds up the computation of the new target location considerably.

Abstract

A real-time computer vision system tracks one or more objects moving in a scene using a target location technique which does not involve searching. The imaging hardware includes a color camera, frame grabber, and processor. The software consists of the low-level image grabbing software and a tracking algorithm. The system tracks objects based on the color, motion and/or shape of the object in the image. A color matching function is used to compute three measures of the target's probable location based on the target color, shape and motion. The method then computes the most probable location of the target using a weighting technique. Once the system is running, a graphical user interface displays the live image from the color camera on the computer screen. The operator can then use the mouse to select a target for tracking. The system will then keep track of the moving target in the scene in real-time.

Description

    REFERENCE TO PRIOR APPLICATIONS
  • This application claims priority of U.S. provisional application Serial No. 60/186,474, filed Mar. 2, 2000, and is a continuation-in-part of U.S. patent application Serial No. 09/371,460, filed Aug. 10, 1999, the entire contents of each application being incorporated herein by reference. [0001]
  • FIELD OF THE INVENTION
  • This invention relates generally to computer vision systems and, in particular, to a real-time object tracking system and method color involving a color matching technique requiring minimal computation. [0002]
  • BACKGROUND OF THE INVENTION
  • Current imaging systems can convert live scenes into a sequence of digital images which can be processed to track any object in the scene from frame to frame. The techniques used for tracking are numerous. Most of the currently available systems use some characteristic of the subset of the image containing the target to search and locate the target in the following image. The quality and speed of the tracking system depends on the implementation of this search and locate idea [0003]
  • Most tracking systems use correlation of a sample subimage representing the object with parts of the current image. The correlation values are computed in a search area around an estimated location of the object. The correlation operation is computationally expensive and usually is performed using specialized hardware. [0004]
  • Another set of tracking methods uses a 3D model of the object being tracked. In these methods, the model is mapped into the target location based on the location and illumination parameters. The disadvantage of such model based tracking methods is the relatively high amount of computation for the mapping from the 3D model to the image. The tracking systems that avoid the correlation or model matching approaches, use characteristics of the object's appearance or motion in estimating the location of the object in the current image. These techniques are faster than correlation methods but are less robust to changing shape and temporary occlusion by similarly colored objects in the scene. [0005]
  • The work by Darell et al. in U.S. Pat. No. 6,188,777 uses stereo cameras and involves three modules which compute the range of the tracked object, segments the object based on color and does pattern classification. Each of the modules involved places a large computational load on the computer. The method of Peurach et. al. in U.S. Pat. No. 6,173,066 uses a 3D object model database and projection geometry to find the pose of the object in the 2D camera image. The pose determination and tracking involves searching in a multi-dimensional object pose space. The computation involved is very high. [0006]
  • The method of Richards in U.S. Pat. No. 6,163,336 uses special cameras and infrared lighting and a specialized background. The method of Marques et. al. in U.S. Pat. No. 6,130,964 involves a layered segmentation of the object in the scene based on a homogenuity measure. The method also involves a high amount of computation. The template matching method proposed by Holliman et. al. in U.S. Pat. No. 6,075,557 which tracks subimages in the larger camera image involves search and correlation means relatively large amounts of computation. The method of Ponticos in U.S. Pat. No. 6,035,067 uses segmentation of the image based on pixel color. The system of Wakitani in U.S. Pat. No. 6,031,568 uses hardware to do template matching of the target. The method is computationally expensive correlation is done via hardware. [0007]
  • The tracking proposed in this method by Suito et. al. in U.S. Pat. No. 6,014,167 relies mostly on the difference image between successive frames to detect motion and then tracks moving pixels using color. This work uses correlation and searches in a multi dimensional space to compute the object's 3D position and orientation. The amount of computation involved is immense. [0008]
  • The proposed method of Matsumura et. al. in U.S. Pat. No. 6,002,428 does color matching to track the target The method of Guthrie in U.S. Pat. No. 5,973,732 uses differencing and blob analysis. The method of Hunke in U.S. Pat. No. 5,912,980 uses color matching as opposed to shape. The method of Tang et. al in U.S. Pat. No. 5,878,151 uses correlation to track subimages in the image. [0009]
  • SUMMARY OF THE INVENTION
  • This invention resides in a real-time computer vision system capable of tracking moving objects in a scene. Unlike current search and locate algorithms, the subject algorithm uses a target location technique which does not involve search. The system tracks objects based on the color, motion and shape of the object in the image. The tracking algorithm uses a unique color matching technique which uses minimal computation. This color matching function is used to compute three measures of the target's probable location based on the target color, shape and motion. It then computes the most probable location of the target using a weighting technique. These techniques make the invention very computationally efficient also makes it robust to noise, occlusion and rapid motion of the target. [0010]
  • The imaging hardware of the real-time object tracking system includes a color camera, a frame grabber, and a personal computer. The software includes low-level image grabbing software and the tracking algorithm. Once the application is running, a graphical user interface displays the live image from the color camera on the computer screen. The operator can then use the mouse to click on the hand in the image to select a target for tracking. The system will then keep track of the moving target in the scene in real-time. [0011]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a simplified drawing of an imaging system and computer with tracking algorithm according to the invention; [0012]
  • FIG. 2 is a flow chart illustrating important steps of the tracking algorithm; [0013]
  • FIG. 3 is a drawing of a preferred graphical user interface for use with the system of the invention; [0014]
  • FIG. 4 is a series of drawings which show the use of color to track a target or feature; [0015]
  • FIG. 5 illustrates the use a truncated cone to account for slight variations in color; and [0016]
  • FIG. 6 illustrates steps of a method according to the invention written in pseudocode. [0017]
  • DETAILED DESCRIPTION OF THE INVENTION
  • A schematic of the system is shown in FIG. 1. The imaging hardware includes a [0018] color camera 102 and a digitizer. The sequence of images of the scene is then fed to a computer 104 which runs tracking software according to the invention. The tracking algorithm is independent of the imaging system hardware. The tracking system has a graphical user interface (GUI) to initialize the target and show the tracking result on the screen 106.
  • The GUI for the ROTS displays a live color image from the camera on the computer screen. The user can initialize the target manually or automatically. Once initialized, the ROTS will then track the target in real-time. [0019]
  • The flow chart of the tracking algorithm is shown in FIG. 2. The program captures live images from the camera and displays them on the screen. It then allows the user to select the target manually using the mouse or automatically by moving the target to a predetermined position in the scene. At the point of initialization, the color, the shape and location of the target are computed and stored. Once the target is initialized, we compute an estimate of the target location using target dynamics. We then compute the actual location using the color, shape and motion information with respect to a region centered at the estimated location. [0020]
  • The input to the ROTS is a sequence of color images, preferably in the standard RGB24 format. Hence, the hardware can be a camera with a image grabbing board or a USB camera connected to the USB port of the computer. A preferred GUI is shown in FIG. 3. [0021]
  • Tracking using Color, Shape and Motion [0022]
  • Once the user clicks on the target in the image, we compute the median color of a small region around this point in the image. This will be the color of the target region being tracked in the scene until it is reinitialized. We also store the shape of the target by segmenting the object using its color. Once tracking begins, we compute the center of the target region in the image using a combination of three aspects of the target. The three aspects are the color, the shape and the motion. This results in a very robust tracking system which can withstand a variety of noise, occlusion and rapid motion. [0023]
  • Color Matching [0024]
  • The color of a pixel in a color image is determined by the values of the Red, Green and Blue bytes corresponding to the pixel in the image buffer. This color value will form a point in the three-dimensional RGB color space. When we compute the color of the target, we assume that the target is fairly evenly colored and the illumination stays relatively the same. The color of the target is then the median RGB value of a sample set of pixels constituting the target. When the target moves and the illumination changes the color of the target is likely to change. We use a computationally efficient color matching function which allows us to compute whether a pixel color matches the target color within limits. [0025]
  • When the illumination on the target changes, the intensity of the color will change. This will appear as a movement along the RGB color vector as shown in FIG. 5. In order to account for slight variations in the color, we further allow the point in color space to lie within a small-truncated cone as shown in FIG. 5. The two thresholds will decide the shape of the matching color cone. A threshold on the angle of the color cone and another threshold on the minimum length of the color vector decides the matching color space. Thus, any pixel whose color lies within the truncated cone in color space will be considered as having the same color as the target. [0026]
  • Given a colored pixel, we quantitatively define the match between it and a reference color pixel as follows. Let (R, G, B) be the values of the RGB vector of the first pixel. Let (R[0027] r, Gr, Br) be the RGB vector for the reference color. d = RR r + GG r + BB r m r = R r 2 + G r 2 + B r 2 m = R 2 + G 2 + B 2 d m = d m r d 1 = d m r m ColorMatch ( R , G , B ) = { d m d a if ( ( d m l < d m < d m h ) & ( d 1 l < d a < d a h ) ) 0 otherwise
    Figure US20010008561A1-20010719-M00001
  • The value of d[0028] m is related to the length of the projection of the given color vector onto the reference vector. The value of da is related to the angle between the two vectors. If we set two threshold bands for dm and da, we can filter out those pixels which lie within the truncated cone around the reference vector. Their product will indicate the goodness of the match. The parameters dm and da are chosen to be computationally simple to implement which becomes important when all the pixels in a region have to be compared to the reference color in each new image.
  • Position Using Color [0029]
  • Once we have the target color and a color matching algorithm, we can find all the pixels in any given region of the image which match the target color. We use the quantitative measure of the match to find a weighted average of these pixel positions. This gives us the most likely center of the target based on color alone. If (i, j) are the row and column coordinates of the pixel P[0030] c(i,j), then for a given rectangular region the most likely target center based on color alone will be given as follows. P c ( i , j , t ) = ColorMatch ( R ( i , j , t ) , G ( i , j , t ) , B ( i , j , t ) ) Center color = [ r c c c ] = [ 1 I * J P c ( i , j , t ) * i 1 I * J P c ( i , j , t ) 1 I * J P c ( i , j , t ) * j 1 I * J P c ( i , j , t ) ]
    Figure US20010008561A1-20010719-M00002
  • Note that the centroid of the target is computed as a weighted sum. The weights are the color matching measure of the pixel. This weighting of the pixel contrasts with the usual practice of weighting all matching pixels the same makes our algorithm less prone to creep. We also keep track of the sum of the matched pixel weights. If this sum is less than a threshold we assume that the target is not in the region. [0031]
  • Shape Matching [0032]
  • Once the target is initialized, we compute a two-dimensional template of the target. We use this dynamic template which is updated every frame to measure the closeness of pixels at the estimated location to the target shape. Given the color of the object being tracked and the color matching function we segment all the pixels in a region around the estimated location. The resulting segmented image is the shape of the object and forms the template. With each new image of the scene, the template of the target in the previous frame is used to compute the new center of the target in the new image. The advantage of using templates instead of any assumed shape such as an ellipse is that the tracking and localization of the target is much more robust to shape change and hence more accurate. [0033] P ( i , j , t ) = ColorMatch ( R ( i , j , t ) , G ( i , j , t ) , B ( i , j , t ) ) for time = t M ( i , j , t - 1 ) = { 1 if ( P ( i , j , t - 1 ) > 0 ) 0 otherwise S ( i , j , t ) = P ( i , j , t ) M ( i , j , t - 1 ) Center shape = [ r s c s ] = [ 1 I * J S ( i , j , t ) * i 1 I * J S ( i , j , t ) 1 I * J S ( i , j , t ) * j 1 I * J S ( i , j , t ) ]
    Figure US20010008561A1-20010719-M00003
  • The closeness of the shape is a summation of the product of the pixel color match P(i, j) with the target template M(i, j). Note again that the color matching measure is used to weight the shape measure. This makes our algorithm robust to creep. Once the region S is obtained, we can compute the centroid of S. This is the probable location of the target based solely on the shape of the target. [0034]
  • Motion Detection [0035]
  • The algorithm checks for motion in a region near the estimated target position using a motion detecting function. This function computes the difference between the current image and the previous image, which is stored in memory. If motion has occurred, there will be sufficient change in the intensities in the region. The motion detection function will trigger if a sufficient number of pixels change intensity by a certain threshold value. This detection phase eliminates unnecessary computation when the object is stationary. [0036]
  • Position Using Motion [0037]
  • If the motion detection function detects motion, the next step is to locate the target. This is done using the difference image and the target color. When an object moves between frames in a relatively stationary background, the color of the pixels changes between frames near the target (unless the target and the background are of the same color). We compute the color change between frames for pixels near the target location. The pixels whose color changes beyond a threshold make up the difference image. Note that the difference image will have areas, which are complementary. The pixels where the object used to be will complement those pixels where the object is at now. If we separate these pixels using the color of the target, we can compute the new location of the target. The set of pixels in the difference image, which has the color of the target in the new image, will correspond to the leading edge of the target in the new image. If we assume that the shape of the target changes negligibly between frames, we can use the shape of the target from the previous image to compute the position of the center of the target from this difference image. [0038]
  • Let D be the difference sub-image between the previous target and the estimated target location in the new image. If we threshold the difference image, we end up with a binary image. If we intersect this binary image D with the shape of the target in the new image M we get the moving edge of the target as the region V. We then weight this region by the color matching measure P. [0039] D ( i , j , t ) = { 1 if ( P ( i , j , t - 1 ) - P ( i , j , t - 1 ) > τ m 0 otherwise M ( i , j , t ) = { 1 if ( P ( i , j , t ) > τ c ) 0 otherwise V ( i , j , t ) = ( D ( i , j , t ) M ( i , j , t ) ) * P ( i , j , t ) Center motion = [ r m c m ] = [ 1 I * J V ( i , j , t ) * i 1 I * J V ( i , j , t ) 1 I * J V ( i , j , t ) * j 1 I * J V ( i , j , t ) ]
    Figure US20010008561A1-20010719-M00004
  • The centroid of the region V is then computed as the probable location of the target based on motion alone. This weighting of the intesection region by the color matching measure makes out tracking less prone to jitter. [0040]
  • In a physically implemented system, the image capture board is capable of providing us with a 480×640-pixel color image at 30 frames per second. Processing such a large image will slow down the program. Fortunately, the nature of the tracking task is such that, only a fraction of the image is of interest. This region called the window of interest lies around the estimated position of the target in the new image. We can compute the location of the target in t he new image from the location of the target in the previous image and its dynamics. We have used prediction based on velocity computation between frames. This technique is able to keep track of the target even when the target moves rapidly. We have found that the window of interest is typically one one-hundredth the area of the original image. This speeds up the computation of the new target location considerably. [0041]
  • Tracking Algorithm [0042]
  • If we are given an estimated target location as (rc, cc) in the new image and the size of the area to be searched is given by (rs, cs), then the algorithm can be written in pseudo code as shown in FIG. 6. [0043]
  • Note that the color matching weight c is being used to weight all the three centers. This weighting makes this algorithm smoother and more robust. The velocity computed at the end of the tracking algorithm is used to compute the estimated position of the target in the next frame. [0044]
  • Extensions of the system are possible in accordance with the described algorithm herein. One is a tracking system which can track multiple targets in the same image. Another uses the tracking in two stereo images to track the target in 3D. [0045]

Claims (14)

We claim:
1. A method of tracking a target, comprising the steps of:
inputting a sequence of images of a scene;
selecting a target in the scene;
computing the center of the target in an initial one of the images using one or more visual characteristics of the target region;
computing the center of the target in a subsequent one of the images using the visual characteristics; and
comparing the center of the target in the subsequent image to the center of the target in the initial image to determine movement of the target within the scene.
2. The method of
claim 1
, wherein the visual characteristics include the color, shape or the location of the target.
3. The method of
claim 1
, wherein the visual characteristics include a combination of static and dynamic characteristics.
4. The method of
claim 3
, further including the step of modeling of the dynamic characteristics to yield an estimate the location of the target in the current image based on the location of the target in previous images.
5. The method of
claim 1
, wherein the step of selecting a target in the scene includes the step of user-selecting the target on a computer screen through a graphical user interface.
6. The method of
claim 5
, wherein the graphical user interface provides a bounding box surrounding the target superimposed on each image as it is displayed on the screen.
7. The method of
claim 2
, wherein step of computing the center of the target with respect to color further includes the steps of:
enabling a match between the color of the target in the subsequent image to the color of the target in a previous image despite differences arising from target lighting and shadows.
8. The method of
claim 2
, wherein step of computing the center of the target with respect to color further includes the steps of:
enabling a match between the color of the target in the subsequent image to the color of the target in a previous image within a threshold of hue.
9. The method of
claim 2
, wherein step of comparing the center of the target in the subsequent image to the center of the target in the initial image includes a comparison of pixel in an RGB format.
10. The method of
claim 1
, further including the step of determining if the target has moved outside of the scene.
11. The method of
claim 1
, wherein:
the visual characteristic is color; and
further including the step of finding a weighted average of color to compute the center of the target based upon color alone.
12. The method of
claim 1
, further including the step of segmented a region defined by a predetermined closeness of color as an estimate of target shape.
13. The method of
claim 1
, further including the step of continuing to track the target when the target moves in front of or behind a similarly colored object in the scene.
14. The method of
claim 1
, further including the step of continuing to track the target when the target and input image move in relation to one another.
US09/798,594 1998-08-10 2001-03-02 Real-time object tracking system Abandoned US20010008561A1 (en)

Priority Applications (7)

Application Number Priority Date Filing Date Title
US09/798,594 US20010008561A1 (en) 1999-08-10 2001-03-02 Real-time object tracking system
US09/896,150 US7121946B2 (en) 1998-08-10 2001-06-29 Real-time head tracking system for computer games and other applications
US10/004,058 US7050606B2 (en) 1999-08-10 2001-11-01 Tracking and gesture recognition system particularly suited to vehicular control applications
US11/440,228 US20070195997A1 (en) 1999-08-10 2006-05-23 Tracking and gesture recognition system particularly suited to vehicular control applications
US11/550,138 US20070066393A1 (en) 1998-08-10 2006-10-17 Real-time head tracking system for computer games and other applications
US12/013,717 US7684592B2 (en) 1998-08-10 2008-01-14 Realtime object tracking system
US14/811,212 US20160023100A1 (en) 1998-08-10 2015-07-28 Real-time head tracking system for computer games and other applications

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US09/371,460 US6681031B2 (en) 1998-08-10 1999-08-10 Gesture-controlled interfaces for self-service machines and other applications
US18647400P 2000-03-02 2000-03-02
US09/798,594 US20010008561A1 (en) 1999-08-10 2001-03-02 Real-time object tracking system

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US09/371,460 Continuation-In-Part US6681031B2 (en) 1998-08-10 1999-08-10 Gesture-controlled interfaces for self-service machines and other applications

Related Child Applications (3)

Application Number Title Priority Date Filing Date
US09/896,150 Continuation-In-Part US7121946B2 (en) 1998-08-10 2001-06-29 Real-time head tracking system for computer games and other applications
US10/004,058 Continuation-In-Part US7050606B2 (en) 1999-08-10 2001-11-01 Tracking and gesture recognition system particularly suited to vehicular control applications
US12/013,717 Continuation US7684592B2 (en) 1998-08-10 2008-01-14 Realtime object tracking system

Publications (1)

Publication Number Publication Date
US20010008561A1 true US20010008561A1 (en) 2001-07-19

Family

ID=46257568

Family Applications (2)

Application Number Title Priority Date Filing Date
US09/798,594 Abandoned US20010008561A1 (en) 1998-08-10 2001-03-02 Real-time object tracking system
US12/013,717 Expired - Fee Related US7684592B2 (en) 1998-08-10 2008-01-14 Realtime object tracking system

Family Applications After (1)

Application Number Title Priority Date Filing Date
US12/013,717 Expired - Fee Related US7684592B2 (en) 1998-08-10 2008-01-14 Realtime object tracking system

Country Status (1)

Country Link
US (2) US20010008561A1 (en)

Cited By (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020037770A1 (en) * 1998-08-10 2002-03-28 Paul George V. Real-time head tracking system for computer games and other applications
US20020126877A1 (en) * 2001-03-08 2002-09-12 Yukihiro Sugiyama Light transmission type image recognition device and image recognition sensor
DE10225077A1 (en) * 2002-06-05 2003-12-24 Vr Magic Gmbh Operating theater object tracking system has moveable optical sensors with position measured in fixed reference system
US20060018516A1 (en) * 2004-07-22 2006-01-26 Masoud Osama T Monitoring activity using video information
US20060088196A1 (en) * 2004-10-25 2006-04-27 Popovich Joseph Jr Embedded imaging and control system
US20060176174A1 (en) * 2005-02-10 2006-08-10 Pinc Solutions Position-tracking device for position-tracking system
US20060187028A1 (en) * 2005-02-10 2006-08-24 Pinc Solutions Position-tracing system
US20060210159A1 (en) * 2005-03-15 2006-09-21 Yea-Shuan Huang Foreground extraction approach by using color and local structure information
FR2885719A1 (en) * 2005-05-10 2006-11-17 Thomson Licensing Sa METHOD AND DEVICE FOR TRACKING OBJECTS IN AN IMAGE SEQUENCE
US20070018811A1 (en) * 2005-07-05 2007-01-25 Pinc Solutions Systems and methods for determining a location of an object
US20070286458A1 (en) * 2006-06-12 2007-12-13 D&S Consultants, Inc. Method and System for Tracking a Target
US7317812B1 (en) * 2002-11-15 2008-01-08 Videomining Corporation Method and apparatus for robustly tracking objects
US20080048930A1 (en) * 1999-12-15 2008-02-28 Automotive Technologies International, Inc. Vehicular Heads-Up Display System
US20080065291A1 (en) * 2002-11-04 2008-03-13 Automotive Technologies International, Inc. Gesture-Based Control of Vehicular Components
US20080085048A1 (en) * 2006-10-05 2008-04-10 Department Of The Navy Robotic gesture recognition system
US20080158096A1 (en) * 1999-12-15 2008-07-03 Automotive Technologies International, Inc. Eye-Location Dependent Vehicular Heads-Up Display System
US20080276191A1 (en) * 1999-12-15 2008-11-06 Automotive Technologies International, Inc. Vehicular Heads-Up Display System
US20090116692A1 (en) * 1998-08-10 2009-05-07 Paul George V Realtime object tracking system
WO2009112575A1 (en) * 2008-03-14 2009-09-17 Bausch & Lomb, Inc. Fast algorithm for streaming wavefront
US20090302030A1 (en) * 2006-03-30 2009-12-10 Advanced Composite Materials Corporation Composite materials and devices comprising single crystal silicon carbide heated by electromagnetic radiation
US20090310829A1 (en) * 2007-04-16 2009-12-17 Fujitsu Limited Image processing method, image processing apparatus, image processing system and computer program
US7660439B1 (en) 2003-12-16 2010-02-09 Verificon Corporation Method and system for flow detection and motion analysis
CN101826157A (en) * 2010-04-28 2010-09-08 华中科技大学 Ground static target real-time identifying and tracking method
WO2010141378A1 (en) * 2009-05-30 2010-12-09 Sony Computer Entertainment Inc. Color calibration for object tracking
US20110123067A1 (en) * 2006-06-12 2011-05-26 D & S Consultants, Inc. Method And System for Tracking a Target
WO2012178202A1 (en) * 2011-06-23 2012-12-27 Oblong Industries, Inc. Adaptive tracking system for spatial input devices
CN103896025A (en) * 2014-04-02 2014-07-02 东南大学 Intelligent camera based flexible vibration transmission system and operating method thereof
CN105069408A (en) * 2015-07-24 2015-11-18 上海依图网络科技有限公司 Video portrait tracking method based on human face identification in complex scenario
US9288449B2 (en) 2008-08-05 2016-03-15 University Of Florida Research Foundation, Inc. Systems and methods for maintaining multiple objects within a camera field-of-view
US9304593B2 (en) 1998-08-10 2016-04-05 Cybernet Systems Corporation Behavior recognition system
US20170282044A1 (en) * 2016-03-30 2017-10-05 Apqs, Llc Ball Return Device and Method of Using
EP2220588B1 (en) * 2007-12-07 2020-03-25 Robert Bosch GmbH Configuration module for a surveillance system, surveillance system, method for configuring the surveillance system, and computer program
CN112991485A (en) * 2019-12-13 2021-06-18 浙江宇视科技有限公司 Track drawing method and device, readable storage medium and electronic equipment
WO2021129491A1 (en) * 2019-12-25 2021-07-01 中兴通讯股份有限公司 Pedestrian search method, server, and storage medium
CN114359265A (en) * 2022-03-04 2022-04-15 广东顺德富意德智能包装科技有限公司 Screw counting method and system based on target tracking

Families Citing this family (263)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8352400B2 (en) 1991-12-23 2013-01-08 Hoffberg Steven M Adaptive pattern recognition based controller apparatus and method and human-factored interface therefore
US7966078B2 (en) 1999-02-01 2011-06-21 Steven Hoffberg Network media appliance system and method
US6990639B2 (en) * 2002-02-07 2006-01-24 Microsoft Corporation System and process for controlling electronic components in a ubiquitous computing environment using multimodal integration
US7665041B2 (en) 2003-03-25 2010-02-16 Microsoft Corporation Architecture for controlling a computer using hand gestures
US8745541B2 (en) 2003-03-25 2014-06-03 Microsoft Corporation Architecture for controlling a computer using hand gestures
JP4140567B2 (en) * 2004-07-14 2008-08-27 松下電器産業株式会社 Object tracking device and object tracking method
JP4386006B2 (en) * 2005-06-28 2009-12-16 ソニー株式会社 Imaging apparatus and method, program, and recording medium
US7697827B2 (en) 2005-10-17 2010-04-13 Konicek Jeffrey C User-friendlier interfaces for a camera
US8018579B1 (en) 2005-10-21 2011-09-13 Apple Inc. Three-dimensional imaging and display system
US8005238B2 (en) 2007-03-22 2011-08-23 Microsoft Corporation Robust adaptive beamforming with enhanced noise suppression
US8005237B2 (en) 2007-05-17 2011-08-23 Microsoft Corp. Sensor array beamformer post-processor
US8629976B2 (en) 2007-10-02 2014-01-14 Microsoft Corporation Methods and systems for hierarchical de-aliasing time-of-flight (TOF) systems
US20090097704A1 (en) * 2007-10-10 2009-04-16 Micron Technology, Inc. On-chip camera system for multiple object tracking and identification
JP4507129B2 (en) * 2008-06-06 2010-07-21 ソニー株式会社 Tracking point detection apparatus and method, program, and recording medium
US8385557B2 (en) 2008-06-19 2013-02-26 Microsoft Corporation Multichannel acoustic echo reduction
US8325909B2 (en) 2008-06-25 2012-12-04 Microsoft Corporation Acoustic echo suppression
US8203699B2 (en) 2008-06-30 2012-06-19 Microsoft Corporation System architecture design for time-of-flight system having reduced differential pixel size, and time-of-flight systems so designed
US8681321B2 (en) 2009-01-04 2014-03-25 Microsoft International Holdings B.V. Gated 3D camera
US8448094B2 (en) * 2009-01-30 2013-05-21 Microsoft Corporation Mapping a natural input device to a legacy system
US8577085B2 (en) 2009-01-30 2013-11-05 Microsoft Corporation Visual target tracking
US8565476B2 (en) 2009-01-30 2013-10-22 Microsoft Corporation Visual target tracking
US7996793B2 (en) 2009-01-30 2011-08-09 Microsoft Corporation Gesture recognizer system architecture
US8577084B2 (en) 2009-01-30 2013-11-05 Microsoft Corporation Visual target tracking
US8588465B2 (en) 2009-01-30 2013-11-19 Microsoft Corporation Visual target tracking
US8487938B2 (en) 2009-01-30 2013-07-16 Microsoft Corporation Standard Gestures
US20100199231A1 (en) * 2009-01-30 2010-08-05 Microsoft Corporation Predictive determination
US8295546B2 (en) 2009-01-30 2012-10-23 Microsoft Corporation Pose tracking pipeline
US8565477B2 (en) 2009-01-30 2013-10-22 Microsoft Corporation Visual target tracking
US8267781B2 (en) 2009-01-30 2012-09-18 Microsoft Corporation Visual target tracking
US8294767B2 (en) * 2009-01-30 2012-10-23 Microsoft Corporation Body scan
US8682028B2 (en) 2009-01-30 2014-03-25 Microsoft Corporation Visual target tracking
US8773355B2 (en) 2009-03-16 2014-07-08 Microsoft Corporation Adaptive cursor sizing
US9256282B2 (en) * 2009-03-20 2016-02-09 Microsoft Technology Licensing, Llc Virtual object manipulation
US8988437B2 (en) 2009-03-20 2015-03-24 Microsoft Technology Licensing, Llc Chaining animations
US9313376B1 (en) 2009-04-01 2016-04-12 Microsoft Technology Licensing, Llc Dynamic depth power equalization
US8660303B2 (en) 2009-05-01 2014-02-25 Microsoft Corporation Detection of body and props
US8649554B2 (en) 2009-05-01 2014-02-11 Microsoft Corporation Method to control perspective for a camera-controlled computer
US9377857B2 (en) 2009-05-01 2016-06-28 Microsoft Technology Licensing, Llc Show body position
US9898675B2 (en) 2009-05-01 2018-02-20 Microsoft Technology Licensing, Llc User movement tracking feedback to improve tracking
US8253746B2 (en) 2009-05-01 2012-08-28 Microsoft Corporation Determine intended motions
US9498718B2 (en) 2009-05-01 2016-11-22 Microsoft Technology Licensing, Llc Altering a view perspective within a display environment
US8181123B2 (en) 2009-05-01 2012-05-15 Microsoft Corporation Managing virtual port associations to users in a gesture-based computing environment
US8942428B2 (en) 2009-05-01 2015-01-27 Microsoft Corporation Isolate extraneous motions
US8503720B2 (en) 2009-05-01 2013-08-06 Microsoft Corporation Human body pose estimation
US8340432B2 (en) 2009-05-01 2012-12-25 Microsoft Corporation Systems and methods for detecting a tilt angle from a depth image
US8638985B2 (en) * 2009-05-01 2014-01-28 Microsoft Corporation Human body pose estimation
US9015638B2 (en) 2009-05-01 2015-04-21 Microsoft Technology Licensing, Llc Binding users to a gesture based system and providing feedback to the users
US8379101B2 (en) 2009-05-29 2013-02-19 Microsoft Corporation Environment and/or target segmentation
US8625837B2 (en) 2009-05-29 2014-01-07 Microsoft Corporation Protocol and format for communicating an image from a camera to a computing environment
US8542252B2 (en) * 2009-05-29 2013-09-24 Microsoft Corporation Target digitization, extraction, and tracking
US8856691B2 (en) 2009-05-29 2014-10-07 Microsoft Corporation Gesture tool
US9383823B2 (en) 2009-05-29 2016-07-05 Microsoft Technology Licensing, Llc Combining gestures beyond skeletal
US8744121B2 (en) 2009-05-29 2014-06-03 Microsoft Corporation Device for identifying and tracking multiple humans over time
US8693724B2 (en) 2009-05-29 2014-04-08 Microsoft Corporation Method and system implementing user-centric gesture control
US9400559B2 (en) 2009-05-29 2016-07-26 Microsoft Technology Licensing, Llc Gesture shortcuts
US8320619B2 (en) 2009-05-29 2012-11-27 Microsoft Corporation Systems and methods for tracking a model
US9182814B2 (en) 2009-05-29 2015-11-10 Microsoft Technology Licensing, Llc Systems and methods for estimating a non-visible or occluded body part
US20100302365A1 (en) * 2009-05-29 2010-12-02 Microsoft Corporation Depth Image Noise Reduction
US8509479B2 (en) 2009-05-29 2013-08-13 Microsoft Corporation Virtual object
US8418085B2 (en) 2009-05-29 2013-04-09 Microsoft Corporation Gesture coach
US8487871B2 (en) 2009-06-01 2013-07-16 Microsoft Corporation Virtual desktop coordinate transformation
US8390680B2 (en) 2009-07-09 2013-03-05 Microsoft Corporation Visual representation expression based on player expression
US9159151B2 (en) 2009-07-13 2015-10-13 Microsoft Technology Licensing, Llc Bringing a visual representation to life via learned input from the user
US8264536B2 (en) 2009-08-25 2012-09-11 Microsoft Corporation Depth-sensitive imaging via polarization-state mapping
US9141193B2 (en) 2009-08-31 2015-09-22 Microsoft Technology Licensing, Llc Techniques for using human gestures to control gesture unaware programs
US8508919B2 (en) 2009-09-14 2013-08-13 Microsoft Corporation Separation of electrical and optical components
US8330134B2 (en) 2009-09-14 2012-12-11 Microsoft Corporation Optical fault monitoring
US8428340B2 (en) 2009-09-21 2013-04-23 Microsoft Corporation Screen space plane identification
US8976986B2 (en) 2009-09-21 2015-03-10 Microsoft Technology Licensing, Llc Volume adjustment based on listener position
US8760571B2 (en) 2009-09-21 2014-06-24 Microsoft Corporation Alignment of lens and image sensor
US9014546B2 (en) 2009-09-23 2015-04-21 Rovi Guides, Inc. Systems and methods for automatically detecting users within detection regions of media devices
US8452087B2 (en) * 2009-09-30 2013-05-28 Microsoft Corporation Image selection techniques
US8723118B2 (en) 2009-10-01 2014-05-13 Microsoft Corporation Imager for constructing color and depth images
US8867820B2 (en) 2009-10-07 2014-10-21 Microsoft Corporation Systems and methods for removing a background of an image
US8963829B2 (en) 2009-10-07 2015-02-24 Microsoft Corporation Methods and systems for determining and tracking extremities of a target
US7961910B2 (en) 2009-10-07 2011-06-14 Microsoft Corporation Systems and methods for tracking a model
US8564534B2 (en) 2009-10-07 2013-10-22 Microsoft Corporation Human tracking system
NL2003662C2 (en) * 2009-10-16 2011-04-19 Sara Lee De Nv METHOD, CONTROL UNIT FOR AN APPARATUS AND APPARATUS PROVIDED WITH A CONTROL UNIT.
US9400548B2 (en) 2009-10-19 2016-07-26 Microsoft Technology Licensing, Llc Gesture personalization and profile roaming
JP2011090488A (en) * 2009-10-22 2011-05-06 Nikon Corp Object tracking device, object tracking program, and camera
US8988432B2 (en) 2009-11-05 2015-03-24 Microsoft Technology Licensing, Llc Systems and methods for processing an image for target tracking
US8843857B2 (en) 2009-11-19 2014-09-23 Microsoft Corporation Distance scalable no touch computing
US9244533B2 (en) 2009-12-17 2016-01-26 Microsoft Technology Licensing, Llc Camera navigation for presentations
US20110150271A1 (en) 2009-12-18 2011-06-23 Microsoft Corporation Motion detection using depth images
US8320621B2 (en) 2009-12-21 2012-11-27 Microsoft Corporation Depth projector system with integrated VCSEL array
US9268404B2 (en) 2010-01-08 2016-02-23 Microsoft Technology Licensing, Llc Application gesture interpretation
US8631355B2 (en) 2010-01-08 2014-01-14 Microsoft Corporation Assigning gesture dictionaries
US9019201B2 (en) 2010-01-08 2015-04-28 Microsoft Technology Licensing, Llc Evolving universal gesture sets
US8933884B2 (en) 2010-01-15 2015-01-13 Microsoft Corporation Tracking groups of users in motion capture system
US8334842B2 (en) 2010-01-15 2012-12-18 Microsoft Corporation Recognizing user intent in motion capture system
US8676581B2 (en) 2010-01-22 2014-03-18 Microsoft Corporation Speech recognition analysis via identification information
US8265341B2 (en) 2010-01-25 2012-09-11 Microsoft Corporation Voice-body identity correlation
US8864581B2 (en) 2010-01-29 2014-10-21 Microsoft Corporation Visual based identitiy tracking
US8891067B2 (en) 2010-02-01 2014-11-18 Microsoft Corporation Multiple synchronized optical sources for time-of-flight range finding systems
US8619122B2 (en) 2010-02-02 2013-12-31 Microsoft Corporation Depth camera compatibility
US8687044B2 (en) 2010-02-02 2014-04-01 Microsoft Corporation Depth camera compatibility
US8717469B2 (en) 2010-02-03 2014-05-06 Microsoft Corporation Fast gating photosurface
US8659658B2 (en) 2010-02-09 2014-02-25 Microsoft Corporation Physical interaction zone for gesture-based user interfaces
US8499257B2 (en) 2010-02-09 2013-07-30 Microsoft Corporation Handles interactions for human—computer interface
US8633890B2 (en) 2010-02-16 2014-01-21 Microsoft Corporation Gesture detection based on joint skipping
US20110199302A1 (en) * 2010-02-16 2011-08-18 Microsoft Corporation Capturing screen objects using a collision volume
US8928579B2 (en) 2010-02-22 2015-01-06 Andrew David Wilson Interacting with an omni-directionally projected display
US8411948B2 (en) 2010-03-05 2013-04-02 Microsoft Corporation Up-sampling binary images for segmentation
US8655069B2 (en) 2010-03-05 2014-02-18 Microsoft Corporation Updating image segmentation following user input
US8422769B2 (en) 2010-03-05 2013-04-16 Microsoft Corporation Image segmentation using reduced foreground training data
US20110223995A1 (en) 2010-03-12 2011-09-15 Kevin Geisner Interacting with a computer based application
US8279418B2 (en) 2010-03-17 2012-10-02 Microsoft Corporation Raster scanning for depth detection
US8213680B2 (en) 2010-03-19 2012-07-03 Microsoft Corporation Proxy training data for human body tracking
US8514269B2 (en) 2010-03-26 2013-08-20 Microsoft Corporation De-aliasing depth images
US8523667B2 (en) 2010-03-29 2013-09-03 Microsoft Corporation Parental control settings based on body dimensions
US8605763B2 (en) 2010-03-31 2013-12-10 Microsoft Corporation Temperature measurement and control for laser and light-emitting diodes
US9098873B2 (en) 2010-04-01 2015-08-04 Microsoft Technology Licensing, Llc Motion-based interactive shopping environment
US9646340B2 (en) 2010-04-01 2017-05-09 Microsoft Technology Licensing, Llc Avatar-based virtual dressing room
US8351651B2 (en) 2010-04-26 2013-01-08 Microsoft Corporation Hand-location post-process refinement in a tracking system
US8379919B2 (en) 2010-04-29 2013-02-19 Microsoft Corporation Multiple centroid condensation of probability distribution clouds
US8284847B2 (en) 2010-05-03 2012-10-09 Microsoft Corporation Detecting motion for a multifunction sensor device
US9110557B2 (en) 2010-05-04 2015-08-18 Timocco Ltd. System and method for tracking and mapping an object to a target
US8498481B2 (en) 2010-05-07 2013-07-30 Microsoft Corporation Image segmentation using star-convexity constraints
US8885890B2 (en) 2010-05-07 2014-11-11 Microsoft Corporation Depth map confidence filtering
US8457353B2 (en) 2010-05-18 2013-06-04 Microsoft Corporation Gestures and gesture modifiers for manipulating a user-interface
US8803888B2 (en) 2010-06-02 2014-08-12 Microsoft Corporation Recognition system for sharing information
US8751215B2 (en) 2010-06-04 2014-06-10 Microsoft Corporation Machine based sign language interpreter
US9008355B2 (en) 2010-06-04 2015-04-14 Microsoft Technology Licensing, Llc Automatic depth camera aiming
US9557574B2 (en) 2010-06-08 2017-01-31 Microsoft Technology Licensing, Llc Depth illumination and detection optics
US8330822B2 (en) 2010-06-09 2012-12-11 Microsoft Corporation Thermally-tuned depth camera light source
US8749557B2 (en) 2010-06-11 2014-06-10 Microsoft Corporation Interacting with user interface via avatar
US8675981B2 (en) 2010-06-11 2014-03-18 Microsoft Corporation Multi-modal gender recognition including depth data
US9384329B2 (en) 2010-06-11 2016-07-05 Microsoft Technology Licensing, Llc Caloric burn determination from body movement
US8982151B2 (en) 2010-06-14 2015-03-17 Microsoft Technology Licensing, Llc Independently processing planes of display data
US8670029B2 (en) 2010-06-16 2014-03-11 Microsoft Corporation Depth camera illuminator with superluminescent light-emitting diode
US8558873B2 (en) 2010-06-16 2013-10-15 Microsoft Corporation Use of wavefront coding to create a depth image
US8296151B2 (en) 2010-06-18 2012-10-23 Microsoft Corporation Compound gesture-speech commands
US8381108B2 (en) 2010-06-21 2013-02-19 Microsoft Corporation Natural user input for driving interactive stories
US8416187B2 (en) 2010-06-22 2013-04-09 Microsoft Corporation Item navigation using motion-capture data
US9118832B2 (en) * 2010-08-17 2015-08-25 Nokia Technologies Oy Input method
US9075434B2 (en) 2010-08-20 2015-07-07 Microsoft Technology Licensing, Llc Translating user motion into multiple object responses
US8613666B2 (en) 2010-08-31 2013-12-24 Microsoft Corporation User selection and navigation based on looped motions
US20120058824A1 (en) 2010-09-07 2012-03-08 Microsoft Corporation Scalable real-time motion recognition
US8437506B2 (en) 2010-09-07 2013-05-07 Microsoft Corporation System for fast, probabilistic skeletal tracking
US8988508B2 (en) 2010-09-24 2015-03-24 Microsoft Technology Licensing, Llc. Wide angle field of view active illumination imaging system
US8681255B2 (en) 2010-09-28 2014-03-25 Microsoft Corporation Integrated low power depth camera and projection device
US8548270B2 (en) 2010-10-04 2013-10-01 Microsoft Corporation Time-of-flight depth imaging
US9484065B2 (en) 2010-10-15 2016-11-01 Microsoft Technology Licensing, Llc Intelligent determination of replays based on event identification
US8592739B2 (en) 2010-11-02 2013-11-26 Microsoft Corporation Detection of configuration changes of an optical element in an illumination system
US8866889B2 (en) 2010-11-03 2014-10-21 Microsoft Corporation In-home depth camera calibration
US8667519B2 (en) 2010-11-12 2014-03-04 Microsoft Corporation Automatic passive and anonymous feedback system
US10726861B2 (en) 2010-11-15 2020-07-28 Microsoft Technology Licensing, Llc Semi-private communication in open environments
US9349040B2 (en) * 2010-11-19 2016-05-24 Microsoft Technology Licensing, Llc Bi-modal depth-image analysis
TWI514324B (en) * 2010-11-30 2015-12-21 Ind Tech Res Inst Tracking system and method for image object region and computer program product thereof
US10234545B2 (en) 2010-12-01 2019-03-19 Microsoft Technology Licensing, Llc Light source module
MY162243A (en) * 2010-12-02 2017-05-31 Mimos Berhad Method and system for tracking object using adaptive attention regions
US8553934B2 (en) 2010-12-08 2013-10-08 Microsoft Corporation Orienting the position of a sensor
US8618405B2 (en) 2010-12-09 2013-12-31 Microsoft Corp. Free-space gesture musical instrument digital interface (MIDI) controller
US8408706B2 (en) 2010-12-13 2013-04-02 Microsoft Corporation 3D gaze tracker
US9171264B2 (en) 2010-12-15 2015-10-27 Microsoft Technology Licensing, Llc Parallel processing machine learning decision tree training
US8884968B2 (en) 2010-12-15 2014-11-11 Microsoft Corporation Modeling an object from image data
US8920241B2 (en) 2010-12-15 2014-12-30 Microsoft Corporation Gesture controlled persistent handles for interface guides
US8448056B2 (en) 2010-12-17 2013-05-21 Microsoft Corporation Validation analysis of human target
US8803952B2 (en) 2010-12-20 2014-08-12 Microsoft Corporation Plural detector time-of-flight depth mapping
US8385596B2 (en) 2010-12-21 2013-02-26 Microsoft Corporation First person shooter control with virtual skeleton
US9848106B2 (en) 2010-12-21 2017-12-19 Microsoft Technology Licensing, Llc Intelligent gameplay photo capture
US8994718B2 (en) 2010-12-21 2015-03-31 Microsoft Technology Licensing, Llc Skeletal control of three-dimensional virtual world
US9823339B2 (en) 2010-12-21 2017-11-21 Microsoft Technology Licensing, Llc Plural anode time-of-flight sensor
US9821224B2 (en) 2010-12-21 2017-11-21 Microsoft Technology Licensing, Llc Driving simulator control with virtual skeleton
US9123316B2 (en) 2010-12-27 2015-09-01 Microsoft Technology Licensing, Llc Interactive content creation
US8488888B2 (en) 2010-12-28 2013-07-16 Microsoft Corporation Classification of posture states
CA2824330C (en) * 2011-01-12 2018-05-01 Videonetics Technology Private Limited An integrated intelligent server based system and method/systems adapted to facilitate fail-safe integration and/or optimized utilization of various sensory inputs
US9247238B2 (en) 2011-01-31 2016-01-26 Microsoft Technology Licensing, Llc Reducing interference between multiple infra-red depth cameras
US8711206B2 (en) 2011-01-31 2014-04-29 Microsoft Corporation Mobile camera localization using depth maps
US8570320B2 (en) 2011-01-31 2013-10-29 Microsoft Corporation Using a three-dimensional environment model in gameplay
US8401242B2 (en) 2011-01-31 2013-03-19 Microsoft Corporation Real-time camera tracking using depth maps
US8401225B2 (en) 2011-01-31 2013-03-19 Microsoft Corporation Moving object segmentation using depth images
US8587583B2 (en) 2011-01-31 2013-11-19 Microsoft Corporation Three-dimensional environment reconstruction
US8724887B2 (en) 2011-02-03 2014-05-13 Microsoft Corporation Environmental modifications to mitigate environmental factors
US8942917B2 (en) 2011-02-14 2015-01-27 Microsoft Corporation Change invariant scene recognition by an agent
US8497838B2 (en) 2011-02-16 2013-07-30 Microsoft Corporation Push actuation of interface controls
US9551914B2 (en) 2011-03-07 2017-01-24 Microsoft Technology Licensing, Llc Illuminator with refractive optical element
US9067136B2 (en) 2011-03-10 2015-06-30 Microsoft Technology Licensing, Llc Push personalization of interface controls
WO2012123033A1 (en) * 2011-03-17 2012-09-20 Ssi Schaefer Noell Gmbh Lager Und Systemtechnik Controlling and monitoring a storage and order-picking system by means of movement and speech
US8571263B2 (en) 2011-03-17 2013-10-29 Microsoft Corporation Predicting joint positions
US9610506B2 (en) 2011-03-28 2017-04-04 Brian M. Dugan Systems and methods for fitness and video games
US9533228B2 (en) * 2011-03-28 2017-01-03 Brian M. Dugan Systems and methods for fitness and video games
US9470778B2 (en) 2011-03-29 2016-10-18 Microsoft Technology Licensing, Llc Learning from high quality depth measurements
US9760566B2 (en) 2011-03-31 2017-09-12 Microsoft Technology Licensing, Llc Augmented conversational understanding agent to identify conversation context between two humans and taking an agent action thereof
US9842168B2 (en) 2011-03-31 2017-12-12 Microsoft Technology Licensing, Llc Task driven user intents
US10642934B2 (en) 2011-03-31 2020-05-05 Microsoft Technology Licensing, Llc Augmented conversational understanding architecture
US9298287B2 (en) 2011-03-31 2016-03-29 Microsoft Technology Licensing, Llc Combined activation for natural user interface systems
US8503494B2 (en) 2011-04-05 2013-08-06 Microsoft Corporation Thermal management system
US8824749B2 (en) 2011-04-05 2014-09-02 Microsoft Corporation Biometric recognition
RU2589395C2 (en) 2011-04-22 2016-07-10 Пепсико, Инк. Dispensing system for beverages with social services capabilities
US8620113B2 (en) 2011-04-25 2013-12-31 Microsoft Corporation Laser diode modes
US8702507B2 (en) 2011-04-28 2014-04-22 Microsoft Corporation Manual and camera-based avatar control
US9259643B2 (en) 2011-04-28 2016-02-16 Microsoft Technology Licensing, Llc Control of separate computer game elements
US10671841B2 (en) 2011-05-02 2020-06-02 Microsoft Technology Licensing, Llc Attribute state classification
US8888331B2 (en) 2011-05-09 2014-11-18 Microsoft Corporation Low inductance light source module
US9064006B2 (en) 2012-08-23 2015-06-23 Microsoft Technology Licensing, Llc Translating natural language utterances to keyword search queries
US9137463B2 (en) 2011-05-12 2015-09-15 Microsoft Technology Licensing, Llc Adaptive high dynamic range camera
US8788973B2 (en) 2011-05-23 2014-07-22 Microsoft Corporation Three-dimensional gesture controlled avatar configuration interface
US8760395B2 (en) 2011-05-31 2014-06-24 Microsoft Corporation Gesture recognition techniques
US8526734B2 (en) 2011-06-01 2013-09-03 Microsoft Corporation Three-dimensional background removal for vision system
US9594430B2 (en) 2011-06-01 2017-03-14 Microsoft Technology Licensing, Llc Three-dimensional foreground selection for vision system
US9013489B2 (en) 2011-06-06 2015-04-21 Microsoft Technology Licensing, Llc Generation of avatar reflecting player appearance
US8897491B2 (en) 2011-06-06 2014-11-25 Microsoft Corporation System for finger recognition and tracking
US8597142B2 (en) 2011-06-06 2013-12-03 Microsoft Corporation Dynamic camera based practice mode
US9098110B2 (en) 2011-06-06 2015-08-04 Microsoft Technology Licensing, Llc Head rotation tracking from depth-based center of mass
US9724600B2 (en) 2011-06-06 2017-08-08 Microsoft Technology Licensing, Llc Controlling objects in a virtual environment
US8929612B2 (en) 2011-06-06 2015-01-06 Microsoft Corporation System for recognizing an open or closed hand
US10796494B2 (en) 2011-06-06 2020-10-06 Microsoft Technology Licensing, Llc Adding attributes to virtual representations of real-world objects
US9208571B2 (en) 2011-06-06 2015-12-08 Microsoft Technology Licensing, Llc Object digitization
US9597587B2 (en) 2011-06-08 2017-03-21 Microsoft Technology Licensing, Llc Locational node device
US8786730B2 (en) 2011-08-18 2014-07-22 Microsoft Corporation Image exposure using exclusion regions
US9218704B2 (en) 2011-11-01 2015-12-22 Pepsico, Inc. Dispensing system and user interface
US9557836B2 (en) 2011-11-01 2017-01-31 Microsoft Technology Licensing, Llc Depth image compression
US9117281B2 (en) 2011-11-02 2015-08-25 Microsoft Corporation Surface segmentation from RGB and depth images
US8854426B2 (en) 2011-11-07 2014-10-07 Microsoft Corporation Time-of-flight camera with guided light
US8724906B2 (en) 2011-11-18 2014-05-13 Microsoft Corporation Computing pose and/or shape of modifiable entities
US8509545B2 (en) 2011-11-29 2013-08-13 Microsoft Corporation Foreground subject detection
US8803800B2 (en) 2011-12-02 2014-08-12 Microsoft Corporation User interface control based on head orientation
US8635637B2 (en) 2011-12-02 2014-01-21 Microsoft Corporation User interface presenting an animated avatar performing a media reaction
US9100685B2 (en) 2011-12-09 2015-08-04 Microsoft Technology Licensing, Llc Determining audience state or interest using passive sensor data
US8879831B2 (en) 2011-12-15 2014-11-04 Microsoft Corporation Using high-level attributes to guide image processing
US8630457B2 (en) 2011-12-15 2014-01-14 Microsoft Corporation Problem states for pose tracking pipeline
US8971612B2 (en) 2011-12-15 2015-03-03 Microsoft Corporation Learning image processing tasks from scene reconstructions
US8811938B2 (en) 2011-12-16 2014-08-19 Microsoft Corporation Providing a user interface experience based on inferred vehicle state
US9342139B2 (en) 2011-12-19 2016-05-17 Microsoft Technology Licensing, Llc Pairing a computing device to a user
US9720089B2 (en) 2012-01-23 2017-08-01 Microsoft Technology Licensing, Llc 3D zoom imager
US8942881B2 (en) 2012-04-02 2015-01-27 Google Inc. Gesture-based automotive controls
US8898687B2 (en) 2012-04-04 2014-11-25 Microsoft Corporation Controlling a media program based on a media reaction
US9210401B2 (en) 2012-05-03 2015-12-08 Microsoft Technology Licensing, Llc Projected visual cues for guiding physical movement
CA2775700C (en) 2012-05-04 2013-07-23 Microsoft Corporation Determining a future portion of a currently presented media program
CN104395929B (en) 2012-06-21 2017-10-03 微软技术许可有限责任公司 Constructed using the incarnation of depth camera
US9836590B2 (en) 2012-06-22 2017-12-05 Microsoft Technology Licensing, Llc Enhanced accuracy of user presence status determination
US9696427B2 (en) 2012-08-14 2017-07-04 Microsoft Technology Licensing, Llc Wide angle depth detection
US8882310B2 (en) 2012-12-10 2014-11-11 Microsoft Corporation Laser die light source module with low inductance
US9857470B2 (en) 2012-12-28 2018-01-02 Microsoft Technology Licensing, Llc Using photometric stereo for 3D environment modeling
US9251590B2 (en) 2013-01-24 2016-02-02 Microsoft Technology Licensing, Llc Camera pose estimation for 3D reconstruction
US9052746B2 (en) 2013-02-15 2015-06-09 Microsoft Technology Licensing, Llc User center-of-mass and mass distribution extraction using depth images
US20140240212A1 (en) * 2013-02-22 2014-08-28 Corel Corporation Tracking device tilt calibration using a vision system
US9940553B2 (en) 2013-02-22 2018-04-10 Microsoft Technology Licensing, Llc Camera/object pose from predicted coordinates
US20140240227A1 (en) * 2013-02-26 2014-08-28 Corel Corporation System and method for calibrating a tracking object in a vision system
US9135516B2 (en) 2013-03-08 2015-09-15 Microsoft Technology Licensing, Llc User body angle, curvature and average extremity positions extraction using depth images
US9092657B2 (en) 2013-03-13 2015-07-28 Microsoft Technology Licensing, Llc Depth image processing
US9274606B2 (en) 2013-03-14 2016-03-01 Microsoft Technology Licensing, Llc NUI video conference controls
US9953213B2 (en) 2013-03-27 2018-04-24 Microsoft Technology Licensing, Llc Self discovery of autonomous NUI devices
US9442186B2 (en) 2013-05-13 2016-09-13 Microsoft Technology Licensing, Llc Interference reduction for TOF systems
US9829984B2 (en) * 2013-05-23 2017-11-28 Fastvdo Llc Motion-assisted visual language for human computer interfaces
US9462253B2 (en) 2013-09-23 2016-10-04 Microsoft Technology Licensing, Llc Optical modules that reduce speckle contrast and diffraction artifacts
US9443310B2 (en) 2013-10-09 2016-09-13 Microsoft Technology Licensing, Llc Illumination modules that emit structured light
US9674563B2 (en) 2013-11-04 2017-06-06 Rovi Guides, Inc. Systems and methods for recommending content
US9769459B2 (en) 2013-11-12 2017-09-19 Microsoft Technology Licensing, Llc Power efficient laser diode driver circuit and method
US9508385B2 (en) 2013-11-21 2016-11-29 Microsoft Technology Licensing, Llc Audio-visual project generator
US9971491B2 (en) 2014-01-09 2018-05-15 Microsoft Technology Licensing, Llc Gesture library for natural user input
WO2015106114A1 (en) 2014-01-13 2015-07-16 T1visions, Inc. Display capable of object recognition
US20150254235A1 (en) * 2014-03-06 2015-09-10 Boyd Whitley Sign Language Translation
KR102356599B1 (en) * 2014-12-05 2022-01-28 삼성전자주식회사 Method for determining region of interest of image and device for determining region of interest of image
US10409443B2 (en) * 2015-06-24 2019-09-10 Microsoft Technology Licensing, Llc Contextual cursor display based on hand tracking
JP6144738B2 (en) * 2015-09-18 2017-06-07 株式会社スクウェア・エニックス Video game processing program, video game processing system, and video game processing method
US10412280B2 (en) 2016-02-10 2019-09-10 Microsoft Technology Licensing, Llc Camera with light valve over sensor array
US10257932B2 (en) 2016-02-16 2019-04-09 Microsoft Technology Licensing, Llc. Laser diode chip on printed circuit board
US10462452B2 (en) 2016-03-16 2019-10-29 Microsoft Technology Licensing, Llc Synchronizing active illumination cameras
US10529075B2 (en) 2018-01-26 2020-01-07 Wipro Limited Method and system for tracking objects within a video
EP3795940A1 (en) 2019-09-19 2021-03-24 sentronics metrology GmbH Device and method for inspecting flat objects and for detecting boundary layers of said objects
US11200458B1 (en) 2020-06-15 2021-12-14 Bank Of America Corporation System for integration of a hexagonal image processing framework within a technical environment

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5574498A (en) * 1993-09-25 1996-11-12 Sony Corporation Target tracking system
US6061055A (en) * 1997-03-21 2000-05-09 Autodesk, Inc. Method of tracking objects with an imaging device

Family Cites Families (97)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6442465B2 (en) * 1992-05-05 2002-08-27 Automotive Technologies International, Inc. Vehicular component control systems and methods
US7415126B2 (en) * 1992-05-05 2008-08-19 Automotive Technologies International Inc. Occupant sensing system
US6772057B2 (en) * 1995-06-07 2004-08-03 Automotive Technologies International, Inc. Vehicular monitoring systems using image processing
US6507779B2 (en) * 1995-06-07 2003-01-14 Automotive Technologies International, Inc. Vehicle rear seat monitor
US4746770A (en) * 1987-02-17 1988-05-24 Sensor Frame Incorporated Method and apparatus for isolating and manipulating graphic objects on computer video monitor
US5047952A (en) * 1988-10-14 1991-09-10 The Board Of Trustee Of The Leland Stanford Junior University Communication system for deaf, deaf-blind, or non-vocal individuals using instrumented glove
US5252951A (en) * 1989-04-28 1993-10-12 International Business Machines Corporation Graphical user interface with gesture recognition in a multiapplication environment
US5759044A (en) * 1990-02-22 1998-06-02 Redmond Productions Methods and apparatus for generating and processing synthetic and absolute real time environments
US5025314A (en) * 1990-07-30 1991-06-18 Xerox Corporation Apparatus allowing remote interactive use of a plurality of writing surfaces
US5898434A (en) * 1991-05-15 1999-04-27 Apple Computer, Inc. User interface system having programmable user interface elements
US5684701A (en) * 1995-06-07 1997-11-04 Automotive Technologies International, Inc. Method and apparatus for sensing a vehicle crash
US5901246A (en) * 1995-06-06 1999-05-04 Hoffberg; Steven M. Ergonomic man-machine interface incorporating adaptive pattern recognition based control system
EP0554492B1 (en) * 1992-02-07 1995-08-09 International Business Machines Corporation Method and device for optical input of commands or data
US5699441A (en) * 1992-03-10 1997-12-16 Hitachi, Ltd. Continuous sign-language recognition apparatus and input apparatus
US5887069A (en) * 1992-03-10 1999-03-23 Hitachi, Ltd. Sign recognition apparatus and method and sign translation system using same
US6529809B1 (en) * 1997-02-06 2003-03-04 Automotive Technologies International, Inc. Method of developing a system for identifying the presence and orientation of an object in a vehicle
US5880411A (en) * 1992-06-08 1999-03-09 Synaptics, Incorporated Object position detector with edge motion feature and gesture recognition
US5889236A (en) * 1992-06-08 1999-03-30 Synaptics Incorporated Pressure sensitive scrollbar feature
US5270820A (en) * 1992-06-25 1993-12-14 Ultimatte Corporation Method and apparatus for tracking a pointing device in a video field
JPH07325934A (en) * 1992-07-10 1995-12-12 Walt Disney Co:The Method and equipment for provision of graphics enhanced to virtual world
JP3435175B2 (en) * 1992-09-03 2003-08-11 株式会社日立製作所 Sign language learning device
JP3244798B2 (en) * 1992-09-08 2002-01-07 株式会社東芝 Moving image processing device
FR2696258B1 (en) * 1992-09-25 1994-10-28 Sextant Avionique Device for managing a human-machine interaction system.
JP3338992B2 (en) * 1992-10-29 2002-10-28 株式会社日立製作所 Sign language / word conversion system
US5612719A (en) * 1992-12-03 1997-03-18 Apple Computer, Inc. Gesture sensitive buttons for graphical user interfaces
US5659764A (en) * 1993-02-25 1997-08-19 Hitachi, Ltd. Sign language generation apparatus and sign language translation apparatus
US5446661A (en) * 1993-04-15 1995-08-29 Automotive Systems Laboratory, Inc. Adjustable crash discrimination system with occupant position detection
GB9308952D0 (en) * 1993-04-30 1993-06-16 Philips Electronics Uk Ltd Tracking objects in video sequences
US5454043A (en) * 1993-07-30 1995-09-26 Mitsubishi Electric Research Laboratories, Inc. Dynamic and static hand gesture recognition through low-level image analysis
US5670987A (en) * 1993-09-21 1997-09-23 Kabushiki Kaisha Toshiba Virtual manipulating apparatus and method
US5423554A (en) * 1993-09-24 1995-06-13 Metamedia Ventures, Inc. Virtual reality game method and apparatus
US5914610A (en) * 1994-02-03 1999-06-22 Massachusetts Institute Of Technology Apparatus and method for characterizing movement of a mass within a defined space
JP3630712B2 (en) * 1994-02-03 2005-03-23 キヤノン株式会社 Gesture input method and apparatus
US5732227A (en) * 1994-07-05 1998-03-24 Hitachi, Ltd. Interactive information processing system responsive to user manipulation of physical objects and displayed images
JP3267047B2 (en) * 1994-04-25 2002-03-18 株式会社日立製作所 Information processing device by voice
US6137908A (en) 1994-06-29 2000-10-24 Microsoft Corporation Handwriting recognition system simultaneously considering shape and context information
US5570301A (en) * 1994-07-15 1996-10-29 Mitsubishi Electric Information Technology Center America, Inc. System for unencumbered measurement and reporting of body posture
US5563988A (en) * 1994-08-01 1996-10-08 Massachusetts Institute Of Technology Method and system for facilitating wireless, full-body, real-time user interaction with a digitally represented visual environment
JPH0863326A (en) * 1994-08-22 1996-03-08 Hitachi Ltd Image processing device/method
JPH08147477A (en) * 1994-09-20 1996-06-07 Fujitsu Ltd Local area image tracking device
US6002428A (en) * 1994-10-21 1999-12-14 Sanyo Electric Co., Ltd. Motion vector detection circuit and object tracking camera device utilizing the same
AUPN003894A0 (en) 1994-12-13 1995-01-12 Xenotech Research Pty Ltd Head tracking system for stereoscopic display apparatus
JP2817646B2 (en) * 1995-02-01 1998-10-30 日本電気株式会社 Document editing device
US5594469A (en) * 1995-02-21 1997-01-14 Mitsubishi Electric Information Technology Center America Inc. Hand gesture machine control system
US5652849A (en) * 1995-03-16 1997-07-29 Regents Of The University Of Michigan Apparatus and method for remote control using a visual information stream
JPH08286831A (en) * 1995-04-14 1996-11-01 Canon Inc Pen input type electronic device and its control method
US5710833A (en) * 1995-04-20 1998-01-20 Massachusetts Institute Of Technology Detection, recognition and coding of complex objects using probabilistic eigenspace analysis
US5757360A (en) * 1995-05-03 1998-05-26 Mitsubishi Electric Information Technology Center America, Inc. Hand held computer control device
DE19516664C1 (en) * 1995-05-05 1996-08-29 Siemens Ag Processor-supported detection of selective target object in image
US5912980A (en) * 1995-07-13 1999-06-15 Hunke; H. Martin Target acquisition and tracking
US6005549A (en) 1995-07-24 1999-12-21 Forest; Donald K. User interface method and apparatus
JP3745802B2 (en) * 1995-10-13 2006-02-15 株式会社日立製作所 Image generation / display device
AU1328597A (en) * 1995-11-30 1997-06-19 Virtual Technologies, Inc. Tactile feedback man-machine interface device
US6219032B1 (en) * 1995-12-01 2001-04-17 Immersion Corporation Method for providing force feedback to a user of an interface device based on interactions of a controlled cursor with graphical elements in a graphical user interface
US5774591A (en) * 1995-12-15 1998-06-30 Xerox Corporation Apparatus and method for recognizing facial expressions and facial gestures in a sequence of images
US5802220A (en) * 1995-12-15 1998-09-01 Xerox Corporation Apparatus and method for tracking facial motion through a sequence of images
JP4079463B2 (en) * 1996-01-26 2008-04-23 ソニー株式会社 Subject detection apparatus and subject detection method
JP3280559B2 (en) * 1996-02-20 2002-05-13 シャープ株式会社 Jog dial simulation input device
US6173066B1 (en) * 1996-05-21 2001-01-09 Cybernet Systems Corporation Pose determination and tracking by matching 3D objects to a 2D sensor
JP3434979B2 (en) * 1996-07-23 2003-08-11 富士通株式会社 Local area image tracking device
US6002808A (en) * 1996-07-26 1999-12-14 Mitsubishi Electric Information Technology Center America, Inc. Hand gesture control system
JP3843502B2 (en) * 1996-09-30 2006-11-08 マツダ株式会社 Vehicle motion recognition device
US6144366A (en) * 1996-10-18 2000-11-07 Kabushiki Kaisha Toshiba Method and apparatus for generating information input using reflected light image of target object
US5878151A (en) * 1996-10-31 1999-03-02 Combustion Engineering, Inc. Moving object tracking
US5990865A (en) * 1997-01-06 1999-11-23 Gard; Matthew Davis Computer interface device
US5864848A (en) * 1997-01-31 1999-01-26 Microsoft Corporation Goal-driven information interpretation and extraction system
WO1998035501A2 (en) * 1997-02-06 1998-08-13 Koninklijke Philips Electronics N.V. Image segmentation and object tracking method and corresponding system
US6236736B1 (en) * 1997-02-07 2001-05-22 Ncr Corporation Method and apparatus for detecting movement patterns at a self-service checkout terminal
US5973732A (en) * 1997-02-19 1999-10-26 Guthrie; Thomas C. Object tracking system for monitoring a controlled space
US6009210A (en) 1997-03-05 1999-12-28 Digital Equipment Corporation Hands-free interface to a virtual reality environment using head tracking
US5875257A (en) * 1997-03-07 1999-02-23 Massachusetts Institute Of Technology Apparatus for controlling continuous behavior through hand and arm gestures
GB2324428A (en) * 1997-04-17 1998-10-21 Sharp Kk Image tracking; observer tracking stereoscopic display
JPH10334270A (en) * 1997-05-28 1998-12-18 Mitsubishi Electric Corp Operation recognition device and recorded medium recording operation recognition program
US6185314B1 (en) * 1997-06-19 2001-02-06 Ncr Corporation System and method for matching image information to object model information
US6263088B1 (en) * 1997-06-19 2001-07-17 Ncr Corporation System and method for tracking movement of objects in a scene
US6075895A (en) * 1997-06-20 2000-06-13 Holoplex Methods and apparatus for gesture recognition based on templates
JPH1120606A (en) * 1997-07-01 1999-01-26 Mitsubishi Electric Corp Occupant constraining device
US6188777B1 (en) * 1997-08-01 2001-02-13 Interval Research Corporation Method and apparatus for personnel detection and tracking
US6720949B1 (en) * 1997-08-22 2004-04-13 Timothy R. Pryor Man machine interfaces and applications
US20020036617A1 (en) * 1998-08-21 2002-03-28 Timothy R. Pryor Novel man machine interfaces and applications
US5907328A (en) * 1997-08-27 1999-05-25 International Business Machines Corporation Automatic and configurable viewpoint switching in a 3D scene
JP3481430B2 (en) * 1997-09-11 2003-12-22 富士通株式会社 Mobile tracking device
US6138908A (en) * 1997-09-19 2000-10-31 Ericsson Inc. Method for updating communications facilitation data
US6088019A (en) * 1998-06-23 2000-07-11 Immersion Corporation Low cost force feedback device with actuator for non-primary axis
US5889523A (en) * 1997-11-25 1999-03-30 Fuji Xerox Co., Ltd. Method and apparatus for dynamically grouping a plurality of graphic objects
JP3660492B2 (en) * 1998-01-27 2005-06-15 株式会社東芝 Object detection device
US6104383A (en) * 1998-02-20 2000-08-15 Shipman; Dale Howard Thumb-actuated computer pointing-input device
US6301370B1 (en) * 1998-04-13 2001-10-09 Eyematic Interfaces, Inc. Face recognition from video images
US6272231B1 (en) * 1998-11-06 2001-08-07 Eyematic Interfaces, Inc. Wavelet-based facial motion capture for avatar animation
US20010008561A1 (en) * 1999-08-10 2001-07-19 Paul George V. Real-time object tracking system
US6950534B2 (en) 1998-08-10 2005-09-27 Cybernet Systems Corporation Gesture-controlled interfaces for self-service machines and other applications
US6154559A (en) * 1998-10-01 2000-11-28 Mitsubishi Electric Information Technology Center America, Inc. (Ita) System for classifying an individual's gaze direction
US6545670B1 (en) * 1999-05-11 2003-04-08 Timothy R. Pryor Methods and apparatus for man machine interfaces and related activity
US6766036B1 (en) * 1999-07-08 2004-07-20 Timothy R. Pryor Camera based man machine interfaces
US7050606B2 (en) * 1999-08-10 2006-05-23 Cybernet Systems Corporation Tracking and gesture recognition system particularly suited to vehicular control applications
US6937744B1 (en) * 2000-06-13 2005-08-30 Microsoft Corporation System and process for bootstrap initialization of nonparametric color models
US6804396B2 (en) * 2001-03-28 2004-10-12 Honda Giken Kogyo Kabushiki Kaisha Gesture recognition system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5574498A (en) * 1993-09-25 1996-11-12 Sony Corporation Target tracking system
US6061055A (en) * 1997-03-21 2000-05-09 Autodesk, Inc. Method of tracking objects with an imaging device

Cited By (52)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7121946B2 (en) * 1998-08-10 2006-10-17 Cybernet Systems Corporation Real-time head tracking system for computer games and other applications
US7684592B2 (en) 1998-08-10 2010-03-23 Cybernet Systems Corporation Realtime object tracking system
US20020037770A1 (en) * 1998-08-10 2002-03-28 Paul George V. Real-time head tracking system for computer games and other applications
US20090116692A1 (en) * 1998-08-10 2009-05-07 Paul George V Realtime object tracking system
US9304593B2 (en) 1998-08-10 2016-04-05 Cybernet Systems Corporation Behavior recognition system
US20070066393A1 (en) * 1998-08-10 2007-03-22 Cybernet Systems Corporation Real-time head tracking system for computer games and other applications
US8818647B2 (en) 1999-12-15 2014-08-26 American Vehicular Sciences Llc Vehicular heads-up display system
US20080048930A1 (en) * 1999-12-15 2008-02-28 Automotive Technologies International, Inc. Vehicular Heads-Up Display System
US8032264B2 (en) 1999-12-15 2011-10-04 Automotive Technologies International, Inc. Vehicular heads-up display system
US8686922B2 (en) 1999-12-15 2014-04-01 American Vehicular Sciences Llc Eye-location dependent vehicular heads-up display system
US20080276191A1 (en) * 1999-12-15 2008-11-06 Automotive Technologies International, Inc. Vehicular Heads-Up Display System
US20080158096A1 (en) * 1999-12-15 2008-07-03 Automotive Technologies International, Inc. Eye-Location Dependent Vehicular Heads-Up Display System
US20020126877A1 (en) * 2001-03-08 2002-09-12 Yukihiro Sugiyama Light transmission type image recognition device and image recognition sensor
DE10225077A1 (en) * 2002-06-05 2003-12-24 Vr Magic Gmbh Operating theater object tracking system has moveable optical sensors with position measured in fixed reference system
DE10225077B4 (en) * 2002-06-05 2007-11-15 Vr Magic Gmbh Object tracking device for medical operations
US20080065291A1 (en) * 2002-11-04 2008-03-13 Automotive Technologies International, Inc. Gesture-Based Control of Vehicular Components
US7317812B1 (en) * 2002-11-15 2008-01-08 Videomining Corporation Method and apparatus for robustly tracking objects
US7660439B1 (en) 2003-12-16 2010-02-09 Verificon Corporation Method and system for flow detection and motion analysis
US20060018516A1 (en) * 2004-07-22 2006-01-26 Masoud Osama T Monitoring activity using video information
US20060088196A1 (en) * 2004-10-25 2006-04-27 Popovich Joseph Jr Embedded imaging and control system
US8121392B2 (en) 2004-10-25 2012-02-21 Parata Systems, Llc Embedded imaging and control system
US7245215B2 (en) 2005-02-10 2007-07-17 Pinc Solutions Position-tracking device for position-tracking system
US7236091B2 (en) 2005-02-10 2007-06-26 Pinc Solutions Position-tracking system
US20060176174A1 (en) * 2005-02-10 2006-08-10 Pinc Solutions Position-tracking device for position-tracking system
US20060187028A1 (en) * 2005-02-10 2006-08-24 Pinc Solutions Position-tracing system
US20060210159A1 (en) * 2005-03-15 2006-09-21 Yea-Shuan Huang Foreground extraction approach by using color and local structure information
FR2885719A1 (en) * 2005-05-10 2006-11-17 Thomson Licensing Sa METHOD AND DEVICE FOR TRACKING OBJECTS IN AN IMAGE SEQUENCE
US20070018811A1 (en) * 2005-07-05 2007-01-25 Pinc Solutions Systems and methods for determining a location of an object
US7321305B2 (en) 2005-07-05 2008-01-22 Pinc Solutions Systems and methods for determining a location of an object
US20090302030A1 (en) * 2006-03-30 2009-12-10 Advanced Composite Materials Corporation Composite materials and devices comprising single crystal silicon carbide heated by electromagnetic radiation
US20070286458A1 (en) * 2006-06-12 2007-12-13 D&S Consultants, Inc. Method and System for Tracking a Target
US20110123067A1 (en) * 2006-06-12 2011-05-26 D & S Consultants, Inc. Method And System for Tracking a Target
US7606411B2 (en) 2006-10-05 2009-10-20 The United States Of America As Represented By The Secretary Of The Navy Robotic gesture recognition system
US20080085048A1 (en) * 2006-10-05 2008-04-10 Department Of The Navy Robotic gesture recognition system
US8571322B2 (en) * 2007-04-16 2013-10-29 Fujitsu Limited Image processing method, image processing apparatus, image processing system and computer for recognizing shape change of a facial part
US20090310829A1 (en) * 2007-04-16 2009-12-17 Fujitsu Limited Image processing method, image processing apparatus, image processing system and computer program
EP2220588B1 (en) * 2007-12-07 2020-03-25 Robert Bosch GmbH Configuration module for a surveillance system, surveillance system, method for configuring the surveillance system, and computer program
WO2009112575A1 (en) * 2008-03-14 2009-09-17 Bausch & Lomb, Inc. Fast algorithm for streaming wavefront
US9288449B2 (en) 2008-08-05 2016-03-15 University Of Florida Research Foundation, Inc. Systems and methods for maintaining multiple objects within a camera field-of-view
WO2010141378A1 (en) * 2009-05-30 2010-12-09 Sony Computer Entertainment Inc. Color calibration for object tracking
CN101826157A (en) * 2010-04-28 2010-09-08 华中科技大学 Ground static target real-time identifying and tracking method
WO2012178202A1 (en) * 2011-06-23 2012-12-27 Oblong Industries, Inc. Adaptive tracking system for spatial input devices
CN103930944A (en) * 2011-06-23 2014-07-16 奥布隆工业有限公司 Adaptive tracking system for spatial input devices
CN103896025A (en) * 2014-04-02 2014-07-02 东南大学 Intelligent camera based flexible vibration transmission system and operating method thereof
CN105069408A (en) * 2015-07-24 2015-11-18 上海依图网络科技有限公司 Video portrait tracking method based on human face identification in complex scenario
US10421001B2 (en) * 2016-03-30 2019-09-24 Apqs, Llc Ball return device and method of using
US20190388763A1 (en) * 2016-03-30 2019-12-26 Apqs, Llc Ball Return Device and Method of Using
US20170282044A1 (en) * 2016-03-30 2017-10-05 Apqs, Llc Ball Return Device and Method of Using
US10806986B2 (en) * 2016-03-30 2020-10-20 Apqs, Llc Ball return device and method of using
CN112991485A (en) * 2019-12-13 2021-06-18 浙江宇视科技有限公司 Track drawing method and device, readable storage medium and electronic equipment
WO2021129491A1 (en) * 2019-12-25 2021-07-01 中兴通讯股份有限公司 Pedestrian search method, server, and storage medium
CN114359265A (en) * 2022-03-04 2022-04-15 广东顺德富意德智能包装科技有限公司 Screw counting method and system based on target tracking

Also Published As

Publication number Publication date
US20090116692A1 (en) 2009-05-07
US7684592B2 (en) 2010-03-23

Similar Documents

Publication Publication Date Title
US7684592B2 (en) Realtime object tracking system
US7121946B2 (en) Real-time head tracking system for computer games and other applications
US9807365B2 (en) System and method for hybrid simultaneous localization and mapping of 2D and 3D data acquired by sensors from a 3D scene
Winlock et al. Toward real-time grocery detection for the visually impaired
JP2915894B2 (en) Target tracking method and device
US5210799A (en) System and method for ranking and extracting salient contours for target recognition
EP0389968B1 (en) Apparatus and method for extracting edges and lines
Palazzolo et al. Fast image-based geometric change detection given a 3d model
EP2339507B1 (en) Head detection and localisation method
Jiang et al. Multiple pedestrian tracking using colour and motion models
Laskar et al. Stereo vision-based hand gesture recognition under 3D environment
Zoidi et al. Stereo object tracking with fusion of texture, color and disparity information
Kolarow et al. Vision-based hyper-real-time object tracker for robotic applications
Haritaoglu et al. Ghost/sup 3D: detecting body posture and parts using stereo
Zhao et al. Robust multiple object tracking in RGB-D camera networks
Tan et al. Fast Vehicle Localisation and Recognition Without Line Extraction and Matching.
Palazzolo et al. Change detection in 3d models based on camera images
WO2010114376A1 (en) Video sequence processing method and system
Dryanovski et al. Real-time pose estimation with RGB-D camera
Paul et al. A realtime object tracking system using a color camera
Sabeti et al. Visual Tracking Using Color Cameras and Time-of-Flight Range Imaging Sensors.
Yaakob et al. Moving object extraction in PTZ camera using the integration of background subtraction and local histogram processing
Malavika et al. Moving object detection and velocity estimation using MATLAB
Junior et al. Shape-based pedestrian segmentation in still images
Deb et al. A motion region detection and tracking method

Legal Events

Date Code Title Description
AS Assignment

Owner name: CYBERNET SYSTEMS CORPORATION, MICHIGAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:PAUL, GEORGE V.;BEACH, GLENN J.;COHEN, CHARLES J.;AND OTHERS;REEL/FRAME:011586/0230

Effective date: 20010301

STCB Information on status: application discontinuation

Free format text: ABANDONED -- AFTER EXAMINER'S ANSWER OR BOARD OF APPEALS DECISION

AS Assignment

Owner name: NORTHERN LIGHTS, SERIES 74 OF ALLIED SECURITY TRUS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:CYBERNET SYSTEMS CORPORATION;REEL/FRAME:042369/0414

Effective date: 20170505

AS Assignment

Owner name: JOLLY SEVEN, SERIES 70 OF ALLIED SECURITY TRUST I,

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:NORTHERN LIGHTS, SERIES 74 OF ALLIED SECURITY TRUST I;REEL/FRAME:049416/0337

Effective date: 20190606