CN103413137A - Interaction gesture motion trail partition method based on multiple rules - Google Patents
Interaction gesture motion trail partition method based on multiple rules Download PDFInfo
- Publication number
- CN103413137A CN103413137A CN2013103358201A CN201310335820A CN103413137A CN 103413137 A CN103413137 A CN 103413137A CN 2013103358201 A CN2013103358201 A CN 2013103358201A CN 201310335820 A CN201310335820 A CN 201310335820A CN 103413137 A CN103413137 A CN 103413137A
- Authority
- CN
- China
- Prior art keywords
- point
- tracing
- tracing point
- angle
- vector angle
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Abstract
The invention discloses an interaction gesture motion trail partition method based on multiple rules. According to the method, the video of the interaction gestures of a user holding a light pen with the hands is shot through a camera, then the motion trail points of the gestures are extracted through a Camshift method, the motion trail points are represented in a parameterization mode, constraint conditions are set, partition rules are defined, and finally the trail points are divided into intervals meeting the rules through a greedy strategy. Different partition methods can be provided according to different kinds of interaction gesture motions, the greedy strategy is adopted, the result is made to be more accurate, and efficiency is higher.
Description
Technical field
The present invention relates to computer vision field, relate in particular to a kind of movement locus of interaction gesture based on more rules dividing method.
Background technology
At present, some dividing methods based on the movement locus of interaction gesture have been worked out both at home and abroad.But the realization of most methods will be take simple background or require gesture person to be prerequisite with the gloves of special color, has increased certain restriction to man-machine interaction.Interaction gesture dividing method based on kinect does not have above-mentioned these restrictions, comparative maturity, but the kinect equipment cost is higher, is difficult to be applied to domestic consumer.
Summary of the invention
The object of the invention is to for the deficiencies in the prior art, a kind of movement locus of interaction gesture based on more rules dividing method is provided, to overcome the complex limitation of prior art, and reduced cost.
The objective of the invention is to be achieved through the following technical solutions: a kind of movement locus of interaction gesture based on more rules dividing method, the method comprises the following steps.
(1) with video camera, take the user's of hand-held light pen the video of interaction gesture: described light pen is fixed a light-emitting device by written section and is formed.
(2) by the Camshift method, extract the movement locus point of gesture: analyze video, from video, extracting, this step realizes by following sub-step by the movement locus point of user's interaction gesture:
(2.1) whole video is as Search Area.
(2.2) choose the zone of light pen luminous position of first frame as initialized Search window.
(2.3) calculate the colored probability distribution under the HSV color space in Search window.
(2.4) move the Meanshift algorithm, obtain position and the size of new Search window.
(2.5) in the next frame video image, by position and the size of the 2.3 Search Window of value initialization that obtain.
(2.6) jump to step 2.3, continue operation, until video end frame.
(2.7) calculate the ,Gai position, position of the central point of Search Window in every two field picture just as the tracing point moved.All points are extracted in the same space, just obtain complete movement locus point.
(3) movement locus is put to parametrization and mean, this step realizes by following sub-step:
(3.1) set up space coordinates.If the motion track information that video camera captures is 3D, that is with regard to the correspondence establishment three-dimensional system of coordinate; If the motion track information that video camera catches is 2D, in like manner set up three-dimensional system of coordinate, acquiescence is located at lower right-hand corner by initial point.
(3.2) tracing point is corresponded in coordinate system, each tracing point obtains the coordinate of a positional information.
(3.3) in this tracing point, be set with N point, be expressed as successively: P
1(x
1, y
1, z
1), P
2(x
2, y
2, z
2), P
3(x
3, y
3, z
3) ... P
n(x
n, y
n, z
n).Definition P
1(x
1, y
1, z
1) the expression starting point, P
n(x
n, y
n, z
n) the expression end point.For any one some P
i, in three-dimensional system of coordinate, its coordinate is P
i(x
i, y
i, z
i).P
iWith adjacent some P
I+1The line segment be linked to be P
iP
I+1, by vectorial P
1P
2Do
P
iP
I+1Be denoted as
Length is respectively
Calculate
With
Between angle, be denoted as P
iVector angle, N is natural number.
The cosine value of the angle by calculating two vectors obtains corresponding angular dimension:
Wherein,
For vector (x
1, y
1, z
1),
For vector (x
2, y
2, z
2), the cosine value of the angle between two vectors is
P is set
iThe instantaneous velocity of point is P
iP
I+1The average velocity of section,
Speed
i=|P
iP
i+1|/0.125;
(3.4) from the off, input one by one the positional information coordinate of each tracing point.
(3.5) by the tracing point color settings, be white, tracing point be shown in the coordinate system of window, and calculate the vector angle size relevant with each coordinate points.
(4) set constraint condition, rule is cut apart in definition: can adopt instantaneous velocity, vector angle or the vector angle of tracing point with instantaneous velocity, to be combined as constraint condition: when the instantaneous velocity of employing tracing point is constraint condition, the instantaneous velocity of first point is as reference in interval, and the continuity point of velocity variations in setting amplitude threshold subsequently is divided between same cut section; While adopting vector angle to be constraint condition, in usining between cut section the vector angle of first point and second locus of points is as reference, by the vector angle variation of tracing point subsequently in setting threshold and the continuity point that meets the angle monotone variation be divided into same interval; While adopting vector angle and instantaneous velocity to be combined into constraint condition, between a cut section, need to meet simultaneously the change condition of vector angle and the change condition of instantaneous velocity.
(5) set and cut apart rule, use Greedy strategy that tracing point is cut apart to legal interval:
When step 4 adopted instantaneous velocity rule is cut apart in definition as constraint condition, the process of cutting apart was as follows:
(A), from the starting point of track, with first tracing point P
iInstantaneous velocity as reference, as the starting point between a cut section, set simultaneously i=2.
(B), analyze tracing point P
iVelocity variations whether in the amplitude threshold variation range of setting:
If the instantaneous velocity of this tracing point meets constraint condition, it is joined between current cut section, i=i+1, jump to step B simultaneously.
If this instantaneous velocity, outside the threshold range of setting, does not meet constraint condition, by this tracing point P
iAs the starting point between a new cut section, by P
iInstantaneous velocity as new reference point, and previous tracing point P
I-1As the terminal between a upper cut section.If last tracing point is white type track, setting these new all tracing points of cutting apart starting point and back thereof is the sectional line style track; If last tracing point is the sectional line style track, setting these new all tracing points of cutting apart starting point and back thereof is white type track.I=i+1, jump to step B simultaneously.
(C), carry out above-mentioned steps, until arrive final on trajectory.
When step 4 adopted vector angle rule is cut apart in definition as constraint condition, the process of cutting apart was as follows:
(a), from the track starting point, by vectorial P
1P
2As initial reference vector
First tracing point, as the starting point between a cut section, defines i=2 simultaneously.
(b), calculate tracing point P
iVector angle, P
iThe i.e. vector of vector angle
With vectorial P
iP
I+1Angle.By calculating the cosine value of this vector angle, obtain corresponding angular dimension.
(c), analyze tracing point P
iWhether meet and cut apart rule.Its vector angle whether in the changes of threshold scope of setting, and need to meet monotone variation.
If the vector angle of this tracing point meets constraint condition, and in the value of vector angle and current cut zone, previous angle keeps monotone increasing or monotone decreasing, so it is joined between current cut section, and i=i+1, jump to step b simultaneously.
If the vector angle of this tracing point has exceeded threshold range or vector angle still in threshold range, but with this cut zone in the discontented sufficient monotone variation of angle, namely do not meet constraint condition, by this tracing point P
iAs the starting point between a new cut section, set P
iP
I+1Reference vector between new cut section
And previous tracing point P
I-1As the terminal between a upper cut section.If last tracing point P
I-1For white type track, set new this and cut apart starting point P
iAnd all tracing points of back are the sectional line style track; If last tracing point P
I-1For the sectional line style track, set new this and cut apart starting point P
iAnd all tracing points of back are white type track.Then i=i+1, jump to step b.
(d) carry out above-mentioned steps, until arrive final on trajectory.
The invention has the beneficial effects as follows: the present invention can provide different dividing methods according to dissimilar interaction gesture motion, and adopts Greedy strategy, makes result more accurate, and efficiency is higher.
The accompanying drawing explanation
Fig. 1 is the profile schematic diagram of the light pen of use;
Fig. 2 is the result schematic diagram of the movement locus point that obtains of Camshift methods analyst;
Fig. 3 is the segmentation result schematic diagram completed according to after rule and parameter threshold setting;
Fig. 4 is the step schematic diagram that movement locus caught, extracted identification.
Embodiment
The invention will be further described below in conjunction with accompanying drawing.
The present invention is based on the interaction gesture movement locus dividing method of more rules, be divided into following concrete steps and complete.
1, with video camera, take the user's of hand-held light pen the video of interaction gesture.
The present invention use the design easy light pen, light pen as shown in Figure 1,, namely in common written section, add a light-emitting device.The user has the custom held a pen, use this structure the time also will use conventional pen holding posture to operate, and cost is lower.The present invention, before carrying out segmentation procedure, relates to the video of the interaction gesture of collection and analysis user, and the equipment that gathers video can be the Digital Video PCcamera that is connecting computer, or digital camera.
2, by the Camshift method, extract the movement locus point of gesture.
This partial content is to analyze video, and from video, extracting, this steps flow chart as shown in Figure 4 by the movement locus point of user's interaction gesture.
2.1, whole video is as Search Area.
2.2, choose the zone of light pen luminous position of first frame as initialized Search window.
2.3, calculate the colored probability distribution under the HSV color space in Search window.
2.4, operation Meanshift algorithm, obtain position and the size of new Search window.
2.5, in the next frame video image, by position and the size of the 2.3 Search Window of value initialization that obtain.
2.6, jump to step 2.3, continue operation, until video end frame.
2.7, the ,Gai position, position of calculating the central point of Search Window in every two field picture is just as the tracing point of motion.All points are extracted in the same space, just obtain complete movement locus point, result as shown in Figure 2.
The process of above-mentioned 2.3-2.6, in the OpenCV storehouse, the function that directly calls the Cmashift algorithm can complete.This function is cvCamShift(), only the colored probability distribution under the HSV color space in the initial value of need input Search Window and Search window, can obtain result: the Search window of every two field picture.
Step 1 and 2 flow process are as shown in Figure 4.
3, movement locus being put to parametrization means.
The movement locus collected is discrete point, if tracing point can not be distinguished, will have a strong impact on the follow-up work of cutting apart.By setting up space coordinates, give the tracing point parameter information, to follow-up its vital role of cutting apart.The needed information of this dividing method is divided into the spatial positional information of tracing point, the temporal information of tracing point.Design parameter process such as following steps.
3.1, set up space coordinates.If the motion track information that video camera captures is 3D, that is with regard to the correspondence establishment three-dimensional system of coordinate; If the motion track information that video camera catches is 2D, in like manner set up three-dimensional system of coordinate.Acquiescence is located at lower right-hand corner by initial point.
3.2, tracing point is corresponded in coordinate system, each tracing point obtains the coordinate of a positional information.
3.3, in this tracing point, be set with N point, be expressed as successively: P
1(x
1, y
1, z
1), P
2(x
2, y
2, z
2), P
3(x
3, y
3, z
3) ... P
n(x
n, y
n, z
n).Definition P
1(x
1, y
1, z
1) the expression starting point, P
n(x
n, y
n, z
n) the expression end point.For any one some P
i, in three-dimensional system of coordinate, its coordinate is P
i(x
i, y
i, z
i).P
iWith adjacent some P
I+1The line segment be linked to be P
iP
I+1, by vectorial P
1P
2Do
P
iP
I+1Be denoted as
Length is respectively
Calculate
With
Between angle, be denoted as P
iVector angle.
The cosine value of the angle by calculating two vectors obtains corresponding angular dimension:
Wherein,
For vector (x
1, y
1, z
1),
For vector (x
2, y
2, z
2), the cosine value of the angle between two vectors is
The PCcamera per second of acquiescence is taken 8 frames, thus every frame be spaced apart 0.125 second, the mistiming of every two tracing points is also 0.125 second.P is set
iThe instantaneous velocity of point is P
iP
I+1The average velocity of section:
Speed
i=|P
iP
i+1|/0.125。
3.4, from the off, input one by one the positional information coordinate of each tracing point.
3.5, by the tracing point color settings, be white, tracing point is shown in the coordinate system of window, and calculates the vector angle size relevant with each coordinate points.
4, set constraint condition, rule is cut apart in definition.
After obtaining the parameter information of movement locus point, just need to movement locus, carry out interval cutting apart according to these parameter informations.
By the threshold value of setting parameters, formulate constraint condition, determine the rule of at every turn cutting apart, make between each cut section in the threshold range of parameter.
In step 3, got tracing point coordinate information, tracing point instantaneous velocity with and relevant vector angle information.
According to the different characteristics of interaction gesture, select different constraint condition.
If the direction of motion of interaction gesture changes greatly, track is when more crooked, the setting instantaneous velocity of employing tracing point is as constraint condition.The instantaneous velocity of first point is as reference in interval, and the velocity variations continuity point that (is defaulted as 30%, also can be set up on their own by the user) in a certain amplitude threshold is divided between same cut section subsequently.
If the interaction gesture motion amplitude is larger, direction changes less, can use vector angle or vector angle to be combined as constraint condition with instantaneous velocity.In usining between cut section, the vector angle of first point and second locus of points is as reference, the vector angle of setting tracing point subsequently changes at a certain amplitude threshold (give tacit consent to vector angle variation between a cut section at [0 °, 90 °], the user can set up on their own) in, and meet the angle monotone variation, be divided into same interval.If use vector angle variation and instantaneous velocity simultaneously as constraint condition, between a cut section, need to meet simultaneously the change condition of vector angle and the change condition of instantaneous velocity so.
An advantage of the method is exactly under same framework, can, according to the movement characteristic of distinct interaction gesture, set different dividing methods.
5. set and cut apart rule, use Greedy strategy that tracing point is cut apart to legal interval.
After setting completes rule, just can carry out separation calculation to movement locus, the realization that this method is used Greedy strategy to cut apart.
Characteristics by interaction gesture are selected constraint condition, determine that the method for constraint condition describes in detail in step 4.
If the direction of motion of interaction gesture changes greatly, track is when more crooked, the employing instantaneous velocity is as constraint condition, and the process of cutting apart is as follows:
A), from the starting point of track, with first tracing point P
iInstantaneous velocity as reference, as the starting point between a cut section, set simultaneously i=2.
B), analyze tracing point P
iVelocity variations whether in the amplitude of setting (being defaulted as in the instantaneous velocity increase and decrease amplitude 30% at reference point) variation range;
If the instantaneous velocity of this tracing point meets constraint condition, it is joined between current cut section, i=i+1, jump to step B simultaneously.
If this instantaneous velocity, outside the threshold range of setting, does not meet constraint condition, by this tracing point P
iAs the starting point between a new cut section, by P
iInstantaneous velocity as new reference point, and previous tracing point P
I-1As the terminal between a upper cut section.If last tracing point is white type track, setting these new all tracing points of cutting apart starting point and back thereof is the sectional line style track; If last tracing point is the sectional line style track, setting these new all tracing points of cutting apart starting point and back thereof is white type track.I=i+1, jump to step B simultaneously.
C), carry out above-mentioned steps, until arrive final on trajectory.
If the interaction gesture motion amplitude is larger, direction changes less, can adopt the vector angle of tracing point associated to define and cut apart rule as constraint condition, and the process of cutting apart is as follows:
A), from the track starting point, by vectorial P
1P
2As initial reference vector
First tracing point, as the starting point between a cut section, defines i=2 simultaneously.
B), calculate tracing point P
iVector angle, P
iThe i.e. vector of vector angle
With vectorial P
iP
I+1Angle.By calculating the cosine value of this vector angle, obtain corresponding angular dimension.
C), analyze tracing point P
iWhether meet and cut apart rule.Whether (vector angle of giving tacit consent between a cut section changes in [0 °, 90 °] variation range its vector angle, and need to meet monotone variation in the amplitude of setting.
If the vector angle of this tracing point meets constraint condition, and in the value of vector angle and current cut zone, previous angle keeps monotone increasing or monotone decreasing, so it is joined between current cut section, and i=i+1, jump to step b simultaneously.
If the vector angle of this tracing point has exceeded threshold range or vector angle still in threshold range, but with this cut zone in the discontented sufficient monotone variation of angle, namely do not meet constraint condition, by this tracing point P
iAs the starting point between a new cut section, set P
iP
I+1Reference vector between new cut section
And previous tracing point P
I-1As the terminal between a upper cut section.If last tracing point P
I-1For white type track, set new this and cut apart starting point P
iAnd all tracing points of back are the sectional line style track; If last tracing point P
I-1For the sectional line style track, set new this and cut apart starting point P
iAnd all tracing points of back are white type track.Then i=i+1, jump to step b.
D) carry out above-mentioned steps, until arrive final on trajectory.
With the movement locus point of Fig. 2 as an example, this orbiting motion amplitude is large, direction changes little, use the constraint condition of vector angle, vector angle is changed to [0 °, 90 °] as threshold amplitude, and need to meet the angle monotone variation as cutting apart rule, the flow process of the step 4 that it is corresponding and step 5 is as follows:
A) the starting point vector is P
1(x
1, y
1, z
1), as the starting point between first cut section, by vectorial P
1P
2Be denoted as with reference to vector
Set i=2.
C) calculate tracing point P
iThe cosine value of vector angle.
D) calculate P
iThe cosine value of vector angle, convert and obtain the angle of vector angle.
If this corner dimension is within [0 °, 90 °], and between this angle and this cut section, the vector angle of previous tracing point forms the relation that increases progressively or successively decrease, P so
iMeet and cut apart rule, by P
iIn joining between current cut section, set i=i+1, and jump to step b.
If do not meet, cut apart rule, P
iAs the starting point between new cut section, by P
iP
I+1Conduct new with reference to the vector
And P
I-1As the terminal between a upper cut section.If P
I-1Point before is the locus of points of sectional line style, by P
I-1Point afterwards all is labeled as white type; If P
iPoint before is white type, by P
iPoint afterwards all is labeled as the locus of points of sectional line style.I=i+1 is set simultaneously, then jumps to step b.
E) last point of tracing point, as terminal, now has been divided into tracing point between the cut section that white track, sectional line style track are alternate, and result as shown in Figure 3.
Claims (1)
1. the movement locus of the interaction gesture based on a more rules dividing method, is characterized in that, the method comprises the following steps.
(1) with video camera, take the user's of hand-held light pen the video of interaction gesture: described light pen is fixed a light-emitting device by written section and is formed.
(2) by the Camshift method, extract the movement locus point of gesture: analyze video, from video, extracting, this step realizes by following sub-step by the movement locus point of user's interaction gesture:
(2.1) whole video is as Search Area.
(2.2) choose the zone of light pen luminous position of first frame as initialized Search window.
(2.3) calculate the colored probability distribution under the HSV color space in Search window.
(2.4) move the Meanshift algorithm, obtain position and the size of new Search window.
(2.5) in the next frame video image, by position and the size of the 2.3 Search Window of value initialization that obtain.
(2.6) jump to step 2.3, continue operation, until video end frame.
(2.7) calculate the ,Gai position, position of the central point of Search Window in every two field picture just as the tracing point moved.All points are extracted in the same space, just obtain complete movement locus point.
(3) movement locus is put to parametrization and mean, this step realizes by following sub-step:
(3.1) set up space coordinates.If the motion track information that video camera captures is 3D, that is with regard to the correspondence establishment three-dimensional system of coordinate; If the motion track information that video camera catches is 2D, in like manner set up three-dimensional system of coordinate, acquiescence is located at lower right-hand corner by initial point.
(3.2) tracing point is corresponded in coordinate system, each tracing point obtains the coordinate of a positional information.
(3.3) in this tracing point, be set with N point, be expressed as successively: P
1(x
1, y
1, z
1), P
2(x
2, y
2, z
2), P
3(x
3, y
3, z
3) ... P
n(x
n, y
n, z
n).Definition P
1(x
1, y
1, z
1) the expression starting point, P
n(x
n, y
n, z
n) the expression end point.For any one some P
i, in three-dimensional system of coordinate, its coordinate is P
i(x
i, y
i, z
i).P
iWith adjacent some P
I+1The line segment be linked to be P
iP
I+1, by vectorial P
1P
2Do
P
iP
I+1Be denoted as
Length is respectively
Calculate
With
Between angle, be denoted as P
iVector angle, N is natural number.
The cosine value of the angle by calculating two vectors obtains corresponding angular dimension:
Wherein,
For vector (x
1, y
1, z
1),
For vector (x
2, y
2, z
2), the cosine value of the angle between two vectors is
P is set
iThe instantaneous velocity of point is P
iP
I+1The average velocity of section,
Speed
i=|P
iP
i+1|/0.125;
(3.4) from the off, input one by one the positional information coordinate of each tracing point.
(3.5) by the tracing point color settings, be white, tracing point be shown in the coordinate system of window, and calculate the vector angle size relevant with each coordinate points.
(4) set constraint condition, rule is cut apart in definition: can adopt instantaneous velocity, vector angle or the vector angle of tracing point with instantaneous velocity, to be combined as constraint condition: when the instantaneous velocity of employing tracing point is constraint condition, the instantaneous velocity of first point is as reference in interval, and the continuity point of velocity variations in setting amplitude threshold subsequently is divided between same cut section; While adopting vector angle to be constraint condition, in usining between cut section the vector angle of first point and second locus of points is as reference, by the vector angle variation of tracing point subsequently in setting threshold and the continuity point that meets the angle monotone variation be divided into same interval; While adopting vector angle and instantaneous velocity to be combined into constraint condition, between a cut section, need to meet simultaneously the change condition of vector angle and the change condition of instantaneous velocity.
(5) set and cut apart rule, use Greedy strategy that tracing point is cut apart to legal interval:
When step 4 adopted instantaneous velocity rule is cut apart in definition as constraint condition, the process of cutting apart was as follows:
(A), from the starting point of track, with first tracing point P
iInstantaneous velocity as reference, as the starting point between a cut section, set simultaneously i=2.
(B), analyze tracing point P
iVelocity variations whether in the amplitude threshold variation range of setting:
If the instantaneous velocity of this tracing point meets constraint condition, it is joined between current cut section, i=i+1, jump to step B simultaneously.
If this instantaneous velocity, outside the threshold range of setting, does not meet constraint condition, by this tracing point P
iAs the starting point between a new cut section, by P
iInstantaneous velocity as new reference point, and previous tracing point P
I-1As the terminal between a upper cut section.If last tracing point is white type track, setting these new all tracing points of cutting apart starting point and back thereof is the sectional line style track; If last tracing point is the sectional line style track, setting these new all tracing points of cutting apart starting point and back thereof is white type track.I=i+1, jump to step B simultaneously.
(C), carry out above-mentioned steps, until arrive final on trajectory.
When step 4 adopted vector angle rule is cut apart in definition as constraint condition, the process of cutting apart was as follows:
(a), from the track starting point, by vectorial P
1P
2As initial reference vector
First tracing point, as the starting point between a cut section, defines i=2 simultaneously.
(b), calculate tracing point P
iVector angle, P
iThe i.e. vector of vector angle
With vectorial P
iP
I+1Angle.By calculating the cosine value of this vector angle, obtain corresponding angular dimension.
(c), analyze tracing point P
iWhether meet and cut apart rule.Its vector angle whether in the changes of threshold scope of setting, and need to meet monotone variation.
If the vector angle of this tracing point meets constraint condition, and in the value of vector angle and current cut zone, previous angle keeps monotone increasing or monotone decreasing, so it is joined between current cut section, and i=i+1, jump to step b simultaneously.
If the vector angle of this tracing point has exceeded threshold range or vector angle still in threshold range, but with this cut zone in the discontented sufficient monotone variation of angle, namely do not meet constraint condition, by this tracing point P
iAs the starting point between a new cut section, set P
iP
I+1Reference vector between new cut section
And previous tracing point P
I-1As the terminal between a upper cut section.If last tracing point P
I-1For white type track, set new this and cut apart starting point P
iAnd all tracing points of back are the sectional line style track; If last tracing point P
I-1For the sectional line style track, set new this and cut apart starting point P
iAnd all tracing points of back are white type track.Then i=i+1, jump to step b.
(d) carry out above-mentioned steps, until arrive final on trajectory.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201310335820.1A CN103413137B (en) | 2013-08-05 | 2013-08-05 | Based on the interaction gesture movement locus dividing method of more rules |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201310335820.1A CN103413137B (en) | 2013-08-05 | 2013-08-05 | Based on the interaction gesture movement locus dividing method of more rules |
Publications (2)
Publication Number | Publication Date |
---|---|
CN103413137A true CN103413137A (en) | 2013-11-27 |
CN103413137B CN103413137B (en) | 2016-04-27 |
Family
ID=49606144
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201310335820.1A Active CN103413137B (en) | 2013-08-05 | 2013-08-05 | Based on the interaction gesture movement locus dividing method of more rules |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN103413137B (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104616317A (en) * | 2014-12-12 | 2015-05-13 | 宁波中国科学院信息技术应用研究院 | Video vehicle tracking validity checking method |
CN105427346A (en) * | 2015-12-01 | 2016-03-23 | 中国农业大学 | Motion target tracking method and system |
CN106326811A (en) * | 2015-06-26 | 2017-01-11 | 浙江大学 | Segmentation credibility-based motion track segmentation method |
CN107787497A (en) * | 2015-06-10 | 2018-03-09 | 维塔驰有限公司 | Method and apparatus for the detection gesture in the space coordinates based on user |
CN108446032A (en) * | 2017-12-28 | 2018-08-24 | 安徽慧视金瞳科技有限公司 | A kind of mouse gestures implementation method in projection interactive system |
CN113160273A (en) * | 2021-03-25 | 2021-07-23 | 常州工学院 | Intelligent monitoring video segmentation method based on multi-target tracking |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6487304B1 (en) * | 1999-06-16 | 2002-11-26 | Microsoft Corporation | Multi-view approach to motion and stereo |
CN101362511A (en) * | 2008-09-19 | 2009-02-11 | 浙江大学 | Synergetic control method of aircraft part pose alignment based on four locater |
-
2013
- 2013-08-05 CN CN201310335820.1A patent/CN103413137B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6487304B1 (en) * | 1999-06-16 | 2002-11-26 | Microsoft Corporation | Multi-view approach to motion and stereo |
CN101362511A (en) * | 2008-09-19 | 2009-02-11 | 浙江大学 | Synergetic control method of aircraft part pose alignment based on four locater |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104616317A (en) * | 2014-12-12 | 2015-05-13 | 宁波中国科学院信息技术应用研究院 | Video vehicle tracking validity checking method |
CN104616317B (en) * | 2014-12-12 | 2017-05-24 | 宁波中国科学院信息技术应用研究院 | Video vehicle tracking validity checking method |
CN107787497A (en) * | 2015-06-10 | 2018-03-09 | 维塔驰有限公司 | Method and apparatus for the detection gesture in the space coordinates based on user |
CN107787497B (en) * | 2015-06-10 | 2021-06-22 | 维塔驰有限公司 | Method and apparatus for detecting gestures in a user-based spatial coordinate system |
CN106326811A (en) * | 2015-06-26 | 2017-01-11 | 浙江大学 | Segmentation credibility-based motion track segmentation method |
CN106326811B (en) * | 2015-06-26 | 2019-05-31 | 浙江大学 | A kind of motion profile dividing method based on segmentation reliability |
CN105427346A (en) * | 2015-12-01 | 2016-03-23 | 中国农业大学 | Motion target tracking method and system |
CN108446032A (en) * | 2017-12-28 | 2018-08-24 | 安徽慧视金瞳科技有限公司 | A kind of mouse gestures implementation method in projection interactive system |
CN108446032B (en) * | 2017-12-28 | 2022-03-08 | 安徽慧视金瞳科技有限公司 | Mouse gesture implementation method in projection interaction system |
CN113160273A (en) * | 2021-03-25 | 2021-07-23 | 常州工学院 | Intelligent monitoring video segmentation method based on multi-target tracking |
Also Published As
Publication number | Publication date |
---|---|
CN103413137B (en) | 2016-04-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN103413137B (en) | Based on the interaction gesture movement locus dividing method of more rules | |
CN103353935B (en) | A kind of 3D dynamic gesture identification method for intelligent domestic system | |
CN106598227B (en) | Gesture identification method based on Leap Motion and Kinect | |
CN103135756B (en) | Generate the method and system of control instruction | |
CN204463032U (en) | System and the virtual reality helmet of gesture is inputted in a kind of 3D scene | |
KR101700817B1 (en) | Apparatus and method for multiple armas and hands detection and traking using 3d image | |
CN104517100B (en) | Gesture pre-judging method and system | |
CN106658023A (en) | End-to-end visual odometer and method based on deep learning | |
CN105068748A (en) | User interface interaction method in camera real-time picture of intelligent touch screen equipment | |
CN104834887B (en) | Move pedestrian's representation method, recognition methods and its device | |
CN103530892A (en) | Kinect sensor based two-hand tracking method and device | |
CN103399637A (en) | Man-computer interaction method for intelligent human skeleton tracking control robot on basis of kinect | |
CN101593022A (en) | A kind of quick human-computer interaction of following the tracks of based on finger tip | |
CN104571511A (en) | System and method for reproducing objects in 3D scene | |
CN105138990A (en) | Single-camera-based gesture convex hull detection and palm positioning method | |
CN102236414A (en) | Picture operation method and system in three-dimensional display space | |
CN103237155B (en) | The tracking of the target that a kind of single-view is blocked and localization method | |
CN103995595A (en) | Game somatosensory control method based on hand gestures | |
CN109839827B (en) | Gesture recognition intelligent household control system based on full-space position information | |
TWI479430B (en) | Gesture identification with natural images | |
CN106251348A (en) | A kind of self adaptation multi thread towards depth camera merges background subtraction method | |
CN204463031U (en) | System and the virtual reality helmet of object is reappeared in a kind of 3D scene | |
Inoue et al. | Tracking Robustness and Green View Index Estimation of Augmented and Diminished Reality for Environmental Design | |
CN102968615A (en) | Three-dimensional somatic data identification method with anti-interference function in intensive people flow | |
CN202815864U (en) | Gesture identification system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant |