CN103413137B - Based on the interaction gesture movement locus dividing method of more rules - Google Patents
Based on the interaction gesture movement locus dividing method of more rules Download PDFInfo
- Publication number
- CN103413137B CN103413137B CN201310335820.1A CN201310335820A CN103413137B CN 103413137 B CN103413137 B CN 103413137B CN 201310335820 A CN201310335820 A CN 201310335820A CN 103413137 B CN103413137 B CN 103413137B
- Authority
- CN
- China
- Prior art keywords
- point
- tracing
- tracing point
- angle
- vector
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Abstract
The invention discloses a kind of interaction gesture movement locus dividing method based on more rules, first the method takes the video of the interaction gesture of the user of hand-held light pen with video camera, then the movement locus point of gesture is extracted by Camshift method, again movement locus point parametrization is represented, and set constraint condition, definition segmentation rule, finally use Greedy strategy that tracing point is split legal interval: the present invention can provide different dividing methods according to the motion of dissimilar interaction gesture, and adopt Greedy strategy, make result more accurate, efficiency is higher.
Description
Technical field
The present invention relates to computer vision field, particularly relate to a kind of interaction gesture movement locus dividing method based on more rules.
Background technology
At present, some dividing methods based on the movement locus of interaction gesture have been investigated both at home and abroad.But the realization of most methods by simple background or will require that gesture person is with premised on the gloves of special color, adds certain restriction to man-machine interaction.Interaction gesture dividing method based on kinect does not have these restrictions above-mentioned, comparative maturity, but kinect equipment cost is higher, is difficult to be applied to domestic consumer.
Summary of the invention
The object of the invention is to for the deficiencies in the prior art, a kind of interaction gesture movement locus dividing method based on more rules is provided, to overcome the complexity restriction of prior art, and reduced cost.
The object of the invention is to be achieved through the following technical solutions: a kind of interaction gesture movement locus dividing method based on more rules, the method comprises the following steps.
(1) video of the interaction gesture of the user of hand-held light pen is taken with video camera: described light pen is fixed a light-emitting device by written portion and formed.
(2) the movement locus point of gesture is extracted by Camshift method: analyze video, extracted from video by the movement locus of the interaction gesture of user point, this step is realized by following sub-step:
(2.1) whole video is as Search Area.
(2.2) region of the light pen luminous position of first frame is chosen as initialized Searchwindow.
(2.3) the color probability distribution in Searchwindow under HSV color space is calculated.
(2.4) run Meanshift algorithm, obtain position and the size of new Searchwindow.
(2.5) in next frame video image, the position of the value initialization SearchWindow obtained with 2.3 and size.
(2.6) jump to step 2.3, continue to run, until video end frame.
(2.7) calculate the position of the central point of SearchWindow in every two field picture, this position is just as the tracing point of motion.All points are extracted in the same space, just obtains complete movement locus point.
(3) movement locus point parametrization represented, this step is realized by following sub-step:
(3.1) space coordinates are set up.If cameras capture to motion track information be 3D, that is with regard to correspondence establishment three-dimensional system of coordinate; If the motion track information of cameras capture is 2D, in like manner set up three-dimensional system of coordinate, give tacit consent to and initial point is located at lower right-hand corner.
(3.2) correspond in coordinate system by tracing point, each tracing point obtains the coordinate of a positional information.
(3.3), in this tracing point, be set with N number of point, be expressed as successively: P
1(x
1, y
1, z
1), P
2(x
2, y
2, z
2), P
3(x
3, y
3, z
3) ... P
n(x
n, y
n, z
n).Definition P
1(x
1, y
1, z
1) represent starting point, P
n(x
n, y
n, z
n) represent end point.For any one some P
i, in three-dimensional system of coordinate, its coordinate is P
i(x
i, y
i, z
i).P
iwith adjacent some P
i+1the line segment be linked to be P
ip
i+1, by vectorial P
1p
2do
p
ip
i+1be denoted as
length is respectively
calculate
with
between angle, be denoted as P
ivector angle, N is natural number.
By calculating the cosine value of two vectorial angles, obtain corresponding angular dimension:
Wherein,
for vector (x
1, y
1, z
1),
for vector (x
2, y
2, z
2), the cosine value of the angle between two vectors is
P is set
ithe instantaneous velocity of point is P
ip
i+1the average velocity of section,
Speed
i=|P
iP
i+1|/0.125;
(3.4) the positional information coordinate of each tracing point from the off, is inputted one by one.
(3.5) be white by tracing point color settings, tracing point is shown in the coordinate system of window, and calculate the vector angle size relevant with each coordinate points.
(4) constraint condition is set, definition segmentation rule: the instantaneous velocity of tracing point, vector angle or vector angle and instantaneous velocity can be adopted to be combined as constraint condition: when adopting the instantaneous velocity of tracing point to be constraint condition, using the instantaneous velocity of first point in interval as reference, the continuity point in setting amplitude threshold of velocity variations is subsequently divided between same cut section; When adopting vector angle to be constraint condition, using the vector angle of interior first point of cut section and second locus of points as reference, by the vector angle change of tracing point subsequently in setting threshold value and the continuity point meeting angle monotone variation is divided into same interval; When adopting vector angle and instantaneous velocity to be combined into constraint condition, need between a cut section to meet the change condition of vector angle and the change condition of instantaneous velocity simultaneously.
(5) set segmentation rule, use Greedy strategy that tracing point is split legal interval:
When step 4 adopts instantaneous velocity as constraint condition definition segmentation rule, the process of segmentation is as follows:
(A), from the starting point of track, with first tracing point P
iinstantaneous velocity as reference, as the starting point between a cut section, set i=2 simultaneously.
(B), tracing point P is analyzed
ivelocity variations whether in the amplitude threshold variation range of setting:
If the instantaneous velocity of this tracing point meets constraint condition, joined between current cut section, i=i+1, jumps to step B simultaneously.
If this instantaneous velocity, outside the threshold range of setting, does not meet constraint condition, then by this tracing point P
ias the starting point between new cut section, by P
iinstantaneous velocity as new reference point, and previous tracing point P
i-1as the terminal between a upper cut section.If last tracing point is white type track, then set this segmentation starting point new and all tracing points are below sectional line style track; If last tracing point is sectional line style track, then set this segmentation starting point new and all tracing points are below white type track.I=i+1, jumps to step B simultaneously.
(C), above-mentioned steps is performed, until arrive final on trajectory.
When step 4 adopts vector angle as constraint condition definition segmentation rule, the process of segmentation is as follows:
(a), from track starting point, by vectorial P
1p
2as initial reference vector
first tracing point, as the starting point between a cut section, defines i=2 simultaneously.
(b), calculating tracing point P
ivector angle, P
ivector angle i.e. vector
with vectorial P
ip
i+1angle.By calculating this vectorial angle cosine value, obtain corresponding angular dimension.
(c), analysis tracing point P
iwhether meet segmentation rule.Whether its vector angle is within the scope of the changes of threshold of setting, and demand fulfillment monotone variation.
If the vector angle of this tracing point meets constraint condition, and angle previous in the value of vector angle and current cut zone keeps monotone increasing or monotone decreasing, and so joined between current cut section, i=i+1, jumps to step b simultaneously.
If the vector angle of this tracing point beyond threshold range or vector angle still in threshold range, but be discontented with sufficient monotone variation with angle in this cut zone, namely do not meet constraint condition, by this tracing point P
ias the starting point between new cut section, setting P
ip
i+1reference vector between new cut section
and previous tracing point P
i-1as the terminal between a upper cut section.If last tracing point P
i-1for white type track, then set new this segmentation starting point P
iand all tracing points are below sectional line style track; If last tracing point P
i-1for sectional line style track, then set new this segmentation starting point P
iand all tracing points are below white type track.Then i=i+1, jumps to step b.
D () performs above-mentioned steps, until arrive final on trajectory.
The invention has the beneficial effects as follows: the present invention can provide different dividing methods according to the motion of dissimilar interaction gesture, and adopts Greedy strategy, make result more accurate, efficiency is higher.
Accompanying drawing explanation
Fig. 1 is the appearance schematic diagram of the light pen used;
Fig. 2 is the result schematic diagram of the movement locus point that Camshift methods analyst obtains;
Fig. 3 is the segmentation result schematic diagram according to completing after rule and parameter threshold setting;
Fig. 4 is that movement locus catches, extracts the step schematic diagram identified.
Embodiment
Below in conjunction with accompanying drawing, the invention will be further described.
The present invention is based on the interaction gesture movement locus dividing method of more rules, be divided into following concrete steps to complete.
1, the video of the interaction gesture of the user of hand-held light pen is taken with video camera.
The present invention use design easy light pen, light pen as shown in Figure 1, namely add a light-emitting device in common written portion.User has the custom held a pen, use this structure time also conventional for use pen holding posture is operated, and cost is lower.The present invention is before carrying out segmentation step, and relate to and gather and analyze the video of the interaction gesture of user, the equipment gathering video can be the Digital Video PCcamera being connected to computer, or digital camera.
2, the movement locus point of gesture is extracted by Camshift method.
This partial content analyzes video, and extracted from video by the movement locus of the interaction gesture of user point, this steps flow chart as shown in Figure 4.
2.1, whole video is as Search Area.
2.2, the region of the light pen luminous position of first frame is chosen as initialized Searchwindow.
2.3, the color probability distribution in Searchwindow under HSV color space is calculated.
2.4, run Meanshift algorithm, obtain position and the size of new Searchwindow.
2.5, in next frame video image, the position of the value initialization SearchWindow obtained with 2.3 and size.
2.6, jump to step 2.3, continue to run, until video end frame.
2.7, calculate the position of the central point of SearchWindow in every two field picture, this position is just as the tracing point of motion.Extract in the same space by all points, just obtain complete movement locus point, result as shown in Figure 2.
The process of above-mentioned 2.3-2.6, in OpenCV storehouse, the function directly calling Cmashift algorithm can complete.This function is cvCamShift(), only need input initial value and the interior color probability distribution under HSV color space of Searchwindow of SearchWindow, can result be obtained: the Searchwindow of every two field picture.
Step 1 and 2 flow process as shown in Figure 4.
3, movement locus point parametrization is represented.
The movement locus collected is discrete point, if tracing point can not be distinguished, will have a strong impact on follow-up segmentation work.By setting up space coordinates, give tracing point parameter information, to its vital role of follow-up segmentation.Information required for this dividing method is divided into the spatial positional information of tracing point, the temporal information of tracing point.Design parameter process is as following steps.
3.1, space coordinates are set up.If cameras capture to motion track information be 3D, that is with regard to correspondence establishment three-dimensional system of coordinate; If the motion track information of cameras capture is 2D, in like manner set up three-dimensional system of coordinate.Initial point is located at lower right-hand corner by acquiescence.
3.2, correspond in coordinate system by tracing point, each tracing point obtains the coordinate of a positional information.
3.3, in this tracing point, be set with N number of point, be expressed as successively: P
1(x
1, y
1, z
1), P
2(x
2, y
2, z
2), P
3(x
3, y
3, z
3) ... P
n(x
n, y
n, z
n).Definition P
1(x
1, y
1, z
1) represent starting point, P
n(x
n, y
n, z
n) represent end point.For any one some P
i, in three-dimensional system of coordinate, its coordinate is P
i(x
i, y
i, z
i).P
iwith adjacent some P
i+1the line segment be linked to be P
ip
i+1, by vectorial P
1p
2do
p
ip
i+1be denoted as
length is respectively
calculate
with
between angle, be denoted as P
ivector angle.
By calculating the cosine value of two vectorial angles, obtain corresponding angular dimension:
Wherein,
for vector (x
1, y
1, z
1),
for vector (x
2, y
2, z
2), the cosine value of the angle between two vectors is
The PCcamera shooting 8 per second frame of acquiescence, thus every frame be spaced apart 0.125 second, the mistiming of every two tracing points is also 0.125 second.P is set
ithe instantaneous velocity of point is P
ip
i+1the average velocity of section:
Speed
i=|P
iP
i+1|/0.125。
3.4, from the off, the positional information coordinate of each tracing point is inputted one by one.
3.5, be white by tracing point color settings, tracing point is shown in the coordinate system of window, and calculate the vector angle size relevant with each coordinate points.
4, constraint condition is set, definition segmentation rule.
After obtaining the parameter information of movement locus point, according to these parameter informations, interval segmentation is carried out to movement locus with regard to needing.
Formulate constraint condition by the threshold value setting parameters, determine the rule of each segmentation, to make between each cut section in the threshold range of parameter.
In step 3, the coordinate information of tracing point, the instantaneous velocity of tracing point and its relevant vector angle information has been got.
According to the different characteristics of interaction gesture, select different constraint condition.
If when the direction of motion of interaction gesture changes greatly, track is relatively more bending, adopt the setting instantaneous velocity of tracing point as constraint condition.Using the instantaneous velocity of first point in interval as reference, will subsequently velocity variations being divided between same cut section with the continuity point of (be defaulted as 30%, also can by user's sets itself) in a certain amplitude threshold.
If interaction gesture motion amplitude is comparatively large, direction change is less, and vector angle or vector angle and instantaneous velocity can be used to be combined as constraint condition.Using the vector angle of interior first point of cut section and second locus of points as reference, the vector angle change setting tracing point subsequently (give tacit consent to vector angle between a cut section to change at [0 ° at a certain amplitude threshold, 90 °], user can sets itself) in, and meet angle monotone variation, be then divided into same interval.If use vector angle change and instantaneous velocity simultaneously as constraint condition, so need between a cut section to meet the change condition of vector angle and the change condition of instantaneous velocity simultaneously.
An advantage of the method is exactly under same framework, according to the movement characteristic of distinct interaction gesture, can set different dividing methods.
5. set segmentation rule, use Greedy strategy that tracing point is split legal interval.
After having set rule, just can carry out separation calculation to movement locus, this method uses Greedy strategy to carry out the realization split.
Select constraint condition by the feature of interaction gesture, determine that the method for constraint condition describes in detail in step 4.
If when the direction of motion of interaction gesture changes greatly, track is relatively more bending, adopt instantaneous velocity as constraint condition, the process of segmentation is as follows:
A), from the starting point of track, with first tracing point P
iinstantaneous velocity as reference, as the starting point between a cut section, set i=2 simultaneously.
B), tracing point P is analyzed
ivelocity variations whether in the amplitude of setting (being defaulted as in the instantaneous velocity increase and decrease amplitude 30% of reference point) variation range;
If the instantaneous velocity of this tracing point meets constraint condition, joined between current cut section, i=i+1, jumps to step B simultaneously.
If this instantaneous velocity, outside the threshold range of setting, does not meet constraint condition, then by this tracing point P
ias the starting point between new cut section, by P
iinstantaneous velocity as new reference point, and previous tracing point P
i-1as the terminal between a upper cut section.If last tracing point is white type track, then set this segmentation starting point new and all tracing points are below sectional line style track; If last tracing point is sectional line style track, then set this segmentation starting point new and all tracing points are below white type track.I=i+1, jumps to step B simultaneously.
C), above-mentioned steps is performed, until arrive final on trajectory.
If interaction gesture motion amplitude is comparatively large, direction change is less, and the vector angle of tracing point associated can be adopted as constraint condition definition segmentation rule, and the process of segmentation is as follows:
A), from track starting point, by vectorial P
1p
2as initial reference vector
first tracing point, as the starting point between a cut section, defines i=2 simultaneously.
B), tracing point P is calculated
ivector angle, P
ivector angle i.e. vector
with vectorial P
ip
i+1angle.By calculating this vectorial angle cosine value, obtain corresponding angular dimension.
C), tracing point P is analyzed
iwhether meet segmentation rule.In the amplitude of setting, (vector angle give tacit consent between a cut section changes in [0 °, 90 °] variation range its vector angle, and demand fulfillment monotone variation.
If the vector angle of this tracing point meets constraint condition, and angle previous in the value of vector angle and current cut zone keeps monotone increasing or monotone decreasing, and so joined between current cut section, i=i+1, jumps to step b simultaneously.
If the vector angle of this tracing point beyond threshold range or vector angle still in threshold range, but be discontented with sufficient monotone variation with angle in this cut zone, namely do not meet constraint condition, by this tracing point P
ias the starting point between new cut section, setting P
ip
i+1reference vector between new cut section
and previous tracing point P
i-1as the terminal between a upper cut section.If last tracing point P
i-1for white type track, then set new this segmentation starting point P
iand all tracing points are below sectional line style track; If last tracing point P
i-1for sectional line style track, then set new this segmentation starting point P
iand all tracing points are below white type track.Then i=i+1, jumps to step b.
D) above-mentioned steps is performed, until arrive final on trajectory.
With the movement locus of Fig. 2 point as an example, this orbiting motion amplitude is comparatively large, direction change is little, use the constraint condition of vector angle, by vector angle change [0 °, 90 °] as threshold amplitude, and demand fulfillment angle monotone variation as segmentation rule, the step 4 of its correspondence and the flow process of step 5 as follows:
A) starting point vector is P
1(x
1, y
1, z
1), as the starting point between first cut section, by vectorial P
1p
2be denoted as with reference to vector
setting i=2.
B) P is set
ip
i+1be denoted as
C) tracing point P is calculated
ivectorial angle cosine value.
D) P is calculated
ivectorial angle cosine value, convert and obtain the angle of vector angle.
If this corner dimension is within [0 °, 90 °], and between this angle and this cut section, the vector angle of middle pre-track point forms the relation increasing progressively or successively decrease, so P
imeet segmentation rule, by P
ijoin between current cut section, setting i=i+1, and jump to step b.
If do not meet segmentation rule, P
ias the starting point between new cut section, by P
ip
i+1as new with reference to vector
and P
i-1as the terminal between a upper cut section.If P
i-1point is before the locus of points of sectional line style, then by P
i-1point afterwards is all labeled as white type; If P
ipoint is before white type, then by P
ipoint afterwards is all labeled as the locus of points of sectional line style.I=i+1 is set simultaneously, then jumps to step b.
E) last point of tracing point is as terminal, and be now divided into by tracing point between the cut section that white track, sectional line style track are alternate, result as shown in Figure 3.
Claims (1)
1., based on an interaction gesture movement locus dividing method for more rules, it is characterized in that, the method comprises the following steps:
(1) video of the interaction gesture of the user of hand-held light pen is taken with video camera: described light pen is fixed a light-emitting device by written portion and formed;
(2) the movement locus point of gesture is extracted by Camshift method: analyze video, extracted from video by the movement locus of the interaction gesture of user point, this step is realized by following sub-step:
(2.1) whole video is as Search Area;
(2.2) region of the light pen luminous position of first frame is chosen as initialized Searchwindow;
(2.3) the color probability distribution in Searchwindow under HSV color space is calculated;
(2.4) run Meanshift algorithm, obtain position and the size of new Searchwindow;
(2.5) in next frame video image, the position of the value initialization SearchWindow obtained by step (2.3) and size;
(2.6) jump to step (2.3), continue to run, until video end frame;
(2.7) calculate the position of the central point of SearchWindow in every two field picture, this position is just as the tracing point of motion; All points are extracted in the same space, just obtains complete movement locus point;
(3) movement locus point parametrization represented, this step is realized by following sub-step:
(3.1) space coordinates are set up; If cameras capture to motion track information be 3D, that is with regard to correspondence establishment three-dimensional system of coordinate; If the motion track information of cameras capture is 2D, in like manner set up three-dimensional system of coordinate, give tacit consent to and initial point is located at lower right-hand corner;
(3.2) correspond in coordinate system by tracing point, each tracing point obtains the coordinate of a positional information;
(3.3) tracing point of step (3.2) has N number of, is expressed as successively: P
1(x
1, y
1, z
1), P
2(x
2, y
2, z
2), P
3(x
3, y
3, z
3) ... P
n(x
n, y
n, z
n); Definition P
1(x
1, y
1, z
1) represent starting point, P
n(x
n, y
n, z
n) represent end point; For any one some P
i, in three-dimensional system of coordinate, its coordinate is P
i(x
i, y
i, z
i); P
iwith adjacent some P
i+1the line segment be linked to be P
ip
i+1, by vectorial P
1p
2do
p
ip
i+1be denoted as
length is respectively
calculate
with
between angle, be denoted as P
ivector angle, N is natural number;
By calculating the cosine value of two vectorial angles, obtain corresponding angular dimension:
Wherein,
for vector (x
1, y
1, z
1),
for vector (x
2, y
2, z
2), the cosine value of the angle between two vectors is
P is set
ithe instantaneous velocity of point is P
ip
i+1the average velocity of section,
Speed
i=|P
iP
i+
1|/
0.125;
(3.4) the positional information coordinate of each tracing point from the off, is inputted one by one;
(3.5) be white by tracing point color settings, tracing point is shown in the coordinate system of window, and calculate the vector angle size relevant with each coordinate points;
(4) constraint condition is set, definition segmentation rule: the instantaneous velocity of tracing point, vector angle or vector angle and instantaneous velocity can be adopted to be combined as constraint condition: when adopting the instantaneous velocity of tracing point to be constraint condition, using the instantaneous velocity of first point in interval as reference, the continuity point in setting amplitude threshold of velocity variations is subsequently divided between same cut section; When adopting vector angle to be constraint condition, using the vector angle of interior first point of cut section and second locus of points as reference, by the vector angle change of tracing point subsequently in setting threshold value and the continuity point meeting angle monotone variation is divided into same interval; When adopting vector angle and instantaneous velocity to be combined into constraint condition, need between a cut section to meet the change condition of vector angle and the change condition of instantaneous velocity simultaneously;
(5) set segmentation rule, use Greedy strategy that tracing point is split legal interval:
When step (4) adopts instantaneous velocity as constraint condition definition segmentation rule, the process of segmentation is as follows:
(A), from the starting point of track, with first tracing point P
iinstantaneous velocity as reference, as the starting point between a cut section, set i=2 simultaneously;
(B), tracing point P is analyzed
ivelocity variations whether in the amplitude threshold variation range of setting:
If the instantaneous velocity of this tracing point meets constraint condition, joined between current cut section, i=i+1, jumps to step (B) simultaneously;
If the instantaneous velocity of this tracing point, outside the threshold range of setting, does not meet constraint condition, then by this tracing point P
ias the starting point between new cut section, by P
iinstantaneous velocity as new reference point, and previous tracing point P
i-1as the terminal between a upper cut section; If last tracing point is white type track, then set this segmentation starting point new and all tracing points are below sectional line style track; If last tracing point is sectional line style track, then set this segmentation starting point new and all tracing points are below white type track; I=i+1, jumps to step (B) simultaneously;
(C), above-mentioned steps is performed, until arrive final on trajectory;
When step (4) adopts vector angle as constraint condition definition segmentation rule, the process of segmentation is as follows:
(a), from track starting point, by vectorial P
1p
2as initial reference vector
first tracing point, as the starting point between a cut section, defines i=2 simultaneously;
(b), calculating tracing point P
ivector angle, P
ivector angle i.e. vector
with vectorial P
ip
i+1angle; By calculating this vectorial angle cosine value, obtain corresponding angular dimension;
(c), analysis tracing point P
iwhether meet segmentation rule; Whether its vector angle is within the scope of the changes of threshold of setting, and demand fulfillment monotone variation;
If the vector angle of this tracing point meets constraint condition, and angle previous in the value of vector angle and current cut zone keeps monotone increasing or monotone decreasing, so joined between current cut section, i=i+1, jumps to step (b) simultaneously;
If the vector angle of this tracing point beyond threshold range or vector angle still in threshold range, but be discontented with sufficient monotone variation with angle in this cut zone, namely do not meet constraint condition, by this tracing point P
ias the starting point between new cut section, setting P
ip
i+1reference vector between new cut section
and previous tracing point P
i-1as the terminal between a upper cut section; If last tracing point P
i-1for white type track, then set new this segmentation starting point P
iand all tracing points are below sectional line style track; If last tracing point P
i-1for sectional line style track, then set new this segmentation starting point P
iand all tracing points are below white type track; Then i=i+1, jumps to step (b);
D () performs above-mentioned steps, until arrive final on trajectory.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201310335820.1A CN103413137B (en) | 2013-08-05 | 2013-08-05 | Based on the interaction gesture movement locus dividing method of more rules |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201310335820.1A CN103413137B (en) | 2013-08-05 | 2013-08-05 | Based on the interaction gesture movement locus dividing method of more rules |
Publications (2)
Publication Number | Publication Date |
---|---|
CN103413137A CN103413137A (en) | 2013-11-27 |
CN103413137B true CN103413137B (en) | 2016-04-27 |
Family
ID=49606144
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201310335820.1A Active CN103413137B (en) | 2013-08-05 | 2013-08-05 | Based on the interaction gesture movement locus dividing method of more rules |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN103413137B (en) |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104616317B (en) * | 2014-12-12 | 2017-05-24 | 宁波中国科学院信息技术应用研究院 | Video vehicle tracking validity checking method |
KR101754126B1 (en) * | 2015-06-10 | 2017-07-19 | 주식회사 브이터치 | Gesture detection method and apparatus on user-oriented spatial coordinate system |
CN106326811B (en) * | 2015-06-26 | 2019-05-31 | 浙江大学 | A kind of motion profile dividing method based on segmentation reliability |
CN105427346B (en) * | 2015-12-01 | 2018-06-29 | 中国农业大学 | A kind of motion target tracking method and system |
CN108446032B (en) * | 2017-12-28 | 2022-03-08 | 安徽慧视金瞳科技有限公司 | Mouse gesture implementation method in projection interaction system |
CN113160273A (en) * | 2021-03-25 | 2021-07-23 | 常州工学院 | Intelligent monitoring video segmentation method based on multi-target tracking |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6487304B1 (en) * | 1999-06-16 | 2002-11-26 | Microsoft Corporation | Multi-view approach to motion and stereo |
CN101362511A (en) * | 2008-09-19 | 2009-02-11 | 浙江大学 | Synergetic control method of aircraft part pose alignment based on four locater |
-
2013
- 2013-08-05 CN CN201310335820.1A patent/CN103413137B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6487304B1 (en) * | 1999-06-16 | 2002-11-26 | Microsoft Corporation | Multi-view approach to motion and stereo |
CN101362511A (en) * | 2008-09-19 | 2009-02-11 | 浙江大学 | Synergetic control method of aircraft part pose alignment based on four locater |
Also Published As
Publication number | Publication date |
---|---|
CN103413137A (en) | 2013-11-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN103413137B (en) | Based on the interaction gesture movement locus dividing method of more rules | |
CN103353935B (en) | A kind of 3D dynamic gesture identification method for intelligent domestic system | |
CN103135756B (en) | Generate the method and system of control instruction | |
CN103399637B (en) | Based on the intelligent robot man-machine interaction method of kinect skeleton tracing control | |
JP6332281B2 (en) | Information processing apparatus, information processing method, and program | |
US20170192519A1 (en) | System and method for inputting gestures in 3d scene | |
CN102769802A (en) | Man-machine interactive system and man-machine interactive method of smart television | |
US20170116461A1 (en) | Image processing device and image processing method | |
US10127702B2 (en) | Image processing device and image processing method | |
CN101593022A (en) | A kind of quick human-computer interaction of following the tracks of based on finger tip | |
CN104517100B (en) | Gesture pre-judging method and system | |
CN102663364A (en) | Imitated 3D gesture recognition system and method | |
CN104571511A (en) | System and method for reproducing objects in 3D scene | |
CN113269158B (en) | Augmented reality gesture recognition method based on wide-angle camera and depth camera | |
CN102236414A (en) | Picture operation method and system in three-dimensional display space | |
WO2012046432A1 (en) | Information processing apparatus, information processing system and information processing method | |
CN107665507B (en) | Method and device for realizing augmented reality based on plane detection | |
CN109839827B (en) | Gesture recognition intelligent household control system based on full-space position information | |
CN107392098A (en) | A kind of action completeness recognition methods based on human skeleton information | |
CN106251348A (en) | A kind of self adaptation multi thread towards depth camera merges background subtraction method | |
CN113822251B (en) | Ground reconnaissance robot gesture control system and control method based on binocular vision | |
Inoue et al. | Tracking Robustness and Green View Index Estimation of Augmented and Diminished Reality for Environmental Design | |
CN204463031U (en) | System and the virtual reality helmet of object is reappeared in a kind of 3D scene | |
Lee et al. | A Hand gesture recognition system based on difference image entropy | |
CN110442242A (en) | A kind of smart mirror system and control method based on the interaction of binocular space gesture |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant |