US20110018822A1 - Gesture recognition method and touch system incorporating the same - Google Patents
Gesture recognition method and touch system incorporating the same Download PDFInfo
- Publication number
- US20110018822A1 US20110018822A1 US12/775,838 US77583810A US2011018822A1 US 20110018822 A1 US20110018822 A1 US 20110018822A1 US 77583810 A US77583810 A US 77583810A US 2011018822 A1 US2011018822 A1 US 2011018822A1
- Authority
- US
- United States
- Prior art keywords
- gesture
- plate surface
- touch system
- variation
- recognition method
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0428—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by sensing at the edges of the touch surface the interruption of optical paths, e.g. an illumination plane, parallel to the touch surface which may be virtual
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
Definitions
- This invention generally relates to a touch system and, more particularly, to a gesture recognition method and a touch system incorporating the same.
- FIGS. 1 a and 1 b show operational schematic diagrams of a conventional touch system 9 , which includes a touch plate 90 and at least two cameras 91 and 92 .
- Field of views of the cameras 91 and 92 encompass the whole touch plate 90 for capturing images looking across a surface of the touch plate 90 .
- the cameras 91 and 92 respectively capture image windows W 91 and W 92 containing a shadow I 81 associated with the tip of the finger 81 .
- a processing unit calculates two dimensional coordinates of the finger 81 contacting the touch plate 90 according to one dimensional positions of the shadow I 81 associated with the tip of the finger 81 . In this manner, the position and displacement of the finger 81 relative to the touch plate 90 can be obtained and the processing unit may accordingly control a display to execute corresponding operations according to the variation of the two dimensional coordinates of the finger 81 .
- image windows W 91 ′ and W 92 ′ respectively captured by the cameras 91 and 92 contain shadows I 81 and I 82 associated with the two fingers 81 and 82 .
- the processing unit respectively calculates two dimensional coordinates of the two fingers 81 and 82 relative to the touch plate 90 according to one dimensional positions of the shadows I 81 and I 82 contained in the image windows W 91 ′ and W 92 ′ and recognizes the gesture according to a variation of the coordinates of the two fingers 81 and 82 .
- the operating method of the touch system 9 is to calculate two dimensional coordinates of a finger contacting the touch plate 90 according to one dimensional positions of the shadow associated with the finger tip in each image window, when a user touches the touch plate 90 with a plurality of fingers, e.g. fingers 81 and 82 , the finger 82 will block the finger 81 with respect to the camera 92 as shown in FIG. 1 b , the image window W 92 ′ captured by the camera 92 may not contain the shadows of all fingers. Accordingly, it is not possible to correctly calculate the two dimensional coordinates of every finger in some circumstances. Although this problem can be solved by installing additional cameras, the system cost will be increased at the same time.
- the present invention further provides a gesture recognition method and a touch system incorporating the same so as to solve the problems existed in the above mentioned conventional touch system.
- the present invention provides a gesture recognition method and a touch system incorporating the same that may perform mode switching according to a contact state variation of a single finger on a plate.
- the present invention provides a gesture recognition method for a touch system including the steps of: capturing images looking across a plate surface with at least one image sensor; processing the images to determine a contact state variation of a single pointer on the plate surface; and recognizing whether a relative variation between the single pointer and the plate surface matches a predetermined gesture when the contact state variation is larger than a threshold.
- the present invention further provides a gesture recognition method for a touch system including the steps of: capturing images looking across a plate surface with at least one image sensor; processing the images to detect a contact point of a single pointer on the plate surface; and recognizing whether a contact of the single pointer on the plate surface matches a predetermined gesture according to a state variation and a position change of the contact point.
- the present invention further provides a touch system including a plate, at least one light source, at least one image sensor and a processing unit.
- the plate has a plate surface.
- the light source is disposed on the plate surface.
- the image sensor captures image windows, looking across the plate surface, containing a shadow of a single pointer blocking the light source.
- the processing unit recognizes whether a width variation or an area variation of the shadow in the image windows is larger than a threshold and recognizes whether a position change of the single pointer on the plate surface matches a predetermined gesture when the width variation or the area variation is larger than the threshold.
- the touch system in the first mode may control the motion of a cursor according to a coordinate variation (or a position change) of a pointer; in the second mode the touch system may update pictures presented on an image display according to the coordinate variation (or the position change) of the pointer, e.g. updating the pictures to present object select, screen scroll, object drag, object zoom in/out or object rotate, wherein the object may be an icon or a window.
- gesture recognition method of the present invention may perform gesture recognition according to a single pointer, miscalculation of the coordinates of a plurality of pointers from blocking each other can then be avoided.
- FIG. 1 a shows an operational schematic diagram of a conventional touch system.
- FIG. 1 b shows another operational schematic diagram of the touch system shown in FIG. 1 a.
- FIG. 2 a shows a block diagram of the touch system in accordance with an embodiment of the present invention.
- FIG. 2 b shows a schematic diagram of a partial field of view of the image sensor shown in FIG. 2 a and an image window captured thereby.
- FIG. 3 shows an upper view of the touch system according to the first embodiment of the present invention.
- FIG. 4 a shows an operational schematic diagram of the touch system according to the first embodiment of the present invention.
- FIG. 4 b shows a schematic diagram of an image window captured by the image sensor shown in FIG. 4 a.
- FIG. 5 a shows a block diagram of the touch system according to the second embodiment of the present invention.
- FIG. 5 b shows a schematic diagram of image windows respectively captured by the two image sensors shown in FIG. 5 a.
- FIG. 6 a - 6 c show operational schematic diagrams of the first mode of the touch system according to the embodiments of the present invention.
- FIG. 7 a - 7 c show operational schematic diagrams of the second mode of the touch system according to the embodiments of the present invention.
- FIG. 8 a - 8 c show schematic diagrams of different gestures corresponding to the touch system according to the embodiments of the present invention.
- FIG. 2 a shows a block diagram of the touch system 10 in accordance with an embodiment of the present invention
- FIG. 2 b shows a schematic diagram of a partial field of view of the image sensor 13 and an image window 20 captured by the image sensor 13 shown in FIG. 2 a.
- the touch system 10 includes a plate 100 , an illumination unit 11 , a first light source 121 , a second light source 122 , an image sensor 13 , a processing unit 14 and an image display 15 .
- the plate 100 includes a first side 100 a , a second side 100 b , a third side 100 c , a fourth side 100 a and a plate surface 100 s .
- Embodiments of the plate 100 include a white board and a touch screen.
- the plate surface 100 s is served as the input area of the touch system 10 .
- the illumination unit 11 is disposed at the first side 100 a on the plate surface 100 s .
- the illumination unit 11 may be an active light source or a passive light source. When the illumination unit 11 is an active light source, it is preferably a linear light source. When the illumination unit 11 is a passive light source, it is configured to reflect the light from other light sources, e.g. the first light source 121 and second light source 122 , and the illumination unit 11 further includes a reflecting surface 11 a facing the third side 100 c of the plate 100 , wherein the reflecting surface 11 a may be made of proper materials.
- the first light source 121 is disposed at the second side 100 b on the plate surface 100 s and preferably illuminates toward the fourth side 100 d of the plate 100 .
- the second light source 100 s is disposed at the third side 100 c on the plate surface 100 s and preferably illuminates toward the first side 100 a of the plate 100 , wherein the first light source 121 and the second light source 122 are preferably active light sources, for example, but not limited to, linear light sources.
- the image sensor 13 is preferably disposed at one corner of the plate 100 , for example at the corner intersected by the second light source 122 and the fourth side 100 d of the plate 100 in this embodiment, and the illumination unit 11 is disposed at a side, which is not adjacent to the image sensor 13 , on the plate surface 100 s .
- the image sensor 13 captures images looking across the plate surface 100 s and encompassing a space defined by the illumination unit 11 , first light source 121 , second light source 122 and fourth side 100 d of the plate 100 .
- a pointer e.g. a finger 81
- a field of view of the image sensor 13 will exist the tip of the finger 81 as shown in the upper part of FIG.
- the image sensor 13 may successively capture image windows 20 containing a shadow I 81 that is formed by the tip of finger 81 blocking the illumination unit 11 or the light source 121 as shown in the lower part of FIG. 2 b.
- Embodiments of the image sensor 13 include, but not limited to, a CCD image sensor and a COMS image sensor. It is appreciated that the pointer may be replaced by other proper object and it is not limited to a finger.
- the processing unit 14 is coupled to the image sensor 13 and configured to process the images captured by the image sensor 13 so as to recognize a width variation or an area variation of a shadow associated with a finger to accordingly control the touch system 10 to operate in a first mode or a second mode.
- the processing unit 14 When the processing unit 14 recognizes that a pointer contacts the plate surface 100 s , it activates the touch system 10 to operate in a first mode; at this moment, the processing unit 14 calculates a two dimensional coordinate of the pointer contacting the plate surface 100 s according to the position of the shadow associated with the pointer in the image window 20 , and controls the motion of a cursor shown on the image display 15 according to a variation of the two dimensional coordinates obtained from successive image windows; wherein the two dimensional coordinates of the plate surface 100 s may correspond to the position coordinates of a display screen 150 of the image display 15 .
- the processing unit 14 When the processing unit 14 recognizes the width variation or the area variation, which may become larger or smaller, of shadow associated with the pointer exceeds a threshold, it controls the touch system 10 to operate in a second mode; at this moment, the processing unit 14 calculates a two dimensional coordinate of the pointer contacting the plate surface 100 s according to the position of the shadow associated with the pointer in the image window 20 , performs gesture recognition according to a variation of the two dimensional coordinates between successive image windows, and controls the update of pictures presented on an image display 15 according to the recognized gesture, e.g. controlling the image display to present object select, screen scroll, object drag, object zoom in/out or object rotate, and details thereof will be illustrated hereinafter.
- the sensitivity of switching between the first mode and second mode may be adjusted by dynamically adjusting the threshold; wherein a larger threshold corresponds to a lower sensitivity whereas a lower threshold corresponds to a higher sensitivity.
- the plate 100 is separated from the image display 15 but it is not a limitation of the present invention.
- the plate 100 may be integrated on the screen 150 of the image display 15 .
- the screen 150 of the image display 15 may also be served as the plate 100 , and the illumination unit 11 , the first light source 121 , the second light source 122 and the image sensor 13 are disposed on the surface of the screen 150 .
- the plate 100 is shown as a rectangle and the illumination unit 11 , the first light source 121 and the second light source 122 are perpendicularly disposed at three sides on the plate 100 in FIG. 2 a , they are only exemplary and not a limitation of the present invention.
- the plate 100 may be formed in other shapes; the illumination unit 11 , the first light source 121 , the second light source 122 and the image sensor 13 may be disposed in other spatial relationships on the plate surface 100 s.
- the illumination unit 11 is a passive light source (e.g. a reflecting component) and has a reflecting surface 11 a facing the third side 100 c of the plate 100 .
- the first light source 121 may map a second mirror image 121 ′ relative to the reflecting surface 11 a ; the second light source 122 may map a third mirror image 122 ′ relative to the reflecting surface 11 a ; and the fourth side 100 d of the plate 100 may map a fourth mirror image 100 d ′ relative to the reflecting surface 11 a , wherein the illumination unit 11 , the first light source 121 , the second light source 122 and the fourth side 100 d of the plate 100 together define a real space RS; and the illumination unit 11 , the second mirror image 121 ′, the third mirror image 122 ′ and the fourth mirror image 100 d ′ together define a virtual space IS.
- the image sensor 13 is disposed at the corner intersected by the second light source 122 and the fourth side 100 d of the plate 100 .
- a field of view VA of the image sensor 13 is looking across the plate surface 100 s and encompasses the real space RS and the virtual space IS, and the image sensor 13 is configured to capture image windows containing a shadow associated with a pointer, e.g. a finger 81 , inside the real space RS, wherein the shadow is formed by the pointer blocking the light source 121 and the illumination unit 11 .
- the image sensor 13 further includes a lens (or lens set) for adjusting the field of view VA of the image sensor 13 to allow the image sensor 13 to be able to capture a complete image encompassing the real space RS and the virtual space IS.
- FIG. 4 a shows an operational schematic diagram of the touch system 10 according to the first embodiment of the present invention
- FIG. 4 b shows a schematic diagram of an image window 20 captured by the image sensor 13 shown in FIG. 4 a .
- a pointer e.g. a finger 81
- the pointer maps a first mirror image in the virtual space IS, shown by a contact point T 81 ′ herein, relative to the reflecting surface 11 a of the illumination unit 11 (i.e. a reflecting component in this embodiment).
- the image sensor 13 captures an image of the tip of the pointer through the first sensing route R 81 such that a shadow I 81 will exist in the image window 20 ; it also captures an image of the first mirror image through the second sensing route R 81 ′ such that a shadow I 81 ′ will exist in the image window 20 as shown in FIG. 4 b .
- relative relationships between a one dimensional position of a shadow in the image window 20 and an angle between a sensing route and the third side 100 c of the plate 100 are pre-stored in the processing unit 14 .
- the processing unit 14 may respectively obtain a first angle A 81 and a second angle A 81 ′ according to one dimensional positions of the shadows I 81 , I 81 ′ in the image window 20 .
- the processing unit 14 may obtain a two dimensional coordinate of the contact point T 81 that the pointer contacts with the plate surface 100 s.
- the plate surface 100 s forms a Cartesian coordinate system, wherein the third side 100 c is served as an X-axis, the fourth side is served as a Y-axis and a location of the image sensor 13 is served as an original point of the Cartesian coordinate system. Therefore, the coordinate of a contact point T 81 in the Cartesian coordinate system may be represented as (a distance to the fourth side 100 d , a distance to the third side 100 c ). In addition, the distance D 1 between the first side 100 a and the third side 100 c of the plate 100 may be pre-stored in the processing unit 14 .
- FIG. 5 a shows a block diagram of the touch system 10 ′ according to the second embodiment of the present invention
- FIG. 5 b shows a schematic diagram of image windows captured by the two image sensors 13 , 13 ′ shown in FIG. 5 a .
- the illumination unit 11 ′ herein is an active light source and the touch system 10 ′ includes two image sensors 13 and 13 ′.
- the touch system 10 ′ includes a plate 100 , an illumination unit 11 ′, a first light source 121 , a second light source 122 , two image sensors 13 , 13 ′ and a processing unit 14 .
- the illumination unit 11 ′ is disposed at the first side 100 a on the plate surface 100 s and preferably illuminates toward the third side 100 c of the plate 100 .
- the first light source 121 is disposed at the second side 100 b on the plate surface 100 s and preferably illuminates toward the fourth side 100 d of the plate 100 .
- the second light source 122 is disposed at the fourth side 100 d on the plate surface 100 s and preferably illuminates toward the second side 100 b of the plate 100 .
- the image sensor 13 is disposed at the intersection of the third side 100 c and the fourth side 100 d of the plate 100 and the field of view thereof is looking across the plate surface 100 s .
- the image sensor 13 ′ is disposed at the intersection of the second side 100 b and the third side 100 c of the plate 100 and the field of view thereof is looking across the plate surface 100 s .
- a pointer e.g. a finger 81
- the image sensor 13 captures an image window W 13 containing a shadow I 81 associated with the tip of finger 81
- the image sensor 13 ′ captures an image window I 81 ′ containing a shadow I 81 ′ associated with the tip of the finger 81 .
- the touch system 10 ′ may also include an image display (not shown) coupled to the processing unit 14 .
- the processing unit 14 is coupled to the image sensors 13 and 13 ′ for processing images captured by the image sensors 13 and 13 ′ to recognize a width variation or an area variation of the shadows I 81 , I 81 ′ associated with a pointer so as to accordingly control the touch system 10 ′ to operate in a first mode or a second mode.
- the processing unit 14 When the processing unit 14 recognizes that a pointer contacts with the plate surface 100 s , it activates the touch system 10 ′ to operate in the first mode; at this moment, the processing unit 14 calculates a two dimensional coordinate of the pointer contacting the plate surface 100 s according to the positions of the shadows I 81 , I 81 ′ associated with the pointer in the image windows W 13 and W 13 ′, and controls the motion of a cursor shown on an image display according to a variation of the two dimensional coordinates obtained from successive image windows W 13 and W 13 ′.
- the processing unit 14 When the processing unit 14 recognizes that the width variation or the area variation of the shadows I 81 , I 81 ′ associated with the pointer exceeds a threshold, it controls the touch system 10 ′ to operate in a second mode; at this moment, the processing unit 14 calculates a two dimensional coordinate of the pointer contacting the plate surface 100 s according to positions of the shadows associated with the pointer in the image windows W 13 and W 13 ′, performs gesture recognition according to a variation of the two dimensional coordinates obtained from successive image windows W 13 and W 13 ′, and controls the update of pictures presented on an image display according to the recognized gesture, e.g. controlling the image display to present object select, screen scroll, object zoom in/out, object drag or object rotate.
- the calculation of the two dimensional coordinates may also be performed through triangulation and details of the calculation is similar to that illustrated in first embodiment and thus will not be repeated again.
- FIGS. 2 a and 6 a - 6 c when a user contacts the plate surface 100 s with a pointer, e.g. a finger 81 , the image sensor 13 captures the shadow I 81 associated with the tip of finger 81 to generate an image window 20 , wherein a width of the shadow I 81 in the image window 20 is assumed to be L.
- the processing unit 14 After the processing unit 14 recognizes a contact event, it activates the touch system 10 and controls the touch system 10 to enter a first mode.
- the processing unit 14 calculates two dimensional coordinates of the finger 81 contacting the plate surface 100 s according to the position of the shadow I 81 in the image window 20 , and controls the motion of a cursor shown on the image display 15 according to a variation of the two dimensional coordinates as shown in FIG. 6 b.
- the user may directly contact a position upon an object O with his/her finger so as to activate the touch system 10 as shown in FIG. 6 c .
- the processing unit 14 also calculates a two dimensional coordinate of the finger 81 relative to the plate surface 100 s according to the position of the shadow I 81 in the image window 20 .
- the processing unit 14 recognizes that the width variation of the shadow exceeds a threshold, e.g. L′/L or
- the area variation of the shadow may be obtained according to the absolute value of a difference or the percentage of the contact areas of two contact states. That is, the threshold may be a variation percentage or a variation of the width or the area of the shadow associated with the pointer.
- the processing unit 14 also calculates a two dimensional coordinate of the finger 81 relative to the plate surface 100 s according to a position of the shadow I 81 in the image window 20 , and then compares a variation of the two dimensional coordinates with gesture data pre-stored in the processing unit 14 to perform gesture recognition. That is, in the second mode the coordinate variation obtained by the processing unit 14 is not used to control the motion of the cursor 151 , it is used to recognize the gesture of a user performed so as to execute predetermined operations, e.g. object select, screen scroll, object drag, object zoom in/out and object rotate, but the present invention is not limited to these operations.
- the object mentioned herein may be an icon or a window.
- a user may change the width or the area of the shadow I 81 for a predetermined period of time, for example, but not limited to, one second, wherein during mode switching the finger 81 may be steady or is moving on the plate surface 100 s.
- FIGS. 6 a - 8 c relationships between gestures performed by a user and operation functions will be illustrated hereinafter. It should be understood that the relationships between gestures and operation functions described below are exemplary and not the limitation of the present invention.
- a user When the plate 100 is a white board, a user firstly contacts the plate surface 100 s with a pointer to activate the touch system 10 and controls the touch system 10 to enter a first mode. Then, the user controls a cursor 151 to upon an object O to be selected by changing a relative position of the finger on the plate surface 100 s as shown in FIG. 6 b . Next, the user changes a contact state of the finger 81 on the plate surface 100 s , as shown in FIG. 7 a , so as to control the touch system 10 to enter a second mode. At this moment, the object may be shown with characteristic change as shown in FIG. 7 b , e.g. color change or line width change, representing the object is selected.
- characteristic change as shown in FIG. 7 b , e.g. color change or line width change, representing the object is selected.
- the user contacts the plate surface 100 s upon the object O to activate the touch system 10 as shown in FIG. 6 c . Then, the user changes a contact state of the finger 81 on the plate surface 100 s so as to have the touch system 10 enter a second mode to select the object O′ as shown in FIG. 7 c.
- a user first contacts the plate surface 100 s with his/her finger to activate the touch system 10 and to control the touch system 10 to enter a first mode as shown in FIG. 6 a or 7 a . Then, the user changes a contact state of the finger 81 on the plate surface 100 s , e.g. from the state shown in FIG. 6 a to FIG. 7 a or from the state shown in FIG. 7 a to FIG. 6 a , for a predetermined period of time to have the touch system 10 enter a second mode.
- the processing unit 14 detects the finger 81 to move upward, downward, leftward or rightward with respect to the plate surface 100 s as shown in FIG. 8 a , it recognizes that the user is performing a scroll gesture.
- the processing unit 14 controls the image display 15 to update its screen 150 to present corresponding pictures.
- a user first contacts the plate surface 100 s with his/her finger 81 to activate the touch system 10 and to control the touch system 10 to enter a first mode. Then the user controls a cursor 151 to upon an object O to be selected by changing the relative position of the finger 81 and the plate surface 100 s . Next, the user changes a contact state of the finger 81 on the plate surface 100 s so as to enter a second mode, and at this moment the object O′ may be shown to be selected. Next, when the processing unit 14 detects the finger 81 to move upward, downward, leftward or rightward with respect to the plate surface 100 s as shown in FIG. 8 a , it recognizes that the user is performing a drag gesture. The processing unit 14 then controls the image display 15 to update its screen 150 to present corresponding pictures.
- a user first contacts the plate surface 100 s with his/her finger 81 to activate the touch system 10 and to control the touch system 10 to enter a first mode. Then the user controls a cursor 151 to upon an object O to be selected by changing the relative position of the finger 81 and the plate surface 100 s . Next, the user changes a contact state of the finger 81 on the plate surface 100 s so as to enter a second mode, and at this moment the object O′ may be shown to be selected. Next, when the processing unit 14 detects the finger 81 to move diagonally with respect to the plate surface 100 s as shown in FIG. 8 b , it recognizes that the user is performing a zoom gesture (zoom in or zoom out). The processing unit 14 then controls the image display 15 to update its screen 150 to present corresponding pictures.
- a user first contacts the plate surface 100 s with his/her finger 81 to activate the touch system 10 and to control the touch system 10 to enter a first mode. Then the user controls a cursor 151 to upon an object O to be selected by changing the relative position of the finger 81 and the plate surface 100 s . Next, the user changes a contact state of the finger 81 and the plate surface 100 s so as to enter a second mode, and at this moment the object O′ may be shown to be selected. Next, when the processing unit 14 detects the finger 81 to rotate with respect to the plate surface 100 s as shown in FIG. 8 c , it recognizes that the user is performing a rotate gesture. The processing unit 14 then controls the image display 15 to update its screen 150 to present corresponding pictures.
- a conventional touch system may not be able to correctly calculate the coordinates of contact points of a plurality of pointers that block each other.
- the present invention further provides a touch system that may perform two operation modes by using a single pointer ( FIGS. 2 a , 3 and 5 a ).
- the present invention may switch between two operation modes simply by changing a contact state of a pointer on the plate surface and the touch system of the present invention has lower system cost.
Abstract
A gesture recognition method for a touch system includes the steps of: capturing images looking across a plate surface with at least one image sensor; processing the images to determine a contact state variation of a single pointer on the plate surface; and recognizing whether a relative variation between the single pointer and the plate surface matches a predetermined gesture when the contact state variation is larger than a threshold. The present invention further provides a touch system.
Description
- This application claims the priority benefit of Taiwan Patent Application Serial Number 098124545, filed on Jul. 21, 2009, the full disclosure of which is incorporated herein by reference.
- 1. Field of the Invention
- This invention generally relates to a touch system and, more particularly, to a gesture recognition method and a touch system incorporating the same.
- 2. Description of the Related Art
- Please refer to
FIGS. 1 a and 1 b, they show operational schematic diagrams of aconventional touch system 9, which includes atouch plate 90 and at least twocameras cameras whole touch plate 90 for capturing images looking across a surface of thetouch plate 90. When auser 8 contacts thetouch plate 90 with onefinger 81, thecameras finger 81. A processing unit calculates two dimensional coordinates of thefinger 81 contacting thetouch plate 90 according to one dimensional positions of the shadow I81 associated with the tip of thefinger 81. In this manner, the position and displacement of thefinger 81 relative to thetouch plate 90 can be obtained and the processing unit may accordingly control a display to execute corresponding operations according to the variation of the two dimensional coordinates of thefinger 81. - When the
user 8 contacts thetouch plate 90 with twofingers cameras fingers fingers touch plate 90 according to one dimensional positions of the shadows I81 and I82 contained in the image windows W91′ and W92′ and recognizes the gesture according to a variation of the coordinates of the twofingers - However, the operating method of the
touch system 9 is to calculate two dimensional coordinates of a finger contacting thetouch plate 90 according to one dimensional positions of the shadow associated with the finger tip in each image window, when a user touches thetouch plate 90 with a plurality of fingers,e.g. fingers finger 82 will block thefinger 81 with respect to thecamera 92 as shown inFIG. 1 b, the image window W92′ captured by thecamera 92 may not contain the shadows of all fingers. Accordingly, it is not possible to correctly calculate the two dimensional coordinates of every finger in some circumstances. Although this problem can be solved by installing additional cameras, the system cost will be increased at the same time. - Accordingly, the present invention further provides a gesture recognition method and a touch system incorporating the same so as to solve the problems existed in the above mentioned conventional touch system.
- The present invention provides a gesture recognition method and a touch system incorporating the same that may perform mode switching according to a contact state variation of a single finger on a plate.
- The present invention provides a gesture recognition method for a touch system including the steps of: capturing images looking across a plate surface with at least one image sensor; processing the images to determine a contact state variation of a single pointer on the plate surface; and recognizing whether a relative variation between the single pointer and the plate surface matches a predetermined gesture when the contact state variation is larger than a threshold.
- The present invention further provides a gesture recognition method for a touch system including the steps of: capturing images looking across a plate surface with at least one image sensor; processing the images to detect a contact point of a single pointer on the plate surface; and recognizing whether a contact of the single pointer on the plate surface matches a predetermined gesture according to a state variation and a position change of the contact point.
- The present invention further provides a touch system including a plate, at least one light source, at least one image sensor and a processing unit. The plate has a plate surface. The light source is disposed on the plate surface. The image sensor captures image windows, looking across the plate surface, containing a shadow of a single pointer blocking the light source. The processing unit recognizes whether a width variation or an area variation of the shadow in the image windows is larger than a threshold and recognizes whether a position change of the single pointer on the plate surface matches a predetermined gesture when the width variation or the area variation is larger than the threshold.
- According to the gesture recognition method of the present invention and a touch system incorporating the same, in the first mode the touch system may control the motion of a cursor according to a coordinate variation (or a position change) of a pointer; in the second mode the touch system may update pictures presented on an image display according to the coordinate variation (or the position change) of the pointer, e.g. updating the pictures to present object select, screen scroll, object drag, object zoom in/out or object rotate, wherein the object may be an icon or a window.
- Since the gesture recognition method of the present invention and a touch system incorporating the same may perform gesture recognition according to a single pointer, miscalculation of the coordinates of a plurality of pointers from blocking each other can then be avoided.
- Other objects, advantages, and novel features of the present invention will become more apparent from the following detailed description when taken in conjunction with the accompanying drawings.
-
FIG. 1 a shows an operational schematic diagram of a conventional touch system. -
FIG. 1 b shows another operational schematic diagram of the touch system shown inFIG. 1 a. -
FIG. 2 a shows a block diagram of the touch system in accordance with an embodiment of the present invention. -
FIG. 2 b shows a schematic diagram of a partial field of view of the image sensor shown inFIG. 2 a and an image window captured thereby. -
FIG. 3 shows an upper view of the touch system according to the first embodiment of the present invention. -
FIG. 4 a shows an operational schematic diagram of the touch system according to the first embodiment of the present invention. -
FIG. 4 b shows a schematic diagram of an image window captured by the image sensor shown inFIG. 4 a. -
FIG. 5 a shows a block diagram of the touch system according to the second embodiment of the present invention. -
FIG. 5 b shows a schematic diagram of image windows respectively captured by the two image sensors shown inFIG. 5 a. -
FIG. 6 a-6 c show operational schematic diagrams of the first mode of the touch system according to the embodiments of the present invention. -
FIG. 7 a-7 c show operational schematic diagrams of the second mode of the touch system according to the embodiments of the present invention. -
FIG. 8 a-8 c show schematic diagrams of different gestures corresponding to the touch system according to the embodiments of the present invention. - It should be noticed that, wherever possible, the same reference numbers will be used throughout the drawings to refer to the same or like parts.
- Please refer to
FIGS. 2 a and 2 b,FIG. 2 a shows a block diagram of thetouch system 10 in accordance with an embodiment of the present invention, andFIG. 2 b shows a schematic diagram of a partial field of view of theimage sensor 13 and animage window 20 captured by theimage sensor 13 shown inFIG. 2 a. Thetouch system 10 includes aplate 100, anillumination unit 11, afirst light source 121, asecond light source 122, animage sensor 13, aprocessing unit 14 and animage display 15. - The
plate 100 includes afirst side 100 a, asecond side 100 b, athird side 100 c, afourth side 100 a and aplate surface 100 s. Embodiments of theplate 100 include a white board and a touch screen. Theplate surface 100 s is served as the input area of thetouch system 10. - In this embodiment, the
illumination unit 11 is disposed at thefirst side 100 a on theplate surface 100 s. Theillumination unit 11 may be an active light source or a passive light source. When theillumination unit 11 is an active light source, it is preferably a linear light source. When theillumination unit 11 is a passive light source, it is configured to reflect the light from other light sources, e.g. thefirst light source 121 andsecond light source 122, and theillumination unit 11 further includes a reflectingsurface 11 a facing thethird side 100 c of theplate 100, wherein the reflectingsurface 11 a may be made of proper materials. Thefirst light source 121 is disposed at thesecond side 100 b on theplate surface 100 s and preferably illuminates toward thefourth side 100 d of theplate 100. Thesecond light source 100 s is disposed at thethird side 100 c on theplate surface 100 s and preferably illuminates toward thefirst side 100 a of theplate 100, wherein thefirst light source 121 and thesecond light source 122 are preferably active light sources, for example, but not limited to, linear light sources. - The
image sensor 13 is preferably disposed at one corner of theplate 100, for example at the corner intersected by thesecond light source 122 and thefourth side 100 d of theplate 100 in this embodiment, and theillumination unit 11 is disposed at a side, which is not adjacent to theimage sensor 13, on theplate surface 100 s. Theimage sensor 13 captures images looking across theplate surface 100 s and encompassing a space defined by theillumination unit 11,first light source 121,second light source 122 andfourth side 100 d of theplate 100. When a pointer, e.g. afinger 81, contacts theplate surface 100 s, a field of view of theimage sensor 13 will exist the tip of thefinger 81 as shown in the upper part ofFIG. 2 b, wherein “BA” refers to a high intensity area and a height thereof is generally determined by the size of theillumination unit 11 andlight sources image sensor 13 may successively captureimage windows 20 containing a shadow I81 that is formed by the tip offinger 81 blocking theillumination unit 11 or thelight source 121 as shown in the lower part ofFIG. 2 b. Embodiments of theimage sensor 13 include, but not limited to, a CCD image sensor and a COMS image sensor. It is appreciated that the pointer may be replaced by other proper object and it is not limited to a finger. - The
processing unit 14 is coupled to theimage sensor 13 and configured to process the images captured by theimage sensor 13 so as to recognize a width variation or an area variation of a shadow associated with a finger to accordingly control thetouch system 10 to operate in a first mode or a second mode. When theprocessing unit 14 recognizes that a pointer contacts theplate surface 100 s, it activates thetouch system 10 to operate in a first mode; at this moment, theprocessing unit 14 calculates a two dimensional coordinate of the pointer contacting theplate surface 100 s according to the position of the shadow associated with the pointer in theimage window 20, and controls the motion of a cursor shown on theimage display 15 according to a variation of the two dimensional coordinates obtained from successive image windows; wherein the two dimensional coordinates of theplate surface 100 s may correspond to the position coordinates of adisplay screen 150 of theimage display 15. - When the
processing unit 14 recognizes the width variation or the area variation, which may become larger or smaller, of shadow associated with the pointer exceeds a threshold, it controls thetouch system 10 to operate in a second mode; at this moment, theprocessing unit 14 calculates a two dimensional coordinate of the pointer contacting theplate surface 100 s according to the position of the shadow associated with the pointer in theimage window 20, performs gesture recognition according to a variation of the two dimensional coordinates between successive image windows, and controls the update of pictures presented on animage display 15 according to the recognized gesture, e.g. controlling the image display to present object select, screen scroll, object drag, object zoom in/out or object rotate, and details thereof will be illustrated hereinafter. In addition in the present invention, the sensitivity of switching between the first mode and second mode may be adjusted by dynamically adjusting the threshold; wherein a larger threshold corresponds to a lower sensitivity whereas a lower threshold corresponds to a higher sensitivity. - In
FIG. 2 a, to clearly show thetouch system 10 of the present invention, theplate 100 is separated from theimage display 15 but it is not a limitation of the present invention. In another embodiment, theplate 100 may be integrated on thescreen 150 of theimage display 15. In addition, when theplate 100 is a touch screen, thescreen 150 of theimage display 15 may also be served as theplate 100, and theillumination unit 11, the firstlight source 121, the secondlight source 122 and theimage sensor 13 are disposed on the surface of thescreen 150. - It is appreciated that although the
plate 100 is shown as a rectangle and theillumination unit 11, the firstlight source 121 and the secondlight source 122 are perpendicularly disposed at three sides on theplate 100 inFIG. 2 a, they are only exemplary and not a limitation of the present invention. In another embodiment, theplate 100 may be formed in other shapes; theillumination unit 11, the firstlight source 121, the secondlight source 122 and theimage sensor 13 may be disposed in other spatial relationships on theplate surface 100 s. - Please refer to
FIG. 3 , it shows an upper view of thetouch system 10 according to the first embodiment of the present invention. In this embodiment, theillumination unit 11 is a passive light source (e.g. a reflecting component) and has a reflectingsurface 11 a facing thethird side 100 c of theplate 100. Accordingly, the firstlight source 121 may map asecond mirror image 121′ relative to the reflectingsurface 11 a; the secondlight source 122 may map athird mirror image 122′ relative to the reflectingsurface 11 a; and thefourth side 100 d of theplate 100 may map afourth mirror image 100 d′ relative to the reflectingsurface 11 a, wherein theillumination unit 11, the firstlight source 121, the secondlight source 122 and thefourth side 100 d of theplate 100 together define a real space RS; and theillumination unit 11, thesecond mirror image 121′, thethird mirror image 122′ and thefourth mirror image 100 d′ together define a virtual space IS. - The
image sensor 13 is disposed at the corner intersected by the secondlight source 122 and thefourth side 100 d of theplate 100. A field of view VA of theimage sensor 13 is looking across theplate surface 100 s and encompasses the real space RS and the virtual space IS, and theimage sensor 13 is configured to capture image windows containing a shadow associated with a pointer, e.g. afinger 81, inside the real space RS, wherein the shadow is formed by the pointer blocking thelight source 121 and theillumination unit 11. In an embodiment, theimage sensor 13 further includes a lens (or lens set) for adjusting the field of view VA of theimage sensor 13 to allow theimage sensor 13 to be able to capture a complete image encompassing the real space RS and the virtual space IS. - Please refer to
FIGS. 4 a and 4 b,FIG. 4 a shows an operational schematic diagram of thetouch system 10 according to the first embodiment of the present invention andFIG. 4 b shows a schematic diagram of animage window 20 captured by theimage sensor 13 shown inFIG. 4 a. As shown inFIG. 4 a, when a pointer, e.g. afinger 81, contacts theplate surface 100 s inside the real space RS, which is shown by a contact point T81, the pointer maps a first mirror image in the virtual space IS, shown by a contact point T81′ herein, relative to the reflectingsurface 11 a of the illumination unit 11 (i.e. a reflecting component in this embodiment). Theimage sensor 13 captures an image of the tip of the pointer through the first sensing route R81 such that a shadow I81 will exist in theimage window 20; it also captures an image of the first mirror image through the second sensing route R81′ such that a shadow I81′ will exist in theimage window 20 as shown inFIG. 4 b. In this embodiment, relative relationships between a one dimensional position of a shadow in theimage window 20 and an angle between a sensing route and thethird side 100 c of theplate 100 are pre-stored in theprocessing unit 14. In this manner, when theimage sensor 13 captures images of the pointer and the first mirror image thereof to generate theimage window 20, theprocessing unit 14 may respectively obtain a first angle A81 and a second angle A81′ according to one dimensional positions of the shadows I81, I81′ in theimage window 20. Next, by using triangulation, theprocessing unit 14 may obtain a two dimensional coordinate of the contact point T81 that the pointer contacts with theplate surface 100 s. - For example in an aspect, the
plate surface 100 s forms a Cartesian coordinate system, wherein thethird side 100 c is served as an X-axis, the fourth side is served as a Y-axis and a location of theimage sensor 13 is served as an original point of the Cartesian coordinate system. Therefore, the coordinate of a contact point T81 in the Cartesian coordinate system may be represented as (a distance to thefourth side 100 d, a distance to thethird side 100 c). In addition, the distance D1 between thefirst side 100 a and thethird side 100 c of theplate 100 may be pre-stored in theprocessing unit 14. In this manner, theprocessing unit 14 may obtain the two dimensional coordinate of the contact point T81 of thepointer 81 according to the following steps: (a) theprocessing unit 14 calculates a first angle A81 between the first sensing route R81 and thethird side 100 c of theplate 100, and a second angle A81′ between the second sensing route R82 and thethird side 100 c of theplate 100; (b) theprocessing unit 14 calculates a distance D2 between the contact point T81 of thepointer 81 and thefourth side 100 d of theplate 100 according to the equation D2=2D1/(tan A81+tan A81′); (c) theprocessing unit 14 calculates a y-coordinate of the contact point T81 according to the equation D2×tan A81. Therefore, a two dimensional coordinate of the contact point T81 may be represented as (D2, D2×tan A81). - Please refer to
FIGS. 5 a and 5 b,FIG. 5 a shows a block diagram of thetouch system 10′ according to the second embodiment of the present invention;FIG. 5 b shows a schematic diagram of image windows captured by the twoimage sensors FIG. 5 a. The differences between this embodiment and the first embodiment are in that theillumination unit 11′ herein is an active light source and thetouch system 10′ includes twoimage sensors - In the second embodiment, the
touch system 10′ includes aplate 100, anillumination unit 11′, a firstlight source 121, a secondlight source 122, twoimage sensors processing unit 14. Theillumination unit 11′ is disposed at thefirst side 100 a on theplate surface 100 s and preferably illuminates toward thethird side 100 c of theplate 100. The firstlight source 121 is disposed at thesecond side 100 b on theplate surface 100 s and preferably illuminates toward thefourth side 100 d of theplate 100. The secondlight source 122 is disposed at thefourth side 100 d on theplate surface 100 s and preferably illuminates toward thesecond side 100 b of theplate 100. Theimage sensor 13 is disposed at the intersection of thethird side 100 c and thefourth side 100 d of theplate 100 and the field of view thereof is looking across theplate surface 100 s. Theimage sensor 13′ is disposed at the intersection of thesecond side 100 b and thethird side 100 c of theplate 100 and the field of view thereof is looking across theplate surface 100 s. When a pointer, e.g. afinger 81, contacts with theplate surface 100 s, theimage sensor 13 captures an image window W13 containing a shadow I81 associated with the tip offinger 81 and theimage sensor 13′ captures an image window I81′ containing a shadow I81′ associated with the tip of thefinger 81. It is appreciated that thetouch system 10′ may also include an image display (not shown) coupled to theprocessing unit 14. - The
processing unit 14 is coupled to theimage sensors image sensors touch system 10′ to operate in a first mode or a second mode. When theprocessing unit 14 recognizes that a pointer contacts with theplate surface 100 s, it activates thetouch system 10′ to operate in the first mode; at this moment, theprocessing unit 14 calculates a two dimensional coordinate of the pointer contacting theplate surface 100 s according to the positions of the shadows I81, I81′ associated with the pointer in the image windows W13 and W13′, and controls the motion of a cursor shown on an image display according to a variation of the two dimensional coordinates obtained from successive image windows W13 and W13′. When theprocessing unit 14 recognizes that the width variation or the area variation of the shadows I81, I81′ associated with the pointer exceeds a threshold, it controls thetouch system 10′ to operate in a second mode; at this moment, theprocessing unit 14 calculates a two dimensional coordinate of the pointer contacting theplate surface 100 s according to positions of the shadows associated with the pointer in the image windows W13 and W13′, performs gesture recognition according to a variation of the two dimensional coordinates obtained from successive image windows W13 and W13′, and controls the update of pictures presented on an image display according to the recognized gesture, e.g. controlling the image display to present object select, screen scroll, object zoom in/out, object drag or object rotate. The calculation of the two dimensional coordinates may also be performed through triangulation and details of the calculation is similar to that illustrated in first embodiment and thus will not be repeated again. - Details of the operating method of the touch system according to the embodiments of the present invention will be illustrated hereinafter. It should be noted that the gesture recognition method described below may be adapted to the
touch systems - Please refer to
FIGS. 2 a and 6 a-6 c, when a user contacts theplate surface 100 s with a pointer, e.g. afinger 81, theimage sensor 13 captures the shadow I81 associated with the tip offinger 81 to generate animage window 20, wherein a width of the shadow I81 in theimage window 20 is assumed to be L. After theprocessing unit 14 recognizes a contact event, it activates thetouch system 10 and controls thetouch system 10 to enter a first mode. In the first mode, theprocessing unit 14 calculates two dimensional coordinates of thefinger 81 contacting theplate surface 100 s according to the position of the shadow I81 in theimage window 20, and controls the motion of a cursor shown on theimage display 15 according to a variation of the two dimensional coordinates as shown inFIG. 6 b. - When the
plate 100 is a touch screen, the user may directly contact a position upon an object O with his/her finger so as to activate thetouch system 10 as shown inFIG. 6 c. Theprocessing unit 14 also calculates a two dimensional coordinate of thefinger 81 relative to theplate surface 100 s according to the position of the shadow I81 in theimage window 20. - Please refer to
FIGS. 2 a and 7 a-7 c, when the user changes a contact state, e.g. a contact area, of thefinger 81 on theplate surface 100 s, the width and the area of the shadow I81 in theimage window 20 are also changed. For example inFIG. 7 a, the width of the shadow I81 in theimage window 20 captured by theimage sensor 13 is changed to L′. When theprocessing unit 14 recognizes that the width variation of the shadow exceeds a threshold, e.g. L′/L or |L′-L| exceeds a predetermined threshold, it controls thetouch system 10 to enter a second mode. Similarly, the area variation of the shadow may be obtained according to the absolute value of a difference or the percentage of the contact areas of two contact states. That is, the threshold may be a variation percentage or a variation of the width or the area of the shadow associated with the pointer. - In the second mode, the
processing unit 14 also calculates a two dimensional coordinate of thefinger 81 relative to theplate surface 100 s according to a position of the shadow I81 in theimage window 20, and then compares a variation of the two dimensional coordinates with gesture data pre-stored in theprocessing unit 14 to perform gesture recognition. That is, in the second mode the coordinate variation obtained by theprocessing unit 14 is not used to control the motion of thecursor 151, it is used to recognize the gesture of a user performed so as to execute predetermined operations, e.g. object select, screen scroll, object drag, object zoom in/out and object rotate, but the present invention is not limited to these operations. In the present invention, the object mentioned herein may be an icon or a window. - In the present invention, if it is desired to switch the
touch system 10 between the first mode and the second mode, a user may change the width or the area of the shadow I81 for a predetermined period of time, for example, but not limited to, one second, wherein during mode switching thefinger 81 may be steady or is moving on theplate surface 100 s. - Please refer to
FIGS. 6 a-8 c, relationships between gestures performed by a user and operation functions will be illustrated hereinafter. It should be understood that the relationships between gestures and operation functions described below are exemplary and not the limitation of the present invention. - When the
plate 100 is a white board, a user firstly contacts theplate surface 100 s with a pointer to activate thetouch system 10 and controls thetouch system 10 to enter a first mode. Then, the user controls acursor 151 to upon an object O to be selected by changing a relative position of the finger on theplate surface 100 s as shown inFIG. 6 b. Next, the user changes a contact state of thefinger 81 on theplate surface 100 s, as shown inFIG. 7 a, so as to control thetouch system 10 to enter a second mode. At this moment, the object may be shown with characteristic change as shown inFIG. 7 b, e.g. color change or line width change, representing the object is selected. - When the
plate 100 is a touch screen, the user contacts theplate surface 100 s upon the object O to activate thetouch system 10 as shown inFIG. 6 c. Then, the user changes a contact state of thefinger 81 on theplate surface 100 s so as to have thetouch system 10 enter a second mode to select the object O′ as shown inFIG. 7 c. - A user first contacts the
plate surface 100 s with his/her finger to activate thetouch system 10 and to control thetouch system 10 to enter a first mode as shown inFIG. 6 a or 7 a. Then, the user changes a contact state of thefinger 81 on theplate surface 100 s, e.g. from the state shown inFIG. 6 a toFIG. 7 a or from the state shown inFIG. 7 a toFIG. 6 a, for a predetermined period of time to have thetouch system 10 enter a second mode. Next, when theprocessing unit 14 detects thefinger 81 to move upward, downward, leftward or rightward with respect to theplate surface 100 s as shown inFIG. 8 a, it recognizes that the user is performing a scroll gesture. Theprocessing unit 14 then controls theimage display 15 to update itsscreen 150 to present corresponding pictures. - A user first contacts the
plate surface 100 s with his/herfinger 81 to activate thetouch system 10 and to control thetouch system 10 to enter a first mode. Then the user controls acursor 151 to upon an object O to be selected by changing the relative position of thefinger 81 and theplate surface 100 s. Next, the user changes a contact state of thefinger 81 on theplate surface 100 s so as to enter a second mode, and at this moment the object O′ may be shown to be selected. Next, when theprocessing unit 14 detects thefinger 81 to move upward, downward, leftward or rightward with respect to theplate surface 100 s as shown inFIG. 8 a, it recognizes that the user is performing a drag gesture. Theprocessing unit 14 then controls theimage display 15 to update itsscreen 150 to present corresponding pictures. - A user first contacts the
plate surface 100 s with his/herfinger 81 to activate thetouch system 10 and to control thetouch system 10 to enter a first mode. Then the user controls acursor 151 to upon an object O to be selected by changing the relative position of thefinger 81 and theplate surface 100 s. Next, the user changes a contact state of thefinger 81 on theplate surface 100 s so as to enter a second mode, and at this moment the object O′ may be shown to be selected. Next, when theprocessing unit 14 detects thefinger 81 to move diagonally with respect to theplate surface 100 s as shown inFIG. 8 b, it recognizes that the user is performing a zoom gesture (zoom in or zoom out). Theprocessing unit 14 then controls theimage display 15 to update itsscreen 150 to present corresponding pictures. - A user first contacts the
plate surface 100 s with his/herfinger 81 to activate thetouch system 10 and to control thetouch system 10 to enter a first mode. Then the user controls acursor 151 to upon an object O to be selected by changing the relative position of thefinger 81 and theplate surface 100 s. Next, the user changes a contact state of thefinger 81 and theplate surface 100 s so as to enter a second mode, and at this moment the object O′ may be shown to be selected. Next, when theprocessing unit 14 detects thefinger 81 to rotate with respect to theplate surface 100 s as shown inFIG. 8 c, it recognizes that the user is performing a rotate gesture. Theprocessing unit 14 then controls theimage display 15 to update itsscreen 150 to present corresponding pictures. - As mentioned above, a conventional touch system may not be able to correctly calculate the coordinates of contact points of a plurality of pointers that block each other. The present invention further provides a touch system that may perform two operation modes by using a single pointer (
FIGS. 2 a, 3 and 5 a). The present invention may switch between two operation modes simply by changing a contact state of a pointer on the plate surface and the touch system of the present invention has lower system cost. - Although the invention has been explained in relation to its preferred embodiment, it is not used to limit the invention. It is to be understood that many other possible modifications and variations can be made by those skilled in the art without departing from the spirit and scope of the invention as hereinafter claimed.
Claims (20)
1. A gesture recognition method for a touch system, comprising the steps of:
capturing images looking across a plate surface with at least one image sensor;
processing the images to determine a contact state variation of a single pointer on the plate surface; and
recognizing whether a relative variation between the single pointer and the plate surface matches a predetermined gesture when the contact state variation is larger than a threshold.
2. The gesture recognition method as claimed in claim 1 , wherein the contact state variation is determined according to a width variation or an area variation of a shadow associated with the single pointer in the images.
3. The gesture recognition method as claimed in claim 1 , wherein when the contact state variation is larger than the threshold maintaining for a predetermined period of time, the process of recognizing whether a relative variation between the single pointer and the plate surface matches a predetermined gesture is performed.
4. The gesture recognition method as claimed in claim 1 , wherein the predetermined gesture is a scroll gesture, a drag gesture, a zoom gesture or a rotate gesture.
5. The gesture recognition method as claimed in claim 1 , further comprising the step of: activating the touch system when a contact between the single pointer and the plate surface is detected according to the images captured.
6. The gesture recognition method as claimed in claim 1 , further comprising the step of: controlling the motion of a cursor shown on an image display according to the relative variation between the single pointer and the plate surface when the contact state variation is smaller than the threshold.
7. The gesture recognition method as claimed in claim 6 , wherein the touch system is in a first mode when the contact state variation is larger than the threshold whereas the touch system is in a second mode when the contact state variation is smaller than the threshold, and the gesture recognition method further comprises the step of: dynamically adjusting the threshold thereby adjusting the sensitivity of mode switching.
8. A gesture recognition method for a touch system, comprising the steps of:
capturing images looking across a plate surface with at least one image sensor;
processing the images to detect a contact point of a single pointer on the plate surface; and
recognizing whether a contact of the single pointer on the plate surface matches a predetermined gesture according to a state variation and a position change of the contact point.
9. The gesture recognition method as claimed in claim 8 , further comprising the step of: calculating the position change of the contact point according to a position of a shadow associated with the single pointer in the images.
10. The gesture recognition method as claimed in claim 8 , wherein the state variation is determined according to a width variation or an area variation of a shadow associated with the single pointer in the images.
11. The gesture recognition method as claimed in claim 10 , wherein when the width variation or the area variation of the shadow is larger than a threshold, the process of recognizing whether the contact of the single pointer on the plate surface matches a predetermined gesture according to a position change of the contact point is performed.
12. The gesture recognition method as claimed in claim 8 , wherein the predetermined gesture is a scroll gesture, a drag gesture, a zoom gesture or a rotate gesture.
13. The gesture recognition method as claimed in claim 8 , further comprising: updating pictures shown on an image display according to the gesture recognized.
14. A touch system, comprising:
a plate, having a plate surface;
at least one light source, disposed on the plate surface;
at least one image sensor, capturing image windows, looking across the plate surface, containing a shadow of a single pointer blocking the light source; and
a processing unit, recognizing whether a width variation or an area variation of the shadow in the image windows is larger than a threshold and recognizing whether a position change of the single pointer on the plate surface matches a predetermined gesture when the width variation or the area variation is larger than the threshold.
15. The touch system as claimed in claim 14 , wherein the plate is a white board or a touch screen.
16. The touch system as claimed in claim 14 , wherein the image sensor is disposed at a corner of the intersection of two sides of the plate surface, and the touch system further comprises a reflecting component disposed at a side not adjacent to the image sensor on the plate surface.
17. The touch system as claimed in claim 16 , wherein the image sensor captures image windows containing two shadows of the single pointer blocking the light source and the reflecting component, and the processing unit calculates the position change of the single pointer on the plate surface according to positions of the two shadows in the image windows.
18. The touch system as claimed in claim 14 , wherein the touch system comprises two image sensors respectively capturing image windows containing a shadow of the single pointer blocking the light source, and the processing unit calculates the position change of the single pointer on the plate surface according to positions of the shadows in the image windows.
19. The touch system as claimed in claim 14 , further comprising an image display coupled to the processing unit, wherein the processing unit controls the image display to update pictures presented thereon while recognizing the position change of the single pointer on the plate surface matches a predetermined gesture.
20. The touch system as claimed in claim 14 , wherein the predetermined gesture is a scroll gesture, a drag gesture, a zoom gesture or a rotate gesture.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
TW098124545 | 2009-07-21 | ||
TW098124545A TWI501121B (en) | 2009-07-21 | 2009-07-21 | Gesture recognition method and touch system incorporating the same |
Publications (1)
Publication Number | Publication Date |
---|---|
US20110018822A1 true US20110018822A1 (en) | 2011-01-27 |
Family
ID=43496864
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/775,838 Abandoned US20110018822A1 (en) | 2009-07-21 | 2010-05-07 | Gesture recognition method and touch system incorporating the same |
Country Status (3)
Country | Link |
---|---|
US (1) | US20110018822A1 (en) |
JP (1) | JP5657293B2 (en) |
TW (1) | TWI501121B (en) |
Cited By (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110265118A1 (en) * | 2010-04-21 | 2011-10-27 | Choi Hyunbo | Image display apparatus and method for operating the same |
US20120050224A1 (en) * | 2010-08-24 | 2012-03-01 | Quanta Computer Inc. | Optical touch system and method |
US20120133579A1 (en) * | 2010-11-30 | 2012-05-31 | Microsoft Corporation | Gesture recognition management |
US20120206377A1 (en) * | 2011-02-12 | 2012-08-16 | Microsoft Corporation | Angular contact geometry |
WO2012129649A1 (en) * | 2011-03-31 | 2012-10-04 | Smart Technologies Ulc | Gesture recognition by shadow processing |
US20120287056A1 (en) * | 2011-05-13 | 2012-11-15 | Abdallah Ibdah | Identification of touch point on touch screen device |
US20130117664A1 (en) * | 2011-11-07 | 2013-05-09 | Tzu-Pang Chiang | Screen display method applicable on a touch screen |
US8725443B2 (en) | 2011-01-24 | 2014-05-13 | Microsoft Corporation | Latency measurement |
US20140146016A1 (en) * | 2012-11-29 | 2014-05-29 | Pixart Imaging Inc. | Positioning module, optical touch system and method of calculating a coordinate of a touch medium |
US8773377B2 (en) | 2011-03-04 | 2014-07-08 | Microsoft Corporation | Multi-pass touch contact tracking |
US8913019B2 (en) | 2011-07-14 | 2014-12-16 | Microsoft Corporation | Multi-finger detection and component resolution |
US8914254B2 (en) | 2012-01-31 | 2014-12-16 | Microsoft Corporation | Latency measurement |
US8988087B2 (en) | 2011-01-24 | 2015-03-24 | Microsoft Technology Licensing, Llc | Touchscreen testing |
US20150212580A1 (en) * | 2012-01-27 | 2015-07-30 | Google Inc. | Handling touch inputs based on user intention inference |
US9213448B2 (en) | 2012-11-29 | 2015-12-15 | Pixart Imaging Inc. | Positioning module, optical touch system and method of calculating a coordinate of a touch medium |
US9317147B2 (en) | 2012-10-24 | 2016-04-19 | Microsoft Technology Licensing, Llc. | Input testing tool |
US9378389B2 (en) | 2011-09-09 | 2016-06-28 | Microsoft Technology Licensing, Llc | Shared item account selection |
US9542092B2 (en) | 2011-02-12 | 2017-01-10 | Microsoft Technology Licensing, Llc | Prediction-based touch contact tracking |
US9785281B2 (en) | 2011-11-09 | 2017-10-10 | Microsoft Technology Licensing, Llc. | Acoustic touch sensitive testing |
US20210096651A1 (en) * | 2013-03-14 | 2021-04-01 | Eyesight Mobile Technologies, LTD. | Vehicle systems and methods for interaction detection |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103064548A (en) * | 2011-10-24 | 2013-04-24 | 联咏科技股份有限公司 | Gesture judgment method capable of filtering mistouched panel out |
TWI479393B (en) * | 2012-11-21 | 2015-04-01 | Wistron Corp | Switching methods, optical touch devices using the same, and computer products thereof |
US10649555B2 (en) * | 2017-09-28 | 2020-05-12 | Htc Corporation | Input interface device, control method and non-transitory computer-readable medium |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4553842A (en) * | 1983-05-09 | 1985-11-19 | Illinois Tool Works Inc. | Two dimensional optical position indicating apparatus |
US4746770A (en) * | 1987-02-17 | 1988-05-24 | Sensor Frame Incorporated | Method and apparatus for isolating and manipulating graphic objects on computer video monitor |
US4918262A (en) * | 1989-03-14 | 1990-04-17 | Ibm Corporation | Touch sensing display screen signal processing apparatus and method |
US20080246740A1 (en) * | 2007-04-04 | 2008-10-09 | Toshiba Matsushita Display Technology Co., Ltd. | Display device with optical input function, image manipulation method, and image manipulation program |
US20080277171A1 (en) * | 2007-05-07 | 2008-11-13 | Wright David G | Reducing sleep current in a capacitance sensing system |
US20090015555A1 (en) * | 2007-07-12 | 2009-01-15 | Sony Corporation | Input device, storage medium, information input method, and electronic apparatus |
US20090021489A1 (en) * | 1998-01-26 | 2009-01-22 | Wayne Westerman | Identifying contacts on a touch surface |
US7515141B2 (en) * | 2005-04-15 | 2009-04-07 | Canon Kabushiki Kaisha | Coordinate input apparatus, control method therefor, and program |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6610917B2 (en) * | 1998-05-15 | 2003-08-26 | Lester F. Ludwig | Activity indication, external source, and processing loop provisions for driven vibrating-element environments |
JP2002351615A (en) * | 2001-05-24 | 2002-12-06 | Ricoh Co Ltd | Display device |
JP4429083B2 (en) * | 2004-06-03 | 2010-03-10 | キヤノン株式会社 | Shading type coordinate input device and coordinate input method thereof |
JP2006099468A (en) * | 2004-09-29 | 2006-04-13 | Toshiba Corp | Gesture input device, method, and program |
JP2008140182A (en) * | 2006-12-01 | 2008-06-19 | Sharp Corp | Input device, transmission/reception system, input processing method and control program |
JP2008191791A (en) * | 2007-02-01 | 2008-08-21 | Sharp Corp | Coordinate input device, coordinate input method, control program and computer-readable recording medium |
JP5282661B2 (en) * | 2009-05-26 | 2013-09-04 | ソニー株式会社 | Information processing apparatus, information processing method, and program |
-
2009
- 2009-07-21 TW TW098124545A patent/TWI501121B/en not_active IP Right Cessation
-
2010
- 2010-05-07 US US12/775,838 patent/US20110018822A1/en not_active Abandoned
- 2010-07-12 JP JP2010157980A patent/JP5657293B2/en not_active Expired - Fee Related
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4553842A (en) * | 1983-05-09 | 1985-11-19 | Illinois Tool Works Inc. | Two dimensional optical position indicating apparatus |
US4746770A (en) * | 1987-02-17 | 1988-05-24 | Sensor Frame Incorporated | Method and apparatus for isolating and manipulating graphic objects on computer video monitor |
US4918262A (en) * | 1989-03-14 | 1990-04-17 | Ibm Corporation | Touch sensing display screen signal processing apparatus and method |
US20090021489A1 (en) * | 1998-01-26 | 2009-01-22 | Wayne Westerman | Identifying contacts on a touch surface |
US7515141B2 (en) * | 2005-04-15 | 2009-04-07 | Canon Kabushiki Kaisha | Coordinate input apparatus, control method therefor, and program |
US20080246740A1 (en) * | 2007-04-04 | 2008-10-09 | Toshiba Matsushita Display Technology Co., Ltd. | Display device with optical input function, image manipulation method, and image manipulation program |
US20080277171A1 (en) * | 2007-05-07 | 2008-11-13 | Wright David G | Reducing sleep current in a capacitance sensing system |
US20090015555A1 (en) * | 2007-07-12 | 2009-01-15 | Sony Corporation | Input device, storage medium, information input method, and electronic apparatus |
Cited By (31)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110265118A1 (en) * | 2010-04-21 | 2011-10-27 | Choi Hyunbo | Image display apparatus and method for operating the same |
US20120050224A1 (en) * | 2010-08-24 | 2012-03-01 | Quanta Computer Inc. | Optical touch system and method |
US8692804B2 (en) * | 2010-08-24 | 2014-04-08 | Quanta Computer Inc. | Optical touch system and method |
US20120133579A1 (en) * | 2010-11-30 | 2012-05-31 | Microsoft Corporation | Gesture recognition management |
US9965094B2 (en) | 2011-01-24 | 2018-05-08 | Microsoft Technology Licensing, Llc | Contact geometry tests |
US9710105B2 (en) | 2011-01-24 | 2017-07-18 | Microsoft Technology Licensing, Llc. | Touchscreen testing |
US9395845B2 (en) | 2011-01-24 | 2016-07-19 | Microsoft Technology Licensing, Llc | Probabilistic latency modeling |
US9030437B2 (en) | 2011-01-24 | 2015-05-12 | Microsoft Technology Licensing, Llc | Probabilistic latency modeling |
US8725443B2 (en) | 2011-01-24 | 2014-05-13 | Microsoft Corporation | Latency measurement |
US8988087B2 (en) | 2011-01-24 | 2015-03-24 | Microsoft Technology Licensing, Llc | Touchscreen testing |
US8982061B2 (en) * | 2011-02-12 | 2015-03-17 | Microsoft Technology Licensing, Llc | Angular contact geometry |
US20120206377A1 (en) * | 2011-02-12 | 2012-08-16 | Microsoft Corporation | Angular contact geometry |
US9542092B2 (en) | 2011-02-12 | 2017-01-10 | Microsoft Technology Licensing, Llc | Prediction-based touch contact tracking |
US8773377B2 (en) | 2011-03-04 | 2014-07-08 | Microsoft Corporation | Multi-pass touch contact tracking |
WO2012129649A1 (en) * | 2011-03-31 | 2012-10-04 | Smart Technologies Ulc | Gesture recognition by shadow processing |
US8773374B2 (en) * | 2011-05-13 | 2014-07-08 | Blackberry Limited | Identification of touch point on touch screen device |
US20120287056A1 (en) * | 2011-05-13 | 2012-11-15 | Abdallah Ibdah | Identification of touch point on touch screen device |
US8913019B2 (en) | 2011-07-14 | 2014-12-16 | Microsoft Corporation | Multi-finger detection and component resolution |
US9378389B2 (en) | 2011-09-09 | 2016-06-28 | Microsoft Technology Licensing, Llc | Shared item account selection |
US9935963B2 (en) | 2011-09-09 | 2018-04-03 | Microsoft Technology Licensing, Llc | Shared item account selection |
US20130117664A1 (en) * | 2011-11-07 | 2013-05-09 | Tzu-Pang Chiang | Screen display method applicable on a touch screen |
US9785281B2 (en) | 2011-11-09 | 2017-10-10 | Microsoft Technology Licensing, Llc. | Acoustic touch sensitive testing |
US9652132B2 (en) * | 2012-01-27 | 2017-05-16 | Google Inc. | Handling touch inputs based on user intention inference |
US20150212580A1 (en) * | 2012-01-27 | 2015-07-30 | Google Inc. | Handling touch inputs based on user intention inference |
US10521102B1 (en) | 2012-01-27 | 2019-12-31 | Google Llc | Handling touch inputs based on user intention inference |
US8914254B2 (en) | 2012-01-31 | 2014-12-16 | Microsoft Corporation | Latency measurement |
US9317147B2 (en) | 2012-10-24 | 2016-04-19 | Microsoft Technology Licensing, Llc. | Input testing tool |
US9213448B2 (en) | 2012-11-29 | 2015-12-15 | Pixart Imaging Inc. | Positioning module, optical touch system and method of calculating a coordinate of a touch medium |
US9134855B2 (en) * | 2012-11-29 | 2015-09-15 | Pixart Imaging Inc. | Positioning module, optical touch system and method of calculating a coordinate of a touch medium |
US20140146016A1 (en) * | 2012-11-29 | 2014-05-29 | Pixart Imaging Inc. | Positioning module, optical touch system and method of calculating a coordinate of a touch medium |
US20210096651A1 (en) * | 2013-03-14 | 2021-04-01 | Eyesight Mobile Technologies, LTD. | Vehicle systems and methods for interaction detection |
Also Published As
Publication number | Publication date |
---|---|
JP5657293B2 (en) | 2015-01-21 |
JP2011028746A (en) | 2011-02-10 |
TW201104519A (en) | 2011-02-01 |
TWI501121B (en) | 2015-09-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20110018822A1 (en) | Gesture recognition method and touch system incorporating the same | |
US8867791B2 (en) | Gesture recognition method and interactive system using the same | |
JP5412227B2 (en) | Video display device and display control method thereof | |
US8659577B2 (en) | Touch system and pointer coordinate detection method therefor | |
TWI393037B (en) | Optical touch displaying device and operating method thereof | |
US20130241837A1 (en) | Input apparatus and a control method of an input apparatus | |
US20140267029A1 (en) | Method and system of enabling interaction between a user and an electronic device | |
US20140218300A1 (en) | Projection device | |
US20130044054A1 (en) | Method and apparatus for providing bare-hand interaction | |
JP2013061848A (en) | Noncontact input device | |
US9958965B2 (en) | Dual mode optical navigation device and mode switching method thereof | |
CN103744542A (en) | Hybrid pointing device | |
US20110193969A1 (en) | Object-detecting system and method by use of non-coincident fields of light | |
CN109964202B (en) | Display control apparatus, display control method, and computer-readable storage medium | |
US10884518B2 (en) | Gesture detection device for detecting hovering and click | |
CN102999158B (en) | The gesture identification of interaction systems and interaction systems | |
CN101989150A (en) | Gesture recognition method and touch system using same | |
US9489077B2 (en) | Optical touch panel system, optical sensing module, and operation method thereof | |
KR20120136719A (en) | The method of pointing and controlling objects on screen at long range using 3d positions of eyes and hands | |
US9189075B2 (en) | Portable computer having pointing functions and pointing system | |
TWI444875B (en) | Multi-touch input apparatus and its interface method using data fusion of a single touch sensor pad and imaging sensor | |
JP5118663B2 (en) | Information terminal equipment | |
US20150153904A1 (en) | Processing method of object image for optical touch system | |
US20240069647A1 (en) | Detecting method, detecting device, and recording medium | |
EP3059664A1 (en) | A method for controlling a device by gestures and a system for controlling a device by gestures |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: PIXART IMAGING INC., TAIWAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LIN, CHO YI;HSU, YAO CHING;REEL/FRAME:024354/0191 Effective date: 20100308 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |