CN101847063B - System and method for detecting object by using non-coincident fields of light - Google Patents

System and method for detecting object by using non-coincident fields of light Download PDF

Info

Publication number
CN101847063B
CN101847063B CN 201010130510 CN201010130510A CN101847063B CN 101847063 B CN101847063 B CN 101847063B CN 201010130510 CN201010130510 CN 201010130510 CN 201010130510 A CN201010130510 A CN 201010130510A CN 101847063 B CN101847063 B CN 101847063B
Authority
CN
China
Prior art keywords
image
edge
unit
indication
surrounding member
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN 201010130510
Other languages
Chinese (zh)
Other versions
CN101847063A (en
Inventor
蔡华骏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qisda Suzhou Co Ltd
Qisda Corp
Original Assignee
Qisda Suzhou Co Ltd
Qisda Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qisda Suzhou Co Ltd, Qisda Corp filed Critical Qisda Suzhou Co Ltd
Priority to CN 201010130510 priority Critical patent/CN101847063B/en
Publication of CN101847063A publication Critical patent/CN101847063A/en
Application granted granted Critical
Publication of CN101847063B publication Critical patent/CN101847063B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention discloses a system and method for detecting the information of an object in an indication space, such as the target position of the object, which is indicated on an indication plane, particularly through capturing the image on the indication space by using the images of the non-coincident fields of light. Therefore, the problems resulted from the prior art using the coincident fields of light and the expensive image sensor are solved.

Description

Utilize object detecting system and the method in the light territory that does not form simultaneously
Technical field
The present invention is about a kind of object detecting system and method (object-detecting system and method), and especially, the light territory (non-coincident fields of light) that do not form simultaneously about a kind of utilization of the present invention and object detecting system and the method for single orthoscopic image sensor (line image sensor).
Background technology
Because touch control screen (touch screen) can allow the operator intuitively see through the advantage that the coordinate input of relative display is carried out in the way of contact, touch control screen has become the input media of the now common configuration of display.Touch control screen to be to be widely used in all kinds of electronic products with display, for example, and monitor, mobile computer, flat computer, ATM (Automatic Teller Machine), point-of-sale terminals, visitor's guide system, industrial control system etc.
Except the Touch Screen that the operators such as traditional electrical resistive, condenser type must contact, the coordinate input mode of utilizing camera assembly (image-capturing device) to allow the operator need not really to touch display also is used.Utilize the related art of the non-contact type touch control screen (or being called the optical touch control screen) of camera assembly to ask for an interview the 4th, 507, No. 557 patents of United States Patent (USP) notification number, seldom do at this and give unnecessary details.
For the position of more accurately resolving input point even can support the multiple spot input, prior art about the optical touch control screen, the design proposal of existing multiple different light sources kenel, light reflection subassembly and leaded light component is suggested, so that more angle functions about the input point position to be provided, in order to the position of resolving exactly input point.For example, United States Patent (USP) notification number the 7th, 460, No. 110 patents, its exposure utilizes the catoptron of a slice waveguide assemblies (waveguide) and installing waveguide assemblies two edges to cooperate light source can cause upper and lower light territory (coincident fields of light) two-layer and that form simultaneously, and image unit can capture upper and lower two-layer different image simultaneously by this.
Yet, if will capture simultaneously upper and lower two-layer different image, must adopt higher matrix type image sensor (area image sensor), multiple orthoscopic image sensor (multiple-line image sensor) or two the orthoscopic image sensors of cost.In addition, adopt matrix type image sensor, multiple orthoscopic image sensor or two orthoscopic image sensors, the optical touch control screen need to expend more calculation resources could resolve the image that these image sensors capture, and especially adopts the matrix type image sensor.In addition, adopt matrix type image sensor, multiple orthoscopic image sensor or two orthoscopic image sensors, the error that its system assembles of optical touch control screen causes can cause these image sensors to sense wrong light territory or sensing less than the situation in light territory, especially adopts two orthoscopic image sensors.
Summary of the invention
Therefore, one of purpose of the present invention is to provide a kind of object detecting system and method, in order to similarly to utilize the target location of optical mode detecting object on the indication plane.And especially, light territory (non-coincident fields of light) and single orthoscopic image sensor of not forming simultaneously according to object detecting system of the present invention and method utilization are to solve the problem that prior art was caused of light territory that above-mentioned utilization forms simultaneously and expensive image sensor.
In addition, another object of the present invention is to provide a kind of object detecting system and method, in order to detect the object informations such as body form, object area, object three-dimensional shape and object volume of object within comprising the indication space of indicating the plane.
According to the wherein object detecting system of a preferred embodiment of the present invention, its surrounding member (peripheral member), light reflection subassembly (light-reflecting device), control/processing unit (controlling/processing unit), the first luminescence unit (light-emitting unit), the second luminescence unit and the first image unit (image-capturing unit).Indication plane in surrounding member definition indication space and the indication space is for the target location of object indication on the indication plane.Surrounding member and object have relativity.The indication plane has the first edge, second edge adjacent with the first edge, three edge adjacent with the second edge and four edge adjacent with the 3rd edge and the first edge.The 3rd edge and the 4th edge form the first corner.The second edge and the 3rd edge form the second corner.The light reflection subassembly is arranged on the surrounding member, and is positioned at the first edge.The first luminescence unit is electrically connected to control/processing unit.The first luminescence unit is arranged on the surrounding member, and is positioned at the first edge.The first luminescence unit is by the control of control/processing unit, to launch the first light.The first light is by the indication space and then form the first smooth territory.The second luminescence unit is electrically connected to control/processing unit.The second luminescence unit is arranged on the surrounding member, and is positioned at the second edge.The 3rd luminescence unit is electrically connected to control/processing unit.The 3rd luminescence unit is arranged on the surrounding member, and is positioned at the 3rd edge.The 4th luminescence unit is electrically connected to control/processing unit.The 4th luminescence unit is arranged on the surrounding member, and is positioned at the 4th edge.The second luminescence unit, the 3rd luminescence unit and Unit the 4th are by the control of control/processing unit, to launch the second light.The second light is by the indication space and then form the second smooth territory.The first image unit is electrically connected control/processing unit, and is arranged at the first corner periphery.The first image unit defines the first camera point.The first image unit is by the control of control/processing unit, and acquisition indication space is presented in the first image of the part surrounding member on the first edge when the first smooth territory forms.The first image unit and controlled by control/processing unit, acquisition indication space is presented in the second image of the part surrounding member on the second edge when the second smooth territory forms, and the indication space is presented in the first reflected image of the part surrounding member on the second edge and the 3rd edge by the light reflection subassembly.Control/processing unit processes the first image, the second image and the first reflected image are positioned at the object information in indication space to determine object.
In a specific embodiment, the light reflection subassembly is level crossing or prism.
In another specific embodiment, the light reflection subassembly comprises the first reflecting surface and the second reflecting surface.The first reflecting surface and the second reflecting surface be haply with right angle intersection, and towards the indication space.The main plane of extending of indication plane definition.The plane is extended in the definition of the first reflecting surface for the first time.The plane is extended in the definition of the second reflecting surface for the second time.Each intersects with miter angle haply with the main plane of extending with extending for the second time the plane to extend for the first time the plane.
In a specific embodiment, the first image unit is the orthoscopic image sensor.
According to the object detecting system of another preferred embodiment of the present invention, further comprise the second image unit.The second image unit is electrically connected control/processing unit, and is arranged at the second corner periphery.The second image unit defines the second camera point.The second image unit is by the control of control/processing unit, and acquisition indication space is presented in the 3rd image of the part surrounding member on the first edge when the first smooth territory forms.The second image unit and controlled by control/processing unit, acquisition indication space is presented in the 4th image of the part surrounding member on the 4th edge when the second smooth territory forms, and the indication space is presented in the second reflected image of the part surrounding member on the 3rd edge and the 4th edge by the light reflection subassembly.Control/processing unit processes the first image, the second image, the first reflected image, the 3rd image, the 4th image and the second reflected image wherein at least the two to determine object information.
In a specific embodiment, the second image unit is the orthoscopic image sensor.
In a specific embodiment, the first luminescence unit, the second luminescence unit, the 3rd luminescence unit and the 4th luminescence unit are respectively line source.
Object detecting method according to a preferred embodiment of the present invention.Enforcement comprises surrounding member, light reflection subassembly, the first luminescence unit, the second luminescence unit, the 3rd luminescence unit and the 4th luminescence unit according to the basis of object detecting method of the present invention.Indication plane in surrounding member definition indication space and the indication space is for the target location of object indication on the indication plane.Surrounding member and object have relativity.The indication plane has the first edge, second edge adjacent with the first edge, three edge adjacent with the second edge and four edge adjacent with the 3rd edge and the first edge.The 3rd edge and the 4th edge form the first corner.The second edge and the 3rd edge form the second corner.The light reflection subassembly is arranged on the surrounding member, and is positioned at the first edge.The first luminescence unit is positioned at the first edge.The second luminescence unit is positioned at the second edge.The 3rd luminescence unit is positioned at the 3rd edge.The 4th luminescence unit is positioned at the 4th edge.Object detecting method according to the present invention may further comprise the steps: at first (a) control the first luminescence unit is to launch the first light, and wherein the first light is by indicating the space and then forming the first smooth territory.Then, (b) object detecting method according to the present invention is presented in the first image of the part surrounding member on the first edge in acquisition indication space, the first corner place when the first smooth territory forms.Then, (c) object detecting method according to the present invention is controlled the second luminescence unit, the 3rd luminescence unit and the 4th luminescence unit to launch the second light, and wherein the second light is by indicating the space and then forming the second smooth territory.Then, (d) object detecting method according to the present invention is when the second smooth territory forms, be presented in the second image of the part surrounding member on the second edge in acquisition indication space, the first corner place, and the indication space is presented in the first reflected image of the part surrounding member on the second edge and the 3rd edge by the light reflection subassembly.At last, (e) object detecting method according to the present invention is processed the first image, this second image and the first reflected image are positioned at the indication space with the decision object object information.
In a specific embodiment, this light reflection subassembly is level crossing or prism.
In a specific embodiment, to be this target location be positioned at object three-dimensional shape and/or the object volume in this indication space with respect to the relative position on this indication plane, body form that this object is projeced into this indication plane and/or object area or this object to this object information.
In a specific embodiment, step (b) and capture the 3rd image that this indication space is presented in this surrounding member of part on this first edge in this second corner place, step (d) and capture this indication space and be presented in the 4th image of this surrounding member of part on the 4th edge and this indication space is presented in this surrounding member of part on the 3rd edge and the 4th edge by this light reflection subassembly the second reflected image, step (e) is processed this first image, this second image, this first reflected image, the 3rd image, the 4th image and this second reflected image wherein at least the two to determine this object information.
In a specific embodiment, this first image, this second image and this first reflected image capture by the first orthoscopic image sensor, and the 3rd image, the 4th image and this second reflected image capture by the second orthoscopic image sensor.
Compared with prior art, light territory and single orthoscopic image sensor that object detecting system of the present invention and method utilization do not form simultaneously, can adopt lower-cost image sensor and less calculation resources to carry out, utilize light territory and the expensive problem that image sensor was caused that forms simultaneously in the prior art to solve.
Can be by following detailed Description Of The Invention and appended graphic being further understood about the advantages and spirit of the present invention.
Description of drawings
Figure 1A illustrates according to the wherein configuration diagram of the object detecting system of a preferred embodiment of the present invention;
Figure 1B illustrates the first luminescence unit, light reflection subassembly and surrounding member among Figure 1A along the viewgraph of cross-section of A-A line;
When Fig. 2 A illustrated the first smooth territory and the second smooth territory and forms respectively, P1 and P2 two input points hindered the path that light is incident upon the first image unit and the second image unit;
Fig. 2 B illustrates the first image unit and captures respectively about the image in the first smooth territory and about the image in the second smooth territory at T0 and T1 two time points;
Fig. 2 C illustrates the second image unit and captures respectively about the image in the first smooth territory and about the image in the second smooth territory at T0 and T1 two time points;
Fig. 3 is according to the process flow diagram of the object detecting method of a wherein preferred embodiment of the present invention.
Embodiment
The invention provides a kind of object detecting system and method, in order to similarly to utilize the target location of optical mode detecting object on the indication plane.In addition, can detect the object informations such as body form, object area, object three-dimensional shape and object volume of object within comprising the indication space of indicating the plane according to object detecting system of the present invention and method.And be to utilize the light territory that does not form simultaneously according to object detecting system of the present invention and method especially.By this, can adopt lower-cost image sensor and less calculation resources to carry out according to object detecting system of the present invention and method.Below by the detailed description to preferred embodiment of the present invention, use abundant explanation about feature of the present invention, spirit, advantage and the feasibility of implementing.
See also Figure 1A and Figure 1B, Figure 1A illustrates the configuration diagram according to the object detecting system 1 of a wherein preferred embodiment of the present invention.Figure 1B illustrates surrounding member 14 among Figure 1A, the first luminescence unit 122, light reflection subassembly 13 along the viewgraph of cross-section of A-A line.Object detecting system 1 according to the present invention is in order to detect the position (for example, two positions (P1, P2) Figure 1A shown in) of at least one object (for example, finger, stylus etc.) on indicating area 10.
Shown in Figure 1A, object detecting system 1 according to the present invention comprises surrounding member 14 (be not illustrated among Figure 1A, ask for an interview Figure 1B), light reflection subassembly 13, control/processing unit 11, the first luminescence unit 122, the second luminescence unit 124, the 3rd luminescence unit 126, the 4th luminescence unit 128 and the first image unit 16.Indication plane 10 in surrounding member 14 definition indication spaces and the indication space is for the target location (P1, P2) of object indication on indication plane 10.Surrounding member 14 has relativity with object.Indication plane 10 has the first edge 102, second edge 104 adjacent with the first edge 102, three edge 106 adjacent with the second edge 104 and four edge 108 adjacent with the 3rd edge 106 and the first edge 102.The 3rd edge 106 and the 4th edge 108 form the first corner C1.The second edge 104 and the 3rd edge 106 form the second corner C2.
Be shown in equally Figure 1A, the first luminescence unit 122 is electrically connected to control/processing unit 11.The first luminescence unit 122 is arranged on the surrounding member 14, and is positioned at the first edge 102.Light reflection subassembly 13 is arranged on the surrounding member 14, and is positioned at the first edge 102.The second luminescence unit 124 is electrically connected to control/processing unit 11.The second luminescence unit 124 is arranged on the surrounding member 14, and is positioned at the second edge 104.The 3rd luminescence unit 126 is electrically connected to control/processing unit 11.The 3rd luminescence unit 126 is arranged on the surrounding member 14, and is positioned at the 3rd edge 106.The 4th luminescence unit 128 is electrically connected to control/processing unit 11.The 4th luminescence unit 128 is arranged on the surrounding member 14, and is positioned at the 4th edge 108.The first image unit 16 is electrically connected control/processing unit 11, and is arranged at the periphery of the first corner C1.The first image unit 16 definition the first camera point.
Such as Figure 1B, according to object detecting system 1 of the present invention and comprise the protuberance and around the indication plane 10 surrounding member 14.Surrounding member 14 supports the first luminescence unit 122, light reflection subassembly 13, the second luminescence unit 124, the 3rd luminescence unit 126, the 4th luminescence unit 128 and the first image unit 16.
In a specific embodiment, light reflection subassembly 13 can be to be level crossing.
In another specific embodiment, as shown in Figure 1B, light reflection subassembly 13 can comprise the first reflecting surface 132 and the second reflecting surface 134.The first reflecting surface 132 and the second reflecting surface 134 be haply with right angle intersection, and towards the indication space.The main plane of extending of indication plane 10 definition.The plane is extended in 132 definition of the first reflecting surface for the first time.The plane is extended in 134 definition of the second reflecting surface for the second time.Each intersects with miter angle haply with the main plane of extending with extending for the second time the plane to extend for the first time the plane.In practical application, above-mentioned light reflection subassembly 13 can be prism.
In practical application, the first luminescence unit 122, the second luminescence unit 124, the 3rd luminescence unit 126 and the 4th luminescence unit 128 can be respectively line source (line light source).Line source (122,124,126,128) can be made of bar-shaped leaded light component and the light emitting diode (for example, infrared light-emitting diode) that is installed in an end of bar-shaped leaded light component.The light that light emitting diode sends enters from an end-fire of bar-shaped leaded light component, its structure of bar-shaped leaded light component with the photoconduction injected to indication plane 10.Line source (122,124,126,128) also can be a row light emitting diode.
The first luminescence unit 122 is by 11 controls of control/processing unit, to launch the first light.The first light is by the indication space and then form the first smooth territory.The first image unit 16 is by 11 controls of control/processing unit, and acquisition indication space is presented in the first image of the part surrounding member 14 on the first edge 102 when the first smooth territory forms.The first image is included in the obstruction that the object in the indication space causes the first light, namely is projected in the shade on the first image.
The second luminescence unit 124, the 3rd luminescence unit 126 and the 4th luminescence unit 128 are by 11 controls of control/processing unit, to launch the second light.The second light is by the indication space and then form the second smooth territory.Especially, control/processing unit 11 control the first smooth territories and the second smooth territory do not occur simultaneously.That is to say, when control/processing unit 11 controls the second luminescence unit 124, the 3rd luminescence unit 126 and the 4th luminescence unit 128 are opened, also control the first luminescence unit 122 and close.The first image unit 16 and by control/processing unit 11 control, acquisition indication space is presented in the second image of the part surrounding member 14 on the second edge 104 when the second smooth territory forms, and the indication space is presented in the first reflected image of the part surrounding member 14 on the second edge 104 and the 3rd edge 106 by light reflection subassembly 13.The second image and the first reflected image are included in the obstruction that the object in the indication space causes the second light, namely are projected in the shade on the second image and the first reflected image.
In practical application, the first image unit 16 can be the orthoscopic image sensor.
At last, control/processing unit 11 is processed the first image, the second image and the first reflected image are positioned at the indication space with the decision object object information.
In a specific embodiment, object information comprises the target location with respect to the relative position on this indication plane 10.Control/processing unit 11 according to the object in the first image on the first edge 102 or the object in this second image on this second edge 104, determine the first object point.Control/processing unit 11 and on the second edge 104 and the 3rd edge 106, determine the first reflecting object point according to the object in the first reflected image.Control/processing unit 11 and determine the first straight inbound path according to the online relation of the first camera point and the first object point, online relation and light reflection subassembly 13 according to the first camera point and the first reflecting object point determine the first reflection paths, and according to the plotted point of the first straight inbound path and the first reflection paths to determine relative position.
In a specific embodiment, object information comprises body form and/or the object area that object is projeced into indication plane 10.Control/processing unit 11 according to the object in the first image on the first edge 102 or the object in the second image on the second edge 104, determine the first object point and the second object point.Control/processing unit 11 and on the second edge 104 and the 3rd edge 106, determine the first reflecting object point and the second reflecting object point according to the object in the first reflected image.Control/processing unit 11 and determine that with the online relation of the first object point and the second object point first directly advances panel path respectively according to the first camera point.Control/processing unit 11 and determine the first plane of reflection path with online relation and first reflection subassembly of the first reflecting object point and the second reflecting object point respectively according to the first camera point, and directly advance the shape in intersection zone in panel path and the first plane of reflection path and/or area to determine body form and/or object area according to first.Further, object information inclusion body is positioned at object three-dimensional shape and/or the object volume in indication space.Control/processing unit 11 is divided into the first image, the second image and the first reflected image respectively a plurality of the first sub-images, a plurality of the second sub-image and a plurality of the first reflection sub-image.Control/processing unit 11 and determine a plurality of body forms and/or a plurality of object area according to a plurality of the first sub-images, a plurality of the second sub-image and a plurality of the first reflection sub-image, and with a plurality of body forms and/or a plurality of object area along the normal direction on indication plane 10 storehouse sequentially, to determine object three-dimensional shape and/or this object volume.
In a specific embodiment, object information inclusion body is positioned at object three-dimensional shape and/or the object volume in indication space.Control/processing unit 11 according to the object in this first image on the first edge 102 or the object in the second image on this second edge 104, determine at least three object point.Control/processing unit 11 and on the second edge 104 and the 3rd edge 106, determine at least three reflecting object points according to the object in the first reflected image.Control/processing unit 11 and determine that with the online relation of at least three object point first directly enters three-dimensional path respectively according to the first camera point, determine the first reflectance volume path with online relation and the light reflection subassembly 13 of at least three reflecting object points respectively according to the first camera point, and according to the first three-dimensional shape and/or volume that directly enters the intersection space in three-dimensional path and the first reflectance volume path, to determine object three-dimensional shape and/or object volume.
Same such as Figure 1A, further comprise the second image unit 18 according to the object detecting system 1 of another preferred embodiment of the present invention.The second image acquisition unit 18 is electrically connected control/processing unit 11, and is arranged at the periphery of the second corner C2.The second image unit 18 definition the second camera point.
The second image unit 18 is by 11 controls of control/processing unit, and acquisition indication space is presented in the 3rd image of the part surrounding member 14 on the first edge 102 when the first smooth territory forms.The 3rd image is included in the obstruction that the object in the indication space causes the first light, namely is projected in the shade on the 3rd image.The second image unit 18 and by control/processing unit 11 control, acquisition indication space is presented in the 4th image of the part surrounding member 14 on the 4th edge 108 when the second smooth territory forms, and the indication space is presented in the second reflected image of the part surrounding member 14 on the 3rd edge 106 and the 4th edge 108 by light reflection subassembly 13.The 4th image and the second reflected image are included in the obstruction that the object in the indication space causes the second light, namely are projected in the shade on the 4th image and the second reflected image.In this preferred embodiment, control/processing unit 11 process the first image, the second image, the first reflected image, the 3rd image, the 4th image and the second reflected image wherein at least the two to determine object information.
What need emphasize is, control/processing unit 11 also can be controlled the second luminescence unit 124, the 3rd luminescence unit 126 and the 4th luminescence unit 128 and open first, and to form first the second smooth territory, row control the first luminescence unit 122 opens to form the first smooth territory again.
In practical application, the second image unit 18 can be the orthoscopic image sensor.
In practical application, the background value of above-mentioned the first reflected image and the second reflected image can be a little less than, and then impact is projected in the interpretation of the shade on the first reflected image and the second reflected image to mirror image.For head it off, it is longer or open twice that control/processing unit 11 can be controlled time of the unlatchings such as the second luminescence unit 124, the 3rd luminescence unit 126, the 4th luminescence unit 128, the first image unit 16 and the second image unit 18, makes the time shutter of the first reflected image and the second reflected image long than the time shutter of the first image and the 3rd image.In addition, also can reach the total quantity of illumination that allows total quantity of illumination in the second smooth territory be higher than the first smooth territory by the drive current of the yield value of control the first luminescence unit 122, the second luminescence unit 124, the 3rd luminescence unit 126 and the 4th luminescence unit 128, light emitting diode or the modes such as a number of light emitting diode in it lighted.
Below will take two input points (P1, P2) fall within Figure 1A indicating plane 10 and by the first image unit 16 and the second image unit 18 as example, use illustrating that it forms the image situation of light territory and acquisition at different time according to object detecting system of the present invention 1.
Shown in Fig. 2 A, the solid line representative opens to form the path (X represents with the angle amount) that the first smooth territory and P1 and P2 two input points obstruction light are incident upon the first image unit 16 and the second image unit 18 at T0 time point control/processing unit 11 controls the first luminescence unit 122 among the figure.Dotted line representative among Fig. 2 A is opened to form the second smooth territory and P1 and P2 two input points at T1 time point control/processing unit 11 controls the second luminescence unit 124, the 3rd luminescence unit 126 and the 4th luminescence unit 128 and is hindered the path that light is incident upon the first image unit 16 and the second image unit 18.
Be shown in equally Fig. 2 B, P1 and P2 two input points hinder the path that light is incident upon the first image unit 16 at T0 and T1 two time points and form respectively φ 1, φ 2, φ 3 and 4 four angle amounts of φ.Shown in Fig. 2 B, at the T0 time point, 16 acquisitions of the first image unit have the real image shade of corresponding angles vector φ 3 about the image I1 in the first smooth territory on it.At the T1 time point, 16 acquisitions of the first image unit have the real image shade of corresponding angles vector φ 4 and the mirror image shade of angle amount φ 1 and angle amount φ 2 about the image I2 in the second smooth territory on it.
Be shown in equally Fig. 2 C, P1 and P2 two input points hinder the path that light is incident upon the second image unit 18 at T0 and T1 two time points and form respectively θ 1, θ 2, θ 3 and 4 four angle amounts of θ.Shown in Fig. 2 C, at the T0 time point, 18 acquisitions of the second image unit have the real image shade of corresponding angles vector θ 3 about the image I3 in the first smooth territory on it.At the T1 time point, 18 acquisitions of the second image unit have the real image shade of corresponding angles vector θ 4 and the mirror image shade of angle amount θ 1 and angle amount θ 2 about the image I4 in the second smooth territory on it.
Significantly, by resolving image I1, image I2, image I3 and the indicated angle amount of image I4 top shadow, can calculate exactly the position of P1 shown in Fig. 2 A and P2 two input points according to object detecting system 1 of the present invention.What more need emphasize is all can be single orthoscopic image sensor according to the first image unit 16 of the present invention and the second image unit 18.By this, can adopt expensive image sensor according to object detecting system of the present invention, also can avoid image sensor to sense wrong light territory or sensing less than the situation in light territory in its assembling, also can exempt the waveguide assemblies that can reduce the display resolution of display.
See also Fig. 3, Fig. 3 illustrates the process flow diagram according to the object detecting method 2 of a wherein preferred embodiment of the present invention.Enforcement comprises surrounding member, light reflection subassembly, the first luminescence unit, the second luminescence unit, the 3rd luminescence unit and the 4th luminescence unit according to the basis of object detecting method 2 of the present invention.Indication plane in surrounding member definition indication space and the indication space is for the target location of object indication on the indication plane.Surrounding member and object have relativity.The indication plane has the first edge, second edge adjacent with the first edge, three edge adjacent with the second edge and four edge adjacent with the 3rd edge and the first edge.The 3rd edge and the 4th edge form the first corner.The second edge and the 3rd edge form the second corner.The light reflection subassembly is arranged on the surrounding member, and is positioned at the first edge.The first luminescence unit is positioned at the first edge.The second luminescence unit is positioned at the second edge.The 3rd luminescence unit is positioned at the 3rd edge.The 4th luminescence unit is positioned at the 4th edge.The specific embodiment of the first luminescence unit, the second luminescence unit, the 3rd luminescence unit, the 4th luminescence unit and light reflection subassembly is asked for an interview shown in Figure 1A and Figure 1B, does not repeat them here.
As shown in Figure 3, according to object detecting method 2 of the present invention execution in step S20 at first, control the first luminescence unit to launch the first light, wherein the first light is by the indication space and then form the first smooth territory.
Then, according to object detecting method 2 execution in step S22 of the present invention, when the first smooth territory forms, be presented in the first image of the part surrounding member on the first edge in acquisition indication space, the first corner place.
Then, according to object detecting method 2 execution in step S24 of the present invention, control the second luminescence unit, the 3rd luminescence unit and the 4th luminescence unit, to launch the second light, wherein the second light is by indicating the space and then forming the second smooth territory.
Then, according to object detecting method 2 execution in step S26 of the present invention, when the second smooth territory forms, be presented in the second image of the part surrounding member on the second edge in acquisition indication space, the first corner place, and the indication space is presented in the first reflected image of the part surrounding member on the second edge and the 3rd edge by the light reflection subassembly.
At last, according to object detecting method 2 execution in step S28 of the present invention, process the first image, this second image and the first reflected image and be positioned at the object information in indication space to determine object.The content that contains about object information with and the mode that determines in above describing in detail, do not repeat them here.
According to the object detecting method 2 of another preferred embodiment of the present invention and synchronous with step S22, be presented in the 3rd image of the part surrounding member on the first edge in acquisition indication space, the second corner place.According to object detecting method 2 of the present invention and synchronous with step S26, acquisition indication space is presented in the 4th image of the part surrounding member on the 4th edge, and the indication space is presented in the second reflected image of the part surrounding member on the 3rd edge and the 4th edge by the light reflection subassembly.And in step S28, wherein the two decides object information at least by processing the first image, the second image, the first reflected image, the 3rd image, the 4th image and the second reflected image.
In a specific embodiment, the first image, the second image and the first reflected image can get by single orthoscopic image sensor acquisition.The 3rd image, the 4th image and the second reflected image can get by another orthoscopic image sensor acquisition.
In practical application, total quantity of illumination in the second smooth territory is higher than total quantity of illumination in the first smooth territory.
By the above detailed description of preferred embodiments, be to wish more to know to describe feature of the present invention and spirit, and be not to come protection scope of the present invention is limited with above-mentioned disclosed preferred embodiment.On the contrary, its objective is that hope can contain in the protection domain of claim of being arranged in of various changes and tool equality institute of the present invention wish application.Therefore, the protection domain of the claim that the present invention applies for should be done the broadest explanation according to above-mentioned explanation, contains the arrangement of all possible change and tool equality to cause it.

Claims (10)

1. object detecting system is characterized in that this object detecting system comprises:
Surrounding member, this surrounding member definition indication space and should the indication space in the indication plane indicate target location on plane for the object indication at this, this surrounding member and this object have relativity, this indication plane has the first edge, second edge adjacent with this first edge, three edge adjacent with this second edge and four edge adjacent with the 3rd edge and this first edge, the 3rd edge and the 4th edge form the first corner, and this second edge and the 3rd edge form the second corner;
Light reflection subassembly, this light reflection subassembly are arranged on this surrounding member and are positioned at this first edge;
Control/processing unit;
The first luminescence unit, this first luminescence unit is electrically connected to this control/processing unit, this first luminescence unit is arranged on this surrounding member and is positioned at this first edge, this first luminescence unit controls to launch the first light by this control/processing unit, and this first light is by this indication space and then form the first smooth territory;
The second luminescence unit, this second luminescence unit is electrically connected to this control/processing unit, and this second luminescence unit is arranged on this surrounding member;
The 3rd luminescence unit, the 3rd luminescence unit is electrically connected to this control/processing unit, and the 3rd luminescence unit is arranged on this surrounding member;
The 4th luminescence unit, the 4th luminescence unit is electrically connected to this control/processing unit, the 4th luminescence unit is arranged on this surrounding member, this second luminescence unit, the 3rd luminescence unit and the 4th luminescence unit control to launch the second light by this control/processing unit, and this second light is by this indication space and then form the second smooth territory; And
The first image unit, this first image unit is electrically connected this control/processing unit and is arranged at this first corner periphery, this first image unit defines the first camera point, this first image unit captures the first image that this indication space is presented in this surrounding member of part on this first edge by this control/processing unit control when this first smooth territory forms, this first image unit and controlled by this control/processing unit and to capture this indication space be presented in the second image of this surrounding member of part on this second edge and this indication space is presented in this surrounding member of part on this second edge and the 3rd edge by this light reflection subassembly the first reflected image when this second smooth territory forms;
Wherein this first image of this control/processing unit processes, this second image and this first reflected image are positioned at the object information in this indication space to determine this object.
2. object detecting system as claimed in claim 1 is characterized in that this light reflection subassembly is level crossing or prism.
3. object detecting system as claimed in claim 1 is characterized in that this object information is this target location and is positioned at object three-dimensional shape and/or the object volume in this indication space with respect to the relative position on this indication plane, body form that this object is projeced into this indication plane and/or object area or this object.
4. object detecting system as claimed in claim 1 is characterized in that this object detecting system further comprises:
The second image unit, this second image unit is electrically connected this control/processing unit and is arranged at this second corner periphery, this second image unit defines the second camera point, this second image unit captures the 3rd image that this indication space is presented in this surrounding member of part on this first edge by this control/processing unit control when this first smooth territory forms, this second image unit and controlled by this control/processing unit and to capture this indication space be presented in the 4th image of this surrounding member of part on the 4th edge and this indication space is presented in this surrounding member of part on the 3rd edge and the 4th edge by this light reflection subassembly the second reflected image when this second smooth territory forms;
Wherein this first image of this control/processing unit processes, this second image, this first reflected image, the 3rd image, the 4th image and this second reflected image wherein at least the two to determine this object information.
5. object detecting system as claimed in claim 4 is characterized in that this first image unit and this second image unit are respectively the orthoscopic image sensor.
6. object detecting method, it is characterized in that, surrounding member definition indication space and should the indication space in the indication plane indicate target location on plane for the object indication at this, this surrounding member and this object have relativity, this indication plane has the first edge, second edge adjacent with this first edge, three edge adjacent with this second edge and four edge adjacent with the 3rd edge and this first edge, the 3rd edge and the 4th edge form the first corner, this second edge and the 3rd edge form the second corner, the light reflection subassembly is arranged on this surrounding member and is positioned at this first edge, the first luminescence unit is positioned at this first edge, the second luminescence unit is positioned at this second edge, the 3rd luminescence unit is positioned at the 3rd edge, the 4th luminescence unit is positioned at the 4th edge, and this object detecting method comprises following steps:
(a) control this first luminescence unit to launch the first light, wherein this first light is by this indication space and then form the first smooth territory;
(b) when this first smooth territory forms, capture the first image that this indication space is presented in this surrounding member of part on this first edge in this first corner place;
(c) control this second luminescence unit, the 3rd luminescence unit and the 4th luminescence unit, to launch the second light, wherein this second light is by this indication space and then form the second smooth territory;
(d) when this second smooth territory forms, capture this indication space in this first corner place and be presented in the second image of this surrounding member of part on this second edge and this indication space is presented in this surrounding member of part on this second edge and the 3rd edge by this light reflection subassembly the first reflected image; And
(e) process this first image, this second image and this first reflected image and be positioned at the object information in this indication space to determine this object.
7. object detecting method as claimed in claim 6 is characterized in that this light reflection subassembly is level crossing or prism.
8. object detecting method as claimed in claim 6 is characterized in that this object information is this target location and is positioned at object three-dimensional shape and/or the object volume in this indication space with respect to the relative position on this indication plane, body form that this object is projeced into this indication plane and/or object area or this object.
9. object detecting method as claimed in claim 6, it is characterized in that step (b) and capture the 3rd image that this indication space is presented in this surrounding member of part on this first edge in this second corner place, step (d) and capture this indication space and be presented in the 4th image of this surrounding member of part on the 4th edge and this indication space is presented in this surrounding member of part on the 3rd edge and the 4th edge by this light reflection subassembly the second reflected image, step (e) is processed this first image, this second image, this first reflected image, the 3rd image, the 4th image and this second reflected image wherein at least the two to determine this object information.
10. object detecting method as claimed in claim 9, it is characterized in that this first image, this second image and this first reflected image capture by the first orthoscopic image sensor, the 3rd image, the 4th image and this second reflected image capture by the second orthoscopic image sensor.
CN 201010130510 2010-03-03 2010-03-03 System and method for detecting object by using non-coincident fields of light Expired - Fee Related CN101847063B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN 201010130510 CN101847063B (en) 2010-03-03 2010-03-03 System and method for detecting object by using non-coincident fields of light

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN 201010130510 CN101847063B (en) 2010-03-03 2010-03-03 System and method for detecting object by using non-coincident fields of light

Publications (2)

Publication Number Publication Date
CN101847063A CN101847063A (en) 2010-09-29
CN101847063B true CN101847063B (en) 2013-04-17

Family

ID=42771696

Family Applications (1)

Application Number Title Priority Date Filing Date
CN 201010130510 Expired - Fee Related CN101847063B (en) 2010-03-03 2010-03-03 System and method for detecting object by using non-coincident fields of light

Country Status (1)

Country Link
CN (1) CN101847063B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103019458B (en) * 2011-09-20 2016-05-18 原相科技股份有限公司 Optical touch control system and localization method thereof
CN103092430B (en) * 2011-11-03 2016-01-13 原相科技股份有限公司 Optical touch control system and localization method thereof

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006318512A (en) * 1998-10-02 2006-11-24 Semiconductor Energy Lab Co Ltd Information terminal equipment
CN201035553Y (en) * 2007-04-10 2008-03-12 北京汇冠新技术有限公司 Light path structure of touch panel using camera and reflector
US7460110B2 (en) * 2004-04-29 2008-12-02 Smart Technologies Ulc Dual mode touch system
CN100468303C (en) * 2003-02-14 2009-03-11 奈克斯特控股公司 Touch screen signal processing
CN101609381A (en) * 2008-06-18 2009-12-23 北京汇冠新技术股份有限公司 Use the touch-detection sensing device of camera and reflective mirror

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4043128B2 (en) * 1999-02-24 2008-02-06 富士通株式会社 Optical scanning touch panel

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006318512A (en) * 1998-10-02 2006-11-24 Semiconductor Energy Lab Co Ltd Information terminal equipment
CN100468303C (en) * 2003-02-14 2009-03-11 奈克斯特控股公司 Touch screen signal processing
US7460110B2 (en) * 2004-04-29 2008-12-02 Smart Technologies Ulc Dual mode touch system
CN201035553Y (en) * 2007-04-10 2008-03-12 北京汇冠新技术有限公司 Light path structure of touch panel using camera and reflector
CN101609381A (en) * 2008-06-18 2009-12-23 北京汇冠新技术股份有限公司 Use the touch-detection sensing device of camera and reflective mirror

Also Published As

Publication number Publication date
CN101847063A (en) 2010-09-29

Similar Documents

Publication Publication Date Title
TWI453642B (en) Multiple-input touch panel and method for gesture recognition
CN101663637B (en) Touch screen system with hover and click input methods
JP2001142642A (en) Device for inputting coordinates
CN104035555A (en) System, Information Processing Apparatus, And Information Processing Method
CN103324358A (en) Optical Touch System
CN104094204B (en) Optical element with alternate mirror lens crystal face
CN102792249A (en) Touch system using optical components to image multiple fields of view on an image sensor
US8982101B2 (en) Optical touch system and optical touch-position detection method
CN103677445A (en) Position detection apparatus and image display apparatus
US20110199337A1 (en) Object-detecting system and method by use of non-coincident fields of light
US20110193969A1 (en) Object-detecting system and method by use of non-coincident fields of light
WO2010137843A2 (en) Touch screen apparatus adopting an infrared scan system
CN101847063B (en) System and method for detecting object by using non-coincident fields of light
CN201628947U (en) Touch electronic device
CN101923418B (en) Object sensing system and method
CN101667083A (en) Position detection system and arrangement method thereof
CN101819489B (en) Object detecting system
US8912482B2 (en) Position determining device and method for objects on a touch device having a stripped L-shaped reflecting mirror and a stripped retroreflector
CN110307822A (en) Apart from arrangement for detecting
US20130241882A1 (en) Optical touch system and optical touch position detecting method
CN102043543B (en) Optical touch control system and method
CN105278760B (en) Optical touch system
US20170185157A1 (en) Object recognition device
US20140043297A1 (en) Optical Touch System and Optical Touch Control Method
CN103092430B (en) Optical touch control system and localization method thereof

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20130417

Termination date: 20160303