CN101847063A - System and method for detecting object by using non-coincident fields of light - Google Patents

System and method for detecting object by using non-coincident fields of light Download PDF

Info

Publication number
CN101847063A
CN101847063A CN201010130510A CN201010130510A CN101847063A CN 101847063 A CN101847063 A CN 101847063A CN 201010130510 A CN201010130510 A CN 201010130510A CN 201010130510 A CN201010130510 A CN 201010130510A CN 101847063 A CN101847063 A CN 101847063A
Authority
CN
China
Prior art keywords
image
edge
unit
indication
surrounding member
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201010130510A
Other languages
Chinese (zh)
Other versions
CN101847063B (en
Inventor
蔡华骏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qisda Suzhou Co Ltd
Qisda Corp
Original Assignee
Qisda Suzhou Co Ltd
Qisda Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qisda Suzhou Co Ltd, Qisda Corp filed Critical Qisda Suzhou Co Ltd
Priority to CN 201010130510 priority Critical patent/CN101847063B/en
Publication of CN101847063A publication Critical patent/CN101847063A/en
Application granted granted Critical
Publication of CN101847063B publication Critical patent/CN101847063B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention discloses a system and method for detecting the information of an object in an indication space, such as the target position of the object, which is indicated on an indication plane, particularly through capturing the image on the indication space by using the images of the non-coincident fields of light. Therefore, the problems resulted from the prior art using the coincident fields of light and the expensive image sensor are solved.

Description

Utilize the object detecting system and the method in the light territory that does not form simultaneously
Technical field
The present invention is about a kind of object detecting system and method (object-detecting system andmethod), and the light territory (non-coincidentfields of light) that do not form simultaneously about a kind of utilization of the present invention and the object detecting system and the method for single orthoscopic image sensor (line image sensor) especially.
Background technology
Because touch control screen (touch screen) can allow the operator intuitively see through the advantage that the coordinate input of relative display is carried out in the way of contact, touch control screen has become the input media of the common configuration of display now.Touch control screen to be to be widely used in all kinds of electronic products with display, for example, and monitor, mobile computer, flat computer, ATM (Automatic Teller Machine), point-of-sale terminals, visitor's guide system, industrial control system etc.
Except the Touch Screen that operators such as traditional resistor formula, condenser type must contact, the coordinate input mode of utilizing camera assembly (image-capturing device) to allow the operator need not really to touch display also is used.Utilize the related art of the contactless Touch Screen (or being called the optical touch control screen) of camera assembly to ask for an interview the 4th, 507, No. 557 patents of United States Patent (USP) notification number, seldom do at this and give unnecessary details.
For the position of more accurate parsing input point even can support the multiple spot input, prior art about the optical touch control screen, the design proposal of existing multiple different light sources kenel, light reflection subassembly and leaded light component is suggested, so that more angle functions about the input point position to be provided, in order to the position of resolving input point exactly.For example, United States Patent (USP) notification number the 7th, 460, No. 110 patents, its exposure utilizes the catoptron of a slice waveguide assemblies (waveguide) and installing waveguide assemblies two edges to cooperate light source can cause upper and lower light territory (coincident fields of light) two-layer and that form simultaneously, and image unit can capture upper and lower two-layer different image simultaneously by this.
Yet, if will capture upper and lower two-layer different image simultaneously, must adopt higher matrix type image sensor (area image sensor), multiple orthoscopic image sensor (multiple-line imagesensor) or two the orthoscopic image sensors of cost.In addition, adopt matrix type image sensor, multiple orthoscopic image sensor or two orthoscopic image sensors, the optical touch control screen need expend more calculation resources could resolve these image sensor institute picked image, especially adopts the matrix type image sensor.In addition, adopt matrix type image sensor, multiple orthoscopic image sensor or two orthoscopic image sensors, the error that its system assembles of optical touch control screen is caused can cause these image sensors to sense wrong light territory or sensing less than the situation in light territory, especially adopts two orthoscopic image sensors.
Summary of the invention
Therefore, one of purpose of the present invention is to provide a kind of object detecting system and method, in order to similarly to utilize the target location of optical mode detecting object on the indication plane.And especially, light territory (non-coincident fields of light) and single orthoscopic image sensor of not forming simultaneously according to object detecting system of the present invention and method utilization are to solve the problem that prior art was caused of light territory that above-mentioned utilization forms simultaneously and expensive image sensor.
In addition, another object of the present invention is to provide a kind of object detecting system and method, in order to the object informations such as body form, object area, object three-dimensional shape and object volume of detecting object in comprising the indication space of indicating the plane.
According to the wherein object detecting system of a preferred embodiment of the present invention, its surrounding member (peripheral member), light reflection subassembly (light-reflecting device), control/processing unit (controlling/processing unit), first luminescence unit (light-emitting unit), second luminescence unit and first image unit (image-capturing unit).Indication plane in surrounding member definition indication space and the indication space is for the target location of object indication on the indication plane.Surrounding member and object have relativity.The indication plane have first edge, second edge adjacent, three edge adjacent with second edge with first edge and with the 4th adjacent edge of the 3rd edge and first edge.The 3rd edge and the 4th edge form first corner.Second edge and the 3rd edge form second corner.The light reflection subassembly is arranged on the surrounding member, and is positioned at first edge.First luminescence unit is electrically connected to control/processing unit.First luminescence unit is arranged on the surrounding member, and is positioned at first edge.First luminescence unit is by the control of control/processing unit, to launch first light.First light is by the indication space and then form the first smooth territory.Second luminescence unit is electrically connected to control/processing unit.Second luminescence unit is arranged on the surrounding member, and is positioned at second edge.The 3rd luminescence unit is electrically connected to control/processing unit.The 3rd luminescence unit is arranged on the surrounding member, and is positioned at the 3rd edge.The 4th luminescence unit is electrically connected to control/processing unit.The 4th luminescence unit is arranged on the surrounding member, and is positioned at the 4th edge.Second luminescence unit, the 3rd luminescence unit and Unit the 4th are by the control of control/processing unit, to launch second light.Second light is by the indication space and then form the second smooth territory.First image unit is electrically connected control/processing unit, and is arranged at first corner periphery.First image unit defines first camera point.First image unit is by the control of control/processing unit, and first image of the part surrounding member on first edge is presented in acquisition indication space when the first smooth territory forms.First image unit and control by control/processing unit, second image of the part surrounding member on second edge is presented in acquisition indication space when the second smooth territory forms, and first reflected image of the part surrounding member on second edge and the 3rd edge is presented in the indication space by the light reflection subassembly.Control/processing unit processes first image, second image and first reflected image are positioned at the object information in indication space with the decision object.
In a specific embodiment, the light reflection subassembly is level crossing or prism.
In another specific embodiment, the light reflection subassembly comprises first reflecting surface and second reflecting surface.First reflecting surface and second reflecting surface be haply with right angle intersection, and towards the indication space.The main plane of extending of indication plane definition.The plane is extended in the definition of first reflecting surface for the first time.The plane is extended in the definition of second reflecting surface for the second time.Each intersects with miter angle haply with the main plane of extending with extending for the second time the plane to extend for the first time the plane.
In a specific embodiment, first image unit is the orthoscopic image sensor.
According to the object detecting system of another preferred embodiment of the present invention, further comprise second image unit.Second image unit is electrically connected control/processing unit, and is arranged at second corner periphery.Second image unit defines second camera point.Second image unit is by the control of control/processing unit, and the 3rd image of the part surrounding member on first edge is presented in acquisition indication space when the first smooth territory forms.Second image unit and control by control/processing unit, the 4th image of the part surrounding member on the 4th edge is presented in acquisition indication space when the second smooth territory forms, and second reflected image of the part surrounding member on the 3rd edge and the 4th edge is presented in the indication space by the light reflection subassembly.Control/processing unit processes first image, second image, first reflected image, the 3rd image, the 4th image and second reflected image wherein at least the two with the decision object information.
In a specific embodiment, second image unit is the orthoscopic image sensor.
In a specific embodiment, first luminescence unit, second luminescence unit, the 3rd luminescence unit and the 4th luminescence unit are respectively line source.
Object detecting method according to a preferred embodiment of the present invention.Enforcement comprises surrounding member, light reflection subassembly, first luminescence unit, second luminescence unit, the 3rd luminescence unit and the 4th luminescence unit according to the basis of object detecting method of the present invention.Indication plane in surrounding member definition indication space and the indication space is for the target location of object indication on the indication plane.Surrounding member and object have relativity.The indication plane have first edge, second edge adjacent, three edge adjacent with second edge with first edge and with the 4th adjacent edge of the 3rd edge and first edge.The 3rd edge and the 4th edge form first corner.Second edge and the 3rd edge form second corner.The light reflection subassembly is arranged on the surrounding member, and is positioned at first edge.First luminescence unit is positioned at first edge.Second luminescence unit is positioned at second edge.The 3rd luminescence unit is positioned at the 3rd edge.The 4th luminescence unit is positioned at the 4th edge.Object detecting method according to the present invention may further comprise the steps: at first (a) control first luminescence unit is to launch first light, and wherein first light is by indicating the space and then forming the first smooth territory.Then, (b) object detecting method according to the present invention is presented in first image of the part surrounding member on first edge in acquisition indication space, the first corner place when the first smooth territory forms.Then, (c) object detecting method according to the present invention is controlled second luminescence unit, the 3rd luminescence unit and the 4th luminescence unit to launch second light, and wherein second light is by indicating the space and then forming the second smooth territory.Then, (d) object detecting method according to the present invention is when the second smooth territory forms, be presented in second image of the part surrounding member on second edge in acquisition indication space, the first corner place, and first reflected image of the part surrounding member on second edge and the 3rd edge is presented in the indication space by the light reflection subassembly.At last, (e) object detecting method according to the present invention is handled first image, this second image and first reflected image are positioned at the indication space with the decision object object information.
In a specific embodiment, this light reflection subassembly is level crossing or prism.
In a specific embodiment, to be this target location be positioned at the object three-dimensional shape and/or the object volume in this indication space with respect to the relative position on this indication plane, body form that this object is projeced into this indication plane and/or object area or this object to this object information.
In a specific embodiment, step (b) and capture the 3rd image that this surrounding member of part on this first edge is presented in this indication space in this second corner place, step (d) and capture this indication space and be presented in the 4th image of this surrounding member of part on the 4th edge and this indication space and be presented in second reflected image of this surrounding member of part on the 3rd edge and the 4th edge by this light reflection subassembly, step (e) is handled this first image, this second image, this first reflected image, the 3rd image, the 4th image and this second reflected image wherein at least the two to determine this object information.
In a specific embodiment, this first image, this second image and this first reflected image capture by the first orthoscopic image sensor, and the 3rd image, the 4th image and this second reflected image capture by the second orthoscopic image sensor.
Compared with prior art, light territory and single orthoscopic image sensor that object detecting system of the present invention and method utilization do not form simultaneously, can adopt lower-cost image sensor and less calculation resources to carry out, utilize light territory and the expensive problem that image sensor was caused that forms simultaneously in the prior art to solve.
Can be about the advantages and spirit of the present invention by following detailed Description Of The Invention and appended graphic being further understood.
Description of drawings
Figure 1A illustrates according to the wherein configuration diagram of the object detecting system of a preferred embodiment of the present invention;
Figure 1B illustrates first luminescence unit, light reflection subassembly and surrounding member among Figure 1A along the viewgraph of cross-section of A-A line;
When Fig. 2 A illustrated the first smooth territory and the second smooth territory and forms respectively, P1 and P2 two input points hindered the path that light is incident upon first image unit and second image unit;
Fig. 2 B illustrates first image unit and captures respectively about the image in the first smooth territory and about the image in the second smooth territory at T0 and T1 two time points;
Fig. 2 C illustrates second image unit and captures respectively about the image in the first smooth territory and about the image in the second smooth territory at T0 and T1 two time points;
Fig. 3 is according to the process flow diagram of the object detecting method of a wherein preferred embodiment of the present invention.
Embodiment
The invention provides a kind of object detecting system and method, in order to similarly to utilize the target location of optical mode detecting object on the indication plane.In addition, can detect the object informations such as body form, object area, object three-dimensional shape and object volume of object in comprising the indication space of indicating the plane according to object detecting system of the present invention and method.And be to utilize the light territory that does not form simultaneously according to object detecting system of the present invention and method especially.By this, can adopt lower-cost image sensor and less calculation resources to carry out according to object detecting system of the present invention and method.Below by detailed description to preferred embodiment of the present invention, use abundant explanation about feature of the present invention, spirit, advantage and the feasibility of implementing.
See also Figure 1A and Figure 1B, Figure 1A illustrates the configuration diagram according to the object detecting system 1 of a wherein preferred embodiment of the present invention.Figure 1B illustrates surrounding member 14 among Figure 1A, first luminescence unit 122, light reflection subassembly 13 viewgraph of cross-section along the A-A line.Object detecting system 1 according to the present invention is in order to detect the position (for example, two positions (P1, P2) Figure 1A shown in) of at least one object (for example, finger, stylus etc.) on indicating area 10.
Shown in Figure 1A, object detecting system 1 according to the present invention comprises surrounding member 14 (be not illustrated among Figure 1A, ask for an interview Figure 1B), light reflection subassembly 13, control/processing unit 11, first luminescence unit 122, second luminescence unit 124, the 3rd luminescence unit 126, the 4th luminescence unit 128 and first image unit 16.Indication plane 10 in surrounding member 14 definition indication spaces and the indication space is for the target location (P1, P2) of object indication on indication plane 10.Surrounding member 14 has relativity with object.Indication plane 10 have first edge 102, second edge 104 adjacent, three edge 106 adjacent with second edge 104 with first edge 102 and with the 4th adjacent edge 108 of the 3rd edge 106 and first edge 102.The 3rd edge 106 and the 4th edge 108 form the first corner C1.Second edge 104 and the 3rd edge 106 form the second corner C2.
Be shown in Figure 1A equally, first luminescence unit 122 is electrically connected to control/processing unit 11.First luminescence unit 122 is arranged on the surrounding member 14, and is positioned at first edge 102.Light reflection subassembly 13 is arranged on the surrounding member 14, and is positioned at first edge 102.Second luminescence unit 124 is electrically connected to control/processing unit 11.Second luminescence unit 124 is arranged on the surrounding member 14, and is positioned at second edge 104.The 3rd luminescence unit 126 is electrically connected to control/processing unit 11.The 3rd luminescence unit 126 is arranged on the surrounding member 14, and is positioned at the 3rd edge 106.The 4th luminescence unit 128 is electrically connected to control/processing unit 11.The 4th luminescence unit 128 is arranged on the surrounding member 14, and is positioned at the 4th edge 108.First image unit 16 is electrically connected control/processing unit 11, and is arranged at the periphery of the first corner C1.First image unit, 16 definition, first camera point.
As Figure 1B, according to object detecting system 1 of the present invention and comprise the protuberance and around the indication plane 10 surrounding member 14.Surrounding member 14 supports first luminescence unit 122, light reflection subassembly 13, second luminescence unit 124, the 3rd luminescence unit 126, the 4th luminescence unit 128 and first image unit 16.
In a specific embodiment, light reflection subassembly 13 can be to be level crossing.
In another specific embodiment, shown in Figure 1B, light reflection subassembly 13 can comprise first reflecting surface 132 and second reflecting surface 134.First reflecting surface 132 and second reflecting surface 134 be haply with right angle intersection, and towards the indication space.The main plane of extending of indication plane 10 definition.The plane is extended in 132 definition of first reflecting surface for the first time.The plane is extended in 134 definition of second reflecting surface for the second time.Each intersects with miter angle haply with the main plane of extending with extending for the second time the plane to extend for the first time the plane.In practical application, above-mentioned light reflection subassembly 13 can be prism.
In practical application, first luminescence unit 122, second luminescence unit 124, the 3rd luminescence unit 126 and the 4th luminescence unit 128 can be respectively line source (line light source).Line source (122,124,126,128) can be made of bar-shaped leaded light component and the light emitting diode (for example, infrared light-emitting diode) that is installed in an end of bar-shaped leaded light component.The light that light emitting diode sends is gone into from an end-fire of bar-shaped leaded light component, its structure of bar-shaped leaded light component with the photoconduction injected to indication plane 10.Line source (122,124,126,128) also can be a row light emitting diode.
First luminescence unit 122 is by 11 controls of control/processing unit, to launch first light.First light is by the indication space and then form the first smooth territory.First image unit 16 is by 11 controls of control/processing unit, and first image of the part surrounding member 14 on first edge 102 is presented in acquisition indication space when the first smooth territory forms.First image is included in the obstruction that the object in the indication space causes first light, just is projected in the shade on first image.
Second luminescence unit 124, the 3rd luminescence unit 126 and the 4th luminescence unit 128 are by 11 controls of control/processing unit, to launch second light.Second light is by the indication space and then form the second smooth territory.Especially, the control/processing unit 11 control first smooth territories and the second smooth territory do not take place simultaneously.That is to say, when control/processing unit 11 controls second luminescence unit 124, the 3rd luminescence unit 126 and the 4th luminescence unit 128 are opened, also control first luminescence unit 122 and close.First image unit 16 and by control/processing unit 11 control, second image of the part surrounding member 14 on second edge 104 is presented in acquisition indication space when the second smooth territory forms, and first reflected image of the part surrounding member 14 on second edge 104 and the 3rd edge 106 is presented in the indication space by light reflection subassembly 13.Second image and first reflected image are included in the obstruction that the object in the indication space causes second light, just are projected in the shade on second image and first reflected image.
In practical application, first image unit 16 can be the orthoscopic image sensor.
At last, control/processing unit 11 is handled first image, second image and first reflected image are positioned at the indication space with the decision object object information.
In a specific embodiment, object information comprises the relative position of target location with respect to this indication plane 10.Control/processing unit 11 according to the object in first image on first edge 102 or the object in this second image on this second edge 104, determine first object point.Control/processing unit 11 and on second edge 104 and the 3rd edge 106, determine first reflecting object point according to the object in first reflected image.Control/processing unit 11 and determine the first straight inbound path according to the online relation of first camera point and first object point, online relation and light reflection subassembly 13 according to first camera point and the first reflecting object point determine first reflection paths, and according to the plotted point of the first straight inbound path and first reflection paths with the decision relative position.
In a specific embodiment, object information comprises body form and/or the object area that object is projeced into indication plane 10.Control/processing unit 11 according to the object in first image on first edge 102 or the object in second image on second edge 104, determine first object point and second object point.Control/processing unit 11 and on second edge 104 and the 3rd edge 106, determine the first reflecting object point and the second reflecting object point according to the object in first reflected image.Control/processing unit 11 and determine first directly to advance panel path with the online relation of first object point and second object point respectively according to first camera point.Control/processing unit 11 and determine first plane of reflection path with the online relation and first reflection subassembly of the first reflecting object point and the second reflecting object point respectively according to first camera point, and directly advance the shape in intersection zone in the panel path and first plane of reflection path and/or area with decision body form and/or object area according to first.Further, object information inclusion body is positioned at the object three-dimensional shape and/or the object volume in indication space.Control/processing unit 11 is divided into first image, second image and first reflected image a plurality of first sub-images, a plurality of second sub-image and a plurality of first reflection sub-image respectively.Control/processing unit 11 and determine a plurality of body forms and/or a plurality of object area according to a plurality of first sub-images, a plurality of second sub-image and a plurality of first reflection sub-image, and with a plurality of body forms and/or a plurality of object area along the normal direction on indication plane 10 storehouse in regular turn, with decision object three-dimensional shape and/or this object volume.
In a specific embodiment, object information inclusion body is positioned at the object three-dimensional shape and/or the object volume in indication space.Control/processing unit 11 according to the object in this first image on first edge 102 or the object in second image on this second edge 104, determine at least three object point.Control/processing unit 11 and on second edge 104 and the 3rd edge 106, determine at least three reflecting object points according to the object in first reflected image.Control/processing unit 11 and determine first directly to enter three-dimensional path with the online relation of at least three object point respectively according to first camera point, determine the first reflectance volume path with the online relation and the light reflection subassembly 13 of at least three reflecting object points respectively according to first camera point, and according to first three-dimensional shape and/or the volume that directly enters the intersection space in the three-dimensional path and the first reflectance volume path, with decision object three-dimensional shape and/or object volume.
Same as Figure 1A, further comprise second image unit 18 according to the object detecting system 1 of another preferred embodiment of the present invention.Second image acquisition unit 18 is electrically connected control/processing unit 11, and is arranged at the periphery of the second corner C2.Second image unit, 18 definition, second camera point.
Second image unit 18 is by 11 controls of control/processing unit, and the 3rd image of the part surrounding member 14 on first edge 102 is presented in acquisition indication space when the first smooth territory forms.The 3rd image is included in the obstruction that the object in the indication space causes first light, just is projected in the shade on the 3rd image.Second image unit 18 and by control/processing unit 11 control, the 4th image of the part surrounding member 14 on the 4th edge 108 is presented in acquisition indication space when the second smooth territory forms, and second reflected image of the part surrounding member 14 on the 3rd edge 106 and the 4th edge 108 is presented in the indication space by light reflection subassembly 13.The 4th image and second reflected image are included in the obstruction that the object in the indication space causes second light, just are projected in the shade on the 4th image and second reflected image.In this preferred embodiment, control/processing unit 11 handle first image, second image, first reflected image, the 3rd image, the 4th image and second reflected image wherein at least the two with the decision object information.
What need emphasize is that control/processing unit 11 also can be controlled second luminescence unit 124, the 3rd luminescence unit 126 and the 4th luminescence unit 128 unlatching earlier, to form the second smooth territory earlier, goes 122 unlatchings of control first luminescence unit again to form the first smooth territory.
In practical application, second image unit 18 can be the orthoscopic image sensor.
In practical application, the background value of above-mentioned first reflected image and second reflected image can be a little less than, and then influence is projected in the interpretation of the shade on first reflected image and second reflected image to mirror image.For head it off, it is longer or open twice that control/processing unit 11 can be controlled time of unlatchings such as second luminescence unit 124, the 3rd luminescence unit 126, the 4th luminescence unit 128, first image unit 16 and second image unit 18, makes the time shutter of first reflected image and second reflected image long than the time shutter of first image and the 3rd image.In addition, also can reach the total quantity of illumination that allows total quantity of illumination in the second smooth territory be higher than the first smooth territory by the drive current of the yield value of control first luminescence unit 122, second luminescence unit 124, the 3rd luminescence unit 126 and the 4th luminescence unit 128, light emitting diode or the modes such as a number of light emitting diode in it lighted.
Below will fall among Figure 1A in the indication plane 10 and be example by first image unit 16 and second image unit 18 with two input points (P1, P2), it forms light territory and picked image situation at different time according to object detecting system of the present invention 1 to use explanation.
Shown in Fig. 2 A, the solid line representative is opened to form the first smooth territory and P1 and P2 two input points at T0 time point control/processing unit 11 controls first luminescence unit 122 and is hindered the path (X represents with the angle amount) that light is incident upon first image unit 16 and second image unit 18 among the figure.Dotted line representative among Fig. 2 A is opened to form the second smooth territory and P1 and P2 two input points at T1 time point control/processing unit 11 controls second luminescence unit 124, the 3rd luminescence unit 126 and the 4th luminescence unit 128 and is hindered the path that light is incident upon first image unit 16 and second image unit 18.
Be shown in Fig. 2 A equally, P1 and P2 two input points hinder the path that light is incident upon first image unit 16 at T0 and T1 two time points and form φ 1, φ 2, φ 3 and 4 four angle amounts of φ respectively.Shown in Fig. 2 B, at the T0 time point, 16 acquisitions of first image unit have the real image shade of corresponding angles vector φ 3 about the image I1 in the first smooth territory on it.At the T1 time point, 16 acquisitions of first image unit have the real image shade of corresponding angles vector φ 4 and the mirror image shade of angle amount φ 1 and angle amount φ 2 about the image I2 in the second smooth territory on it.
Be shown in Fig. 2 A equally, P1 and P2 two input points hinder the path that light is incident upon second image unit 18 at T0 and T1 two time points and form θ 1, θ 2, θ 3 and 4 four angle amounts of θ respectively.Shown in Fig. 2 C, at the T0 time point, 18 acquisitions of second image unit have the real image shade of corresponding angles vector θ 3 about the image I3 in the first smooth territory on it.At the T1 time point, 18 acquisitions of second image unit have the real image shade of corresponding angles vector θ 4 and the mirror image shade of angle amount θ 1 and angle amount θ 2 about the image I4 in the second smooth territory on it.
Significantly, by resolving image I1, image I2, image I3 and the indicated angle amount of image I4 top shadow, can calculate the position of P1 shown in Fig. 2 A and P2 two input points exactly according to object detecting system 1 of the present invention.What more need emphasize is all can be single orthoscopic image sensor according to first image unit 16 of the present invention and second image unit 18.By this, can adopt expensive image sensor according to object detecting system of the present invention, also can avoid image sensor to sense wrong light territory or sensing in its assembling, also can exempt the waveguide assemblies that can reduce the display resolution of display less than the situation in light territory.
See also Fig. 3, Fig. 3 illustrates the process flow diagram according to the object detecting method 2 of a wherein preferred embodiment of the present invention.Enforcement comprises surrounding member, light reflection subassembly, first luminescence unit, second luminescence unit, the 3rd luminescence unit and the 4th luminescence unit according to the basis of object detecting method 2 of the present invention.Indication plane in surrounding member definition indication space and the indication space is for the target location of object indication on the indication plane.Surrounding member and object have relativity.The indication plane have first edge, second edge adjacent, three edge adjacent with second edge with first edge and with the 4th adjacent edge of the 3rd edge and first edge.The 3rd edge and the 4th edge form first corner.Second edge and the 3rd edge form second corner.The light reflection subassembly is arranged on the surrounding member, and is positioned at first edge.First luminescence unit is positioned at first edge.Second luminescence unit is positioned at second edge.The 3rd luminescence unit is positioned at the 3rd edge.The 4th luminescence unit is positioned at the 4th edge.The specific embodiment of first luminescence unit, second luminescence unit, the 3rd luminescence unit, the 4th luminescence unit and light reflection subassembly is asked for an interview shown in Figure 1A and Figure 1B, does not repeat them here.
As shown in Figure 3, according to object detecting method 2 of the present invention execution in step S20 at first, control first luminescence unit to launch first light, wherein first light is by the indication space and then form the first smooth territory.
Then, according to object detecting method 2 execution in step S22 of the present invention, when the first smooth territory forms, be presented in first image of the part surrounding member on first edge in acquisition indication space, the first corner place.
Then, according to object detecting method 2 execution in step S24 of the present invention, control second luminescence unit, the 3rd luminescence unit and the 4th luminescence unit, to launch second light, wherein second light is by indicating the space and then forming the second smooth territory.
Then, according to object detecting method 2 execution in step S26 of the present invention, when the second smooth territory forms, be presented in second image of the part surrounding member on second edge in acquisition indication space, the first corner place, and first reflected image of the part surrounding member on second edge and the 3rd edge is presented in the indication space by the light reflection subassembly.
At last, according to object detecting method 2 execution in step S28 of the present invention, handle first image, this second image and first reflected image are positioned at the indication space with the decision object object information.The content that contains about object information with and the mode of decision in above describing in detail, do not repeat them here.
According to the object detecting method 2 of another preferred embodiment of the present invention and synchronous, be presented in the 3rd image of the part surrounding member on first edge in acquisition indication space, the second corner place with step S22.According to object detecting method 2 of the present invention and synchronous with step S26, the 4th image of the part surrounding member on the 4th edge is presented in acquisition indication space, and second reflected image of the part surrounding member on the 3rd edge and the 4th edge is presented in the indication space by the light reflection subassembly.And in step S28, wherein the two decides object information at least by handling first image, second image, first reflected image, the 3rd image, the 4th image and second reflected image.
In a specific embodiment, first image, second image and first reflected image can get by single orthoscopic image sensor acquisition.The 3rd image, the 4th image and second reflected image can get by another orthoscopic image sensor acquisition.
In practical application, total quantity of illumination in the second smooth territory is higher than total quantity of illumination in the first smooth territory.
By the above detailed description of preferred embodiments, be to wish to know more to describe feature of the present invention and spirit, and be not to come protection scope of the present invention is limited with above-mentioned disclosed preferred embodiment.On the contrary, its objective is that hope can contain in the protection domain of claim of being arranged in of various changes and tool equality institute of the present invention desire application.Therefore, the protection domain of the claim that the present invention applied for should be done the broadest explanation according to above-mentioned explanation, contains the arrangement of all possible change and tool equality to cause it.

Claims (10)

1. object detecting system is characterized in that this object detecting system comprises:
Surrounding member, this surrounding member definition indication space and should the indication space in the indication plane indicate target location on plane for the object indication at this, this surrounding member and this object have relativity, this indication plane have first edge, second edge adjacent, three edge adjacent with this second edge with this first edge and with the 4th adjacent edge of the 3rd edge and this first edge, the 3rd edge and the 4th edge form first corner, and this second edge and the 3rd edge form second corner;
Light reflection subassembly, this light reflection subassembly are arranged on this surrounding member and are positioned at this first edge;
Control/processing unit;
First luminescence unit, this first luminescence unit is electrically connected to this control/processing unit, this first luminescence unit is arranged on this surrounding member and is positioned at this first edge, this first luminescence unit is controlled to launch first light by this control/processing unit, and this first light is by this indication space and then form the first smooth territory;
Second luminescence unit, this second luminescence unit is electrically connected to this control/processing unit, and this second luminescence unit is arranged on this surrounding member;
The 3rd luminescence unit, the 3rd luminescence unit is electrically connected to this control/processing unit, and the 3rd luminescence unit is arranged on this surrounding member;
The 4th luminescence unit, the 4th luminescence unit is electrically connected to this control/processing unit, the 4th luminescence unit is arranged on this surrounding member, this second luminescence unit, the 3rd luminescence unit and the 4th luminescence unit are controlled to launch second light by this control/processing unit, and this second light is by this indication space and then form the second smooth territory; And
First image unit, this first image unit is electrically connected this control/processing unit and is arranged at this first corner periphery, this first image unit defines first camera point, this first image unit captures first image that this surrounding member of part on this first edge is presented in this indication space by this control/processing unit control when this first smooth territory forms, this first image unit and controlled by this control/processing unit and to capture this indication space be presented in second image of this surrounding member of part on this second edge and this surrounding member of part on this second edge and the 3rd edge is presented in this indication space by this light reflection subassembly first reflected image when this second smooth territory forms;
Wherein this first image of this control/processing unit processes, this second image and this first reflected image are positioned at the object information in this indication space to determine this object.
2. object detecting system as claimed in claim 1 is characterized in that this light reflection subassembly is level crossing or prism.
3. object detecting system as claimed in claim 1 is characterized in that this object information is this target location and is positioned at the object three-dimensional shape and/or the object volume in this indication space with respect to the relative position on this indication plane, body form that this object is projeced into this indication plane and/or object area or this object.
4. object detecting system as claimed in claim 1 is characterized in that this object detecting system further comprises:
Second image unit, this second image unit is electrically connected this control/processing unit and is arranged at this second corner periphery, this second image unit defines second camera point, this second image unit captures the 3rd image that this surrounding member of part on this first edge is presented in this indication space by this control/processing unit control when this first smooth territory forms, this second image unit and controlled by this control/processing unit and to capture this indication space be presented in the 4th image of this surrounding member of part on the 4th edge and this surrounding member of part on the 3rd edge and the 4th edge is presented in this indication space by this light reflection subassembly second reflected image when this second smooth territory forms;
Wherein this first image of this control/processing unit processes, this second image, this first reflected image, the 3rd image, the 4th image and this second reflected image wherein at least the two to determine this object information.
5. object detecting system as claimed in claim 4 is characterized in that this first image unit and this second image unit are respectively the orthoscopic image sensor.
6. object detecting method, it is characterized in that, surrounding member definition indication space and should the indication space in the indication plane indicate target location on plane for the object indication at this, this surrounding member and this object have relativity, this indication plane has first edge, second edge adjacent with this first edge, three edge adjacent with this second edge and with the 4th adjacent edge of the 3rd edge and this first edge, the 3rd edge and the 4th edge form first corner, this second edge and the 3rd edge form second corner, the light reflection subassembly is arranged on this surrounding member and is positioned at this first edge, first luminescence unit is positioned at this first edge, second luminescence unit is positioned at this second edge, the 3rd luminescence unit is positioned at the 3rd edge, the 4th luminescence unit is positioned at the 4th edge, and this object detecting method comprises following steps:
(a) control this first luminescence unit to launch first light, wherein this first light is by this indication space and then form the first smooth territory;
(b) when this first smooth territory forms, capture first image that this surrounding member of part on this first edge is presented in this indication space in this first corner place;
(c) control this second luminescence unit, the 3rd luminescence unit and the 4th luminescence unit, to launch second light, wherein this second light is by this indication space and then form the second smooth territory;
(d) when this second smooth territory forms, capture this indication space in this first corner place and be presented in second image of this surrounding member of part on this second edge and this surrounding member of part on this second edge and the 3rd edge is presented in this indication space by this light reflection subassembly first reflected image; And
(e) handle this first image, this second image and this first reflected image and be positioned at the object information in this indication space to determine this object.
7. object detecting method as claimed in claim 6 is characterized in that this light reflection subassembly is level crossing or prism.
8. object detecting method as claimed in claim 6 is characterized in that this object information is this target location and is positioned at the object three-dimensional shape and/or the object volume in this indication space with respect to the relative position on this indication plane, body form that this object is projeced into this indication plane and/or object area or this object.
9. object detecting method as claimed in claim 6, it is characterized in that step (b) and capture the 3rd image that this surrounding member of part on this first edge is presented in this indication space in this second corner place, step (d) and capture this indication space and be presented in the 4th image of this surrounding member of part on the 4th edge and this indication space and be presented in second reflected image of this surrounding member of part on the 3rd edge and the 4th edge by this light reflection subassembly, step (e) is handled this first image, this second image, this first reflected image, the 3rd image, the 4th image and this second reflected image wherein at least the two to determine this object information.
10. object detecting method as claimed in claim 9, it is characterized in that this first image, this second image and this first reflected image capture by the first orthoscopic image sensor, the 3rd image, the 4th image and this second reflected image capture by the second orthoscopic image sensor.
CN 201010130510 2010-03-03 2010-03-03 System and method for detecting object by using non-coincident fields of light Expired - Fee Related CN101847063B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN 201010130510 CN101847063B (en) 2010-03-03 2010-03-03 System and method for detecting object by using non-coincident fields of light

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN 201010130510 CN101847063B (en) 2010-03-03 2010-03-03 System and method for detecting object by using non-coincident fields of light

Publications (2)

Publication Number Publication Date
CN101847063A true CN101847063A (en) 2010-09-29
CN101847063B CN101847063B (en) 2013-04-17

Family

ID=42771696

Family Applications (1)

Application Number Title Priority Date Filing Date
CN 201010130510 Expired - Fee Related CN101847063B (en) 2010-03-03 2010-03-03 System and method for detecting object by using non-coincident fields of light

Country Status (1)

Country Link
CN (1) CN101847063B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103019458A (en) * 2011-09-20 2013-04-03 原相科技股份有限公司 Optical touch control system and positioning method thereof
CN103092430A (en) * 2011-11-03 2013-05-08 原相科技股份有限公司 Optical touch system and positioning method thereof

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010055006A1 (en) * 1999-02-24 2001-12-27 Fujitsu Limited Optical scanning-type touch panel
JP2006318512A (en) * 1998-10-02 2006-11-24 Semiconductor Energy Lab Co Ltd Information terminal equipment
CN201035553Y (en) * 2007-04-10 2008-03-12 北京汇冠新技术有限公司 Light path structure of touch panel using camera and reflector
US7460110B2 (en) * 2004-04-29 2008-12-02 Smart Technologies Ulc Dual mode touch system
CN100468303C (en) * 2003-02-14 2009-03-11 奈克斯特控股公司 Touch screen signal processing
CN101609381A (en) * 2008-06-18 2009-12-23 北京汇冠新技术股份有限公司 Use the touch-detection sensing device of camera and reflective mirror

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006318512A (en) * 1998-10-02 2006-11-24 Semiconductor Energy Lab Co Ltd Information terminal equipment
US20010055006A1 (en) * 1999-02-24 2001-12-27 Fujitsu Limited Optical scanning-type touch panel
CN100468303C (en) * 2003-02-14 2009-03-11 奈克斯特控股公司 Touch screen signal processing
US7460110B2 (en) * 2004-04-29 2008-12-02 Smart Technologies Ulc Dual mode touch system
CN201035553Y (en) * 2007-04-10 2008-03-12 北京汇冠新技术有限公司 Light path structure of touch panel using camera and reflector
CN101609381A (en) * 2008-06-18 2009-12-23 北京汇冠新技术股份有限公司 Use the touch-detection sensing device of camera and reflective mirror

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103019458A (en) * 2011-09-20 2013-04-03 原相科技股份有限公司 Optical touch control system and positioning method thereof
CN103019458B (en) * 2011-09-20 2016-05-18 原相科技股份有限公司 Optical touch control system and localization method thereof
CN103092430A (en) * 2011-11-03 2013-05-08 原相科技股份有限公司 Optical touch system and positioning method thereof
CN103092430B (en) * 2011-11-03 2016-01-13 原相科技股份有限公司 Optical touch control system and localization method thereof

Also Published As

Publication number Publication date
CN101847063B (en) 2013-04-17

Similar Documents

Publication Publication Date Title
CN101663637B (en) Touch screen system with hover and click input methods
US8576200B2 (en) Multiple-input touch panel and method for gesture recognition
CN100576156C (en) Utilize the optical navigation system and the method for estimating motion of optics lift detection
JP2001142642A (en) Device for inputting coordinates
US9442607B2 (en) Interactive input system and method
CN104035555A (en) System, Information Processing Apparatus, And Information Processing Method
CN103324358A (en) Optical Touch System
US8982101B2 (en) Optical touch system and optical touch-position detection method
US20110199337A1 (en) Object-detecting system and method by use of non-coincident fields of light
CN104094204B (en) Optical element with alternate mirror lens crystal face
CN103677445A (en) Position detection apparatus and image display apparatus
CN101464745B (en) Back projection light source type touch recognition device and method thereof
WO2010137843A2 (en) Touch screen apparatus adopting an infrared scan system
US20110193969A1 (en) Object-detecting system and method by use of non-coincident fields of light
CN101847063B (en) System and method for detecting object by using non-coincident fields of light
CN101667083A (en) Position detection system and arrangement method thereof
CN101923418B (en) Object sensing system and method
CN101819489B (en) Object detecting system
US20160139735A1 (en) Optical touch screen
US8912482B2 (en) Position determining device and method for objects on a touch device having a stripped L-shaped reflecting mirror and a stripped retroreflector
US20130241882A1 (en) Optical touch system and optical touch position detecting method
US20170185157A1 (en) Object recognition device
CN102043543A (en) Optical touch control system and method
US20140043297A1 (en) Optical Touch System and Optical Touch Control Method
KR101258815B1 (en) reflection type touch screen

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20130417

Termination date: 20160303

CF01 Termination of patent right due to non-payment of annual fee