CN101923418A - Object sensing system and method - Google Patents

Object sensing system and method Download PDF

Info

Publication number
CN101923418A
CN101923418A CN 201010143684 CN201010143684A CN101923418A CN 101923418 A CN101923418 A CN 101923418A CN 201010143684 CN201010143684 CN 201010143684 CN 201010143684 A CN201010143684 A CN 201010143684A CN 101923418 A CN101923418 A CN 101923418A
Authority
CN
China
Prior art keywords
edge
image
light
reflection subassembly
surrounding member
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN 201010143684
Other languages
Chinese (zh)
Other versions
CN101923418B (en
Inventor
唐建兴
蔡华骏
廖昱维
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qisda Suzhou Co Ltd
Qisda Corp
Original Assignee
Qisda Suzhou Co Ltd
Qisda Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qisda Suzhou Co Ltd, Qisda Corp filed Critical Qisda Suzhou Co Ltd
Priority to CN 201010143684 priority Critical patent/CN101923418B/en
Publication of CN101923418A publication Critical patent/CN101923418A/en
Application granted granted Critical
Publication of CN101923418B publication Critical patent/CN101923418B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention provides an object sensing system and an object sensing method, which are used for sensing object information of an object in an indicated space, such as a target position of the object on an indicated plane. Particularly, images related to the indicated space are captured from images of optical domains formed asynchronously, and the object information of the object in the indicated space is determined according to the captured images.

Description

Object detecting system and method
Technical field
The invention relates to a kind of object detecting system and method, especially, the invention relates to the light territory (non-coincident fields of light) that a kind of utilization do not form simultaneously and the object detecting system and the method for single orthoscopic image sensor (line image sensor).
Background technology
Because touch control screen (touch screen) can allow the operator intuitively see through the advantage that the coordinate input of relative display is carried out in the way of contact, touch control screen has become the input media of the common configuration of display now.Touch control screen to be to be widely used in all kinds of electronic products with display, for example, and monitor, mobile computer, flat computer, ATM (Automatic Teller Machine), point-of-sale terminals, visitor's guide system, industrial control system, etc.
Except the Touch Screen that operators such as traditional resistor formula, condenser type must contact, the coordinate input mode of utilizing camera assembly (image-capturing device) to allow the operator need not really to touch display also is used.Utilize the related prior art of the contactless Touch Screen (or being called the optical touch control screen) of camera assembly to ask for an interview the 4th, 507, No. 557 patents of United States Patent (USP) notification number, seldom do at this and give unnecessary details.Above-mentioned in the optical image mode, the object detecting system of reaching the object space judgement also can be applied to touch-control plotting sheet, touch control controller etc. except that can be applicable to Touch Screen.
For the position of more accurate parsing input point even can support the multiple spot input, prior art about the optical touch control screen, the design proposal of existing multiple different light sources kenel, light reflection subassembly and leaded light component is suggested, so that more angle functions about the input point position to be provided, in order to the position of resolving input point exactly.For example, United States Patent (USP) notification number the 7th, 460, No. 110 patents, it discloses and utilizes the object that possesses the radioluminescence source to drop in the indicating area, and cooperate the catoptron of a slice waveguide assemblies (waveguide) and installing waveguide assemblies two edges, and then cause upper and lower light territory (coincident fields of light) two-layer and that form simultaneously, image unit can capture upper and lower two-layer different image simultaneously by this.
Yet, if will capture simultaneously, two-layer different image, image unit must adopt the higher matrix type image sensor of cost (area image sensor), multiple orthoscopic image sensor (multiple-lineimage sensor) or two orthoscopic image sensors.In addition, adopt matrix type image sensor, multiple orthoscopic image sensor or two orthoscopic image sensors, the optical touch control screen need expend more calculation resources could resolve these image sensor institute picked image, especially adopts the matrix type image sensor.In addition, adopt matrix type image sensor, multiple orthoscopic image sensor or two orthoscopic image sensors, the error that its system assembles of optical touch control screen is caused can cause these image sensors to sense wrong light territory or sensing less than the situation in light territory, especially adopts two orthoscopic image sensors.
In addition, need possess object, waveguide assemblies and the catoptron in radioluminescence source according to the optical touch control screen of the 7th, 460, No. 110 patents of United States Patent (USP) notification number, the three arranges in pairs or groups simultaneously just can reach upper and lower light territory two-layer and that form simultaneously.Significantly, the framework of the 7th, 460, No. 110 patents of United States Patent (USP) notification number is comparatively complicated.And, showing the prior art of Touch Screen about optics, its image unit still has to be hoisted to the identification scope of indicating area and the resolution that drops on the object in the indicating area.
Summary of the invention
Therefore, one object of the present invention is to provide a kind of object detecting system and method, in order to similarly to utilize the target location of optical mode detecting object on the indication plane.And especially, light territory (non-coincident fields of light) and single orthoscopic image sensor of not forming simultaneously according to object detecting system of the present invention and method utilization are to solve the problem that prior art was caused of light territory that above-mentioned utilization forms simultaneously and expensive image sensor.
In addition, another object of the present invention is to provide a kind of object detecting system and method, in order to the object informations such as body form, object area, object three-dimensional shape and object volume of detecting object in comprising the indication space of indicating the plane.
The object detecting system of a kind of embodiment that proposes according to purpose of the present invention, its surrounding member (peripheral member), filtering assembly (light-filtering device), light reflection subassembly (reflector), the first reverse smooth reflection subassembly (retro reflector), the second reverse smooth reflection subassembly, the 3rd reverse smooth reflection subassembly, control module (controlling unit), first luminescence unit (light-emitting unit) and first image unit (image-capturing unit).Indication plane in periphery definition indication space and the indication space is for the target location of object indication on the indication plane.Surrounding member and object have relativity.The indication plane have first edge, second edge adjacent, three edge adjacent with second edge with first edge and with the 4th adjacent edge of the 3rd edge and first edge.The 3rd edge and the 4th edge form first corner.Second edge and the 3rd edge form second corner.Filtering assembly is arranged on the surrounding member, and is positioned at first edge.The light reflection subassembly is arranged on the surrounding member, and is positioned at first edge and is positioned at the back side of filtering assembly.The first reverse smooth reflection subassembly is arranged on the surrounding member, and be positioned at first edge and be positioned at the light reflection subassembly above or below.The second reverse smooth reflection subassembly is arranged on the surrounding member, and is positioned at second edge.The 3rd reverse smooth reflection subassembly is arranged on the surrounding member, and is positioned at the 3rd edge.First luminescence unit is electrically connected to control module, and is arranged at first corner periphery.First luminescence unit comprises first light emitting source and second light emitting source.First luminescence unit is controlled by control module, launches first light to drive first light emitting source.First light passes through the indication space, and then forms the first smooth territory.First luminescence unit and by control module control is launched second light to drive second light emitting source.Second light passes through the indication space, and then forms the second smooth territory.Filtering assembly does not allow first light pass through, but allows second light pass through.First image unit is electrically connected control module, and is arranged at first corner periphery.First image unit defines first camera point.First image unit is controlled by control module, and when the first smooth territory formed, first image that reaches the part surrounding member on second edge on first edge was presented in by the first reverse smooth reflection subassembly and the second reverse smooth reflection subassembly in acquisition indication space.First image unit and by control module control, when the second smooth territory formed, acquisition indication space was by the 3rd reverse smooth reflection subassembly and the light reflection subassembly is presented on the 3rd edge and first reflected image of the part surrounding member at this second edge.Control module is handled first image and first reflected image, is positioned at the object information in indication space with the decision object.
The object detecting system of a kind of embodiment that proposes according to purpose of the present invention, wherein the light reflection subassembly can be level crossing or prism.
The object detecting system of a kind of embodiment that proposes according to purpose of the present invention, wherein the light reflection subassembly comprises first reflecting surface and second reflecting surface.First reflecting surface and second reflecting surface be haply with right angle intersection, and towards the indication space.The main plane of extending of indication plane definition.The plane is extended in the definition of first reflecting surface for the first time.The plane is extended in the definition of second reflecting surface for the second time.Each intersects with miter angle haply with the main plane of extending with extending for the second time the plane to extend for the first time the plane.
The object detecting system of a kind of embodiment that proposes according to purpose of the present invention, wherein the light reflection subassembly can be a prism.
The object detecting system of a kind of embodiment that proposes according to purpose of the present invention, the wherein first image unit orthoscopic image sensor.
The object detecting system of the another kind of embodiment that proposes according to purpose of the present invention further comprises the 4th reverse smooth reflection subassembly, second luminescence unit and second image unit.The 4th reverse smooth reflection subassembly is to be arranged on the surrounding member, and is positioned at the 4th edge.Second luminescence unit system is electrically connected to control module, and is arranged at second corner periphery.Second luminescence unit comprises the 3rd light emitting source and the 4th light emitting source.Second luminescence unit system launches first light by control module control to drive the 3rd light emitting source.Second luminescence unit and by control module control is launched second light to drive the 4th light emitting source.Second image unit system is electrically connected control module, and is arranged at second corner periphery.Second image unit defines second camera point.Second image unit is by control module control, and when the first smooth territory formed, second image that reaches the part surrounding member on the 4th edge on first edge was presented in by the first reverse smooth reflection subassembly and the 4th reverse smooth reflection subassembly in acquisition indication space.Second image unit and by control module control, when the second smooth territory formed, second reflected image of the part surrounding member on the 3rd edge and the 4th edge was presented in acquisition indication space by the 3rd reverse smooth reflection subassembly and light reflection subassembly.Control module handle first image, second image, first reflected image and second reflected image wherein at least the two, with the decision object information.
The object detecting system of the another kind of embodiment that proposes according to purpose of the present invention, wherein second image unit is the orthoscopic image sensor.
The object detecting method of a kind of embodiment that proposes according to purpose of the present invention.Enforcement comprises surrounding member, filtering assembly, light reflection subassembly, the first reverse smooth reflection subassembly, the second reverse smooth reflection subassembly and the 3rd reverse smooth reflection subassembly according to the basis of object detecting method of the present invention.Indication plane in surrounding member definition indication space and the indication space is for the target location of object indication on the indication plane.Surrounding member and object tool relativity.The indication plane have first edge, second edge adjacent, three edge adjacent with second edge with first edge and with the 4th adjacent edge of the 3rd edge and first edge.The 3rd edge and the 4th edge form first corner.Second edge and the 3rd edge form second corner.Filtering assembly is arranged on the surrounding member, and is positioned at first edge.The light reflection subassembly is arranged on the surrounding member, and is positioned at first edge and is positioned at the back side of filtering assembly.The first reverse smooth reflection subassembly is arranged on the surrounding member, and be positioned at first edge and be positioned at the light reflection subassembly above or below.The second reverse smooth reflection subassembly is arranged on the surrounding member, and is positioned at second edge.The 3rd reverse smooth reflection subassembly is arranged on the surrounding member, and is positioned at the 3rd edge.At first in the first corner place, launch first light and directive indication space according to object detecting method of the present invention, wherein first light is by indicating the space and then forming the first smooth territory.Then, when object detecting method according to the present invention forms when the first smooth territory, be presented in first image that reaches the part surrounding member on second edge on first edge by the first reverse smooth reflection subassembly and the second reverse smooth reflection subassembly in acquisition indication space, the first corner place.Then, object detecting method according to the present invention is in the first corner place, launches second light and directive indication space, wherein filtering assembly do not allow first light by but allow second light pass through, second light is by the indication space and then form the second smooth territory.Then, when object detecting method according to the present invention forms when the second smooth territory, be presented in first reflected image of the part surrounding member on the 3rd edge and this second edge by the 3rd reverse smooth reflection subassembly and light reflection subassembly in acquisition indication space, the first corner place.At last, object detecting method according to the present invention is handled first image and first reflected image are positioned at the indication space with the decision object object information.
The present invention compares with prior art: in the mode of mirror image, increase the image unit scope of identification indicating area fully 1.; 2. increase the optical path length in image unit and corner, limit, indicating area, so can avoid, low even the difficulty that can't identification of resolution when object during near corner; 3. object real image and mirror image shadow are not stratified in the imaging of image unit; 4. use the light source of two groups of different wave lengths; 5. the object body need not be luminous; 6. object, waveguide assemblies and the catoptron three that need possess the radioluminescence source with prior art arranges in pairs or groups simultaneously and compares down, and framework of the present invention is simple and easy relatively.
Below in conjunction with accompanying drawing the present invention is elaborated.
Description of drawings
Figure 1A is the configuration diagram according to the object detecting system of preferable embodiment of the present invention.
Figure 1B is that surrounding member, filtering assembly, light reflection subassembly and the first reverse smooth reflection subassembly among Figure 1A is along the viewgraph of cross-section of A-A line.
Fig. 2 A is when schematically illustrating the first smooth territory and the second smooth territory and forming respectively, and P1 and P2 two input points hinder the path that light is incident upon first image unit and second image unit.
Fig. 2 B schematically illustrates first image unit to capture respectively about the image in the first smooth territory and about the image in the second smooth territory at T0 and T1 two time points.
Fig. 2 C schematically illustrates second image unit to capture respectively about the image in the first smooth territory and about the image in the second smooth territory at T0 and T1 two time points.
Fig. 3 is the process flow diagram according to the object detecting method of preferable embodiment of the present invention.
Embodiment
The invention provides a kind of object detecting system and method, in order to similarly to utilize the target location of optical mode detecting object on the indication plane.In addition, can detect the object informations such as body form, object area, object three-dimensional shape and object volume of object in comprising the indication space of indicating the plane according to object detecting system of the present invention and method.And the light territory that does not form simultaneously according to object detecting system of the present invention and method utilization especially.By this, can adopt lower-cost image sensor and less calculation resources to carry out according to object detecting system of the present invention and method.Below by detailed description to preferable embodiment of the present invention, use abundant explanation about feature of the present invention, spirit, advantage and the feasibility of implementing.
See also Figure 1A and Figure 1B, Figure 1A is the configuration diagram according to the object detecting system 1 of preferable embodiment of the present invention.Figure 1B is that part surrounding member 19 (not being illustrated among Figure 1A), filtering assembly 132, light reflection subassembly 134 and the first reverse smooth reflection subassembly 122 among Figure 1A is along the viewgraph of cross-section of A-A line.Object detecting system 1 according to the present invention is in order to detect at least one object (for example, finger, stylus, etc.) in the position (for example, two positions (P1, P2) Figure 1A shown in) of indication on the plane 10.
Shown in Figure 1A, object detecting system 1 according to the present invention comprises polygonal surrounding member 19 (be not illustrated among Figure 1A, ask for an interview Figure 1B), filtering assembly 132, light reflection subassembly 134, first reverse smooth reflection subassembly 122, second reverse smooth reflection subassembly the 124, the 3rd reverse smooth reflection subassembly 126, control module 11, first luminescence unit 14 and first image unit 16.Indication plane 10 in surrounding member 19 definition indication space S and the indication space S, just surrounding member 19 is around indication space S, indication plane 10.Surrounding member 19 is contour with the indication space S approximately, for the target location (P1, P2) of object indication on indication plane 10.Surrounding member 19 has relativity with object.Indication plane 10 have first edge 102, second edge 104 adjacent, three edge 106 adjacent with second edge 104 with first edge 102 and with the 4th adjacent edge 108 of the 3rd edge 106 and first edge 102.The 3rd edge 106 and the 4th edge 108 form the first corner C1.Second edge 104 and the 3rd edge 106 form the second corner C2.
Be shown in Figure 1A equally, filtering assembly 132 is arranged on the surrounding member 19, and is positioned at first edge 102.Shown in Figure 1B, light reflection subassembly 134 is arranged on the surrounding member 19, and is positioned at first edge 102 and is positioned at the back side of filtering assembly 132.The first reverse smooth reflection subassembly 122 is arranged on the surrounding member 19, and be arranged in first edge 102 and be positioned at light reflection subassembly 134 above or below (is example to be positioned at light reflection subassembly 134 tops in this embodiment).The second reverse smooth reflection subassembly 124 is arranged on the surrounding member 19, and is positioned at second edge 104.The 3rd reverse smooth reflection subassembly 126 is arranged on the surrounding member 19, and is positioned at the 3rd edge 106.Each reverse smooth reflection subassembly reflection has the incident light L1 of a conduct direction, and makes its reflected light L2 haply along reflecting back with the reverse and parallel direction of this conduct direction of incident light L1, shown in Figure 1B.
Be shown in Figure 1A equally, first luminescence unit 14 is electrically connected to control module 11, and is arranged at the first corner C1 periphery.First luminescence unit 14 comprises first light emitting source 142 and second light emitting source 144.First luminescence unit 14 is by control module 11 controls, to drive first light emitting source, 142 emissions, first light.First light passes through the indication space S, and then forms the first smooth territory.First luminescence unit 14 and by control module 11 control, to drive second light emitting source, 144 emissions, second light.Second light passes through the indication space S, and then forms the second smooth territory.Especially, shown in Figure 1B, filtering assembly 132 does not allow first light pass through, but allows second light pass through.Indicate the travel path that solid arrow is represented first light in Figure 1B, dotted arrow is represented the travel path of second light.Be shown in Figure 1B equally, first light and second light all can be by first reverse smooth reflection subassembly 122 retrodirective reflections, and second light can pass through filtering assembly 132, and then by light reflection subassembly 134 regular reflections.First light can not pass through filtering assembly 132, can not reflected by filtering assembly 132 yet.
In practical application, first light emitting source 142 can be that emission wavelength is the infrared transmitter of 850nm, and second light emitting source 144 can be that emission wavelength is the infrared transmitter of 940nm.
In an embodiment, light reflection subassembly 134 can be a level crossing.
In another embodiment, shown in Figure 1B, light reflection subassembly 134 can comprise first reflecting surface 1342 and second reflecting surface 1344.First reflecting surface 1342 and second reflecting surface 1344 be haply with right angle intersection, and towards the indication space S.The main plane of extending of indication plane 10 definition.The plane is extended in 1342 definition of first reflecting surface for the first time.The plane is extended in 1344 definition of second reflecting surface for the second time.Each intersects with miter angle haply with the main plane of extending with extending for the second time the plane to extend for the first time the plane.In practical application, above-mentioned light reflection subassembly 134 can be a prism.
First image unit 16 is electrically connected control module 11, and is arranged at the periphery of the first corner C1.First image unit, 16 definition, first camera point.First image unit 16 is by control module 11 controls, when the first smooth territory formed, acquisition indication space S was presented in first image that reaches the part surrounding member 19 on second edge 104 on first edge 102 by the first reverse smooth reflection subassembly 122 and the second reverse smooth reflection subassembly 124.First image is included in the obstruction that the object in the indication space S causes first light, just is projected in the shade on first image, for example, and the shade shown in Fig. 2 B on the image I1 (case will be specified in hereinafter shown in Fig. 2 B).First image unit 16 and by control module 11 control, when the second smooth territory formed, acquisition indication space S was presented in first reflected image of the part surrounding member 19 on the 3rd edge 106 and second edge 104 by the 3rd reverse smooth reflection subassembly 126 and light reflection subassembly 134.First reflected image is included in the obstruction that the object in the indication space S causes second light, just is projected in the shade on first reflected image, for example, and the shade shown in Fig. 2 B on the image I2 (case will be specified in hereinafter shown in Fig. 2 B).
In practical application, first image unit 16 can be the orthoscopic image sensor.
At last, control module 11 is handled first image and first reflected image, is positioned at the object information of indication space S with the decision object.
In an embodiment, object information comprises the relative position of target location with respect to this indication plane 10.Control module 11 according to the object in first image in determining first object point (for example, as O1 among Fig. 2 A and O2 point) on first edge 102 or on second edge 104.Control module 11 and on the 3rd edge 106, determine first reflecting object point (for example, as R1 among Fig. 2 A and R2 point) according to the object in first reflected image.Control module 11 and according to first camera point (as the coordinate points among Fig. 2 A (0,0)) and the online relation of first object point (as O1 among Fig. 2 A and O2 point) (for example determine the first straight inbound path, as D1 among Fig. 2 A and D2 path), according to first camera point (as the coordinate points among Fig. 2 A (0,0)) and the online relation and the light reflection subassembly 134 of first reflecting object point (as R1 among Fig. 2 A and R2 point) (for example determine one first reflection paths, as D3 among Fig. 2 A and D4 path), and according to the plotted point of the first straight inbound path (as D1 among Fig. 2 A and D2 path) and first reflection paths (as D3 among Fig. 2 A and D4 path) with the decision relative position.
Same diagrammatic sketch 1A further comprises the 4th reverse smooth reflection subassembly 128, second luminescence unit 15 and second image unit 18 according to the object detecting system 1 of another preferable embodiment of the present invention.
The 4th reverse smooth reflection subassembly 128 is arranged on the surrounding member 19, and is positioned at the 4th edge 108.Second luminescence unit 15 is electrically connected to control module 11, and is arranged at the second corner C2 periphery.Second luminescence unit 15 comprises the 3rd light emitting source 152 and the 4th light emitting source 154.Second luminescence unit 15 is by control module 11 controls, to drive the 3rd light emitting source 152 emissions first light.In practical application, first light emitting source 142 and the 3rd light emitting source 152 are driven launches first light simultaneously, and first light and then forms the first smooth territory by the indication space S.
Second luminescence unit 15 and by control module 11 control, to drive the 4th light emitting source 154 emissions second light.In practical application, second light emitting source 144 and the 4th light emitting source 154 are driven launches second light simultaneously, and second light and then forms the second smooth territory by the indication space S.
Second image acquisition unit 18 is electrically connected control module 11, and is arranged at the periphery of the second corner C2.Second image unit, 18 definition, second camera point.Second image unit 18 is by control module 11 controls, when the first smooth territory formed, acquisition indication space S was presented in second image that reaches the part surrounding member 19 on the 4th edge 108 on first edge 102 by the first reverse smooth reflection subassembly 122 and the 4th reverse smooth reflection subassembly 128.Second image is included in the obstruction that the object in the indication space S causes first light, just is projected in the shade on second image, for example, and the shade shown in Fig. 2 C on the image I3 (case will be specified in hereinafter shown in Fig. 2 C).Second image unit 18 and by control module 11 control, when the second smooth territory formed, acquisition indication space S was presented in second reflected image of the part surrounding member 19 on the 3rd edge 106 and the 4th edge 108 by the 3rd reverse smooth reflection subassembly 126 and light reflection subassembly 134.The second anti-photogra is included in the obstruction that the object in the indication space S causes second light, just is projected in the shade on the second anti-photogra, for example, and the shade shown in Fig. 2 C on the image I4 (case will be specified in hereinafter shown in Fig. 2 C).In this preferable embodiment, control module 11 handle first image, second image, first reflected image and second reflected image wherein at least the two, with the decision object information.
What need emphasize is, control module 11 also can controlling and driving second light emitting source 144 and the 4th light emitting source 154 launch second light to form the second smooth territory earlier in advance, go controlling and driving first light emitting source 142 and the 3rd light emitting source 152 emissions first light again to form the first smooth territory.
In practical application, second image unit 18 can be the orthoscopic image sensor.
Below will fall among Figure 1A in the indication plane 10 and be example by first image unit 16 and second image unit 18 with two input points (P1, P2), it forms light territory and picked image situation at different time according to object detecting system of the present invention 1 to use explanation.
Shown in Fig. 2 A, forming the first smooth territory, and P1 and P2 two input points hinder the path of first smooth retrodirective reflection to the first image unit 16 and second image unit 18 at T0 time point control module 11 controlling and driving, first light emitting source 142 and the 3rd light emitting source 152 emissions first light for solid line representative among the figure.The representative of pecked line among Fig. 2 A drives second light emitting sources 144 and the 4th light emitting source 154 at T1 time point control module 11 and launches second light forming the second smooth territory earlier in advance, and P1 and P2 two input points hinder the path of the second smooth retrodirective reflection and regular reflection to the first image unit 16 and second image unit 18.
Be shown in Fig. 2 A equally, P1 and P2 two input points hinder the path that first light and second light reflexes to first image unit 16 at T0 and T1 two time points and form φ 2, φ 1, φ 4 and 3 four angle amounts of φ respectively.Shown in Fig. 2 B, at the T0 time point, 16 acquisitions of first image unit have the real image shade of corresponding angles vector φ 2 and φ 1 about the image I1 in the first smooth territory on it.At the T1 time point, 16 acquisitions of first image unit have the mirror image shade of corresponding angles vector φ 4 and φ 3 about the image I2 in the second smooth territory on it.Because P1 and P2 two input points in the second smooth territory, can cause the real image shade with corresponding angles vector φ 2 and φ 1 equally on image I2.In order to alleviate calculation resources, to shorten the processing time, at the T1 time point, first image unit 16 only captures the sub-image at corresponding first edge 102, the sub-image at corresponding second edge 104 does not then capture, so, image I2 shown in Fig. 2 B goes up except the mirror image shade of corresponding angles vector φ 4 and φ 3, also has the real image shade of corresponding angles vector φ 2, but does not have the real image shade of corresponding angles vector φ 1.
Be shown in Fig. 2 A equally, P1 and P2 two input points hinder the path that first light and second light reflexes to second image unit 18 at T0 and T1 two time points and form θ 2, θ 21, θ 24 and 23 4 angle amounts of θ respectively.Shown in Fig. 2 C, at the T0 time point, 18 acquisitions of second image unit have the real image shade of corresponding angles vector θ 22 and θ 21 about the image I3 in the first smooth territory on it.At the T1 time point, 18 acquisitions of second image unit have the mirror image shade of corresponding angles vector θ 24 and θ 23 about the image I4 in the second smooth territory on it.Because P1 and P2 two input points in the second smooth territory, can cause the real image shade with corresponding angles vector θ 22 and θ 21 equally on image I4.In order to alleviate calculation resources, to shorten the processing time, at the T1 time point, second image unit 18 only captures the sub-image at corresponding first edge 102, the sub-image at corresponding the 4th edge 108 does not then capture, so, image I4 shown in Fig. 2 C goes up except the mirror image shade of corresponding angles vector θ 24 and θ 23, also has the real image shade of corresponding angles vector θ 22, but does not have the real image shade of corresponding angles vector θ 21.
Significantly, by resolving image I1, image I2, image I3 and the indicated angle amount of image I4 top shadow, can calculate the position of P1 shown in Fig. 2 A and P2 two input points exactly according to object detecting system 1 of the present invention.What more need emphasize is all can be single orthoscopic image sensor according to first image unit 16 of the present invention and second image unit 18.By this, can adopt expensive image sensor, also can avoid image sensor to sense wrong light territory or sensing situation in its assembling less than the light territory according to object detecting system of the present invention.The maximum difference of the present invention and prior art is: in the mode of mirror image, increase the image unit scope of identification indicating area fully 1.; 2. increase the optical path length in image unit and corner, limit, indicating area, so can avoid, low even the difficulty that can't identification of resolution when object during near corner; 3. object real image and mirror image shadow are not stratified in the imaging of image unit; 4. use the light source of two groups of different wave lengths; 5. the object body need not be luminous; 6. object, waveguide assemblies and the catoptron three that need possess the radioluminescence source with prior art arranges in pairs or groups simultaneously and compares down, and framework of the present invention is simple and easy relatively.
See also Fig. 3, Fig. 3 illustrates the process flow diagram according to the object detecting method 2 of a preferable embodiment of the present invention.Enforcement comprises surrounding member, filtering assembly, light reflection subassembly, the first reverse smooth reflection subassembly, the second reverse smooth reflection subassembly and the 3rd reverse smooth reflection subassembly according to the basis of object detecting method 2 of the present invention.Indication plane in surrounding member definition indication space and the indication space is for the target location of object indication on the indication plane.Surrounding member and object tool one contrast relation.The indication plane have first edge, second edge adjacent, three edge adjacent with second edge with first edge and with the 4th adjacent edge of the 3rd edge and first edge.The 3rd edge and the 4th edge form first corner.Second edge and the 3rd edge form second corner.Filtering assembly is arranged on the surrounding member, and is positioned at first edge.The light reflection subassembly is arranged on the surrounding member, and is positioned at first edge and is positioned at the back side of filtering assembly.The first reverse smooth reflection subassembly is arranged on the surrounding member, and be positioned at first edge and be positioned at the light reflection subassembly above or below.The second reverse smooth reflection subassembly is arranged on the surrounding member, and is positioned at second edge.The 3rd reverse smooth reflection subassembly is arranged on the surrounding member, and is positioned at the 3rd edge.
The embodiment of surrounding member, filtering assembly, light reflection subassembly, the first reverse smooth reflection subassembly, the second reverse smooth reflection subassembly and the 3rd reverse smooth reflection subassembly is asked for an interview shown in Figure 1A and Figure 1B, does not repeat them here.
As shown in Figure 3, according to object detecting method 2 of the present invention execution in step S20 at first, in the first corner place, launch first light and directive indication space, wherein first light is by the indication space and then form the first smooth territory.
Then, according to object detecting method 2 execution in step S22 of the present invention, when the first smooth territory forms, be presented in first image that reaches the part surrounding member on second edge on first edge by the first reverse smooth reflection subassembly and the second reverse smooth reflection subassembly in acquisition indication space, the first corner place.
Then,,, launch second light and directive indication space in the first corner place according to object detecting method 2 execution in step S24 of the present invention, wherein filtering assembly do not allow first light by but allow second light pass through, second light is by indication space and then form the second smooth territory.
Then, according to object detecting method 2 execution in step S26 of the present invention, when the second smooth territory forms, be presented in first reflected image of the part surrounding member on the 3rd edge and this second edge by the 3rd reverse smooth reflection subassembly and light reflection subassembly in acquisition indication space, the first corner place.
At last, according to object detecting method 2 execution in step S28 of the present invention, handle first image and first reflected image are positioned at the indication space with the decision object object information.The content that contains about object information with and the mode of decision in above describing in detail, do not repeat them here.
According to the basis of the object detecting method 2 of another preferable embodiment of the present invention and comprise the 4th reverse smooth reflection subassembly.The 4th reverse smooth reflection subassembly is arranged on the surrounding member, and is positioned at the 4th edge.
Step S20 and launch first light and directive indication space in the second corner place.Step S22 and in acquisition indication space, the second corner place, by the first reverse smooth reflection subassembly and the 4th reverse smooth reflection subassembly is presented on first edge and the 4th edge on second image of part surrounding member.Step S24 and launch second light and directive indication space in the second corner place.Second reflected image of the part surrounding member on the 3rd edge and the 4th edge is presented in step S26 and acquisition indication space by the 3rd reverse smooth reflection subassembly and light reflection subassembly.Step S28 handle first image, second image, first reflected image and second reflected image wherein at least the two with the decision object information.
In an embodiment, first image and first reflected image can get by single orthoscopic image sensor acquisition.Second image and second reflected image can get by another orthoscopic image sensor acquisition.
By the detailed description of above preferable embodiment, hope can be known description feature of the present invention and spirit more, and is not to come claim scope of the present invention is limited with above-mentioned disclosed preferable embodiment.On the contrary, its objective is that hope can contain in the scope of claim of being arranged in of various changes and tool equality institute of the present invention desire application.

Claims (10)

1. object detecting system is characterized in that comprising:
Surrounding member, this surrounding member definition indication space and should the indication space in the indication plane indicate target location on plane for the object indication at this, this surrounding member and this object have relativity, this indication plane have first edge, second edge adjacent, three edge adjacent with this second edge with this first edge and with the 4th adjacent edge of the 3rd edge and this first edge, the 3rd edge and the 4th edge form first corner, and this second edge and the 3rd edge form second corner;
Filtering assembly, this filtering assembly are arranged on this surrounding member and are positioned at this first edge;
Light reflection subassembly, this light reflection subassembly are arranged on this surrounding member and are positioned at this first edge and are positioned at the back side of this filtering assembly;
The first reverse smooth reflection subassembly, this first reverse smooth reflection subassembly be arranged on this surrounding member and be positioned at this first edge and be positioned at this light reflection subassembly above or below;
The second reverse smooth reflection subassembly, this second reverse smooth reflection subassembly is arranged on this surrounding member and is positioned at this second edge;
The 3rd reverse smooth reflection subassembly, the 3rd reverse smooth reflection subassembly is arranged on this surrounding member and is positioned at the 3rd edge;
Control module;
First luminescence unit, this first luminescence unit is electrically connected to this control module and is arranged at this first corner periphery, this first luminescence unit comprises first light emitting source and second light emitting source, this first luminescence unit system launches first light by this control module control to drive this first light emitting source, this first light is by this indication space and then form the first smooth territory, this first luminescence unit and launch second light to drive this second light emitting source by the control of this control module, this second light is by this indication space and then form the second smooth territory, wherein this filtering assembly do not allow this first light by but allow this second light pass through; And
First image unit, this first image unit is electrically connected this control module and is arranged at this first corner periphery, this first image unit defines first camera point, this first image unit is by this control module control, when this first smooth territory forms, capture this indication space and be presented in first image that reaches this surrounding member of part on this second edge on this first edge, when this second smooth territory forms, capture this surrounding member of part on the 3rd edge and this second edge is presented in this indication space by the 3rd reverse smooth reflection subassembly and this light reflection subassembly first reflected image by this first reverse smooth reflection subassembly and this second reverse smooth reflection subassembly;
Wherein this control module is handled this first image and this first reflected image is positioned at the object information in this indication space to determine this object.
2. object detecting system as claimed in claim 1 is characterized in that: this light reflection subassembly is level crossing or prism.
3. object detecting system as claimed in claim 1, it is characterized in that: this light reflection subassembly comprises first reflecting surface and second reflecting surface, this first reflecting surface and this second reflecting surface are with right angle intersection and towards this indication space, the main plane of extending of this indication plane definition, the plane is extended in this first reflecting surface definition for the first time, the plane is extended in the definition of this second reflecting surface for the second time, and this extends the plane for the first time and extends the plane for the second time with this each is crossing with miter angle with this master extension plane.
4. object detecting system as claimed in claim 1 is characterized in that: this first image unit is the orthoscopic image sensor.
5. object detecting system as claimed in claim 1, it is characterized in that: this object information comprises the relative position of this target location with respect to this indication plane, this control module according to this object in this first image in determining first object point on this first edge or on this second edge, on the 3rd edge, determine first reflecting object point according to this object in this first reflected image, online relation according to this first camera point and this first object point determines the first straight inbound path, online relation and this light reflection subassembly according to this first camera point and this first reflecting object point determine first reflection paths, and according to the plotted point of this first straight inbound path and this first reflection paths to determine this relative position.
6. object detecting system as claimed in claim 1 is characterized in that further comprising:
The 4th reverse smooth reflection subassembly, the 4th reverse smooth reflection subassembly is arranged on this surrounding member and is positioned at the 4th edge;
Second luminescence unit, this second luminescence unit is electrically connected to this control module and is arranged at this second corner periphery, this second luminescence unit comprises the 3rd light emitting source and the 4th light emitting source, this second luminescence unit is launched this first light by the control of this control module to drive the 3rd light emitting source, this second luminescence unit and launch this second light to drive the 4th light emitting source by this control module control; And
Second image unit, this second image unit is electrically connected this control module and is arranged at this second corner periphery, this second image unit defines second camera point, this second image unit is by this control module control, when this first smooth territory forms, capture this indication space and be presented in second image that reaches this surrounding member of part on the 4th edge on this first edge, when this second smooth territory forms, capture this surrounding member of part on the 3rd edge and the 4th edge is presented in this indication space by the 3rd reverse smooth reflection subassembly and this light reflection subassembly second reflected image by this first reverse smooth reflection subassembly and the 4th reverse smooth reflection subassembly;
Wherein this control module handle this first image, this second image, this first reflected image and this second reflected image wherein at least the two to determine this object information.
7. object detecting system as claimed in claim 6 is characterized in that: this second image unit is an orthoscopic image sensor.
8. object detecting method, comprise: the indication plane in surrounding member definition indication space and this indication space is for the target location of object indication on this indication plane, this surrounding member and this object have relativity, this indication plane has first edge, second edge adjacent with this first edge, three edge adjacent with this second edge and with the 4th adjacent edge of the 3rd edge and this first edge, the 3rd edge and the 4th edge form first corner, this second edge and the 3rd edge form second corner, filtering assembly is arranged on this surrounding member and is positioned at this first edge, the light reflection subassembly is arranged on this surrounding member and is positioned at this first edge and is positioned at the back side of this filtering assembly, the first reverse smooth reflection subassembly be arranged on this surrounding member and be positioned at this first edge and be positioned at this light reflection subassembly above or below, the second reverse smooth reflection subassembly is arranged on this surrounding member and is positioned at this second edge, the 3rd reverse smooth reflection subassembly is arranged on this surrounding member and is positioned at the 3rd edge, it is characterized in that this object detecting method comprises step down:
(a) in this first corner place, launch this indication space of first light and directive, wherein this first light is by this indication space and then form the first smooth territory;
(b) when this first smooth territory forms, capture this indication space in this first corner place and be presented in first image that reaches this surrounding member of part on this second edge on this first edge by this first reverse smooth reflection subassembly and this second reverse smooth reflection subassembly;
(c) in this first corner place, launch this indication space of second light and directive, wherein this filtering assembly does not allow this first light but allow this second light pass through, and this second light is by this indication space and then form the second smooth territory;
(d) when this second smooth territory forms, capture this indication space in this first corner place and be presented in first reflected image that reaches this surrounding member of part at this second edge on the 3rd edge by the 3rd reverse smooth reflection subassembly and this light reflection subassembly; And
(e) handle this first image and this first reflected image and be positioned at the object information in this indication space to determine this object.
9. object detecting method as claimed in claim 8, it is characterized in that: in step (b), first camera point is defined, in step (e), this object information comprises the relative position of this target location with respect to this indication plane, first object point according to this object in this first image in deciding on this first edge or on this second edge, the first reflecting object point decides on the 3rd edge according to this object in this first reflected image, the first straight inbound path decides according to the online pass of this first camera point and this first object point, first reflection paths decides according to online relation and this light reflection subassembly of this first camera point and this first reflecting object point, and this relative position decides according to the plotted point of this first straight inbound path and this first reflection paths.
10. object detecting method as claimed in claim 8, it is characterized in that: the 4th reverse smooth reflection subassembly is arranged on this surrounding member and is positioned at the 4th edge, step (a) and launch this first light and this indication space of directive in this second corner place, step (b) and in this second corner place capture this indication space be presented on this first edge by this first reverse smooth reflection subassembly and the 4th reverse smooth reflection subassembly and the 4th edge on second image of this surrounding member of part, step (c) and launch this second light and this indication space of directive in this second corner place, step (d) and capture this indication space and be presented in second reflected image of this surrounding member of part on the 3rd edge and the 4th edge by the 3rd reverse smooth reflection subassembly and this light reflection subassembly, step (e) is handled this first image, this second image, this first reflected image and this second reflected image wherein at least the two to determine this object information.
CN 201010143684 2010-03-07 2010-03-07 Object sensing system and method Expired - Fee Related CN101923418B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN 201010143684 CN101923418B (en) 2010-03-07 2010-03-07 Object sensing system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN 201010143684 CN101923418B (en) 2010-03-07 2010-03-07 Object sensing system and method

Publications (2)

Publication Number Publication Date
CN101923418A true CN101923418A (en) 2010-12-22
CN101923418B CN101923418B (en) 2013-01-16

Family

ID=43338386

Family Applications (1)

Application Number Title Priority Date Filing Date
CN 201010143684 Expired - Fee Related CN101923418B (en) 2010-03-07 2010-03-07 Object sensing system and method

Country Status (1)

Country Link
CN (1) CN101923418B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103186291A (en) * 2011-12-29 2013-07-03 原相科技股份有限公司 Optical touch system
TWI456464B (en) * 2011-12-21 2014-10-11 Pixart Imaging Inc Optical touch system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006318512A (en) * 1998-10-02 2006-11-24 Semiconductor Energy Lab Co Ltd Information terminal equipment
US7460110B2 (en) * 2004-04-29 2008-12-02 Smart Technologies Ulc Dual mode touch system
CN100468303C (en) * 2003-02-14 2009-03-11 奈克斯特控股公司 Touch screen signal processing
CN101609381A (en) * 2008-06-18 2009-12-23 北京汇冠新技术股份有限公司 Use the touch-detection sensing device of camera and reflective mirror

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006318512A (en) * 1998-10-02 2006-11-24 Semiconductor Energy Lab Co Ltd Information terminal equipment
CN100468303C (en) * 2003-02-14 2009-03-11 奈克斯特控股公司 Touch screen signal processing
US7460110B2 (en) * 2004-04-29 2008-12-02 Smart Technologies Ulc Dual mode touch system
CN101609381A (en) * 2008-06-18 2009-12-23 北京汇冠新技术股份有限公司 Use the touch-detection sensing device of camera and reflective mirror

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI456464B (en) * 2011-12-21 2014-10-11 Pixart Imaging Inc Optical touch system
US9389731B2 (en) 2011-12-21 2016-07-12 Pixart Imaging Inc Optical touch system having an image sensing module for generating a two-dimensional image and converting to a one-dimensional feature
CN103186291A (en) * 2011-12-29 2013-07-03 原相科技股份有限公司 Optical touch system
CN103186291B (en) * 2011-12-29 2015-12-02 原相科技股份有限公司 Optical touch control system

Also Published As

Publication number Publication date
CN101923418B (en) 2013-01-16

Similar Documents

Publication Publication Date Title
TWI453642B (en) Multiple-input touch panel and method for gesture recognition
CN101663637B (en) Touch screen system with hover and click input methods
CN100576156C (en) Utilize the optical navigation system and the method for estimating motion of optics lift detection
CN103019474A (en) Optical touch scanning device
CN103324356B (en) Optical touch system and optical touch position detection method
CN103419944A (en) Air bridge and automatic abutting method therefor
CN102232209A (en) Stereo optical sensors for resolving multi-touch in a touch detection system
US20110109565A1 (en) Cordinate locating method, coordinate locating device, and display apparatus comprising the coordinate locating device
US20110199337A1 (en) Object-detecting system and method by use of non-coincident fields of light
US11966811B2 (en) Machine vision system and method with on-axis aimer and distance measurement assembly
CN103677445A (en) Position detection apparatus and image display apparatus
US20110193969A1 (en) Object-detecting system and method by use of non-coincident fields of light
CN101923418B (en) Object sensing system and method
CN111397586A (en) Measuring system
CN105278228B (en) Laser projection display and its color alignment method
CN101847063B (en) System and method for detecting object by using non-coincident fields of light
CN101819489B (en) Object detecting system
TWI521413B (en) Optical touch screen
US20160004385A1 (en) Input device
EP4276682A1 (en) Biometric acquisition and recognition system and method, and terminal device
US8912482B2 (en) Position determining device and method for objects on a touch device having a stripped L-shaped reflecting mirror and a stripped retroreflector
TWI587196B (en) Optical touch system and optical detecting method for touch position
US20170185157A1 (en) Object recognition device
CN110618677A (en) Mobile robot
US9104269B2 (en) Touch control apparatus and associated selection method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20130116

Termination date: 20160307