CN101663637B - Touch screen system with hover and click input methods - Google Patents

Touch screen system with hover and click input methods Download PDF

Info

Publication number
CN101663637B
CN101663637B CN2008800117959A CN200880011795A CN101663637B CN 101663637 B CN101663637 B CN 101663637B CN 2008800117959 A CN2008800117959 A CN 2008800117959A CN 200880011795 A CN200880011795 A CN 200880011795A CN 101663637 B CN101663637 B CN 101663637B
Authority
CN
China
Prior art keywords
touch
detector
touch area
screen
touch screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN2008800117959A
Other languages
Chinese (zh)
Other versions
CN101663637A (en
Inventor
约翰·牛顿
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Next Holdings Ltd.
Original Assignee
Next Holdings Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Next Holdings Ltd filed Critical Next Holdings Ltd
Priority claimed from PCT/US2008/060102 external-priority patent/WO2008128096A2/en
Publication of CN101663637A publication Critical patent/CN101663637A/en
Application granted granted Critical
Publication of CN101663637B publication Critical patent/CN101663637B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

A touch screen system that can approximate tracking and dragging states regardless of the user's orientation and without reliance on direct sensing of touch pressure or area. A first detector generates a signal representing a first image of an object interacting with the touch screen. A second detector generates a signal representing a second image of the object. A signal processor processes the first signal to determine approximated coordinates of a first pair of outer edges of the object and processes the second signal to determine approximated coordinates of a second pair of outer edges of the object. The signal processor then calculates an approximated touch area based on the approximated coordinates of the first pair of outer edges and the approximated coordinates of the second pair of outer edges of the object. If the approximated touch area is less than or equal to a threshold touch area, the signal processor determines that the object interacting with the touch screen indicates a tracking state. If the approximated touch area is greater than the threshold touch area, the signal processor determines that the object interacting with the touch screen indicates a selection state. The threshold touch area may be established by calibrating the touch screen system when the object interacting with the touch screen is known to indicate the tracking state.

Description

Utilize the touch-screen system that suspends and click input method
Related application
The application requires the right of priority to the 554th, No. 416 New Zealand's temporary patent application that is entitled as " utilizing the touch-screen that suspends and click input method " of new zealand patent office submission on April 11st, 2007.
Technical field
The present invention relates generally to touch sensitive display, be also referred to as touch-screen.More specifically, the present invention relates to utilize the system and method that alternately carry out optical detection of signal Processing, wherein said mutual representative tracking, selection and drag operation to user and touch-screen.
Background technology
The touch-screen system of prior art can be classified in the following technology groups: ohmic, surface capacitance, protrusion electric capacity, surface acoustic wave (SAW), infrared (IR), frustrated total internal reflection (FTIR), optics and dispersion signal (flexural wave).Each touch screen technology all has characteristic, merits and demerits separately.In whole these technology, the shortage of human finger's size and sensing precision can make and be difficult to the accurate touch screen interaction of sensing.Most of traditional touch screen system does not all solve the needs that the active user interface requires to have at least four kinds of different interaction modes, i.e. (1) state out-of-bounds; (2) tracking mode (being also referred to as " suspension " or " approaching " state); (3) selection mode (being also referred to as " click " state) and (4) drag state.
As a comparison, traditional calculating machine input equipment, for example mouse, pen and touch pad allow the user to carry out tracking, drag and selection operation.Mouse for example allows the user to be independent of button click and selects following calculation display screen cursor on every side, perhaps allows the user when handling mouse, to carry out drag operation through button is remained on down state.Pen and touch pad can directly be measured contact pressure, thereby utilize the pressure that detects to distinguish tracking over time, drag and selection mode.It is very important in many software application that cursor is positioned and carry out subsequently the ability that optics pushes or trigger, and allows accurate more input pattern.Therefore in the touch screen technology field, generally need this function.
For detecting and tracking with drag state, any touch-screen system all must continuous probe and the position of report user finger or stylus.Yet, most of traditional touch screen system only contacting between user's finger or stylus and touch screen surface produce or when disappearing record select (" click "), thereby independently tracking operation and drag operation can't be provided.As an exception; People such as Benko are entitled as " Precise Selection Techniques forMulti-Touch Screens (the accurate selection technology that is used for many touch-screens) " (Proc.ACMCHI 2006:Human Factors in Computing Systems at it; Pp.1263-1272) proved the FTIR touch-screen system in the paper; This FTIR touch-screen system direct sensing touches area, applicable to the detecting touch area over time roughly to estimate to follow the tracks of and drag state.The described technology of people such as Benko is known as SimPress, allegedly can reduce the displacement error in the click process and allow the suspended state of simulation on can't the equipment of sense proximity.
The SimPress technology type is similar to pressure-sensing touch pad and the employed technology of a surface of contact that is used for computing machine.All these technology all require to have the pressure that direct sensing touches or the ability of area (i.e. the surface area of finger or the touch-screen that stylus contacted); Thereby be not useable for lacking in the touch-screen system of this ability, comprise Infrared touch screen systems and optical touch screen systems.In addition, owing to calculate to touch the mode of the variation of area, the SimPress technology only always just works during near tabletop touch from same direction the user.Therefore, needs are a kind of does not consider the directed of user and does not rely on the touch-screen system that just can roughly estimate to follow the tracks of and drag state to the direct sensing of touch pressure or area.
The infrared touch panel technology depends on the interruption of the infrared light grid that is positioned at display screen the place ahead." touch framework " or " optical matrix framework " comprises usually and is installed in a row infrared LED and the row optical transistor of two opposite sides with the grid that produces sightless infrared light.Said frame assembly comprises printed wiring board, photoelectric device is installed on this printed wiring board and these photoelectric devices are hidden in after the shadow shield of infrared transparent.Said shadow shield makes photoelectric device not receive the influence of operating environment and allows infrared beam to pass.
Infrared controller applies pulse to produce the grid of infrared beam continuously to LED.When stylus or finger entering grid, it blocks some light beams.One or more phototransistors are surveyed lacking of light and are sent the x that can be used for recognizing touch operation and the signal of y coordinate.Infrared touch screen systems is in being usually used in manufacturing and medical application, because it can utilize the hard or soft object of any amount to come to seal fully and operate.The subject matter of Infrared touch screen systems is to touch " location " of framework a little more than screen.Therefore, before finger or stylus actual touch screen, touch-screen is vulnerable to the influence of " starting ahead of time ".The cost of making infrared shadow shield also is very high.
Optical touch screen systems depends on the combination of the algorithm of line sweep or face image camera, digital signal processing, front or back lighting and definite touch point.Many optical touch screen systems utilizations are carried out to picture along the directed line scan cameras of touch screen surface to shadow shield.Like this, system can follow the tracks of moving near any object of touch screen surface through the variation of surveying the illumination light that light source (for example infrared light supply) sent.For example, infrarede emitting diode (IR-LED) or special reflecting surface can pass the surface emitting infrared light of touch-screen.Optical touch screen technology has some merits and demerits of infrared touch panel technology.One of them shortcoming be to touch usually finger or object just in time before the actual touch touch screen surface by record.The most important advantage of Optical touch screen technology is along with the increase cost raising of size is less, and to have higher in fact resolution and data transfer rate, and this changes better drag and drop performance into.
Summary of the invention
The invention provides a kind of touch-screen system that is used to distinguish the user interactions state.Touch-screen system of the present invention can estimate roughly that tracking mode do not consider the directed of user and do not rely on touch pressure or touch the direct sensing of area with dragging state.Said touch-screen system comprises touch-screen, at least two detector and signal processors near said touch-screen.Said detector can be line scan cameras, face smear camera or phototransistor.Said touch-screen system will typically comprise the light source that is used to illuminate object.Said detector will be surveyed the illuminance that is caused by the object with said touch screen interaction and change.
First detector produces first signal, first image of the object of said first signal representative and touch screen interaction.Second detector produces secondary signal, second image of the object of said secondary signal representative and touch screen interaction.Said signal processor object computer executable instruction; Said instruction is used for said first signal is handled the outer peripheral approximate coordinates of the first couple with definite said object, and said secondary signal is handled to confirm the outer peripheral approximate coordinates of the second couple of said object.For example, said approximate coordinates can utilize the parallax computing method to confirm.
Said signal processor calculates approximated touch area based on the outer peripheral approximate coordinates of the first couple of said object and said second pair of outer peripheral approximate coordinates then.If said approximated touch area is less than or equal to threshold touch area, then said signal processor is confirmed the object indicators track state with said touch screen interaction.If said approximated touch area is greater than said threshold touch area, then said signal processor is confirmed the object indication selection mode with said touch screen interaction.When the object indicators track state of known and said touch screen interaction, can set up said threshold touch area through calibrating said touch-screen system.
If with the object indication selection mode of said touch screen interaction, then said signal processor monitoring is from the follow-up signal of said detector, to confirm whether said object moves with respect to said touch-screen.If said object moves with respect to said touch-screen, then said signal processor recomputate said approximated touch area and the touch area confirming to recomputate whether still more than or equal to said threshold touch area.If the said touch area that recomputates more than or equal to said threshold touch area, is then confirmed to drag state with the object indication of said touch screen interaction.If the said touch area that recomputates is less than said threshold touch area, then said signal processor is confirmed the object indicators track state with said touch screen interaction.If indicate selection mode, drag state or tracking mode with the object of said touch screen interaction, confirm then whether said object becomes and can't be detected by said first detector and second detector.Can't be detected by said first detector and second detector if said object becomes, then confirm to indicate out-of-bounds state with the object of said touch screen interaction.
Can touch the object that area and relatively large second touches area for finger, stylus or other can produce first with the object of said touch screen interaction.For example, said object can comprise stylus, and said stylus has the outstanding spring-loaded plunger from the tip of said stylus, and said plunger produces less relatively touch area with said touch screen interaction the time.When said spring applies enough pressure, said plunger retraction makes the tip of said stylus contact and produce relatively large touch area with said touch-screen in the tip of said stylus.These and other characteristic of the present invention is with combining accompanying drawing and claim to describe further in the detail specifications below.
Description of drawings
Fig. 1 shows the touch-screen system according to certain exemplary of the present invention;
Fig. 2 is the block diagram according to the assembly that comprises computing equipment of the touch-screen system of certain exemplary of the present invention;
Fig. 3 comprises Fig. 3 A and 3B, show according to certain exemplary of the present invention, under tracing mode with the finger of touch screen interaction;
Fig. 4 comprises Fig. 4 A and 4B, show according to certain exemplary of the present invention, under preference pattern with the finger of touch screen interaction;
Fig. 5 is according to certain exemplary of the present invention, is used to understand the reference diagram of exemplary triangulation calculation, and said triangulation calculation can be used for roughly estimating to touch area;
Fig. 6 comprises Fig. 6 A and 6B, shows the special stylus that can be used for according in certain exemplary of the present invention;
Fig. 7 is the process flow diagram that is used to distinguish the illustrative methods of tracking mode and selection mode in the touch-screen system that is illustrated in according to certain exemplary of the present invention; And
Fig. 8 shows the constitutional diagram of the sequence of operation of certain exemplary.
Embodiment
The present invention is provided for the touch-screen system and the method for approximate at least four interaction modes, and these these interaction modes comprise: (1) is state out-of-bounds; (2) tracking mode; (3) selection mode and (4) drag state.System and method of the present invention is provided at the function of discerning between a plurality of interaction modes, and no matter the direction that user's finger, stylus or other touch object how, and need not to rely on the direct sensing in touch pressure or zone.To illustrate and describe illustrative embodiments of the present invention hereinafter, identical in the accompanying drawings label is represented components identical.
Fig. 1 is the diagrammatic sketch of exemplary touch screen system 100.As employed in this article, term " touch-screen system " means touch-screen 110 and hardware and/or the software element that touches measuring ability is provided.Exemplary touch-screen system 100 is shown as contiguous display device (that is video monitor) 190.Display device 190 can be connected to carry out and be used to detect on touch-screen 110 or near the personal computer of the software of the touch it or other calculation element (referring to Fig. 2).The diagrammatic sketch of touch-screen system 100 contiguous display device 190 is represented the exemplary application of touch-screen system 100 in Fig. 1.For example, touch-screen system 100 can be arranged on and/or be fixed on the place ahead of display device 190, and is mutual with it so that the user can watch the vision output of display device 190 and pass through touch-screen 110.
Therefore, touch-screen system 100 can have the covering that is used for existing display device 190 or upgrade and use.Other application that it should be understood, however, that exemplary touch screen system 100 is also expected by the present invention.For example, touch-screen system 100 can be used as the integrated component of display device 190, and in this case can also be as the display screen that is used for display device 190.Exemplary touch-screen system 100 can be used in combination with the display device 190 of all sizes and size, includes but not limited to the display screen of less hand-held device, such as mobile phone, PDA(Personal Digital Assistant), pager or the like.
It is transparent and/or translucent that at least a portion of touch-screen 110 is generally; So that image or other target can be watched through touch-screen 110; And the energy of light and/or other form can be within touch-screen 110 or through touch-screen 110 (for example, through reflection or refraction) transmission.For example, touch-screen 110 can be by plastics or thermoplastic (for example, acrylic acid, plexiglas, polycarbonate or the like) and/or glass-like materials formation.In some embodiments, touch-screen can be polycarbonate or the glass material that is bonded to acryhic material.That kind as is known to the person skilled in the art, touch-screen 110 can also be made up of other material.The coating that touch-screen 110 can also dispose durable (for example, anti-scraping and/or pulverizing).Touch-screen 110 can comprise or can not comprise framework or protecgulum,, centers on the outer box or the shell of touch-screen 110 peripheries that is.
Touch-screen system 100 comprises energy source 120, and it for example is configured to launch that form is the energy (, being commonly referred to " energy beam " in this article for easy) of pulse, ripple, bundle or the like.Energy source 120 is arranged within one or more edges of touch-screen 110 or neighbouring (for example, approaching with the edge) usually.Energy source 120 can be launched the energy of one or more types.For example, energy source 120 can be launched infrared (IR) energy.Perhaps, energy source 120 can visible emitting energy (for example, with one or more frequencies or spectrum).
Energy source 120 can comprise one or more independently emissive sources (transmitter, generator or the like).For example, energy source 120 can comprise one or more infrarede emitting diodes (LED).As another example, energy source 120 can comprise one or more microwave energy transmitters or one or more sonic generators.Energy source 120 is set up and is configured so that its emission passes the energy beam 140 on touch-screen 110 surface, thereby generates near the energized plane that is positioned at the touch screen surface.For example, suitable reflection or refracting element (such as the zone of reflections, lacquer, metal or plastics, mirror, prism or the like) can be used to form and be provided with energized plane.
The energy beam 150 of the front surface 111 that passes touch-screen 110 of being reflected is detected by detector 130,131.Detector 130,131 can be configured to keep watch on and/or detected energy bundle 150 in variation (change, or the like).The direction that depends on energy source 120 and detector 130,131, energy beam 150 can have " backlight (back-lighting) " or " preceding light (fore-lighting) " effect on finger, stylus or other object of contact touch-screen 110.Under the backlight situation, on touch-screen 110 front surfaces or near touch can cause the interruption of the energy beam 150 of reflection so that touch location appears as shade or profile (, energy disappearance) when being surveyed by detector 130,131.Under preceding smooth situation, will in detector 130,131, appear as the zone of energy density increase by the energy of finger, contact pilotage or other reflection object.
In some embodiments, detector 130,131 and/or software can be used the detection that filtering changes with the enhanced energy beam intensity.Yet the strength difference between energy beam 150 and the ambient noise possibly be enough to eliminate the needs to filtering.As hereinafter is discussed with reference to Fig. 2, can handle by video processing unit (for example, digital signal processor) and/or calculation element by the information signal that detector 130,131 generates.
Detector 130,131 can be arranged within the touch-screen 110 or neighbouring (for example, approaching with it), so that the energy beam 150 near the energized plane touch screen surface can kept watch on and/or detect to detector 130,131.If necessary, depend on the position of detector 130,131, can use reverberator and/or prism to allow detector 130,131 detected energy bundles 150.In the instance shown in Fig. 1, detector 130,131 is arranged within the bottom margin of touch-screen 110 or along this edge setting, has a detecting device in the every nook and cranny.In preferred embodiment, comprise two separate detector at least, so that it is the position that touches can be confirmed by triangulation technique, as mentioned below.
Detector 130,131 can be to detect any device of variation that (for example, imaging, keep watch on or the like) passed the energy beam 150 of touch-screen 110 front surface reflections.For example, suitable detector 130,131 can be a kind of in the multiple camera, such as sector scanning or line scanning (for example, numeral) camera.This sector scanning or line scan camera can be based on complementary metal oxide semiconductor (CMOS) (CMOS) or charge-coupled device (CCD) technology, and it is known in the field.In addition, because detector 130,131 need not to obtain detailed color image, therefore monochromatic (for example, gray scale) camera can be enough.
Although the enough detector means such as other type of photodetector (for example, photodiode or phototransistor) in touch-screen system 100 of the common specific energy of camera are expensive, camera provides bigger touch accuracy of detection.As well known in the art, sector scanning or line scan camera (camera that especially has monochromatic performance) are usually cheap than being configured to obtain detail image and/or having a camera of color detection performance.Therefore, cost-effective relatively sector scanning or line scan camera can be touch-screen system 100 accurate touch screen performance are provided.Yet, should be appreciated that according to other embodiment of the present invention, other device also can be used to provide the function of detector 130,131.
Therefore, touch-screen system 100 of the present invention is configured to come senses touch (for example, by finger, stylus or other object) based on the variation near the detected energy beam 150 that touch screen surface, forms energized plane.Energy beam 150 is kept watch on by detector 130,131.Detector 130,131 can be configured to the variation (for example, reduce or increase) of the intensity of detected energy bundle 150.As understood by one of ordinary skill in the art; Permission is carried out the output performance that enough detects required energy source 120 by detecting device can be based on multiple factor; Such as the inner expected loss of the size of touch-screen 110, touch-screen system 100 (for example; 2 losses of 1/ distance) and the speed of the expected loss that causes by surrounding medium (for example, air), detector 110 or time shutter characteristic, surround lighting characteristic or the like.As will discussing with reference to following accompanying drawing; Detector 130,131 will be relevant with energy beam 150 (or variation wherein) data transmission to calculation element (not shown), this calculation element is carried out and is used for processing said data and calculates the software with respect to the touch location of touch-screen 110.
Fig. 2 is the calcspar that shows the exemplary touch screen system that is connected to example calculation device 201 100 of some embodiment according to the present invention.Calculation element 201 can functionally be connected to touch-screen system 100 through circuit or wireless connections.Exemplary calculation element 201 can be any type by the processor device driven, such as personal computer, laptop computer, handheld computer, PDA(Personal Digital Assistant), numeral and/or cell phone, beeper, video game apparatus or the like.These or other type be conspicuous to those skilled in the art by the processor device driven.As employed in this article, term " processor " can mean the PLD of any type, comprises the microprocessor or the similar device of other type arbitrarily.
Calculation element 201 can comprise for example processor 202, Installed System Memory 204 and various system interface components 206.Processor 202, Installed System Memory 204, digital signal processing (DSP) unit 205 and system interface components 206 can be passed through system bus 208 functional connections.System interface components 206 can make processor 202 communicate by letter with peripheral unit.For example, memory device interface 210 can provide processor 202 and memory storage 211 (for example, dismountable and/or non-removable), such as disc driver, between interface.Network interface 212 also can be provided as the interface between processor 202 and the network communication device (not shown), so that calculation element 201 can be connected to network.
Display screen interface 214 can provide the interface between processor 202 and the display device 190 (shown in Fig. 1).The touch-screen 110 of touch-screen system 100 can be arranged on the place ahead of the display device 190 with himself display screen 192, perhaps otherwise connects or is mounted to display device 190.Perhaps, touch-screen 110 can be used as the display screen 192 of display device 190.One or more I/O (" I/O ") port interface 216 can be provided as the interface between processor 202 and a plurality of input and/or the output unit.For example, the detector 130,131 of touch-screen system 100 or other suitable element can be connected to calculation element 201 or to processor 202 input signal are provided through input port interface 216 through input port.Similarly, the energy source 120 of touch-screen system 100 can be connected to calculation element 201 and can receive the output signal from processor 202 through output port interface 216 through output port.
A plurality of program modules can be stored in Installed System Memory 204 and/or any other computer-readable medium of being associated with memory storage 211 (for example, hard disk drive) in.Program module can comprise operating system 217.This program module also can comprise information display program module 219, and it comprises the computer executable instructions that is used on display screen 192 display image or out of Memory.The others of this illustrative embodiments of the present invention can be implemented in touch-screen control program module 221, and this module is used for the detector 130,131 of control energy source 120 and/or touch-screen system 100 and/or is used for calculating with respect to the touch location of touch-screen 110 and discerning interaction mode based on the signal that receives from detector 130,131.
Some embodiment of the present invention can comprise the DSP unit, is used to carry out the some or all functions that belong to touch-screen control program module 221.As well known in the art, DSP unit 205 can be configured to the calculating (comprising filtering, data sampling, triangulation and other calculating) of the many types of execution and the modulation in control energy source 120.DSP unit 205 can comprise a series of scanning imagers, digital filter and the comparer of realizing with software.Therefore DSP unit 205 can be programmed and be used to calculate touch location and the identification interaction mode with respect to touch-screen 110, as described herein.
Can be configured to the computer-readable instruction of carrying out each program module by the processor 202 of operating system 217 controls.Method of the present invention may be implemented in these computer-readable instructions.In addition, can be stored in one or more information data files 223 by information display program module 219 images displayed or out of Memory, on any computer-readable medium that information data file 223 can be stored in calculation element 201 is associated.
As stated, when the user touches on touch-screen 110 or near it, pass in the intensity of energy beam 150 on touch-screen 110 surface and will occur variation.Detector 130,131 be configured to detect the energy beam 150 that passes touch-screen 110 surface reflections intensity and should be enough sensitivity to detect the variation in this intensity.Can be used for confirming whether also discern this touch with respect to the touch location of touch-screen 110 (and therefore with respect to display screen 192) representes selection mode, tracking mode or drag state by calculation element 201 by the detector 130,131 of touch-screen system 100 and/or the information signal of other element generation.Calculation element 201 also can be confirmed on touch-screen 110 or near the suitable response of the touch it.
According to some embodiment of the present invention, can periodically be handled by calculation element 201 from the data of detector 130,131, be directed to the typical intensity level of the energy beam 150 that passes touch-screen 110 surfaces with monitoring energy beam 150 when not having touch.This permission system solves and reduces thus the influence of the variation of ambient light levels or other environmental baseline.When needs, calculation element 201 can increase or reduce the intensity by the energy beam 150 of energy source 120 emissions alternatively.Subsequently, if detector 130,131 detects the variation of the intensity of energy beam 150, then calculation element 201 can be handled this information to confirm near on touch-screen 110 or it, having occurred touch.
For example, can be through handling the information that receives from each detector 130,131 and carrying out one or more known trigonometric calculations and confirm touch location with respect to touch-screen 110.As an example, calculation element 201 can be from each detector 130,131 reception information, and detector 130,131 can be used to discern with respect to regional location each detector 130,131, that energy beam intensity increases or reduces.Can confirm about the one or more pixels of touch-screen 110 or the coordinate of virtual pixel with respect to regional location each detector 130,131, that energy beam intensity reduces.Then, can be based on detector 130, geometric condition between 131, to respect to each detector, energy beam intensity increases or the regional location that reduces carries out triangulation, so that confirm the actual touch position with respect to touch-screen 110.Confirming will be with reference to following description of drawings by the calculating that touches represented interaction mode.Be used for confirming that any this calculating of touch location and/or interaction mode can comprise the algorithm that can use, be used for compensate (damage of for example, lens distortion, environmental baseline, touch-screen 110 or the barrier on it or the like).
Fig. 3 comprises Fig. 3 A and Fig. 3 B, shows the mutual of user and exemplary touch-screen 110.User interactions among the embodiment that illustrates is intended to represent tracking mode.The user points a part (or other object) of 302 and gets near the energized plane (being formed by energy beam 150) the touch screen surface, perhaps " suspensions " near touch screen surface and do not contact with it, perhaps the less relatively pressure of usefulness contacts with touch-screen.Two detectors 130,131 (are referred to as Camera for ease 0(camera 0) and Camera 1(camera 1)) generate the indication energized plane Strength Changes, and thereby indication have the information signal that touches.
View data to detector 130,131 is caught can be handled and explain, with the indicated interaction mode of approximate touch.For example, can be in a known manner to Camera 0Output handle, to confirm that extending to the user from first RP (for example, the angle 303 of touch-screen 110) points 302 the slope (m of line of first pair of outward flange 304,306 of part in the visual field of detector 130 0aAnd m 0b).Equally, can be to Camera 1Output handle, to confirm that extending to the user from second RP (for example, the angle 305 of touch-screen 110) points 302 the slope (m of line of second pair of outward flange 308,310 of part in the visual field of detector 131 1aAnd m 1b).The selection of RP (for example, angle 303 and 305) depends on detector 130,131 geometries with respect to touch-screen 110 certainly.Article four, the oblique line (m that calculates 0a, m 0b, m 1aAnd m 1b) the point of crossing then can be used for approximate user and point 302 the surface areas (S) of part in the visual field of detector 130,131.Among this paper the user is pointed 302 the surface areas (S) of part in the visual field of detector 130,131 and be called " touch area ", not necessarily need contacting of reality although should be appreciated that as stated between finger 302 (or other objects) and the touch-screen 110.
Opposite with tracking mode embodiment illustrated in fig. 3, the user interactions shown in Fig. 4 A and Fig. 4 B is intended to expression and selects or " click " state.The user points near 302 the energized plane of a part (or other object) entering (or remaining on) touch screen surface, and uses the pressure contact touch screen surface bigger than the pressure among the embodiment of Fig. 3.Two detectors 130,131 generate once more the indication energized plane Strength Changes, and thereby indication have the information signal that touches.In the embodiment of Fig. 4, the user points 302 can get into energized plane from position out-of-bounds.Alternatively, the user finger position in energized plane can change, so that it becomes from previous suspension (noncontact) position and contacts with touch screen surface or increase the pressure on touch screen surface.
Equally, can be in a known manner to Camera 0Output handle, with confirm from first RP (for example, the angle 303 of touch-screen 110) extend to the user point 302 in the visual field of detector 130 part first pair of outward flange 304 ', 306 ' line slope (m ' 0aAnd m ' 0b).Similarly, can be to Camera 1Output handle, with confirm from second RP (for example, the angle 305 of touch-screen 110) extend to the user point 302 in the visual field of detector 131 part second pair of outward flange 308 ', 310 ' line slope (m ' 1aAnd m ' 1b).The oblique line that four calculates (m ' Oa, m ' 0b, m ' 1aAnd m ' 1b) the point of crossing then can be used for approximate touch area (S ').
As relatively, Fig. 4 A with solid line show the oblique line of representing selection mode (m ' Oa, m ' 0b, m ' 1aAnd m ' 1b) and touch area (S '), and be shown in broken lines the oblique line (m of expression tracking mode (Fig. 3) Oa, m 0b, m 1aAnd m 1b) and touch area (S).As shown in the figure, and the touch area of expression selection mode (S ') greater than the touch area (S) of representing selection mode.This points 302 because of the user is soft; And when the user contacts touch-screen (or increase on touch-screen pressure) when selecting, the user point 302 in contact point distortion (or after contact pressure increases, having bigger distortion) to cover bigger area on the touch screen surface.
Calculation element 201 can be used for touch-screen system 100 is calibrated, to specify the threshold touch area of expression tracking mode.After calibration, calculation element 201 can be programmed to " selection " is appointed as in the touch area that exceeds threshold touch area that calculates, any touch that detects.It will be understood by those skilled in the art that exemplary calibration steps comprises that the prompting user carries out with respect to the tracking operation of touch-screen 110, when the user carries out tracking operation the touch area that calculates the touch area, will calculate then and adds that optional error or " hysteresis " value are stored as threshold touch area.
In some embodiments, calibration steps can the user point 302 or stylus automatically perform when static.This calibration steps hypothesis a period of time before applying extra pressure representative " selection " operation, user's finger or stylus remain on stable " tracking " pattern.To those skilled in the art, other method that is used to calibrate exemplary touch-screen system 100 will be conspicuous, therefore also should be considered within the scope of the invention.
In some embodiments, can use below exemplary triangulation calculation come approximate touch area.Can understand these formula best with reference to Fig. 5.Yet, should be noted that Fig. 5 only is provided as exemplary reference.
At first, order:
Camera is positioned at y=0, and at a distance of the distance of 1 unit,
m 0a=Camera 0The slope at observed first edge,
m 0b=Camera 0The slope at observed second edge,
m 0c=m 0aAnd m 0bAverage,
m 1a=Camera 1The slope at observed first edge,
m 1b=Camera 1The slope at observed second edge,
m 1c=m 1aAnd m 1bAverage,
(x 0a, y 0a)=m 0aAnd m 1cIntersection point,
(x 0b, y 0b)=m 0bAnd m 1cIntersection point,
(x 0c, y 0c)=m 0cAnd m 1cIntersection point, the touch center,
(x 1a, y 1a)=m 1aAnd m 0cIntersection point,
(x 1b, y 1b)=m 1bAnd m 0cIntersection point,
With (x 0c, y 0c) identical, and r 0=from Camera 0To the distance at touch center,
r 1=from Camera 1To the distance at touch center,
w 0=point (x 0a, y 0a) to (x 0b, y 0b) width or distance,
w 1=point (x 1a, y 1a) to (x 1b, y 1b) width or distance,
Then, use following formula to calculate by Camera 0Width (the w of observed touch area 0):
x 0a=m 1c/(m 0a-m 1c)
y 0a=m 0a*x 0a
x 0b=m 1c/(m 0b-m 1c)
y 0b=m 0b*x 0b
x 0c=m 1c/(m 0b-m 1c)
y 0c=m 0b*x 0c
r 0=sqrt(x 0c 2+y 0c 2)
Similarly formula can be used to calculate by Camera 1Width (the w of observed touch area 1).After width is found the solution, can use computes to touch area (S):
S=w 0*w 1
Wherein, w 0Be by Camera 0The width of the touch area that detects, w 1Be by Camera 1The width of the touch area that detects.
Fig. 6 comprises Fig. 6 A and Fig. 6 B, shows simple stylus 602, and it is modified to and can allows a plurality of touch areas based on applied pressure.Stylus 602 comprises spring-loaded plunger 604, and it is designed in the tip 606 of retraction stylus 602 when enough pressure puts on spring 608.Like this, when stylus 602 is suspended near the of touch-screen 110 or contact touch-screen 110 and its insufficient pressure with compression spring 608, plunger 604 will keep giving prominence to from most advanced and sophisticated 606.Detector 130,131 will be surveyed the existence of plunger 604, and calculation element 201 will calculate based on the plunger size that detects and touch area (S).On the contrary, when stylus 602 contacts with touch-screen 100 with enough pressure and during compression spring 608, plunger 604 is in the retraction most advanced and sophisticated 606, most advanced and sophisticated 606 contact touch-screens 110.The touch area that calculation element 201 will enlarge based on the size calculating of the stylus tip that detects 606 (S ').
The stylus 602 of Fig. 6 is designed to be similar to finger 302 operations, and it produces the touch area that enlarges when exerting pressure.Other stylus designs also can be accomplished similar function.For example, can similar function be provided by having the most advanced and sophisticated stylus of rubber, this rubber tip (on area) when being applied in pressure enlarges.Therefore, not only can be used for indicating but also can be used for indicating any stylus of larger area or other object all to can be used for according to the embodiment of the present invention than small size.
Fig. 7 shows and is used to distinguish tracking mode, selection mode and the process flow diagram of the illustrative methods 700 of state out-of-bounds.Method 700 determines whether that in step 702 near the energized plane touch-screen detects finger or stylus then from starting block 701 beginnings.If do not detect finger or stylus, this method then gets into step 704, and the indication interaction mode is " out-of-bounds ".After step 704, this method is got back to step 702 and is more handled.When detect finger or stylus in step 702 after, this method gets into step 706, the image that first detector is caught is handled, to confirm the outer peripheral approximate coordinates of the first couple of finger or stylus.For example, slope line calculations capable of using is confirmed these coordinates.Next,, the image that second detector is caught is handled, to confirm the outer peripheral approximate coordinates of the second couple of finger or stylus in step 708.In step 710, use the outer peripheral approximate coordinates of two couples of finger or stylus to calculate approximated touch area.
After step 710 had been calculated approximated touch area, this method got into step 712 and confirms that whether this approximated touch area is greater than threshold touch area.Threshold touch area can be confirmed through the calibration to touch-screen system 100, or can be specified by system operator or supvr.If approximated touch area is greater than threshold touch area, then at step 712 indication selection mode.If approximated touch area is not more than threshold touch area, then at step 714 indicators track state.From step 712 or 714, this method turns back to step 702 and more handles.
It will be apparent for a person skilled in the art that touch location calculation can or carry out with the computation sequence of approximate interaction mode ground concurrently.Thereby through the continuous selection mode of iteration indication of illustrative methods 700, so, this continuous selection mode will be considered to the state of dragging if detect moving of finger or stylus.Indicating continuous tracking mode to combine then can be considered to the mobile phase of finger or stylus for example is to need cursor to follow finger or stylus.
Fig. 8 shows the view of the sequence of operation of certain illustrative embodiments of the present invention.When the touch area that detects user's finger or stylus near the energized plane touch-screen 110 and calculate is confirmed as when being less than or equal to threshold touch area indicators track state 802.If finger or stylus do not move (that is, the speed that detects is about zero), then indicate steady state (SS) 804.During steady state (SS) 804, alternatively, can calibrate threshold touch area, for example, with it as background process.Steady state (SS) 804 times, if finger or stylus begin to move (that is, the speed that detects is greater than zero) and the touch area that calculates keeps being less than or equal to threshold touch area, indicators track state 802 once more then.
Steady state (SS) 804 times,, then indicate selection mode 806 if the touch area that calculates is confirmed as greater than threshold touch area.If finger or stylus begin to move when indication selection mode 806, then indication drags state 808.When the indication selection mode 806 or the state 808 of dragging, if being confirmed as, the touch area that calculates is less than or equal to threshold touch area (that is, finger or stylus are promoted at least partly away from touch-screen 110), then indication stops selection mode 810.Stopping selection mode 810 times,, then indicating steady state (SS) 804 once more if finger or stylus remain in the energized plane.In tracking mode 802, steady state (SS) 804 or stop selection mode 810 times,, then indicate out-of-bounds state 112 if finger or stylus are promoted to fully away from touch-screen 110.
Those skilled in the art will recognize that the state machine diagram of Fig. 8 is merely exemplary, state and state exchange additional and/or replacement all are possible.For example, other embodiment of the present invention is configurable for directly to be converted to selection mode 806 or to drag state 808 from tracking mode 802.Similarly, in some embodiments, the present invention is configurable for from selection mode 806 or drag state 808 and directly be converted to tracking mode 802 or state 812 out-of-bounds.Therefore, scope of the present invention does not trend towards being limited to the exemplary state machine synoptic diagram of Fig. 8, is not limited to the exemplary process diagram of Fig. 6 yet.
Those of skill in the art also will appreciate that; Some function of illustrative embodiments of the present invention can provide through the program module of any kind and quantity, and these program modules can use any programming language to create and can be stored in the calculation element 201 (also can not) this locality.For example, calculation element 201 can comprise and can be configured to carry out the webserver, client computer or the equipment that is stored in the program module on other network equipment and/or is used to control the touch-screen system of long-range setting.
Based on foregoing, it is thus clear that the invention provides improved touch-screen system, it can be similar to follows the tracks of and drags state, and no matter touch direction, and need not to depend on direct sensing touch pressure and area.Many other modifications of the present invention, characteristic and embodiment will be obvious to those skilled in the art.For example, those skilled in the art will recognize that embodiment of the present invention all is useful and suitable to various touch-screens, these touch-screens include but not limited to: optical touch screen, IR touch-screen and capacitive touch screen.Therefore, should be realized that above-described many aspects of the present invention all are merely exemplary, and do not trend towards necessary or important element, unless expressly stated otherwise, into the present invention.
Therefore, should be appreciated that mentioned abovely only relate to some embodiment of the present invention, it can have a large amount of changes and not depart from the spirit and scope of the present invention that are defined by the claims.Be also to be understood that to the invention is not restricted to described embodiment, can make various modifications within the scope of the invention.

Claims (15)

1. method that in touch-screen system, is used to distinguish the user interactions state comprises:
Reception is from first signal of first detector of said touch-screen system, first image of said first signal representative and the object of touch screen interaction;
Reception is from the secondary signal of second detector of said touch-screen system, second image of said secondary signal representative and the said object of said touch screen interaction;
Said first signal is handled to confirm the outer peripheral approximate coordinates of the first couple of said object;
Said secondary signal is handled to confirm the outer peripheral approximate coordinates of the second couple of said object;
Calculate approximated touch area based on the outer peripheral approximate coordinates of the second couple of outer peripheral approximate coordinates of the first couple of said object and said object;
If said approximated touch area is less than or equal to threshold touch area, then confirm object indicators track state with said touch screen interaction;
If said approximated touch area, is then confirmed the object indication selection mode with said touch screen interaction greater than said threshold touch area;
If, confirm then whether said object moves with respect to said touch-screen with the object indication selection mode of said touch screen interaction;
If said object moves with respect to said touch-screen, then recomputate said approximated touch area and the touch area confirming to recomputate whether still more than or equal to said threshold touch area; And
If the said touch area that recomputates still more than or equal to said threshold touch area, is then confirmed to drag state with the object indication of said touch screen interaction.
2. the method for claim 1, wherein utilize parallax to calculate to confirm said first pair of outer peripheral approximate coordinates and said second pair of outer peripheral approximate coordinates.
3. the method for claim 1, wherein when the object indicators track state of known and said touch screen interaction, set up said threshold touch area through calibrating said touch-screen system.
4. the method for claim 1, wherein the operator through said touch-screen system sets up said threshold touch area.
5. method as claimed in claim 4 further comprises:
If, confirm then whether said object becomes and can't be detected by said first detector and second detector with the object indication selection mode or the tracking mode of said touch screen interaction;
Can't be detected by said first detector and second detector if said object becomes, then confirm to indicate out-of-bounds state with the object of said touch screen interaction.
6. the method for claim 1 further comprises:
If the said touch area that recomputates not is more than or equal to said threshold touch area, then confirm object indicators track state with said touch screen interaction.
7. the method for claim 1 further comprises:
If indicate selection mode, drag state or tracking mode with the object of said touch screen interaction, confirm then whether said object becomes and can't be detected by said first detector and second detector;
Can't be detected by said first detector and second detector if said object becomes, then confirm to indicate out-of-bounds state with the object of said touch screen interaction.
8. the method for claim 1, wherein said first detector and second detector all are selected from line scan cameras, face smear camera and phototransistor.
9. device that in touch-screen system, is used to distinguish the user interactions state comprises:
Reception is from the device of first signal of first detector of said touch-screen system, first image of said first signal representative and the object of said touch screen interaction;
Reception is from the device of the secondary signal of second detector of said touch-screen system, second image of said secondary signal representative and the said object of said touch screen interaction;
Said first signal is handled the device with the outer peripheral approximate coordinates of the first couple of confirming said object;
Said secondary signal is handled the device with the outer peripheral approximate coordinates of the second couple of confirming said object;
Calculate the device of approximated touch area based on the outer peripheral approximate coordinates of the second couple of outer peripheral approximate coordinates of the first couple of said object and said object;
If said approximated touch area is less than or equal to threshold touch area, then confirm device with the object indicators track state of said touch screen interaction;
If said approximated touch area greater than said threshold touch area, then confirms to indicate with the object of said touch screen interaction the device of selection mode;
If with the object of said touch screen interaction indication selection mode, then confirm the device whether said object moves with respect to said touch-screen;
If said object moves with respect to said touch-screen, then recomputate said approximated touch area and the touch area confirming to recomputate whether still more than or equal to the device of said threshold touch area; And
If the said touch area that recomputates still more than or equal to said threshold touch area, is then confirmed the device of the state that drags with the object indication of said touch screen interaction.
10. device as claimed in claim 9; Wherein, It is to utilize parallax to calculate to confirm the device of said first pair of outer peripheral approximate coordinates that said first signal is handled device with the outer peripheral approximate coordinates of the first couple of confirming said object, and the device that said secondary signal is handled with the outer peripheral approximate coordinates of the second couple of definite said object is to utilize parallax to calculate to confirm the device of said second pair of outer peripheral approximate coordinates.
11. device as claimed in claim 9 further comprises: when the object indicators track state of known and said touch screen interaction, set up the device of said threshold touch area through calibrating said touch-screen system.
12. device as claimed in claim 11 further comprises:
If with the object of said touch screen interaction indication selection mode or tracking mode, confirm then whether said object becomes the device that can't be detected by said first detector and second detector;
Can't be detected by said first detector and second detector if said object becomes, then confirm to indicate the out-of-bounds device of state with the object of said touch screen interaction.
13. device as claimed in claim 9 further comprises:
If the said touch area that recomputates less than said threshold touch area, is then confirmed and the device of the object indicators track state of said touch screen interaction.
14. device as claimed in claim 9 further comprises:
If with the object of said touch screen interaction indication selection mode, drag state or tracking mode, confirm then whether said object becomes the device that can't be detected by said first detector and second detector;
Can't be detected by said first detector and second detector if said object becomes, then confirm to indicate the out-of-bounds device of state with the object of said touch screen interaction.
15. device as claimed in claim 9, wherein said first detector and second detector all are selected from line scan cameras, face smear camera and phototransistor.
CN2008800117959A 2007-04-11 2008-04-11 Touch screen system with hover and click input methods Expired - Fee Related CN101663637B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
NZ554,416 2007-04-11
NZ5544107 2007-04-11
PCT/US2008/060102 WO2008128096A2 (en) 2007-04-11 2008-04-11 Touch screen system with hover and click input methods

Publications (2)

Publication Number Publication Date
CN101663637A CN101663637A (en) 2010-03-03
CN101663637B true CN101663637B (en) 2012-08-22

Family

ID=41790653

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2008800117959A Expired - Fee Related CN101663637B (en) 2007-04-11 2008-04-11 Touch screen system with hover and click input methods

Country Status (1)

Country Link
CN (1) CN101663637B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8289299B2 (en) 2003-02-14 2012-10-16 Next Holdings Limited Touch screen signal processing
US8384693B2 (en) 2007-08-30 2013-02-26 Next Holdings Limited Low profile touch panel systems
US8405636B2 (en) 2008-01-07 2013-03-26 Next Holdings Limited Optical position sensing system and optical position sensor assembly
US8432377B2 (en) 2007-08-30 2013-04-30 Next Holdings Limited Optical touchscreen with improved illumination
US8456447B2 (en) 2003-02-14 2013-06-04 Next Holdings Limited Touch screen signal processing
US8508508B2 (en) 2003-02-14 2013-08-13 Next Holdings Limited Touch screen signal processing with single-point calibration

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7538759B2 (en) 2004-05-07 2009-05-26 Next Holdings Limited Touch panel display system with illumination and detection provided from a single edge
US8115753B2 (en) 2007-04-11 2012-02-14 Next Holdings Limited Touch screen system with hover and click input methods
TWI428804B (en) 2010-10-20 2014-03-01 Pixart Imaging Inc Optical screen touch system and method thereof
CN102478999B (en) * 2010-11-22 2015-07-01 原相科技股份有限公司 Optical touch control system and sensing method thereof
CN103348305B (en) * 2011-02-04 2016-11-16 皇家飞利浦有限公司 Controlled attitude system uses proprioception to create absolute reference system
US20140082559A1 (en) * 2011-02-22 2014-03-20 Bradley Neal Suggs Control area for facilitating user input
TWI494830B (en) * 2011-04-15 2015-08-01 Elan Microelectronics Corp Touch-controlled device, identifying method and computer program product thereof
JP2012247936A (en) * 2011-05-26 2012-12-13 Sony Corp Information processor, display control method and program
CN102221932B (en) * 2011-06-29 2013-04-24 华为终端有限公司 Touch screen command input method and user equipment
TWI498787B (en) * 2013-01-18 2015-09-01 Wistron Corp Optical touch system, method of touch sensing, and computer program product
JP5567727B1 (en) * 2013-09-17 2014-08-06 株式会社フジクラ Electronic device and control method of electronic device
US9395824B2 (en) * 2013-10-18 2016-07-19 Synaptics Incorporated Active pen with improved interference performance
CN109612398B (en) * 2018-12-07 2021-10-08 佳格科技(浙江)股份有限公司 Touch screen object off-screen detection method

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6421042B1 (en) * 1998-06-09 2002-07-16 Ricoh Company, Ltd. Coordinate position inputting/detecting device, a method for inputting/detecting the coordinate position, and a display board system
CN1653411A (en) * 2002-05-06 2005-08-10 3M创新有限公司 Method for improving positioned accuracy for a determined touch input

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6421042B1 (en) * 1998-06-09 2002-07-16 Ricoh Company, Ltd. Coordinate position inputting/detecting device, a method for inputting/detecting the coordinate position, and a display board system
CN1653411A (en) * 2002-05-06 2005-08-10 3M创新有限公司 Method for improving positioned accuracy for a determined touch input

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8289299B2 (en) 2003-02-14 2012-10-16 Next Holdings Limited Touch screen signal processing
US8456447B2 (en) 2003-02-14 2013-06-04 Next Holdings Limited Touch screen signal processing
US8466885B2 (en) 2003-02-14 2013-06-18 Next Holdings Limited Touch screen signal processing
US8508508B2 (en) 2003-02-14 2013-08-13 Next Holdings Limited Touch screen signal processing with single-point calibration
US8384693B2 (en) 2007-08-30 2013-02-26 Next Holdings Limited Low profile touch panel systems
US8432377B2 (en) 2007-08-30 2013-04-30 Next Holdings Limited Optical touchscreen with improved illumination
US8405636B2 (en) 2008-01-07 2013-03-26 Next Holdings Limited Optical position sensing system and optical position sensor assembly
US8405637B2 (en) 2008-01-07 2013-03-26 Next Holdings Limited Optical position sensing system and optical position sensor assembly with convex imaging window

Also Published As

Publication number Publication date
CN101663637A (en) 2010-03-03

Similar Documents

Publication Publication Date Title
CN101663637B (en) Touch screen system with hover and click input methods
US8115753B2 (en) Touch screen system with hover and click input methods
EP2353069B1 (en) Stereo optical sensors for resolving multi-touch in a touch detection system
JP4442877B2 (en) Coordinate input device and control method thereof
EP1759378B1 (en) Touch panel display system with illumination and detection provided from a single edge
CN101231450B (en) Multipoint and object touch panel arrangement as well as multipoint touch orientation method
US20100295821A1 (en) Optical touch panel
US20160154533A1 (en) Integrated light guide and touch screen frame
US20090278795A1 (en) Interactive Input System And Illumination Assembly Therefor
US8803845B2 (en) Optical touch input system and method of establishing reference in the same
CN102754047A (en) Methods and systems for position detection using an interactive volume
KR20120058594A (en) Interactive input system with improved signal-to-noise ratio (snr) and image capture method
KR20100055516A (en) Optical touchscreen with improved illumination
CN102792249A (en) Touch system using optical components to image multiple fields of view on an image sensor
CN101930306A (en) Multi-touch device and detection method thereof
TWI461990B (en) Optical imaging device and image processing method for optical imaging device
TWI511006B (en) Optical imaging system and imaging processing method for optical imaging system
US20160139735A1 (en) Optical touch screen
US9652081B2 (en) Optical touch system, method of touch detection, and computer program product
KR101308477B1 (en) Method for Detecting Touch and Display Device Using the Same
JP2006350908A (en) Optical information input device
CN101859189B (en) Optical input system and method
KR101125824B1 (en) Infrared touch screen devices
KR20120025333A (en) Infrared touch screen devices

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
ASS Succession or assignment of patent right

Owner name: NEXTER HOLDING CO., LTD.

Free format text: FORMER OWNER: NEXT HOLDINGS LTD.

Effective date: 20101202

C41 Transfer of patent application or patent right or utility model
COR Change of bibliographic data

Free format text: CORRECT: ADDRESS; FROM: CALIFORNIA, USA TO: AUCKLAND, NEW ZEALAND

TA01 Transfer of patent application right

Effective date of registration: 20101202

Address after: Auckland

Applicant after: Next Holdings Ltd.

Address before: American California

Applicant before: Next Holdings Ltd.

C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20120822

Termination date: 20150411

EXPY Termination of patent right or utility model