CA2453873C - Illuminated bezel and touch system incorporating the same - Google Patents

Illuminated bezel and touch system incorporating the same Download PDF

Info

Publication number
CA2453873C
CA2453873C CA002453873A CA2453873A CA2453873C CA 2453873 C CA2453873 C CA 2453873C CA 002453873 A CA002453873 A CA 002453873A CA 2453873 A CA2453873 A CA 2453873A CA 2453873 C CA2453873 C CA 2453873C
Authority
CA
Canada
Prior art keywords
source
light
illumination
touch surface
touch
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime
Application number
CA002453873A
Other languages
French (fr)
Other versions
CA2453873A1 (en
Inventor
Trevor M. Akitt
Neil Gordon Bullock
Gerald D. Morrison
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Smart Technologies ULC
Original Assignee
Smart Technologies ULC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Family has litigation
First worldwide family litigation filed litigation Critical https://patents.darts-ip.com/?family=32770322&utm_source=google_patent&utm_medium=platform_link&utm_campaign=public_patent_search&patent=CA2453873(C) "Global patent litigation dataset” by Darts-ip is licensed under a Creative Commons Attribution 4.0 International License.
Application filed by Smart Technologies ULC filed Critical Smart Technologies ULC
Publication of CA2453873A1 publication Critical patent/CA2453873A1/en
Application granted granted Critical
Publication of CA2453873C publication Critical patent/CA2453873C/en
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0428Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by sensing at the edges of the touch surface the interruption of optical paths, e.g. an illumination plane, parallel to the touch surface which may be virtual
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03547Touch pads, in which fingers can move on a surface

Abstract

A passive touch system includes a touch surface and at least one source of backlight illumination projecting backlighting across the touch surface. At least two image sensors are associated with the touch surface and acquire images of the touch surface from different locations. A digital signal processor is associated with each image sensor. The digital signal processors select pixel subsets of images acquired by the image sensors and process pixel data acquired by the selected pixel subsets to generate pointer characteristic data when a pointer exists in the acquired images. A master digital signal processor in communication with the digital signal processors triangulates the pointer characteristic data to determine the location of the pointer relative to the touch surface.

Description

ILLUMINATED BEZEL AND TOUCH SYSTEM
INCORPORATING THE SAME

Field of the Invention The present invention relates generally to touch systems and in particular to an illuminated bezel for a touch system and to a touch system incorporating the same.
Background of the invention Touch systems are well known in the art and typically include a touch screen having a touch surface on which contacts are made using a pointer. Pointer contacts with the touch surface are detected and are used to generate output pointer position data representing areas of the touch surface where the contacts are made. There are basically two general types of touch systems available and they can be broadly classified as "active" touch systems and "passive" touch systems.
Active touch systems allow a user to generate pointer position data by contacting the touch surface with a special pointer that usually requires some form of on-board power source, typically batteries. The special pointer emits signals such as infrared light, visible light, ultrasonic frequencies, electromagnetic frequencies, etc. that activate the touch surface.
Passive touch systems allow a user to generate pointer position data by contacting the touch surface with a passive pointer and do not require the use of a special pointer in order to activate the touch surface. A passive pointer can be a finger, a cylinder of some material, or any suitable object that can be used to contact some predetermined area of interest on the touch surface. Since special active pointers are not necessary in passive touch systems, battery power levels and/or pointer damage, theft, or pointer misplacement are of no concern to users.
International PCT Application No. PCT/CAC)1/00930 filed on July 5, 2001 and published under number WO 02/03316 on January 10, 2002, assigned to the assignee of the present invention, discloses a camera-based touch system comprising a touch screen that includes a passive touch surface on which a computer-generated image is presented. A rectangular bezel or frame surrounds the touch surface and supports digital cameras at its corners.
-2-The digital cameras have overlapping fields of view that encompass and look along the touch surface. The digital cameras acquire images of the touch surface from different locations and generate image data. The image data is processed by digital signal processors to determine if a pointer exists in the captured image data. When it is determined that a pointer exists in the captured image data, the digital signal processors convey pointer characteristic data to a master controller, which in turn processes the pointer characteristic data to determine the location of the pointer relative to the touch surface using triangulation. The pointer location data is conveyed to a computer executing one or more application programs. The computer uses the pointer location data to update the computer-generated image that is presented on the touch surface. Pointer contacts on the touch surface can therefore be recorded as writing or drawing or used to control execution of the applications programs executed by the computer.
Although this touch system works extremely well, it has been found that when the digital camera frame rates are high, in less favor'able light conditions the ability to determine the existence of a pointer in the captured image data is diminished. As a result, there exists a need to improve the iighting environment for the digital cameras to ensure high resolution irrespective of ambient lighting conditions.
The concept of providing an illumination source for a touch surface has been considered. For example, U.S. Patent No. 4,144,449 to Funk et al. discloses a position detection apparatus for detecting the position of a passive object. The position detection apparatus includes a generally rectangular frame having an open interior. Fluorescent tube continuous light sources extend along three sides of the frame for illuminating the open interior of the frame. Linear image detectors are mounted at opposite corners of the fourth side of the frame. Aperture-defining devices are located between the linear image detectors and the open interior of the frame for configuring coincident fields of light fr-om the open interior for the linear image detectors to view. Unfortunately, the iight emitted by fluorescent tubes is limited to a very narrow frequency range within the visible light spectrum. This makes the
-3-position detection apparatus very susceptible to interference by ambient light.
It is therefore an object of the present invention to provide a novel illuminated bezel and a touch system incorporating the same.
Summary of the Invention According to one aspect of the present invention there is provided in a touch system including a touch surface and at least one optical sensor looking along the touch surface to acquire images of a pointer in proximity thereto, an illumination source to provide backlighting to said at least one optical sensor comprising:
at least one light source; and a diffuser disposed between said at least one light source and said at least one optical sensor, said diffuser diffusing light projected by said at least one light source prior to said light being directed across the touch surface to said at least one optical sensor.
In one embodiment, the at least one light source includes a plurality of spaced discrete light sources. The discrete light sources are arranged in at least one row and are generally equally spaced. The light projected by the discrete light sources onto the diffuser is expanded so that the illumination source appears as a generally continuous illumination source to the at least one optical sensor.
In a preferred embodiment, the diffuser is generally transparent in a specified frequency range and generally opaque in a different specified frequency range. In one embodiment, the discrete light sources are infrared light emitting diodes and the diffuser is generally transparent in the infrared range and generally opaque in the visible range.
In an alternative embodiment, the at least one light source is a continuous light source. The continuous light source may project light in the infrared spectrum or project light in the visible spectrum. When the continuous light source projects light in the visible spectrum, the illumination source further includes a color filter either adjacent to or incorporated into the diffuser.
4 According to another aspect of the present invention there is provided a touch system comprising:
at least one optical sensor associated with a touch surface and having a field of view encompassing and looking across at least a portion of said touch surface;
at least one source of backlight illumination directing light into the field of view of said at least one optical sensor, said at least one illumination source inc6uding at least one light source and a diffusion medium to expand light projected by said at least one light source prior to said light being directed into said field of view; and a pass filter associated with said at least one optical sensor to pass the light directed by said source of backlight illumination to said at least one optical sensor.
According to yet another aspect of the present invention there is provided a passive touch system comprising:
a touch surface;
at least one source of backlight illumination projecting backlighting across said touch surface;
at least two image sensors associated with said touch surFace, said at least two image sensors acquiring images of said touch surface from different locations and having overlapping fields of view;
a pass filter associated with each of said image sensors generally to blind said image sensors except to said projected backlighting;
a digital signal processor associated with each image sensor, the digital signal processors associated with said at least two image sensors selecting pixel subsets of images acquired by said at least two image sensors and processing pixel data acquired by the selected pixel subsets to generate pointer characteristic data when a pointer exists in said acquired images; and a master digital signal processor in communication with said digital signal processors, said master digital signal processor receiving pointer characteristic data from said digital signal processors and triangulating the
-5-pointer characteristic data to determine the location of said pointer relative to said touch surface.
According to yet another aspect of the present invention there is provided a touch system comprising:
at least two CMOS image sensors associated with a touch surface, said at least two CMOS image sensors acquiring images of said touch surface from different locations and having overlapping fields of view;
at least one source of backlight illumination projecting backlighting across said touch surface through a diffuser;
a pass filter associated with each of said image sensors generally to blind said image sensors except to said projected backlighting;
and at least one processor receiving and processing image data acquired by said at least two CMOS image sensors to detect the existence of a pointer in said images and to determine the location of said pointer relative to said touch surface.
According to still yet another aspect of the present invention there is provided a touch system comprising:
at least two optical recording devices associated with a touch surface, said at least two optical recording devices acquiring images of said touch surface from different locations and having overlapping fields of view;
at least one source of backlight illumination projecting backlighting across said touch surface;
a pass filter associated with said optical recording devices generally to blind said optical recording devices except to said projected backlighting; and a processor receiving and processing image data acquired by said at least two optical recording devices to detect the existence of a pointer in said images and to determine the location of said pointer relative to said touch surface, wherein said processor includes first and second processing stages, said first processing stage including a plurality of digital signal processors each associated with a respective one of said optical recording
-6-devices, said digital signal processors processing pixel data from pixel subsets of said optical recording devices and generating pointer parameter data, said second processing stage processing said pointer parameter data from said first processing stage to determine the location of the pointer.
According to still yet another aspect of the present invention there is provided an imaging assembly comprising:
a substantially rectangular bezel to surround a display surface;
at least one optical sensor mounted on said bezel, said at least one optical sensor being oriented to have a field of view looking along said display surface;
at least one source of backiight illumination within said bezel projecting backlight illumination into said field of view through a diffuser;
and a pass filter associated with said at least one optical sensor to pass the light directed by said source of backlight illumination to said at least one optical sensor.
According to still yet another aspect of the present invention there is provided a method of detecting the position of a pointer relative to a touch surface comprising the steps of:
acquiring multiple images of a pointer relative to said touch surface;
selecting pixel subsets of said acquired images;
processing pixel data acquired by the pixel subsets to detect the existence of said pointer therein and to determine the location of the pointer relative to the touch surface using triangulation; and during said acquiring providing backlight illumination across said touch surface and acquiring said images based on said backlight illumination.
The present invention provides advantages in that the illuminated bezel provides good backlighting for the optical sensors allowing the optical sensors to detect the presence of a pointer in close proximity to the touch surface in a wide range of ambient lighting conditions. This of course increases the resolution of the touch system.
-7-Brief Description of the PrawiM
Embodiments of the present invention wili now be described more fully with reference to the accompanying dravvings in which:
Figure 1 is a schematic diagram of a camera-based touch system;
Figure 2 is a front elevation view of a touch screen forming part of the touch system of Figure 1 including an illuminated bezel in accordance with the present invention;
Figure 3 is a cross-sectional view of a side fame assembly forming part of the illuminated bezel of Figure 2;
Figure 4 is a perspective view of the side frame assembly of Figure 3;
Figure 5 shows the radiation pattern of a discrete light source forming part of the illuminated bezel of Figure 2;
Figure 6a is a perspective view of a portion of a continuous illumination source including a row of discrete light sources and a diffuser forming part of the illuminated bezel of Figure 2;
Figure 6b is a front elevation view of the diffuser of Figure 6a showing illumination spots projected thereon by the discrete light sources;
Figure 7 is a schematic diagram of a digital camera forming part of the touch screen of Figure 2;
Figure 8 is a schematic diagram of a rnaster controller forming part of the touch system of Figure 1;
Figure 9 is a front elevation view of the touch screen of Figure 2 showing the illumination sources and the fields of view of the digital cameras;
Figures 10a and 10b show backlight illumination projected by a continuous illumination source as seen by a digital camera with and without a diffuser;
Figures 11 a to 11 c are front elevation views of a diffuser showing illumination spots projected thereon at different discrete light source spacings;

Figure 12 is a front elevation view of a diffuser showing an illumination spot projected thereon at an increased discrete light source throw;
Figure 13a is a perspective view showing an alternative discrete light source orientation for an illumination source;
Figure 13b is a front eievation view of a diffuser showing an illumination spot projected thereon by the discrete light source of Figure 13a;
Figure 14a is a side elevation view showing an alternative diffuser profile for an illumination source;
Figure 14b is a front elevation view of the diffuser of Figure 14a showing an illumination spot projected thereon by a discrete light source; and Figures 15a and 15b are front elevation views of alternative discrete light source arrangements for an illumination source.
Detailed Description of the Preferred Embodiments The present invention relates generally to a touch system including at least one optical sensor having a field of view encompassing a touch surface. At least one source of backlight illumination directs light towards the at least one optical sensor to enable pointer contacts with the touch surface to be clearly detected by the at least one optical sensor in a variety of ambient lighting conditions. Preferred embodirnents of the present invention will now be described.
Turning now to Figure 1, a camera-based touch system in accordance with the present invention is shown and is generally identified by reference numeral 50. Camera-based touch system 50 is similar to that disclosed in International PCT Application Serial No. WO 02/03316, assigned to SMART Technologies Inc., assignee of the present invention. As can be seen, touch system 50 includes a touch screen 52 coupled to a digital signal processor (DSP) based master controller 54. Master controller 54 is also coupled to a computer 56. Computer 56 executes one or more application programs and provides computer-generated image output to the touch screen 52. The touch screen 52, master controller 54 and computer 56 form a closed-loop so that pointer contacts with the touch screen 52 can be recorded as writing or drawing or used to control execution of application programs executed by the computer 56.
Figure 2 better illustrates the touch screen 52. Touch screen 52 in the present embodiment includes a high-resolution display device such as a plasma display 58, the front surface of which defines a touch surface 60. The touch surface 60 is bordered by an illuminated bezel or frame 62 coupled to the display device. Illuminated bezel 62 includes elongate side frame assemblies 64 that are coupled to the sides of the plasma display 58. Each side frame assembly 64 accommodates a generally continuous illumination source 66 (see Figure 3) as will be described. The ends of the side frame assemblies 64 are joined by corner pieces 68 that house DSP-based CMOS
digital cameras 70 (see Figure 7). Each digital camera 70 is mounted within its respective corner piece 68 so that its field of view encompasses and looks across the entire touch surface 60.
One of the side frame assemblies 64 is shown in Figures 3 and 4. As can be seen, each side frame assembly 64 includes an extrusion 64a that snaps onto a side of the piasma display 58. The extrusion 64a has an open face 64b directed towards the touch surface 60 and defines a housing 64c to accommodate the generally continuous illumination source 66.
Each generally continuous illumination source 66 includes a row of discrete light sources 66a mounted on the forward surface of a printed circuit board 66b and a diffuser 66c covering the open face 64b of the extrusion 64a. The top and bottom edges of the printed circuit board 66b are received by channels 64d formed within the extrusion 64a to maintain the printed circuit board 66b in an orientation generally orthogonal to the plane of the touch surface 60. In the present embodiment, the discrete light sources 66a are in the form of infrared light emitting diodes (1R LEDs) aimed at the diffuser 66c. The spacing between each IR LED 66a is equal and is in the range of from about 1 to 2 inches. The IR LEDs 66a are oriented generally perpendicular to the plane of the diffuser 66c and are spaced from the diffuser 66c by approximately 0.8 inches. Figure 5 shows the radiation pattern of each IR LED 66a and as can be seen, the half power field of view is approximately 1200.
The printed circuit board and IR LED arrangement is made in strips of fixed length, in this case twelve (12) inch strips. A feed through power terminal 66d is provided on the rearward side of the printed circuit board 66b and is coupled to each IR LED 66a on the strip. By providing the printed circuit board and IR LED arrangement in strips, illuminated bezels 62 for a wide variety of touch screer, sizes can easily be constructed by populating the extrusions 64a with the appropriate numbers of strips and attaching power lines to the feed through terminals 66d.
Each diffuser 66c is formed of plastic that is semi-transparent or transparent (i.e. generally transparent) within a specified frequency range, in this case the infrared range, but substantially opaque in the visible light spectrum. As a result, the diffuser 66c obscures the internal components of the illuminated bezel 62 xrom view making the illuminated bezel more ,aesthetic. The diffuser 66c acts to diffuse or expand light emitted by the IR
LEDs 66a so that the illumination sources 66 are seen by the digital cameras 70 as generally continuous illumination sources. In the present embodiment, the spacing between adjacent IR LEDs 66a, the throw of the IR LEDs 66a and the distance between the IR LEDs 66a and the diffusers 66c is such that the illumination spots 72 projected onto the diffusers 66c by the IR LEDs 66a partiaily overlap at the diffusers 66c and remain within the boundaries of the diffusers as shown in Figures 6a and 6b. The slight curved shape of the diffusers 66c results in the illumination spots 72 taking on a generally elliptical shape.
One of the digital cameras 70 within a corner piece 68 is shown in Figure 7. As can be seen, each digital camera 70 includes a two-dimensional CMOS image sensor and associated lens assembly 80, a first-in-first-out (FIFO) buffer 82 coupled to the image serisor and lens assembly 80 by a data bus and a digital signal processor (DSP) 84 coupled to the FIFO 82 by a data bus and to the image sensor and lens assembly 80 by a control bus.
A boot EPROM 86 and a power supply subsystem 88 are also included. In -1~-the present embodiment, the CMOS camera image sensor is configured for a 20x640 pixel subarray that can be operated to capture image frames at rates in excess of 200 frames per second since arbitrary pixel rows can be selected. Also, since the pixel rows can be arbitrarily selected, the pixel subarray can be exposed for a greater duration for a given digital camera frame rate allowing for good operation in dark roorns as well as well lit rooms.
The DSP 84 provides control information to the image sensor and lens assembly 80 via the control bus. The control information allows the DSP 84 to control parameters of the image sensor and lens assembly 80 such as exposure, gain, array configuration, reset and initialization. The DSP 84 also provides clock signals to the image sensor and lens assembly 80 to control the frame rate of the image sensor and lens assembly 80.
An infrared pass filter 89 is provided on the digital camera image sensor and lens assembly 80 to blind the digital camera 70 to frequencies of light other than the light broadcasted by the illuminated bezel 62.
Master controller 54 is best illustrated in Figure 8 and includes a DSP 90, a boot EPROM 92, a serial line driver 94 and a power supply subsystem 95. The DSP 90 communicates with the DSPs 84 of the digital cameras 70 over a data bus via a serial port 96 and communicates with the computer 56 over a data bus via a seriai port 98 and the serial line driver 94.
The master controller 54 and each digital camera 70 follow a communication protocol that enables bi-directional communications via a common serial cable similar to a universal serial bus (USB). The transmission bandwidth is divided into thirty-two (32) 16-bit channels. (Df the thirty-two channels, six (6) channels are assigned to each of the DSPs 84 in. the digital cameras 70 and to the DSP 90 in the master controller 54 and the remaining two (2) channels are unused. The master controller 54 monitors the twenty-four (24) channels assigned to the DSPs 84 while the DSPs 84 monitor the six (6) channels assigned to the DSP 90 of the master controller 54.
Communications between the master controller 54 and the digital cameras 70 are performed as background processes in response to interrupts.

The operation of the touch system 50 will now be described.
Each digital camera 70 acquires images looking along the touch surface 60 within the field of view of its image sensor and lens assembly 80 at a desired frame rate and processes each acquired image to determine if a pointer is in the acquired image. If a pointer is in the acquired image, the image is further processed to determine characteristics of the pointer contacting or hovering above the touch surface 60. Pointer information packets (PiPs) including pointer characteristics, status and/or diagnostic information are then generated by the digital cameras 70 and the PIPs are queued for transmission to the master controller 54. The digital cameras 70 also receive and respond to command PIPs generated by the master controller 54.
The master controller 54 polls the digital cameras 70 for PIPs. If the PIPs include pointer characteristic information, the master controller 54 triangulates pointer characteristics in the PIPs to determine the position of the pointer relative to the touch surface 60 in Cartesian rectangular coordinates.
The master controller 54 in turn transmits calculated pointer position data, status and/or diagnostic information to the computer 56. in this manner, the pointer position data transmitted to the computer 56 can be recorded as writing or drawing or can be used to control execution of application programs executed by the computer 56. The computer 56 also updates the cornputer-generated image output conveyed to the plasma display 58 so that information presented on the touch surface 60 reflects the pointer activity.
The master controller 54 a6so receives commands from the computer 56 and responds accordingly as well as c;enerates and conveys command P1Ps to the digital cameras 70. Specifics concerning the processing of acquired images and the triangulation of pointer characteristics in PIPs are described in PCT Application No. WO 02/033'16 and therefore will not be described further herein.
To provide adequate backlighting for the digital cameras 70, the IR LEDs 66a within each side frame assembly 64 are powered and project infrared light onto the diffusers 66c. The diffusers 66c in turn, diffuse and hence, expand the illumination spots 72 so that the intensity of light passing _16_ through the diffusers into the region encompassed by the illuminated bezel 62 is generally even across the surfaces of the diffusers 66c. As a result, the illumination sources 66 appear as generally continuous illumination sources to the digital cameras 70. Since the digital cameras 70 include infrared pass filters 89, the digital cameras 70 are effectively blind to the background and only see the infrared light broadcast by the illuminated bezel 62. This backlight illumination in conjunction with the pass filters 89 allow the digital cameras 70 to capture distinct images of a pointer in proximity to the touch surface 60 since the pointer occludes some of the backlight illumination. As a result, this helps to binarize the images captured by the digital cameras 70.
Figure 9 shows the field of views of the digital cameras 70 and as can be seen in this arrangement each digital camera 70 receives backlight illumination directly from two illumination sources 66. Figure 1 a shows a continuous illumination source of backlight illumination as seen by one of the digital cameras 70. For contrast, Figure 1 b shows the continuous illumination source of backlight illumination as seen be one of the digital cameras 70 with the diffuser 66c removed.
In the preferred embodiment, the spacing between the IR LEDs 66a is such that the illumination spots 72 projected onto the diffusers 66c partially overlap as shown in Figures 6a and 6b. The optical properties of the diffusers 66c are such that the diffusers 66c expand the illumination spots 72 so that light passing through the diffusers has a generaily even intensity over the entire surfaces of the diffusers 66c. As will be appreciated, alternative arrangements are possible. The IR LEDs 66a can be spaced so that the illumination spots 72 projected onto the diffusers 64a significantly overlap as shown in Figure 11 a, abut as shown in Figure 11 b or are spaced apart as shown in Figure 11 c. In the case where the illumination spots 72 are spaced apart, if the optical properties of the diffusers 66c aire such that the illumination spots 72 cannot be adequately expanded, the digital cameras 70 will see the illumination sources 66 as being discontinuous or discrete.
Although a particular IR LED throw, distance between the IR
LEDs 66a and diffusers 66c, and angular orientatiors of the IR LEDs 66a with respect to the diffusers 66c have been disclosed, those of skill in the art will appreciate that the IR LED throw, distance between the IR LEDs 66a and the diffusers 66c, and the angular orientation of the IR LEDs 66a with respect to the diffusers 66c may be altered to suit the particular environment. An increase in IR LED throw or distance between the IR LEDs 66a and the diffusers 66c will result in expanded illumination spots 72 projected onto the diffusers 66c as shown in Figure 12.
Changes in the angular orientation of the IR LEDs 66a with respect to the diffusers 66c determines the geometry of the illumination spots 72 as shown in Figures 13a and 13b. In this example, the angular orientation of the IR LED 66a results in an elongate illumination spot being projected onto the diffuser 66c. The profile geometry of the diffusers 66c will also alter the profiles of the illumination spots 72 as shown in Figures 14a and 14b. In this example, the diffuser profile geometry results in circular illumination spots '15 being projected onto the diffuser 66c rather than elliptical illumination spots as shown in Figures 6a and 6b.
Although the IR LEDs 66a have been described as being equally spaced along the lengths of the printed circuit boards 66b, those of skill in the art will appreciate that the spacing between the IR LEDs need not be equal along the lengths of the printed circuit boards. For example, the spacing between the IFZ LEDs 66a may be non-linear and correspond to the resolution of the digital cameras 70. In addition, although each illumination source 66 is described as including a single row of IR LEDs 66a, it will be appreciated by those of skill in the art that an array of IR I_E s 66a including stacked rows or other two-dirnensional arrays of IF2 LEDs may be provided in each illumination source 66 to enhance the backlight illumination provided to the digital cameras 70. In the stacked row IR LED arrangement, the rows of IR LEDs 66a can be aligned as shown in Figure 15a or staggered as shown in Figure 15b.

Although the diffusers 66c are described as being formed of plastic that is generally transparent in the IR range and generally opaque in the visible range, those of skill in the art will appreciate that the diffusers 66c may be formed of other suitable materials and/or have alternative optical properties. For example, the diffusers 66c may be formed of a polymer impregnated with a suitable material to aid in light diffusion. Furthermore, the diffusers may also be designed to act as polarizers to polarize the light emitted by the illumination sources 66. The diffusers 66c can also be modified to control the backlight illumination as seen by the digital cameras 70. For example, the diffusers 66c may be provided with horizontal slits therein defining apertures to limit the vertical backlight iliumination as seen by the digital cameras. In this case, backlight illumination projected by the illumination sources 66 is effectively cropped to remove top and bottom fringe effects thereby to provide a more continuous source of backlight illumination.
Rather than using discrete light sources, continuous light sources in conjunction with colour filters incorporated into the diffusers or in close proximity thereto to block unwanted frequencies can be used to provide the desired backlight illumination for the digital cameras 70. For example, the IR LEDs 66a can be replaced with electroluminescent wire extending around the illuminated bezel 62 within the side frame assemblies 64. As is known, electroluminescent wire when powered casts continuous iight in the visible range in one of eight frequencies. Of course other continuous sources of IR
illumination can be used. As will be appreciated, when non-infrared light sources are used in the illumination sources 62, the filters 89 of the digital cameras 70 are selected to pass the appropriate frequencies of light broadcast by the illuminated bezel 62 and blind the digital cameras 70 to the background.
Although the touch system 50 has been described as including a plasma display 58 to present images on the touch surface 60, those of skill in the art will appreciate that this is not required. The touch screen 52 may be a rear or front projection display device or virtually any surface on which a computer generated image is projected. Alternatively, the touch system 50 may be a writeboard where images are not projected thereon.
Also, although the touch system 50 is described as including a master controller 54 separate from the digital cameras 70, if desired one of the digital cameras 70 can be conditioned to function as both a camera and the master controller and poll the other digital cameras for P6Ps. In this case, it is preferred that the digital camera functioning as the master controller includes a faster DSP 84 than the remaining digitaG cameras.
Furthermore, although the touch system 50 has been described as including four digital cameras 70, each mounted adjacent a corner of the illuminated bezel 62, those of skill in the art will appreciate that other image sensing arrangements can be used. The touch system 50 may include basically any number of optical sensors to acquire images along the touch surface 60 and one or more illumination sources 66 to provide the desired backlight illumination.
Although preferred embodiments of the present invention have been described, those of skill in the art will appreciate that variations and modifications may be made without departing from the spirit and scope thereof as defined by the appended claims.

Claims (69)

What is claimed is:
1. In a touch system including a touch surface and at least one optical sensor looking along the touch surface to acquire images of a pointer in proximity thereto, an illumination source to provide backlighting to said at least one optical sensor comprising:
at least one light source; and a diffuser disposed between said at least one light source and said at least one optical sensor, said diffuser diffusing light projected by said at least one light source prior to said light being directed across the touch surface to said at least one optical sensor.
2. An illumination source according to claim 1 wherein said at least one light source is at least one discrete light source.
3. An illumination source according to claim 2 wherein said at least one light source includes a plurality of spaced discrete light sources.
4. An illumination source according to claim 3 wherein said discrete light sources are arranged in at least one row.
5. An illumination source according to claim 4 wherein said discrete light sources are arranged in a two-dimensional array.
6. An illumination source according to claim 5 wherein said two-dimensional array includes a plurality of stacked rows of discrete light sources.
7. An illumination source according to claim 3 wherein said discrete light sources are generally equally spaced.
8. An illumination source according to claim 3 wherein the spacing between said discrete light sources is non-linear.
9. An illumination source according to claim 3 wherein said discrete light sources are spaced such that the light projected onto the diffuser by adjacent discrete light sources overlaps.
10. An illumination source according to claim 9 wherein said spacing is such that the light projected onto the diffuser by adjacent discrete light sources significantly overlaps.
11. An illumination source according to claim 3 wherein said discrete light sources are spaced such that the light projected onto the diffuser by adjacent discrete light sources abuts.
12. An illumination source according to claim 3 wherein said discrete light sources are spaced such that the light projected onto the diffuser by adjacent discrete light sources is spaced apart.
13. An illumination source according to claim 3 wherein said diffuser diffuses the light projected by said discrete light sources so that the light directed to said at least one optical sensor is generally continuous across said illumination source.
14. An illumination source according to claim 3 wherein said diffuser is generally transparent in a specified frequency range.
15. An illumination source according to claim 14 wherein said diffuser is generally opaque in a different specified frequency range.
16. An illumination source according to claim 15 wherein said discrete light sources are infrared light emitting diodes (IR LEDs) and wherein said diffuser is generally transparent in the infrared range.
17. An illumination source according to claim 16 wherein said diffuser is generally opaque in the visible light spectrum.
18. An illumination source according to claim 3 wherein said diffuser further polarizes the light prior to being directed to said at least one optical sensor.
19. An illumination source according to claim 1 wherein said at least one light source is a continuous light source.
20. An illumination source according to claim 19 wherein said continuous light source projects light in the infrared spectrum.
21. An illumination source according to claim 19 wherein said continuous light source projects light in the visible spectrum and wherein said illumination source further includes a color filter through which said light passes.
22. An illumination source according to claim 21 wherein said color filter is adjacent to said diffuser.
23. An illumination source according to claim 21 wherein said diffuser acts as said color filter.
24. A touch system comprising:
at least one optical sensor associated with a touch surface and having a field of view encompassing and looking across at least a portion of said touch surface;

at least one source of backlight illumination directing light into the field of view of said at least one optical sensor, said at least one illumination source including at least one light source and a diffusion medium to expand light projected by said at least one light source prior to said light being directed into said field of view; and a pass filter associated with said at least one optical sensor to pass the light directed by said source of backlight illumination to said at least one optical sensor.
25. A touch system according to claim 24 wherein said source of backlight illumination extends at least partially along one side of said touch surface opposite said at least one optical sensor.
26. A touch system according to claim 25 wherein said source of backlight illumination extends generally the length of said one side.
27. A touch system according to claim 26 wherein said source of backlight illumination includes a plurality of spaced discrete light sources.
28. A touch system according to claim 27 wherein said discrete light sources are spaced such that the light projected onto the diffusion medium by adjacent discrete light sources overlaps.
29. A touch system according to claim 28 wherein said diffusion medium expands the light projected by said discrete light sources so that the light directed to said at least one optical sensor is generally continuous across said at least one source of backlight illumination.
30. A touch system according to claim 29 wherein said diffusion medium is generally transparent in a specified frequency range.
31. A touch system according to claim 30 wherein said diffusion medium is generally opaque in a different specified frequency range.
32. A touch system according to claim 31 wherein said discrete light sources are infrared light emitting diodes (IR LEDs) and wherein said diffusion medium is generally transparent in the infrared range, said pass filter being an infrared pass filter.
33. A touch system according to claim 32 wherein said diffusion medium is generally opaque in the visible light spectrum.
34. A touch system according to claim 27 wherein said diffusion medium further polarizes the light being directed to said at least one optical sensor.
35. A touch system according to claim 24 wherein said at least one light source is a continuous light source.
36. A touch system according to claim 35 wherein said continuous light source projects light in the visible spectrum and wherein said source of backlight illumination further includes a color filter through which said backlight illumination passes.
37. A touch system according to claim 36 wherein said color filter is adjacent to said diffusion medium.
38. A touch system according to claim 37 wherein said diffusion medium acts as said color filter.
39. A touch system according to claim 35 wherein said continuous light source projects light in the infrared spectrum.
40. A touch system according to claim 26 including a plurality of optical sensors adjacent corners of said touch surface and a plurality of sources of backlight illumination extending along sides of said touch surface.
41. A passive touch system comprising:
a touch surface;
at least one source of backlight illumination projecting backlighting across said touch surface;
at least two image sensors associated with said touch surface, said at least two image sensors acquiring images of said touch surface from different locations and having overlapping fields of view;
a pass filter associated with each of said image sensors generally to blind said image sensors except to said projected backlighting;
a digital signal processor associated with each image sensor, the digital signal processors associated with said at least two image sensors selecting pixel subsets of images acquired by said at least two image sensors and processing pixel data acquired by the selected pixel subsets to generate pointer characteristic data when a pointer exists in said acquired images; and a master digital signal processor in communication with said digital signal processors, said master digital signal processor receiving pointer characteristic data from said digital signal processors and triangulating the pointer characteristic data to determine the location of said pointer relative to said touch surface.
42. A passive touch system according to claim 41 including a plurality of sources of backlight illumination.
43. A passive touch system according to claim 42 wherein each source of backlight illumination extends along a different side of said touch surface.
44. A passive touch system according to claim 43 wherein said sources of backlight illumination project light directly into the fields of view of said at least two image sensors.
45. A passive touch system according to claim 44 wherein said sources of backlight illumination appear as continuous light sources to said at least two image sensors.
46. A passive touch system according to claim 45 including an image sensor at each corner of said touch surface and a source of backlight illumination along each side of said touch surface.
47. A touch system comprising:
at least two CMOS image sensors associated with a touch surface, said at least two CMOS image sensors acquiring images of said touch surface from different locations and having overlapping fields of view;
at least one source of backlight illumination projecting backlighting across said touch surface through a diffuser;
a pass filter associated with each of said image sensors generally to blind said image sensors except to said projected backlighting;
and at least one processor receiving and processing image data acquired by said at least two CMOS image sensors to detect the existence of a pointer in said images and to determine the location of said pointer relative to said touch surface.
48. A touch system according to claim 47 including a plurality of sources of backlight illumination.
49. A touch system according to claim 48 wherein each source of backlight illumination extends along a different side of said touch surface.
50. A touch system according to claim 49 wherein said sources of backlight illumination project light directly into the fields of view of said at least two image sensors.
51. A touch system according to claim 50 wherein said sources of backlight illumination appear as continuous light sources to said at least two CMOS image sensors.
52. A touch system according to claim 51 include a CMOS image sensor at each corner of said touch surface and a source of backlight illumination along each side of said touch surface.
53. A touch system comprising:
at least two optical recording devices associated with a touch surface, said at least two optical recording devices acquiring images of said touch surface from different locations and having overlapping fields of view;
at least one source of backlight illumination projecting backlighting across said touch surface;
a pass filter associated with said optical recording devices generally to blind said optical recording devices except to said projected backlighting; and a processor receiving and processing image data acquired by said at least two optical recording devices to detect the existence of a pointer in said images and to determine the location of said pointer relative to said touch surface, wherein said processor includes first and second processing stages, said first processing stage including a plurality of digital signal processors each associated with a respective one of said optical recording devices, said digital signal processors processing pixel data from pixel subsets of said optical recording devices and generating pointer parameter data, said second processing stage processing said pointer parameter data from said first processing stage to determine the location of the pointer.
54. A touch system according to claim 53 wherein said second processing stage includes a master digital signal processor receiving said pointer parameter data from said digital signal processors, said master digital signal processor triangulating said pointer parameter data.
55. A touch system according to claim 53 to 54, wherein a third processing stage includes a personal computer receiving the location of said pointer relative to said touch surface from said master digital signal processor.
56. A touch system according to any one of claims 53 to 55 including a plurality of sources of backlight illumination.
57. A touch system according to claim 56 wherein each source of backlight illumination extends along a different side of said touch surface.
58. A touch system according to claim 57 wherein said sources of backlight illumination project light directly into the fields of view of said at least two optical recording devices.
59. A touch system according to claim 58 wherein said sources of backlight illumination appear as continuous light sources to said at least two optical recording devices.
60. A touch system according to claim 59 including an optical recording device at each corner of said touch surface and a source of backlight illumination along each side of said touch surface.
61. An imaging assembly comprising:
a substantially rectangular bezel to surround a display surface;
at least one optical sensor mounted on said bezel, said at least one optical sensor being oriented to have a field of view looking along said display surface;

at least one source of backlight illumination within said bezel projecting backlight illumination into said field of view through a diffuser;
and a pass filter associated with said at least one optical sensor to pass the light directed by said source of backlight illumination to said at least one optical sensor.
62. An imaging assembly according to claim 61 include an optical sensor at each corner of said bezel and a source of backlight illumination within each side of said bezel.
63. An imaging assembly according to claim 62 wherein said sources of backlight illumination appear as continuous light sources to said optical sensors.
64. An imaging assembly according to claim 63 wherein said bezel includes side frame assemblies joined by corner pieces, said side frame assemblies including housing elements engaging said display surface and accommodating said sources of backlight illumination and said corner pieces accommodating said optical sensors.
65. An imaging assembly according to any one of claims 61 to 64 wherein each optical sensor is a CMOS digital camera.
66. An imaging assembly according to claim 65, wherein each CMOS digital camera has a selectable pixel array.
67. An imaging assembly according to claim 64 wherein each source of backlight illumination includes a plurality of spaced discrete light sources.
68. An imaging assembly according to claim 63 wherein each source of backlight illumination includes a continuous light source.
69. A method of detecting the position of a pointer relative to a touch surface comprising the steps of:
acquiring multiple images of a pointer relative to said touch surface;
selecting pixel subsets of said acquired images;
processing pixel data acquired by the pixel subsets to detect the existence of said pointer therein and to determine the location of the pointer relative to the touch surface using triangulation; and during said acquiring providing backlight illumination across said touch surface and acquiring said images based on said backlight illumination.
CA002453873A 2003-01-30 2003-12-18 Illuminated bezel and touch system incorporating the same Expired - Lifetime CA2453873C (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US10/354,168 US6972401B2 (en) 2003-01-30 2003-01-30 Illuminated bezel and touch system incorporating the same
US10/354,168 2003-01-30

Publications (2)

Publication Number Publication Date
CA2453873A1 CA2453873A1 (en) 2004-07-30
CA2453873C true CA2453873C (en) 2007-12-04

Family

ID=32770322

Family Applications (1)

Application Number Title Priority Date Filing Date
CA002453873A Expired - Lifetime CA2453873C (en) 2003-01-30 2003-12-18 Illuminated bezel and touch system incorporating the same

Country Status (2)

Country Link
US (1) US6972401B2 (en)
CA (1) CA2453873C (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8115753B2 (en) 2007-04-11 2012-02-14 Next Holdings Limited Touch screen system with hover and click input methods
US8149221B2 (en) 2004-05-07 2012-04-03 Next Holdings Limited Touch panel display system with illumination and detection provided from a single edge
US8289299B2 (en) 2003-02-14 2012-10-16 Next Holdings Limited Touch screen signal processing
US8384693B2 (en) 2007-08-30 2013-02-26 Next Holdings Limited Low profile touch panel systems
US8405636B2 (en) 2008-01-07 2013-03-26 Next Holdings Limited Optical position sensing system and optical position sensor assembly
US8432377B2 (en) 2007-08-30 2013-04-30 Next Holdings Limited Optical touchscreen with improved illumination
US8456447B2 (en) 2003-02-14 2013-06-04 Next Holdings Limited Touch screen signal processing
US8508508B2 (en) 2003-02-14 2013-08-13 Next Holdings Limited Touch screen signal processing with single-point calibration

Families Citing this family (157)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7844914B2 (en) 2004-07-30 2010-11-30 Apple Inc. Activating virtual keys of a touch-screen virtual keyboard
US9239673B2 (en) 1998-01-26 2016-01-19 Apple Inc. Gesturing with a multipoint sensing device
US8479122B2 (en) 2004-07-30 2013-07-02 Apple Inc. Gestures for touch sensitive input devices
US7614008B2 (en) 2004-07-30 2009-11-03 Apple Inc. Operation of a computer with touch screen interface
EP1717684A3 (en) 1998-01-26 2008-01-23 Fingerworks, Inc. Method and apparatus for integrating manual input
US9292111B2 (en) 1998-01-26 2016-03-22 Apple Inc. Gesturing with a multipoint sensing device
US7808479B1 (en) 2003-09-02 2010-10-05 Apple Inc. Ambidextrous mouse
US7663607B2 (en) 2004-05-06 2010-02-16 Apple Inc. Multipoint touchscreen
JP4052498B2 (en) 1999-10-29 2008-02-27 株式会社リコー Coordinate input apparatus and method
JP2001184161A (en) 1999-12-27 2001-07-06 Ricoh Co Ltd Method and device for inputting information, writing input device, method for managing written data, method for controlling display, portable electronic writing device, and recording medium
CN1310126C (en) 2000-07-05 2007-04-11 智能技术公司 Camera-based touch system
US6803906B1 (en) 2000-07-05 2004-10-12 Smart Technologies, Inc. Passive touch system and method of detecting user input
US7030861B1 (en) 2001-02-10 2006-04-18 Wayne Carl Westerman System and method for packing multi-touch gestures onto a hand
US9213443B2 (en) 2009-02-15 2015-12-15 Neonode Inc. Optical touch screen systems using reflected light
US9052777B2 (en) 2001-11-02 2015-06-09 Neonode Inc. Optical elements with alternating reflective lens facets
US8339379B2 (en) * 2004-04-29 2012-12-25 Neonode Inc. Light-based touch screen
US9164654B2 (en) * 2002-12-10 2015-10-20 Neonode Inc. User interface for mobile computer unit
US9052771B2 (en) * 2002-11-04 2015-06-09 Neonode Inc. Touch screen calibration and update methods
US9471170B2 (en) * 2002-11-04 2016-10-18 Neonode Inc. Light-based touch screen with shift-aligned emitter and receiver lenses
US8674966B2 (en) 2001-11-02 2014-03-18 Neonode Inc. ASIC controller for light-based touch screen
US9778794B2 (en) 2001-11-02 2017-10-03 Neonode Inc. Light-based touch screen
US11275405B2 (en) 2005-03-04 2022-03-15 Apple Inc. Multi-functional hand-held device
US7656393B2 (en) 2005-03-04 2010-02-02 Apple Inc. Electronic device having display and surrounding touch sensitive bezel for user interface and control
US7358963B2 (en) 2002-09-09 2008-04-15 Apple Inc. Mouse having an optically-based scrolling feature
US8416217B1 (en) 2002-11-04 2013-04-09 Neonode Inc. Light-based finger gesture user interface
US8896575B2 (en) * 2002-11-04 2014-11-25 Neonode Inc. Pressure-sensitive touch screen
US8587562B2 (en) * 2002-11-04 2013-11-19 Neonode Inc. Light-based touch screen using elliptical and parabolic reflectors
US6954197B2 (en) * 2002-11-15 2005-10-11 Smart Technologies Inc. Size/scale and orientation determination of a pointer in a camera-based touch system
US8902196B2 (en) * 2002-12-10 2014-12-02 Neonode Inc. Methods for determining a touch location on a touch screen
US8403203B2 (en) * 2002-12-10 2013-03-26 Neonoda Inc. Component bonding using a capillary effect
US9389730B2 (en) * 2002-12-10 2016-07-12 Neonode Inc. Light-based touch screen using elongated light guides
US9195344B2 (en) * 2002-12-10 2015-11-24 Neonode Inc. Optical surface using a reflected image for determining three-dimensional position information
US7532206B2 (en) 2003-03-11 2009-05-12 Smart Technologies Ulc System and method for differentiating between pointers used to contact touch surface
US7411575B2 (en) 2003-09-16 2008-08-12 Smart Technologies Ulc Gesture recognition method and touch system incorporating the same
US7274356B2 (en) 2003-10-09 2007-09-25 Smart Technologies Inc. Apparatus for determining the location of a pointer within a region of interest
US7355593B2 (en) 2004-01-02 2008-04-08 Smart Technologies, Inc. Pointer tracking across multiple overlapping coordinate input sub-regions defining a generally contiguous input region
US7232986B2 (en) * 2004-02-17 2007-06-19 Smart Technologies Inc. Apparatus for detecting a pointer within a region of interest
GB2424269A (en) 2004-04-01 2006-09-20 Robert Michael Lipman Control apparatus
US7460110B2 (en) 2004-04-29 2008-12-02 Smart Technologies Ulc Dual mode touch system
US7492357B2 (en) 2004-05-05 2009-02-17 Smart Technologies Ulc Apparatus and method for detecting a pointer relative to a touch surface
US8120596B2 (en) 2004-05-21 2012-02-21 Smart Technologies Ulc Tiled touch system
US8381135B2 (en) 2004-07-30 2013-02-19 Apple Inc. Proximity detector in handheld device
US7653883B2 (en) 2004-07-30 2010-01-26 Apple Inc. Proximity detector in handheld device
CN100555200C (en) 2004-08-16 2009-10-28 苹果公司 The method of the spatial resolution of touch sensitive devices and raising touch sensitive devices
WO2008111079A2 (en) 2007-03-14 2008-09-18 Power2B, Inc. Interactive devices
US10452207B2 (en) 2005-05-18 2019-10-22 Power2B, Inc. Displays and information input devices
US20070115397A1 (en) * 2005-06-24 2007-05-24 Fakespace Labs, Inc. Projection display with internal calibration bezel for video
JP2009508205A (en) 2005-09-08 2009-02-26 パワー2ビー,インコーポレイティド Display and information input device
US20070064004A1 (en) * 2005-09-21 2007-03-22 Hewlett-Packard Development Company, L.P. Moving a graphic element
US8077147B2 (en) 2005-12-30 2011-12-13 Apple Inc. Mouse with optical sensing surface
US7591165B2 (en) * 2006-03-29 2009-09-22 Tekscan Incorporated Control circuit for sensor array and related methods
US7538760B2 (en) 2006-03-30 2009-05-26 Apple Inc. Force imaging input device and system
US7978181B2 (en) 2006-04-25 2011-07-12 Apple Inc. Keystroke tactility arrangement on a smooth touch surface
US8279180B2 (en) 2006-05-02 2012-10-02 Apple Inc. Multipoint touch surface controller
US8552989B2 (en) 2006-06-09 2013-10-08 Apple Inc. Integrated display and touch screen
CN104965621B (en) 2006-06-09 2018-06-12 苹果公司 Touch screen LCD and its operating method
KR101295943B1 (en) 2006-06-09 2013-08-13 애플 인크. Touch screen liquid crystal display
KR100748469B1 (en) * 2006-06-26 2007-08-10 삼성전자주식회사 User interface method based on keypad touch and mobile device thereof
US7333095B1 (en) 2006-07-12 2008-02-19 Lumio Inc Illumination for optical touch panel
US9442607B2 (en) 2006-12-04 2016-09-13 Smart Technologies Inc. Interactive input system and method
US8493330B2 (en) 2007-01-03 2013-07-23 Apple Inc. Individual channel phase delay scheme
US9710095B2 (en) 2007-01-05 2017-07-18 Apple Inc. Touch screen stack-ups
US20080221930A1 (en) 2007-03-09 2008-09-11 Spacelabs Medical, Inc. Health data collection tool
US8471830B2 (en) * 2007-07-06 2013-06-25 Neonode Inc. Scanning of a touch screen
US8094137B2 (en) 2007-07-23 2012-01-10 Smart Technologies Ulc System and method of detecting contact on a display
US8330730B1 (en) * 2007-09-04 2012-12-11 Imaging Systems Technology, Inc. Calibrating of interactive touch system for image compositing
US8102377B2 (en) * 2007-09-14 2012-01-24 Smart Technologies Ulc Portable interactive media presentation system
KR101407301B1 (en) 2007-12-03 2014-06-13 엘지디스플레이 주식회사 touch panel display apparatus
US8963796B2 (en) * 2008-01-07 2015-02-24 Smart Technologies Ulc Method of launching a selected application in a multi-monitor computer system and multi-monitor computer system employing the same
WO2009094494A2 (en) * 2008-01-23 2009-07-30 Ashoff Richard D Programmable, progressive, directing lighting systems: apparatus and method
US7781722B2 (en) * 2008-02-07 2010-08-24 Lumio Inc Optical touch screen assembly
KR100969977B1 (en) * 2008-02-25 2010-07-15 삼성에스디아이 주식회사 Plasma display device
US8902193B2 (en) 2008-05-09 2014-12-02 Smart Technologies Ulc Interactive input system and bezel therefor
US8887063B2 (en) * 2008-05-21 2014-11-11 Smart Technologies Ulc Desktop sharing method and system
US8508488B2 (en) * 2008-06-12 2013-08-13 Samsung Sdi Co., Ltd. Display apparatus having touch screen function
WO2009152715A1 (en) * 2008-06-18 2009-12-23 北京汇冠新技术有限公司 Sensing apparatus for touch checking
US9513705B2 (en) 2008-06-19 2016-12-06 Tactile Displays, Llc Interactive display with tactile feedback
US8217908B2 (en) 2008-06-19 2012-07-10 Tactile Displays, Llc Apparatus and method for interactive display with tactile feedback
US8115745B2 (en) 2008-06-19 2012-02-14 Tactile Displays, Llc Apparatus and method for interactive display with tactile feedback
US8665228B2 (en) 2008-06-19 2014-03-04 Tactile Displays, Llc Energy efficient interactive display with energy regenerative keyboard
US20100001978A1 (en) * 2008-07-02 2010-01-07 Stephen Brian Lynch Ambient light interference reduction for optical input devices
US8427453B2 (en) * 2008-07-10 2013-04-23 Pixart Imaging Inc. Optical sensing system
TWI397847B (en) 2009-09-17 2013-06-01 Pixart Imaging Inc Optical touch device and locating method thereof
US8305345B2 (en) * 2008-08-07 2012-11-06 Life Technologies Co., Ltd. Multimedia playing device
WO2010025557A1 (en) 2008-09-03 2010-03-11 Smart Technologies Ulc Method of displaying applications in a multi-monitor computer system and multi-monitor computer system employing the method
US20100083109A1 (en) * 2008-09-29 2010-04-01 Smart Technologies Ulc Method for handling interactions with multiple users of an interactive input system, and interactive input system executing the method
US20100079409A1 (en) * 2008-09-29 2010-04-01 Smart Technologies Ulc Touch panel for an interactive input system, and interactive input system incorporating the touch panel
US8810522B2 (en) * 2008-09-29 2014-08-19 Smart Technologies Ulc Method for selecting and manipulating a graphical object in an interactive input system, and interactive input system executing the method
US8339378B2 (en) 2008-11-05 2012-12-25 Smart Technologies Ulc Interactive input system with multi-angle reflector
KR100982331B1 (en) * 2008-12-01 2010-09-15 삼성에스디아이 주식회사 Plasma display device
KR20110112831A (en) 2009-01-05 2011-10-13 스마트 테크놀러지스 유엘씨 Gesture recognition method and interactive input system employing same
US20100201637A1 (en) * 2009-02-11 2010-08-12 Interacta, Inc. Touch screen display system
US20100201812A1 (en) * 2009-02-11 2010-08-12 Smart Technologies Ulc Active display feedback in interactive input systems
US8775023B2 (en) 2009-02-15 2014-07-08 Neanode Inc. Light-based touch controls on a steering wheel and dashboard
US9063614B2 (en) 2009-02-15 2015-06-23 Neonode Inc. Optical touch screens
TWM363032U (en) * 2009-02-25 2009-08-11 Pixart Imaging Inc Optical touch control module
US8502803B2 (en) * 2009-04-07 2013-08-06 Lumio Inc Drift compensated optical touch screen
US8770815B2 (en) * 2009-05-18 2014-07-08 Sony Corporation Active bezel edge lighting with diffuser layer
US20100289942A1 (en) * 2009-05-18 2010-11-18 Sony Corporation And Sony Electronics Feedback system for optimizing exposure
US20100309169A1 (en) * 2009-06-03 2010-12-09 Lumio Inc. Optical Touch Screen with Reflectors
KR20100131634A (en) * 2009-06-08 2010-12-16 삼성전자주식회사 Display apparatus and control method thereof
CN102597796B (en) * 2009-06-16 2015-02-04 百安托国际有限公司 Two-dimensional position sensing systems and sensors therefor
WO2011003171A1 (en) * 2009-07-08 2011-01-13 Smart Technologies Ulc Three-dimensional widget manipulation on a multi-touch panel
US8692768B2 (en) * 2009-07-10 2014-04-08 Smart Technologies Ulc Interactive input system
US8654524B2 (en) 2009-08-17 2014-02-18 Apple Inc. Housing as an I/O device
MX2012002504A (en) * 2009-09-01 2012-08-03 Smart Technologies Ulc Interactive input system with improved signal-to-noise ratio (snr) and image capture method.
TWI412975B (en) * 2009-09-02 2013-10-21 Pixart Imaging Inc Gesture recognition method and interactive system using the same
EP2488837A4 (en) 2009-10-16 2017-11-15 Spacelabs Healthcare LLC Light enhanced flow tube
US9604020B2 (en) 2009-10-16 2017-03-28 Spacelabs Healthcare Llc Integrated, extendable anesthesia system
US20110095989A1 (en) * 2009-10-23 2011-04-28 Smart Technologies Ulc Interactive input system and bezel therefor
KR101070864B1 (en) * 2009-12-11 2011-10-10 김성한 optical touch screen
US8502789B2 (en) * 2010-01-11 2013-08-06 Smart Technologies Ulc Method for handling user input in an interactive input system, and interactive input system executing the method
US8624835B2 (en) * 2010-01-13 2014-01-07 Smart Technologies Ulc Interactive input system and illumination system therefor
US20110176082A1 (en) * 2010-01-18 2011-07-21 Matthew Allard Mounting Members For Touch Sensitive Displays
US20110187678A1 (en) * 2010-01-29 2011-08-04 Tyco Electronics Corporation Touch system using optical components to image multiple fields of view on an image sensor
US20110193969A1 (en) * 2010-02-09 2011-08-11 Qisda Corporation Object-detecting system and method by use of non-coincident fields of light
KR101167061B1 (en) * 2010-03-05 2012-07-27 크루셜텍 (주) Optical joystick and mobile device having it
US9383864B2 (en) 2010-03-31 2016-07-05 Smart Technologies Ulc Illumination structure for an interactive input system
US8872772B2 (en) 2010-04-01 2014-10-28 Smart Technologies Ulc Interactive input system and pen tool therefor
US10719131B2 (en) 2010-04-05 2020-07-21 Tactile Displays, Llc Interactive display with tactile feedback
US20200393907A1 (en) 2010-04-13 2020-12-17 Tactile Displays, Llc Interactive display with tactile feedback
US8338725B2 (en) 2010-04-29 2012-12-25 Au Optronics Corporation Camera based touch system
CN103119476B (en) * 2010-06-09 2018-03-27 百安托国际有限公司 Modularization position sensing and method
CN102375793A (en) * 2010-08-12 2012-03-14 上海科斗电子科技有限公司 USB (universal serial bus) communication touch screen
KR101362170B1 (en) * 2010-10-19 2014-02-13 엘지디스플레이 주식회사 Optical Sensing Frame and Display Device Including the Same
EP2447811B1 (en) * 2010-11-02 2019-12-18 LG Display Co., Ltd. Infrared sensor module, touch sensing method thereof, and auto calibration method applied to the same
US20140159921A1 (en) * 2010-11-19 2014-06-12 Spacelabs Healthcare Llc Configurable, Portable Patient Monitoring System
US9047747B2 (en) 2010-11-19 2015-06-02 Spacelabs Healthcare Llc Dual serial bus interface
US8804056B2 (en) * 2010-12-22 2014-08-12 Apple Inc. Integrated touch screens
US9629566B2 (en) 2011-03-11 2017-04-25 Spacelabs Healthcare Llc Methods and systems to determine multi-parameter managed alarm hierarchy during patient monitoring
TW201239710A (en) 2011-03-29 2012-10-01 Genius Electronic Optical Co Ltd Optical touch system
US8600107B2 (en) 2011-03-31 2013-12-03 Smart Technologies Ulc Interactive input system and method
CN102736792A (en) * 2011-04-01 2012-10-17 玉晶光电股份有限公司 Optical profile type touch control system
FR2976093B1 (en) * 2011-06-01 2013-08-16 Thales Sa OPTICAL TRANSMITTER AND RECEIVER TOUCH SYSTEM
US9292109B2 (en) 2011-09-22 2016-03-22 Smart Technologies Ulc Interactive input system and pen tool therefor
WO2013104062A1 (en) 2012-01-11 2013-07-18 Smart Technologies Ulc Interactive input system and method
JP6259818B2 (en) * 2012-05-15 2018-01-10 スペースラブズ ヘルスケア エルエルシー Configurable portable patient monitoring system
US9557846B2 (en) 2012-10-04 2017-01-31 Corning Incorporated Pressure-sensing touch system utilizing optical and capacitive systems
US9785291B2 (en) 2012-10-11 2017-10-10 Google Inc. Bezel sensitive touch screen system
US9921661B2 (en) 2012-10-14 2018-03-20 Neonode Inc. Optical proximity sensor and associated user interface
US9207800B1 (en) 2014-09-23 2015-12-08 Neonode Inc. Integrated light guide and touch screen frame and multi-touch determination method
US10282034B2 (en) 2012-10-14 2019-05-07 Neonode Inc. Touch sensitive curved and flexible displays
US9164625B2 (en) 2012-10-14 2015-10-20 Neonode Inc. Proximity sensor for determining two-dimensional coordinates of a proximal object
US9092093B2 (en) 2012-11-27 2015-07-28 Neonode Inc. Steering wheel user interface
US10987026B2 (en) 2013-05-30 2021-04-27 Spacelabs Healthcare Llc Capnography module with automatic switching between mainstream and sidestream monitoring
US9504620B2 (en) 2014-07-23 2016-11-29 American Sterilizer Company Method of controlling a pressurized mattress system for a support structure
KR102471578B1 (en) * 2016-05-25 2022-11-25 에이엠에스 센서스 싱가포르 피티이. 리미티드. Microlens Array Diffusers
EP3339951A1 (en) * 2016-12-20 2018-06-27 Nokia Technologies Oy Fill lighting apparatus
CN110300950B (en) 2017-02-06 2023-06-16 平蛙实验室股份公司 Optical coupling in touch sensing systems
PL422371A1 (en) * 2017-07-27 2019-01-28 Paweł Ryłko Method for obtaining tactile function on images projected by still projectors on any flat surfaces
WO2019172826A1 (en) * 2018-03-05 2019-09-12 Flatfrog Laboratories Ab Improved touch-sensing apparatus
WO2019190481A1 (en) * 2018-03-27 2019-10-03 Ford Global Technologies, Llc Display for an autonomous taxi
EP3887192B1 (en) 2018-11-28 2023-06-07 Neonode Inc. Motorist user interface sensor
US11842014B2 (en) 2019-12-31 2023-12-12 Neonode Inc. Contactless touch input system
JP2023512682A (en) 2020-02-10 2023-03-28 フラットフロッグ ラボラトリーズ アーベー Improved touch detector
KR20230074269A (en) 2020-09-30 2023-05-26 네오노드, 인크. optical touch sensor
CN112307956A (en) * 2020-10-30 2021-02-02 维沃移动通信有限公司 Display screen and electronic equipment

Family Cites Families (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US611538A (en) * 1898-09-27 Permutation whip-lock
JPS5936295B2 (en) 1981-06-23 1984-09-03 株式会社日立国際電気 Optical coordinate input device
US4507557A (en) * 1983-04-01 1985-03-26 Siemens Corporate Research & Support, Inc. Non-contact X,Y digitizer using two dynamic ram imagers
DE3616490A1 (en) * 1985-05-17 1986-11-27 Alps Electric Co Ltd OPTICAL COORDINATE INPUT DEVICE
JPS61262917A (en) * 1985-05-17 1986-11-20 Alps Electric Co Ltd Filter for photoelectric touch panel
JPS6375918A (en) * 1986-09-19 1988-04-06 Alps Electric Co Ltd Coordinate input device
US4746770A (en) * 1987-02-17 1988-05-24 Sensor Frame Incorporated Method and apparatus for isolating and manipulating graphic objects on computer video monitor
US5880411A (en) * 1992-06-08 1999-03-09 Synaptics, Incorporated Object position detector with edge motion feature and gesture recognition
US5317140A (en) * 1992-11-24 1994-05-31 Dunthorn David I Diffusion-assisted position location particularly for visual pen detection
US5502568A (en) * 1993-03-23 1996-03-26 Wacom Co., Ltd. Optical position detecting unit, optical coordinate input unit and optical position detecting method employing a pattern having a sequence of 1's and 0's
JPH08240407A (en) 1995-03-02 1996-09-17 Matsushita Electric Ind Co Ltd Position detecting input device
US5786810A (en) * 1995-06-07 1998-07-28 Compaq Computer Corporation Method of determining an object's position and associated apparatus
JPH0991094A (en) 1995-09-21 1997-04-04 Sekisui Chem Co Ltd Coordinate detector for touch panel
JP3807779B2 (en) 1996-05-29 2006-08-09 富士通株式会社 Coordinate detection device
US5936615A (en) * 1996-09-12 1999-08-10 Digital Equipment Corporation Image-based touchscreen
JP3624070B2 (en) * 1997-03-07 2005-02-23 キヤノン株式会社 Coordinate input device and control method thereof
JP3876942B2 (en) 1997-06-13 2007-02-07 株式会社ワコム Optical digitizer
WO1999040562A1 (en) 1998-02-09 1999-08-12 Joseph Lev Video camera computer touch screen system
JP4033582B2 (en) * 1998-06-09 2008-01-16 株式会社リコー Coordinate input / detection device and electronic blackboard system
US6335724B1 (en) * 1999-01-29 2002-01-01 Ricoh Company, Ltd. Method and device for inputting coordinate-position and a display board system
JP4057200B2 (en) * 1999-09-10 2008-03-05 株式会社リコー Coordinate input device and recording medium for coordinate input device
JP3934846B2 (en) * 2000-03-06 2007-06-20 株式会社リコー Coordinate input / detection device, electronic blackboard system, light receiving element positional deviation correction method, and storage medium
JP2001265516A (en) * 2000-03-16 2001-09-28 Ricoh Co Ltd Coordinate input device
JP2001282445A (en) * 2000-03-31 2001-10-12 Ricoh Co Ltd Coordinate input/detecting device and information display input device
US6531999B1 (en) * 2000-07-13 2003-03-11 Koninklijke Philips Electronics N.V. Pointing direction calibration in video conferencing and other camera-based system applications

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8289299B2 (en) 2003-02-14 2012-10-16 Next Holdings Limited Touch screen signal processing
US8456447B2 (en) 2003-02-14 2013-06-04 Next Holdings Limited Touch screen signal processing
US8466885B2 (en) 2003-02-14 2013-06-18 Next Holdings Limited Touch screen signal processing
US8508508B2 (en) 2003-02-14 2013-08-13 Next Holdings Limited Touch screen signal processing with single-point calibration
US8149221B2 (en) 2004-05-07 2012-04-03 Next Holdings Limited Touch panel display system with illumination and detection provided from a single edge
US8115753B2 (en) 2007-04-11 2012-02-14 Next Holdings Limited Touch screen system with hover and click input methods
US8384693B2 (en) 2007-08-30 2013-02-26 Next Holdings Limited Low profile touch panel systems
US8432377B2 (en) 2007-08-30 2013-04-30 Next Holdings Limited Optical touchscreen with improved illumination
US8405636B2 (en) 2008-01-07 2013-03-26 Next Holdings Limited Optical position sensing system and optical position sensor assembly
US8405637B2 (en) 2008-01-07 2013-03-26 Next Holdings Limited Optical position sensing system and optical position sensor assembly with convex imaging window

Also Published As

Publication number Publication date
CA2453873A1 (en) 2004-07-30
US20040149892A1 (en) 2004-08-05
US6972401B2 (en) 2005-12-06

Similar Documents

Publication Publication Date Title
CA2453873C (en) Illuminated bezel and touch system incorporating the same
CA2819551C (en) Multi-touch input system with re-direction of radiation
US8872772B2 (en) Interactive input system and pen tool therefor
US8466885B2 (en) Touch screen signal processing
CA2786338C (en) Interactive system with synchronous, variable intensity of illumination
JP4668897B2 (en) Touch screen signal processing
US20080068352A1 (en) Apparatus for detecting a pointer within a region of interest
US8619027B2 (en) Interactive input system and tool tray therefor
KR100829172B1 (en) Infrared Touch Screen Apparatus and Method for Calculation of Coordinations at a Touch Point of the Same
WO2010122762A1 (en) Optical position detection apparatus
US20130100022A1 (en) Interactive input system and pen tool therefor
US8102377B2 (en) Portable interactive media presentation system
KR20120058594A (en) Interactive input system with improved signal-to-noise ratio (snr) and image capture method
KR20080098374A (en) Uniform illumination of interactive display panel
EP2435893A1 (en) Optical position detection apparatus
US20100207909A1 (en) Detection module and an optical detection device comprising the same
TWI737591B (en) Touch screen rear projection display
US20110170253A1 (en) Housing assembly for imaging assembly and fabrication method therefor
US8913035B2 (en) Optical touch panel and light guide module thereof
WO2011120145A1 (en) Interactive input device with palm reject capabilities
KR20150080298A (en) Touch recognition apparatus of curved display
KR101196404B1 (en) System of touch panel
JPH0681027U (en) Photoelectric switch device

Legal Events

Date Code Title Description
EEER Examination request
MKEX Expiry

Effective date: 20231218