WO2012172360A2 - Touch-sensitive display devices - Google Patents

Touch-sensitive display devices Download PDF

Info

Publication number
WO2012172360A2
WO2012172360A2 PCT/GB2012/051379 GB2012051379W WO2012172360A2 WO 2012172360 A2 WO2012172360 A2 WO 2012172360A2 GB 2012051379 W GB2012051379 W GB 2012051379W WO 2012172360 A2 WO2012172360 A2 WO 2012172360A2
Authority
WO
WIPO (PCT)
Prior art keywords
light
camera
touch
image
plane
Prior art date
Application number
PCT/GB2012/051379
Other languages
French (fr)
Other versions
WO2012172360A3 (en
Inventor
Gareth John Mccaughan
Paul Richard Routley
Original Assignee
Light Blue Optics Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Light Blue Optics Ltd filed Critical Light Blue Optics Ltd
Publication of WO2012172360A2 publication Critical patent/WO2012172360A2/en
Publication of WO2012172360A3 publication Critical patent/WO2012172360A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • G06F3/0426Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected tracking fingers with respect to a virtual keyboard projected or printed on the surface
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/745Detection of flicker frequency or suppression of flicker wherein the flicker is caused by illumination, e.g. due to fluorescent tube illumination or pulsed LED illumination
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03HHOLOGRAPHIC PROCESSES OR APPARATUS
    • G03H1/00Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
    • G03H1/22Processes or apparatus for obtaining an optical image from holograms
    • G03H1/2294Addressing the hologram to an active spatial light modulator

Definitions

  • This invention relates to touch sensing systems, to related methods and to corresponding processor control code. More particularly the invention relates to systems employing image projection techniques in combination with a touch sensing system which projects a plane of light adjacent the displayed image.
  • aspects of the invention relate to the suppression of flicker from alternating current powered ambient light in such systems.
  • a touch sensitive device comprising: a touch sensor light source to project a plane of light above a surface; a camera directed to capture a touch sense image from a region including at least a portion of said plane of light, said touch sense image comprising light scattered from said plane of light by an object approaching or touching said surface; and a signal processor coupled to said camera, to process a said touch sense image from said camera to identify a lateral location of said object; wherein said signal processor is further configured to: detect when an oscillatory component of ambient light captured by said camera has a frequency different to an integral multiple of a frame rate of said camera; and adjust said frame rate of said camera responsive to said detection to an adjusted frame rate, such that said oscillatory component of said ambient light captured by said camera has a frequency within 10% of an integral multiple of said adjusted frame rate.
  • Embodiments of the touch sensitive display systems employ differencing between touch sensor light source (infra-red laser) on and off frames to subtract out ambient light, in particular ambient infra-red light in a captured touch sense image.
  • the subsequent processing in particular the finger location algorithms, are relatively insensitive to slow changes in background level across a captured image. This is broadly speaking because they employ a procedure which looks for a local increase in scattered light from a finger or other object, that is by identifying local areas or regions of increased brightness within a captured touch sense image.
  • Artificial illumination in particular fluorescent lighting, can beat with the effective frame capture rate.
  • the effective frame capture rate may be, for example, 30 frames per second (fps) for a camera operating at 60 fps and processing alternate frames to capture scattered light (for identifying finger or other object positions).
  • fps frames per second
  • laser-off frames may not be needed, for example where a bright laser light source is employed and/or where the subsequent processing is less sensitive to ambient light (for example where such processing includes correlating with a shape characteristic of the scattered light from a finger - typically a generally half moon shape but optionally an oval, teardrop or arcuate shape).
  • the effective frame rate may be the same as the camera frame rate.
  • the camera frame rate is chosen so that an integral multiple of this frame rate is substantially equal to the illumination flicker frequency, typically 100Hz or 120Hz for 50 Hz/60 Hz mains respectively.
  • the ambient light flicker frequency is close to an integral multiple of the effective camera frame rate, a slow or substantially stationary beating is observed. The effect of this depends, in part, on the type of camera employed, more particularly whether or not the camera is a rolling shutter camera. Nonetheless, broadly speaking, where the beating is slow or stationary the effect on a captured touch sense image is generally to provide a slowly changing gradient or broad fuzzy bar of ambient illumination across the captured image. This varies slowly over time.
  • the finger location processing is not substantially affected by an effect of this type which varies relatively slowly.
  • the camera is operating at, say, 60 fps whilst the mains frequency is, say, 50 Hz then a relatively fast flicker can result which can interfere with finger detection/location.
  • this is addressed by switching to an adjusted camera frame rate so that an integral multiple of this adjusted frame rate (which integral multiply may be unity) is within 10% of the illumination flicker frequency; preferably within 8%, 5%, 3%, 2% or 1 % of the target frequency.
  • the effect of an error here is to produce a spurious signal at the difference between the effective frame rates, and thus preferably the absolute difference in (effective) frame rate is less than 2Hz.
  • an integral multiple of the adjusted frame rate is within 2Hz of the illumination flicker frequency.
  • the signal processor is configured to determine an instantaneous level of the oscillatory component of the captured ambient light by averaging pixel values in two orthogonal directions over part or all of a captured frame.
  • this may be achieved by correlating a pattern of variation of light level in a captured image in a direction of motion of the rolling shutter with a variation of the light level at an expected flicker frequency, for example trying each of 100 Hz/120 Hz. (In a rolling shutter camera the pixel levels along a single row have substantially the same phase, but different phases at different vertical positions).
  • Embodiments of the signal processing determine a difference frequency signal level dependent upon a level of a frequency component at a difference frequency between an effective frame rate of the camera and an expected flicker frequency. This signal level can then be used to determine whether or not to switch camera frame rates.
  • the difference frequency level is determined by correlating light intensity level data (dependent on the artificial illumination) from the captured images with one or more waveforms at the expected frequency.
  • a sinusoidal or square waveform may be used.
  • two 90° out-of-phase waveforms are employed. The correlation may be performed by summing over a block of samples having a length determined by a period of the one or more waveforms.
  • a signal dependent upon the captured ambient light level may be digitally filtered to detect a level of a frequency component in the signal, in particular using an MR (Infinite Impulse Response) filter.
  • MR Infinite Impulse Response
  • the signal processing determines a measure of a difference frequency level for each of two expected illumination flicker frequencies, and then compares one against the other to determine whether or not to switch the camera frame rate.
  • a constant and/or average ambient light level may also be included in this comparison to provide a degree of hysteresis/noise immunity.
  • more sophisticated algorithms may be employed to determine whether or not to switch the frame rate of the touch sense camera, for example by employing flicker level history data, comprising data defining a relative or absolute level of flicker in the captured images when the camera was last at a different effective frame rate.
  • the field of view of the touch sense camera will extend beyond the region of the plane/fan of light.
  • portions of the touch sense image beyond the plane of light may be used for compensating for illumination flicker even when the laser is on.
  • the invention provides a method of ambient light interference suppression in a touch sensitive device, the device comprising: a touch sensor light source to project a plane of light above a surface; a camera directed to capture a touch sense image from a region including at least a portion of said plane of light, said touch sense image comprising light scattered from said plane of light by an object
  • a signal processor coupled to said camera, to process a said touch sense image from said camera to identify a lateral location of said object; the method comprising: correlating an ambient light intensity signal from said touch sense image with a difference frequency between a frame rate or an integral fraction of a frame rate of said camera and an expected frequency of an oscillatory component of said ambient light captured by said camera; determining when said oscillatory component of said ambient light captured by said camera has a frequency different to said integral multiple of said frame rate or said integral fraction of said frame rate of said camera; and adjusting said frame rate of said camera to an adjusted frame rate, responsive to said determining, such that said oscillatory component of said ambient light captured by said camera has a frequency within 10% of an integral multiple of said adjusted frame rate.
  • the invention also provides A touch sensitive device, the device comprising: a touch sensor light source to project a plane of light above a surface; a camera directed to capture a touch sense image from a region including at least a portion of said plane of light, said touch sense image comprising light scattered from said plane of light by an object approaching or touching said surface; and a signal processor coupled to said camera, to process a said touch sense image from said camera to identify a lateral location of said object; wherein said touch sensor light projection is amplitude modulated, and wherein said touch sense image region extends beyond said plane of light; and wherein said signal processor is further configured to: determine a level of ambient light captured by said camera from portions of said touch sense image beyond said plane of light for captured touch sense images when said projection is on; and compensate for said determined level of ambient light to identify said scattered light in a said touch sense image for identifying said location of said object.
  • the amplitude modulation modulates the laser output between a state in which the laser is on and a state in which the laser is substantially off; the modulation may thus be on-off modulation.
  • a rolling shutter camera is employed, a still further approach is contemplated, in which the laser turns on (or off) whilst a frame is captured. In this case part of the frame will have the laser off, and part of the frame will have laser on. This can be used for enhanced ambient light level compensation (whether or not 'flicker compensation' is also employed), by determining a background level of illumination from an off portion of a frame. This may then be subtracted from a spatially corresponding (or non-corresponding) laser-on frame or frame portion.
  • a touch sensitive device comprising: a touch sensor light source to project a plane of light above a surface; a camera directed to capture a touch sense image from a region including at least a portion of said plane of light, said touch sense image comprising light scattered from said plane of light by an object approaching or touching said surface; and a signal processor coupled to said camera, to process a said touch sense image from said camera to identify a lateral location of said object; wherein said touch sensor light projection is amplitude modulated, and wherein said touch sense image region extends beyond said plane of light; and wherein said camera is a rolling shutter camera; wherein said rolling shutter captures light from a portion of a field of view of said camera where said projection is off; and wherein said signal processor is further configured to determine said level of ambient light from a portion of said touch sense image within said plane of light when said projection is turned from on to off.
  • a touch sensitive device as described above may be an image display device, in particular comprising an image projector, such as a holographic image projector, to project a displayed image onto the surface in front of the device.
  • the touch sensor light source is then configured to project the plane of light above the displayed image
  • the signal processor is configured to process a touch sense image to identify a lateral location of the or each of said objects relative to the displayed image.
  • references to on-off modulation include amplitude modulation in which the laser output changes between a state in which the laser is on and a state in which the laser is substantially off as well as binary on-off modulation.
  • the invention further provides processor control code configured to implement the above described signal processing.
  • the code is provided on a physical carrier such as a disk, CD - or DVD-ROM, programmed memory or other physical computer-readable medium.
  • Code and/or data to implement embodiments of the invention may comprise source, object or executable code in one or more conventional programming languages (interpreted or compiled), or the code may comprise code for setting up or controlling an ASIC (Application Specific Integrated Circuit) or FPGA (Field Programmable Gate Array) or code for a hardware description language such as Verilog, VHDL, or SystemC.
  • ASIC Application Specific Integrated Circuit
  • FPGA Field Programmable Gate Array
  • Verilog Verilog
  • VHDL Verilog
  • SystemC SystemC
  • Embodiments of each of the above described aspects of the invention are not limited to use with any particular type of projection technology.
  • the techniques of the invention may also be applied to other forms of projection technology including, but not limited to, digital micromirror-based projectors such as projectors based on DLPTM (Digital Light Processing) technology from Texas Instruments, Inc.
  • DLPTM Digital Light Processing
  • Figures 1 a and 1 b show, respectively, a vertical cross section view through an example touch sensitive image display device suitable for implementing embodiments of the invention, and details of a plane of light-based touch sensing system for the device;
  • Figures 2a and 2b show, respectively, a holographic image projection system for use with the device of Figure 1 , and a functional block diagram of the device of Figure 1 ;
  • Figures 3a to 3c show, respectively, an embodiment of a touch sensitive image display device according to an aspect of the invention, use of a crude peak locator to find finger centroids, and the resulting finger locations.
  • FIGS. 1 a and 1 b show an example touch sensitive holographic image projection device 100 comprising a holographic image projection module 200 and a touch sensing system 250, 258, 260 in a housing 102.
  • a proximity sensor 104 may be employed to selectively power-up the device on detection of proximity of a user to the device.
  • a holographic image projector is merely described by way of example; the techniques we describe herein may be employed with any type of image projection system.
  • the holographic image projection module 200 is configured to project downwards and outwards onto a flat surface such as a tabletop. This entails projecting at an acute angle onto the display surface (the angle between a line joining the centre of the output of the projection optics and the middle of the displayed image and a line in a plane of the displayed image is less than 90°).
  • table down projection the angle between a line joining the centre of the output of the projection optics and the middle of the displayed image and a line in a plane of the displayed image is less than 90°.
  • table down projection A holographic image projector is particularly suited to this application because it can provide a wide throw angle, long depth of field, and substantial distortion correction without significant loss of brightness/efficiency. Boundaries of the light forming the displayed image 150 are indicated by lines 150a, b.
  • the touch sensing system 250, 258, 260 comprises an infrared laser illumination system (IR line generator) 250 configured to project a sheet of infrared light 256 just above, for example ⁇ 1 mm above, the surface of the displayed image 150 (although in principle the displayed image could be distant from the touch sensing surface).
  • the laser illumination system 250 may comprise an IR LED or laser 252, preferably collimated, then expanded in one direction by light sheet optics 254, which may comprise a negative or cylindrical lens.
  • light sheet optics 254 may include a 45 degree mirror adjacent the base of the housing 102 to fold the optical path to facilitate locating the plane of light just above the displayed image.
  • a CMOS imaging sensor (touch camera) 260 is provided with an ir-pass lens 258 captures light scattered by touching the displayed image 150, with an object such as a finger, through the sheet of infrared light 256.
  • the boundaries of the CMOS imaging sensor field of view are indicated by lines 257, 257a,b.
  • the touch camera 260 provides an output to touch detect signal processing circuitry as described further later.
  • Figure 2a shows an example holographic image projection system architecture 200 in which the SLM may advantageously be employed.
  • the architecture of Figure 2 uses dual SLM modulation - low resolution phase modulation and higher resolution amplitude (intensity) modulation. This can provide substantial improvements in image quality, power consumption and physical size.
  • the primary gain of holographic projection over imaging is one of energy efficiency.
  • the low spatial frequencies of an image can be rendered holographically to maintain efficiency and the high- frequency components can be rendered with an intensity-modulating imaging panel, placed in a plane conjugate to the hologram SLM.
  • diffracted light from the hologram SLM device SLM1
  • SLM2 imaging SLM device
  • the hologram SLM is preferably be a fast multi-phase device, for example a pixellated MEMS-based piston actuator device.
  • SLM1 is a pixellated MEMS-based piston actuator SLM as described above, to display a hologram - for example a 160 ⁇ 160 pixel device with physically small lateral dimensions, e.g ⁇ 5mm or ⁇ 1 mm.
  • L1 , L2 and L3 are collimation lenses (optional, depending upon the laser output) for respective Red, Green and Blue lasers.
  • M1 , M2 and M3 are dichroic mirrors a implemented as prism assembly.
  • M4 is a turning beam mirror
  • SLM2 is an imaging SLM and has a resolution at least equal to the target image resolution (e.g. 854 ⁇ 480); it may comprise a LCOS (liquid crystal on silicon) or DMD (Digital Micromirror Device) panel.
  • LCOS liquid crystal on silicon
  • DMD Digital Micromirror Device
  • Diffraction optics 210 comprises lenses LD1 and LD2, forms an intermediate image plane on the surface of SLM2, and has effective focal length f such that fX / ⁇ covers the active area of imaging SLM2.
  • optics 210 perform a spatial Fourier transform to form a far field illumination pattern in the Fourier plane, which illuminates SLM2.
  • PBS2 (Polarising Beam Splitter 2) transmits incident light to SLM2, and reflects emergent light into the relay optics 212 (liquid crystal SLM2 rotates the polarisation by 90 degrees).
  • PBS2 preferably has a clear aperture at least as large as the active area of SLM2.
  • Relay optics 212 relay light to the diffuser D1.
  • M5 is a beam turning mirror
  • D1 is a diffuser to reduce speckle.
  • Projection optics 214 project the object formed on D1 by the relay optics 212, and preferably provide a large throw angle, for example >90°, for angled projection down onto a table top (the design is simplified by the relatively low scattere from the diffuser).
  • the different colours are time-multiplexed and the sizes of the replayed images are scaled to match one another, for example by padding a target image for display with zeros (the field size of the displayed image depends upon the pixel size of the SLM not on the number of pixels in the hologram).
  • a system controller and hologram data processor 202 inputs image data and provides low spatial frequency hologram data 204 to SLM1 and higher spatial frequency intensity modulation data 206 to SLM2.
  • the controller also provides laser light intensity control data 208 to each of the three lasers.
  • hologram calculation procedure reference may be made to WO2010/007404 (hereby incorporated by reference).
  • a system controller 1 10 is coupled to a touch sensing module 1 12 from which it receives data defining one or more touched locations on the display area, either in rectangular or in distorted coordinates (in the latter case the system controller may perform keystone distortion compensation).
  • the touch sensing module 1 12 in embodiments comprises a CMOS sensor driver and touch-detect processing circuitry.
  • the system controller 1 10 is also coupled to an input/output module 1 14 which provides a plurality of external interfaces, in particular for buttons, LEDs, optionally a USB and/or Bluetooth (RTM) interface, and a bi-directional wireless communication interface, for example using WiFi (RTM).
  • RTM USB and/or Bluetooth
  • the wireless interface may be employed to download data for display either in the form of images or in the form of hologram data.
  • this data may include price data for price updates, and the interface may provide a backhaul link for placing orders, handshaking to enable payment and the like.
  • Non-volatile memory 1 16, for example Flash RAM is provided to store data for display, including hologram data, as well as distortion compensation data, and touch sensing control data (identifying regions and associated actions/links).
  • Non-volatile memory 1 16 is coupled to the system controller and to the I/O module 1 14, as well as to an optional image-to-hologram engine 1 18 as previously described (also coupled to system controller 1 10), and to an optical module controller 120 for controlling the optics shown in figure 2a.
  • the image-to-hologram engine is optional as the device may receive hologram data for display from an external source).
  • the optical module controller 120 receives hologram data for display and drives the hologram display SLM, as well as controlling the laser output powers in order to compensate for brightness variations caused by varying coverage of the display area by the displayed image (for more details see, for example, our WO2008/075096).
  • the laser power(s) is(are) controlled dependent on the "coverage" of the image, with coverage defined as the sum of: the image pixel values, preferably raised to a power of gamma (where gamma is typically 2.2).
  • the laser power is inversely dependent on (but not necessarily inversely proportional to) the coverage; in preferred embodiments a lookup table as employed to apply a programmable transfer function between coverage and laser power.
  • Preferred embodiments of the device also include a power management system 122 to control battery charging, monitor power consumption, invoke a sleep mode and the like.
  • the system controller controls loading of the image/hologram data into the non-volatile memory, where necessary conversion of image data to hologram data, and loading of the hologram data into the optical module and control of the laser intensities.
  • the system controller also performs distortion compensation and controls which image to display when and how the device responds to different "key" presses and includes software to keep track of a state of the device.
  • the controller is also configured to transition between states (images) on detection of touch events with coordinates in the correct range, a detected touch triggering an event such as a display of another image and hence a transition to another state.
  • the system controller 1 10 also, in embodiments, manages price updates of displayed menu items, and optionally payment, and the like.
  • FIG. 3a shows an embodiment of a touch sensitive image display device 300 according to an aspect of the invention.
  • the system comprises an infra red laser and optics 250 to generate a plane of light 256 viewed by a touch sense camera 258, 260 as previously described, the camera capturing the scattered light from one or more fingers 301 or other objects interacting with the plane of light.
  • the system also includes an image projector 1 18, for example a holographic image projector, also as previously described.
  • a controller 320 controls the IR laser on and off, controls the acquisition of images by camera 260 and controls projector 1 18.
  • images are captured with the IR laser on and off in alternate frames and touch detection is then performed on the difference of these frames to subtract out any ambient infra red.
  • the image capture objects 258 preferably also include a notch filter at the laser wavelength which may be around 780-800 nm. Because of laser diodes process variations and change of wavelength with temperature this notch may be relatively wide, for example of order 20 nm and thus it is desirable to suppress ambient IR.
  • subtraction is performed by module 302 which, in embodiments, is implemented in hardware (an FPGA).
  • module 302 also performs binning of the camera pixels, for example down to approximately 80 by 50 pixels. This helps reduce the subsequent processing power/memory requirements and is described in more detail later. However such binning is optional, depending upon the processing power available, and even where processing power/memory is limited there are other options, as described further later. Following the binning and subtraction the captured image data is loaded into a buffer 304 for subsequent processing to identify the position of a finger or, in a multi-touch system, fingers.
  • the camera 260 is directed down towards the plane of light at an angle it can be desirable to provide a greater exposure time for portions of the captured image further from the device than for those nearer the device. This can be achieved, for example, with a rolling shutter device, under control of controller 320 setting appropriate camera registers. Depending upon the processing of the captured touch sense images and/or the brightness of the laser illumination system, differencing alternate frames may not be necessary (for example, where 'finger shape' is detected). However where subtraction takes place the camera should have a gamma of substantial unity so that subtraction is performed with a linear signal.
  • module 306 performs thresholding on a captured image and, in embodiments, this is also employed for image clipping or cropping to define a touch sensitive region.
  • some image scaling may also be performed in this module.
  • a crude peak locator 308 is applied to the thresholded image to identify, approximately, regions in which a finger/object is potentially present.
  • Figure 3b illustrates an example such a coarse (decimated) grid.
  • the spots indicate the first estimation of the centre-of-mass.
  • a centroid locator 310 (centre of mass algorithm) is applied to the original (unthresholded) image in buffer 304 at each located peak, to determine a respective candidate finger/object location.
  • Figure 3c shows the results of the fine-grid position estimation, the spots indicating the finger locations found.
  • the system then applies distortion correction 312 to compensate for keystone distortion of the captured touch sense image and also, optionally, any distortion such as barrel distortion, from the lens of imaging optics 258.
  • the optical access of camera 260 is directed downwards at an angle of approximately 70° to the plane of the image and thus the keystone distortion is relatively small, but still significant enough for distortion correction to be desirable.
  • the thresholding may be position sensitive (at a higher level for mirror image parts) alternatively position-sensitive scaling may be applied to the image in buffer 304 and a substantially uniform threshold may be applied.
  • the procedure finds a connected region of the captured image by identifying the brightest block within a region (or a block with greater than a threshold brightness), and then locates the next brightest block, and so forth, preferably up to a distance limit (to avoid accidentally performing a flood fill). Centroid location is then performed on a connected region.
  • the pixel brightness/intensity values are not squared before the centroid location, to reduce the sensitivity of this technique to noise, interference and the like (which can cause movement of a detected centroid location by more than once pixel).
  • n is the order of the CoM calculation, and and Vare the sizes of the ROI.
  • C x and C y represent polynomial coefficients in matrix-form, and x and y are the vectorised powers of x and y respectively. Then we may design C x and C y such that we can assign a projected space grid location (i.e. memory location) by evaluation of the polynomial: Where is the number of grid locations in the x-direction in projector space, and .J is the floor operator.
  • the polynomial evaluation may be implemented, say, in Chebyshev form for better precision performance; the coefficients may be assigned at calibration. Further background can be found in our published PCT application WO2010/073024.
  • a module 314 which tracks finger/object positions and decodes actions, in particular to identity finger up/down or present/absent events.
  • this module also provides some position hysteresis, for example implemented using a digital filter, to reduce position jitter.
  • this module In a single touch system module 314 need only decode a finger up/finger down state, but in a multi-touch system this module also allocates identifiers to the fingers/objects in the captured images and tracks the indentified fingers/objects.
  • the field of view of the touch sense camera system is larger than the displayed image.
  • touch events outside the displayed image area may be rejected (for example, using appropriate entries in a threshold table of threshold module 306 to clip the crude peak locator outside the image area).
  • a touch sensing system of the type described above is potentially vulnerable to "flicker” resulting from a mismatch between the camera's frame rate and the frequency of the AC current powering the source(s) of ambient light.
  • flicker resulting from a mismatch between the camera's frame rate and the frequency of the AC current powering the source(s) of ambient light.
  • the ambient light in frame-pair n can be described by:
  • the camera frame rate will not be exactly 50Hz or 60Hz; the mains frequency will not be exactly 60Hz or 50Hz; either may drift slightly; and there will also be slow variations in the ambient light that do not arise from oscillations in the light source. There may also be not-so-slow variations, from moving shadows and the like.
  • the signal processing in the touch sensitive display device is configured to determine when this is happening, so that the camera change to a different frame rate.
  • the signal processing in the touch sensitive display device is configured to turn the camera's pixel-by-pixel measurements into data which can be processed to look for the periodic signal we expect. There are many ways to do this, and some preferred approaches are given below. (Note that "pixel” here can mean either native camera pixels, or the grossly downsampled “bins” these are grouped into, or something in between).
  • the illumination where the system is measuring may then be less representative of the illumination within the image itself.
  • the first option (averaging all the pixels) is simple and good enough in practice.
  • image enhancement may be applied to the camera image before using it for finger detection.
  • the system may compute the median pixel value and subtract it from every pixel; or the system may convolve the captured image with a filter designed to make large smooth dim artefacts/objects disappear or be suppressed and/or to make fingers show up brightly. If such image enhancement is employed than it is preferable to perform this before the ambient light measurement stage.
  • the signal processing has extracted a sequence of observations which is expected to have a component that is periodic - and probably roughly sinusoidal - with a particular period, if the camera frame rate is wrong in a predicted way; otherwise we expect the observations to be varying only slowly.
  • the system may instead accumulate blocks of p samples, just as above, and then feed these into an MR filter; then one can look every p samples instead. If the system uses this approach then it can be preferable allow ⁇ (and optionally also ⁇ ) to vary as more samples are accumulated. (After a some accumulation the samples will settle down but at the very beginning - note that preferably the system resets the filter whenever the camera frame rate is changed - the optimal values will vary.)
  • this approach is less preferable to that previously described.
  • the numbers referred to as S and T are (in the simpler average-over-a-block algorithm) two of the components in the Fourier decomposition of our signal.
  • V 2 F l F z .
  • the system should switch camera frame rate only when the system is reasonably confident that there really is a sinusoidal component large enough that it might affect the touch sensing performance.
  • “Large enough” in this context may be defined as “the measured oscillatory ambient light is at least k " for some choice of k that depends on how the system is measuring ambient light, how big we expect fingers to be, and so forth.
  • the system should switch frequency if ⁇ 2 > (kF ⁇ 2 + r 2 W 2 for some r .
  • the system could instead use (for instance) the value at a single pixel, perform the above computations separately for several pixels, and then switch frame rate only if the results from different pixels agree sufficiently well.
  • the system may employ multiple camera rates, and the processing may be configured to detect multiple different flicker rates.
  • the calculations are essentially the same as those discussed above, but performed n - l times with different parameters. Identifying a particular frequency component as strongly present may then trigger a switch to the corresponding frame rate.
  • the system may gain some robustness by adding a criterion of the following form: do not switch unless the strongest oscillatory component found is substantially bigger than all the others the signal processing looked for.
  • the ambient light actually contains oscillatory components at both 100Hz and 120Hz. (This could happen, for example, in Tokyo, where there are two different mains frequencies). Then the system might find itself repetitively hopping from one frame rate to another and back again, which is undesirable. On the other hand, suppose the system thinks it sees some flicker and therefore switches frame rate. If the system then sees much worse flicker then the system may switch back and should be biased towards staying switched back.
  • the system maintains a guess at how much flickering illumination there might be at the "other" frame rate. (If there are more than two possible frame rates, the system maintains a guess for each possible other frame rate).
  • the system updates this guess since the system now has actual knowledge.
  • the system also remembers how long it is since it last measured at any given other frame rate - the longer the elapsed time, the less confident the system is are about how much flicker the system expects to see at that frame rate.
  • the system compares the observed flicker at the current frame rate with the best-guess flicker at the other frame rate, and switches only if there is reasonably confidence that the other rate would provide an improvement (less flicker) on the present frame rate.
  • the above described flicker detection and camera frame rate control may be implemented by a software (and/or hardware) flicker suppression module 350.
  • the flicker suppression module 350 may receive touch sense images either directly from camera 260 or, for example, from module 302, after binning. The module then implements a signal processing procedure as described above and controls camera 260, either directly or via controller 320, to adjust the frame capture rate if necessary.
  • the plane or fan of light is preferably invisible, for example in the infrared, but this is not essential - ultraviolet or visible light may alternatively be used. Although in general the plane or fan of light will be adjacent to displayed image, this is also not essential and, in principle, the projected image could be at some distance beyond the touch sensing surface. The skilled person will appreciate that whilst a relatively thin, flat plane of light is desirable this is not essential and some tilting and/or divergence or spreading of the beam may be acceptable with some loss of precision. Alternatively some convergence of the beam towards the far edge of the display area may be helpful in at least partially compensating for the reduction in brightness of the touch sensor illumination as the light fans out. Further, in embodiments the light defining the touch sheet need not be light defining a continuous plane - instead structured light such as a comb or fan of individual beams and/or one or more scanned light beams, may be employed to define the touch sheet.

Abstract

A touch sensitive device, the device comprising: a touch sensor light source to project a plane of light above a surface; a camera directed to capture a touch sense image from a region including at least a portion of said plane of light, said touch sense image comprising light scattered from said plane of light by an object approaching or touching said surface; and a signal processor coupled to said camera, to process a said touch sense image from said camera to identify a lateral location of said object; wherein said signal processor is further configured to: detect when an oscillatory component of ambient light captured by said camera has a frequency different to an integral multiple of a frame rate of said camera; and adjust said frame rate of said camera responsive to said detection to an adjusted frame rate, such that said oscillatory component of said ambient light captured by said camera has a frequency within 10% of an integral multiple of said adjusted frame rate.

Description

Touch-Sensitive Display Devices
FIELD OF THE INVENTION
This invention relates to touch sensing systems, to related methods and to corresponding processor control code. More particularly the invention relates to systems employing image projection techniques in combination with a touch sensing system which projects a plane of light adjacent the displayed image.
Aspects of the invention relate to the suppression of flicker from alternating current powered ambient light in such systems.
BACKGROUND TO THE INVENTION
Background prior art relating to touch sensing systems employing a plane of light can be found in US6,281 ,878 (Montellese), and in various later patents of Lumio/VKB Inc, such as US7, 305,368, as well as in similar patents held by Canesta Inc, for example US6,710,770. Broadly speaking these systems project a fan-shaped plane of infrared (IR) light just above a displayed image and use a camera to detect the light scattered from this plane by a finger or other object reaching through to approach or touch the displayed image.
Further background prior art can be found in: WO01 /93006; US6650318; US7305368; US7084857; US7268692; US7417681 ; US7242388 (US2007/222760);
US2007/019103; WO01 /93006; WO01/93182; WO2008/038275; US2006/187199;
US6,614,422; US6,710,770 (US2002021287); US7,593,593; US7599561 ; US7519223;
US7394459; US661 1921 ; USD595785; US6,690,357; US6,377,238; US5767842;
WO2006/108443; WO2008/146098; US6,367,933 (WO00/21282); WO02/101443; US6,491 ,400; US7,379,619; US2004/0095315; US6281878; US6031519;
GB2,343,023A; US4384201 ; DE 41 21 180A; and US2006/244720.
We have previously described techniques for improved touch sensitive holographic displays, in particular in our earlier patent applications: WO2010/073024; WO2010/073045; and WO2010/073047. However research by the inventors has shown that a touch sensitive image display system of this type has an unexpected type of vulnerability to AC (alternating current) ambient illumination.
SUMMARY OF THE INVENTION
According to a first aspect of the invention there is therefore provided a touch sensitive device, the device comprising: a touch sensor light source to project a plane of light above a surface; a camera directed to capture a touch sense image from a region including at least a portion of said plane of light, said touch sense image comprising light scattered from said plane of light by an object approaching or touching said surface; and a signal processor coupled to said camera, to process a said touch sense image from said camera to identify a lateral location of said object; wherein said signal processor is further configured to: detect when an oscillatory component of ambient light captured by said camera has a frequency different to an integral multiple of a frame rate of said camera; and adjust said frame rate of said camera responsive to said detection to an adjusted frame rate, such that said oscillatory component of said ambient light captured by said camera has a frequency within 10% of an integral multiple of said adjusted frame rate.
Embodiments of the touch sensitive display systems we describe employ differencing between touch sensor light source (infra-red laser) on and off frames to subtract out ambient light, in particular ambient infra-red light in a captured touch sense image. The subsequent processing, in particular the finger location algorithms, are relatively insensitive to slow changes in background level across a captured image. This is broadly speaking because they employ a procedure which looks for a local increase in scattered light from a finger or other object, that is by identifying local areas or regions of increased brightness within a captured touch sense image. Artificial illumination, in particular fluorescent lighting, can beat with the effective frame capture rate. The effective frame capture rate may be, for example, 30 frames per second (fps) for a camera operating at 60 fps and processing alternate frames to capture scattered light (for identifying finger or other object positions). In other approaches laser-off frames may not be needed, for example where a bright laser light source is employed and/or where the subsequent processing is less sensitive to ambient light (for example where such processing includes correlating with a shape characteristic of the scattered light from a finger - typically a generally half moon shape but optionally an oval, teardrop or arcuate shape). In such a case the effective frame rate may be the same as the camera frame rate.
In either case the camera frame rate is chosen so that an integral multiple of this frame rate is substantially equal to the illumination flicker frequency, typically 100Hz or 120Hz for 50 Hz/60 Hz mains respectively. Where the ambient light flicker frequency is close to an integral multiple of the effective camera frame rate, a slow or substantially stationary beating is observed. The effect of this depends, in part, on the type of camera employed, more particularly whether or not the camera is a rolling shutter camera. Nonetheless, broadly speaking, where the beating is slow or stationary the effect on a captured touch sense image is generally to provide a slowly changing gradient or broad fuzzy bar of ambient illumination across the captured image. This varies slowly over time.
The finger location processing is not substantially affected by an effect of this type which varies relatively slowly. However if the camera is operating at, say, 60 fps whilst the mains frequency is, say, 50 Hz then a relatively fast flicker can result which can interfere with finger detection/location. In embodiments of the invention this is addressed by switching to an adjusted camera frame rate so that an integral multiple of this adjusted frame rate (which integral multiply may be unity) is within 10% of the illumination flicker frequency; preferably within 8%, 5%, 3%, 2% or 1 % of the target frequency. However the effect of an error here is to produce a spurious signal at the difference between the effective frame rates, and thus preferably the absolute difference in (effective) frame rate is less than 2Hz. Thus in embodiments an integral multiple of the adjusted frame rate is within 2Hz of the illumination flicker frequency.
In one embodiment of the device the signal processor is configured to determine an instantaneous level of the oscillatory component of the captured ambient light by averaging pixel values in two orthogonal directions over part or all of a captured frame. However where a rolling shutter camera is employed this may be achieved by correlating a pattern of variation of light level in a captured image in a direction of motion of the rolling shutter with a variation of the light level at an expected flicker frequency, for example trying each of 100 Hz/120 Hz. (In a rolling shutter camera the pixel levels along a single row have substantially the same phase, but different phases at different vertical positions).
Embodiments of the signal processing determine a difference frequency signal level dependent upon a level of a frequency component at a difference frequency between an effective frame rate of the camera and an expected flicker frequency. This signal level can then be used to determine whether or not to switch camera frame rates. In embodiments the difference frequency level is determined by correlating light intensity level data (dependent on the artificial illumination) from the captured images with one or more waveforms at the expected frequency. Thus, for example, a sinusoidal or square waveform may be used. In embodiments two 90° out-of-phase waveforms are employed. The correlation may be performed by summing over a block of samples having a length determined by a period of the one or more waveforms.
In an alternative approach, a signal dependent upon the captured ambient light level may be digitally filtered to detect a level of a frequency component in the signal, in particular using an MR (Infinite Impulse Response) filter.
In embodiments, the signal processing determines a measure of a difference frequency level for each of two expected illumination flicker frequencies, and then compares one against the other to determine whether or not to switch the camera frame rate. Optionally a constant and/or average ambient light level may also be included in this comparison to provide a degree of hysteresis/noise immunity.
In embodiments more sophisticated algorithms may be employed to determine whether or not to switch the frame rate of the touch sense camera, for example by employing flicker level history data, comprising data defining a relative or absolute level of flicker in the captured images when the camera was last at a different effective frame rate.
In general the field of view of the touch sense camera will extend beyond the region of the plane/fan of light. In a system where differencing between laser-on and laser-off frames is employed for ambient light suppression, portions of the touch sense image beyond the plane of light may be used for compensating for illumination flicker even when the laser is on.
In a related aspect the invention provides a method of ambient light interference suppression in a touch sensitive device, the device comprising: a touch sensor light source to project a plane of light above a surface; a camera directed to capture a touch sense image from a region including at least a portion of said plane of light, said touch sense image comprising light scattered from said plane of light by an object
approaching or touching said surface; and a signal processor coupled to said camera, to process a said touch sense image from said camera to identify a lateral location of said object; the method comprising: correlating an ambient light intensity signal from said touch sense image with a difference frequency between a frame rate or an integral fraction of a frame rate of said camera and an expected frequency of an oscillatory component of said ambient light captured by said camera; determining when said oscillatory component of said ambient light captured by said camera has a frequency different to said integral multiple of said frame rate or said integral fraction of said frame rate of said camera; and adjusting said frame rate of said camera to an adjusted frame rate, responsive to said determining, such that said oscillatory component of said ambient light captured by said camera has a frequency within 10% of an integral multiple of said adjusted frame rate.
The invention also provides A touch sensitive device, the device comprising: a touch sensor light source to project a plane of light above a surface; a camera directed to capture a touch sense image from a region including at least a portion of said plane of light, said touch sense image comprising light scattered from said plane of light by an object approaching or touching said surface; and a signal processor coupled to said camera, to process a said touch sense image from said camera to identify a lateral location of said object; wherein said touch sensor light projection is amplitude modulated, and wherein said touch sense image region extends beyond said plane of light; and wherein said signal processor is further configured to: determine a level of ambient light captured by said camera from portions of said touch sense image beyond said plane of light for captured touch sense images when said projection is on; and compensate for said determined level of ambient light to identify said scattered light in a said touch sense image for identifying said location of said object. In embodiments the amplitude modulation modulates the laser output between a state in which the laser is on and a state in which the laser is substantially off; the modulation may thus be on-off modulation. Where a rolling shutter camera is employed, a still further approach is contemplated, in which the laser turns on (or off) whilst a frame is captured. In this case part of the frame will have the laser off, and part of the frame will have laser on. This can be used for enhanced ambient light level compensation (whether or not 'flicker compensation' is also employed), by determining a background level of illumination from an off portion of a frame. This may then be subtracted from a spatially corresponding (or non-corresponding) laser-on frame or frame portion.
Thus according to a further aspect of the invention there is provided a touch sensitive device, the device comprising: a touch sensor light source to project a plane of light above a surface; a camera directed to capture a touch sense image from a region including at least a portion of said plane of light, said touch sense image comprising light scattered from said plane of light by an object approaching or touching said surface; and a signal processor coupled to said camera, to process a said touch sense image from said camera to identify a lateral location of said object; wherein said touch sensor light projection is amplitude modulated, and wherein said touch sense image region extends beyond said plane of light; and wherein said camera is a rolling shutter camera; wherein said rolling shutter captures light from a portion of a field of view of said camera where said projection is off; and wherein said signal processor is further configured to determine said level of ambient light from a portion of said touch sense image within said plane of light when said projection is turned from on to off.
In embodiments a touch sensitive device as described above may be an image display device, in particular comprising an image projector, such as a holographic image projector, to project a displayed image onto the surface in front of the device. The touch sensor light source is then configured to project the plane of light above the displayed image, and the signal processor is configured to process a touch sense image to identify a lateral location of the or each of said objects relative to the displayed image.
The invention also provides methods corresponding to the above described operational features of a touch sensitive image display device. In general herein, references to on-off modulation include amplitude modulation in which the laser output changes between a state in which the laser is on and a state in which the laser is substantially off as well as binary on-off modulation.
The invention further provides processor control code configured to implement the above described signal processing. The code is provided on a physical carrier such as a disk, CD - or DVD-ROM, programmed memory or other physical computer-readable medium. Code and/or data to implement embodiments of the invention may comprise source, object or executable code in one or more conventional programming languages (interpreted or compiled), or the code may comprise code for setting up or controlling an ASIC (Application Specific Integrated Circuit) or FPGA (Field Programmable Gate Array) or code for a hardware description language such as Verilog, VHDL, or SystemC. As the skilled person will appreciate such code and/or data may be distributed between a plurality of coupled components in communication with one another.
The skilled person will appreciate that, in general, the signal processing we describe may be implemented in software, or in hardware (circuitry), or in a combination of the two.
Embodiments of each of the above described aspects of the invention are not limited to use with any particular type of projection technology. Thus although we will describe later an example of a holographic image projector, the techniques of the invention may also be applied to other forms of projection technology including, but not limited to, digital micromirror-based projectors such as projectors based on DLP™ (Digital Light Processing) technology from Texas Instruments, Inc.
BRIEF DESCRIPTION OF THE DRAWINGS
These and other aspects of the invention will now be further described, by way of example only, with reference to the accompanying figures in which: Figures 1 a and 1 b show, respectively, a vertical cross section view through an example touch sensitive image display device suitable for implementing embodiments of the invention, and details of a plane of light-based touch sensing system for the device; Figures 2a and 2b show, respectively, a holographic image projection system for use with the device of Figure 1 , and a functional block diagram of the device of Figure 1 ;
Figures 3a to 3c show, respectively, an embodiment of a touch sensitive image display device according to an aspect of the invention, use of a crude peak locator to find finger centroids, and the resulting finger locations.
DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS Figures 1 a and 1 b show an example touch sensitive holographic image projection device 100 comprising a holographic image projection module 200 and a touch sensing system 250, 258, 260 in a housing 102. A proximity sensor 104 may be employed to selectively power-up the device on detection of proximity of a user to the device. A holographic image projector is merely described by way of example; the techniques we describe herein may be employed with any type of image projection system.
The holographic image projection module 200 is configured to project downwards and outwards onto a flat surface such as a tabletop. This entails projecting at an acute angle onto the display surface (the angle between a line joining the centre of the output of the projection optics and the middle of the displayed image and a line in a plane of the displayed image is less than 90°). We sometimes refer to projection onto a horizontal surface, conveniently but not essentially non-orthogonally, as "table down projection". A holographic image projector is particularly suited to this application because it can provide a wide throw angle, long depth of field, and substantial distortion correction without significant loss of brightness/efficiency. Boundaries of the light forming the displayed image 150 are indicated by lines 150a, b.
The touch sensing system 250, 258, 260 comprises an infrared laser illumination system (IR line generator) 250 configured to project a sheet of infrared light 256 just above, for example ~1 mm above, the surface of the displayed image 150 (although in principle the displayed image could be distant from the touch sensing surface). The laser illumination system 250 may comprise an IR LED or laser 252, preferably collimated, then expanded in one direction by light sheet optics 254, which may comprise a negative or cylindrical lens. Optionally light sheet optics 254 may include a 45 degree mirror adjacent the base of the housing 102 to fold the optical path to facilitate locating the plane of light just above the displayed image.
A CMOS imaging sensor (touch camera) 260 is provided with an ir-pass lens 258 captures light scattered by touching the displayed image 150, with an object such as a finger, through the sheet of infrared light 256. The boundaries of the CMOS imaging sensor field of view are indicated by lines 257, 257a,b. The touch camera 260 provides an output to touch detect signal processing circuitry as described further later.
Example holographic image projection system
Figure 2a shows an example holographic image projection system architecture 200 in which the SLM may advantageously be employed. The architecture of Figure 2 uses dual SLM modulation - low resolution phase modulation and higher resolution amplitude (intensity) modulation. This can provide substantial improvements in image quality, power consumption and physical size. The primary gain of holographic projection over imaging is one of energy efficiency. Thus the low spatial frequencies of an image can be rendered holographically to maintain efficiency and the high- frequency components can be rendered with an intensity-modulating imaging panel, placed in a plane conjugate to the hologram SLM. Effectively, diffracted light from the hologram SLM device (SLM1 ) is used to illuminate the imaging SLM device (SLM2). Because the high-frequency components contain relatively little energy, the light blocked by the imaging SLM does not significantly decrease the efficiency of the system, unlike in a conventional imaging system. The hologram SLM is preferably be a fast multi-phase device, for example a pixellated MEMS-based piston actuator device.
In Figure 2a: • SLM1 is a pixellated MEMS-based piston actuator SLM as described above, to display a hologram - for example a 160 χ 160 pixel device with physically small lateral dimensions, e.g <5mm or <1 mm.
• L1 , L2 and L3 are collimation lenses (optional, depending upon the laser output) for respective Red, Green and Blue lasers.
• M1 , M2 and M3 are dichroic mirrors a implemented as prism assembly.
• M4 is a turning beam mirror.
• SLM2 is an imaging SLM and has a resolution at least equal to the target image resolution (e.g. 854 χ 480); it may comprise a LCOS (liquid crystal on silicon) or DMD (Digital Micromirror Device) panel.
• Diffraction optics 210 comprises lenses LD1 and LD2, forms an intermediate image plane on the surface of SLM2, and has effective focal length f such that fX / Δ covers the active area of imaging SLM2. Thus optics 210 perform a spatial Fourier transform to form a far field illumination pattern in the Fourier plane, which illuminates SLM2.
• PBS2 (Polarising Beam Splitter 2) transmits incident light to SLM2, and reflects emergent light into the relay optics 212 (liquid crystal SLM2 rotates the polarisation by 90 degrees). PBS2 preferably has a clear aperture at least as large as the active area of SLM2.
· Relay optics 212 relay light to the diffuser D1.
• M5 is a beam turning mirror.
• D1 is a diffuser to reduce speckle.
• Projection optics 214 project the object formed on D1 by the relay optics 212, and preferably provide a large throw angle, for example >90°, for angled projection down onto a table top (the design is simplified by the relatively low entendue from the diffuser).
The different colours are time-multiplexed and the sizes of the replayed images are scaled to match one another, for example by padding a target image for display with zeros (the field size of the displayed image depends upon the pixel size of the SLM not on the number of pixels in the hologram).
A system controller and hologram data processor 202, implemented in software and/or dedicated hardware, inputs image data and provides low spatial frequency hologram data 204 to SLM1 and higher spatial frequency intensity modulation data 206 to SLM2. The controller also provides laser light intensity control data 208 to each of the three lasers. For details of an example hologram calculation procedure reference may be made to WO2010/007404 (hereby incorporated by reference).
Control system
Referring now to Figure 2b, this shows a block diagram of the device 100 of figure 1 . A system controller 1 10 is coupled to a touch sensing module 1 12 from which it receives data defining one or more touched locations on the display area, either in rectangular or in distorted coordinates (in the latter case the system controller may perform keystone distortion compensation). The touch sensing module 1 12 in embodiments comprises a CMOS sensor driver and touch-detect processing circuitry. The system controller 1 10 is also coupled to an input/output module 1 14 which provides a plurality of external interfaces, in particular for buttons, LEDs, optionally a USB and/or Bluetooth (RTM) interface, and a bi-directional wireless communication interface, for example using WiFi (RTM). In embodiments the wireless interface may be employed to download data for display either in the form of images or in the form of hologram data. In an ordering/payment system this data may include price data for price updates, and the interface may provide a backhaul link for placing orders, handshaking to enable payment and the like. Non-volatile memory 1 16, for example Flash RAM is provided to store data for display, including hologram data, as well as distortion compensation data, and touch sensing control data (identifying regions and associated actions/links). Non-volatile memory 1 16 is coupled to the system controller and to the I/O module 1 14, as well as to an optional image-to-hologram engine 1 18 as previously described (also coupled to system controller 1 10), and to an optical module controller 120 for controlling the optics shown in figure 2a. (The image-to-hologram engine is optional as the device may receive hologram data for display from an external source). In embodiments the optical module controller 120 receives hologram data for display and drives the hologram display SLM, as well as controlling the laser output powers in order to compensate for brightness variations caused by varying coverage of the display area by the displayed image (for more details see, for example, our WO2008/075096). In embodiments the laser power(s) is(are) controlled dependent on the "coverage" of the image, with coverage defined as the sum of: the image pixel values, preferably raised to a power of gamma (where gamma is typically 2.2). The laser power is inversely dependent on (but not necessarily inversely proportional to) the coverage; in preferred embodiments a lookup table as employed to apply a programmable transfer function between coverage and laser power. The hologram data stored in the non-volatile memory, optionally received by interface 1 14, therefore in embodiments comprises data defining a power level for one or each of the lasers together with each hologram to be displayed; the hologram data may define a plurality of temporal holographic subframes for a displayed image. Preferred embodiments of the device also include a power management system 122 to control battery charging, monitor power consumption, invoke a sleep mode and the like.
In operation the system controller controls loading of the image/hologram data into the non-volatile memory, where necessary conversion of image data to hologram data, and loading of the hologram data into the optical module and control of the laser intensities. The system controller also performs distortion compensation and controls which image to display when and how the device responds to different "key" presses and includes software to keep track of a state of the device. The controller is also configured to transition between states (images) on detection of touch events with coordinates in the correct range, a detected touch triggering an event such as a display of another image and hence a transition to another state. The system controller 1 10 also, in embodiments, manages price updates of displayed menu items, and optionally payment, and the like.
Touch Sensing Systems
Referring now to Figure 3a, this shows an embodiment of a touch sensitive image display device 300 according to an aspect of the invention. The system comprises an infra red laser and optics 250 to generate a plane of light 256 viewed by a touch sense camera 258, 260 as previously described, the camera capturing the scattered light from one or more fingers 301 or other objects interacting with the plane of light. The system also includes an image projector 1 18, for example a holographic image projector, also as previously described.
In the arrangement of Figure 3a a controller 320 controls the IR laser on and off, controls the acquisition of images by camera 260 and controls projector 1 18. In the illustrated example images are captured with the IR laser on and off in alternate frames and touch detection is then performed on the difference of these frames to subtract out any ambient infra red. The image capture objects 258 preferably also include a notch filter at the laser wavelength which may be around 780-800 nm. Because of laser diodes process variations and change of wavelength with temperature this notch may be relatively wide, for example of order 20 nm and thus it is desirable to suppress ambient IR. In the embodiment of Figure 3a subtraction is performed by module 302 which, in embodiments, is implemented in hardware (an FPGA). In embodiments module 302 also performs binning of the camera pixels, for example down to approximately 80 by 50 pixels. This helps reduce the subsequent processing power/memory requirements and is described in more detail later. However such binning is optional, depending upon the processing power available, and even where processing power/memory is limited there are other options, as described further later. Following the binning and subtraction the captured image data is loaded into a buffer 304 for subsequent processing to identify the position of a finger or, in a multi-touch system, fingers.
Because the camera 260 is directed down towards the plane of light at an angle it can be desirable to provide a greater exposure time for portions of the captured image further from the device than for those nearer the device. This can be achieved, for example, with a rolling shutter device, under control of controller 320 setting appropriate camera registers. Depending upon the processing of the captured touch sense images and/or the brightness of the laser illumination system, differencing alternate frames may not be necessary (for example, where 'finger shape' is detected). However where subtraction takes place the camera should have a gamma of substantial unity so that subtraction is performed with a linear signal.
Various different techniques for locating candidate finger/object touch positions will be described. In the illustrated example, however, an approach is employed which detects intensity peaks in the image and then employs a centroid finder to locate candidate finger positions. In embodiments this is performed in software. Processor control code and/or data to implement the aforementioned FPGA and/or software modules shown in Figure 3 may be provided on a disk 318 or another physical storage medium.
Thus in embodiments module 306 performs thresholding on a captured image and, in embodiments, this is also employed for image clipping or cropping to define a touch sensitive region. Optionally some image scaling may also be performed in this module. Then a crude peak locator 308 is applied to the thresholded image to identify, approximately, regions in which a finger/object is potentially present. Figure 3b illustrates an example such a coarse (decimated) grid. In the Figure the spots indicate the first estimation of the centre-of-mass. We then take a 32x20 (say) grid around each of these. This is preferably used in conjunction with a differential approach to minimize noise, i.e. one frame laser on, next laser off. A centroid locator 310 (centre of mass algorithm) is applied to the original (unthresholded) image in buffer 304 at each located peak, to determine a respective candidate finger/object location. Figure 3c shows the results of the fine-grid position estimation, the spots indicating the finger locations found. The system then applies distortion correction 312 to compensate for keystone distortion of the captured touch sense image and also, optionally, any distortion such as barrel distortion, from the lens of imaging optics 258. In one embodiment the optical access of camera 260 is directed downwards at an angle of approximately 70° to the plane of the image and thus the keystone distortion is relatively small, but still significant enough for distortion correction to be desirable.
Because nearer parts of a captured touch sense image may be brighter than further parts, the thresholding may be position sensitive (at a higher level for mirror image parts) alternatively position-sensitive scaling may be applied to the image in buffer 304 and a substantially uniform threshold may be applied.
In one embodiment of the crude peak locator 308 the procedure finds a connected region of the captured image by identifying the brightest block within a region (or a block with greater than a threshold brightness), and then locates the next brightest block, and so forth, preferably up to a distance limit (to avoid accidentally performing a flood fill). Centroid location is then performed on a connected region. In embodiments the pixel brightness/intensity values are not squared before the centroid location, to reduce the sensitivity of this technique to noise, interference and the like (which can cause movement of a detected centroid location by more than once pixel).
A simple centre-of-mass calculation is sufficient for the purpose of finding a centroid in a given ROI (region of interest), and R(x,y) may be estimated thus:
7-1 X-l
_ ys =° ¾ =°
7-1 X-l
∑ ∑ «" (¾ . ¾ )
ys =0 =0
∑ ∑ ¾«" ( ¾ . ¾ )
_ ys =0 ¾ =0
J Y-i x-i
∑ ∑ «" (¾ . ¾ )
ys =0 ¾ =0 where n is the order of the CoM calculation, and and Vare the sizes of the ROI.
In embodiments the distortion correction module 312 performs a distortion correction using a polynomial to map between the touch sense camera space and the displayed image space: Say the transformed coordinates from camera space (x,y) into projected space (x',y') are related by the bivariate polynomial: x = xCxyT and y = xCyyT ; where
Cx and Cy represent polynomial coefficients in matrix-form, and x and y are the vectorised powers of x and y respectively. Then we may design Cx and Cy such that we can assign a projected space grid location (i.e. memory location) by evaluation of the polynomial:
Figure imgf000017_0001
Where is the number of grid locations in the x-direction in projector space, and .J is the floor operator. The polynomial evaluation may be implemented, say, in Chebyshev form for better precision performance; the coefficients may be assigned at calibration. Further background can be found in our published PCT application WO2010/073024.
Once a set of candidate finger positions has been identified, these are passed to a module 314 which tracks finger/object positions and decodes actions, in particular to identity finger up/down or present/absent events. In embodiments this module also provides some position hysteresis, for example implemented using a digital filter, to reduce position jitter. In a single touch system module 314 need only decode a finger up/finger down state, but in a multi-touch system this module also allocates identifiers to the fingers/objects in the captured images and tracks the indentified fingers/objects.
In general the field of view of the touch sense camera system is larger than the displayed image. To improve robustness of the touch sensing system touch events outside the displayed image area (which may be determined by calibration) may be rejected (for example, using appropriate entries in a threshold table of threshold module 306 to clip the crude peak locator outside the image area).
Flicker suppression
As previously described, a touch sensing system of the type described above is potentially vulnerable to "flicker" resulting from a mismatch between the camera's frame rate and the frequency of the AC current powering the source(s) of ambient light. We will now detail some techniques to detect this flicker so that, in response, the system can change to a better frame rate.
Suppose that some of the ambient light is powered by a sinusoidal current with period τ . Then the ambient light will comprise a slowly varying component (from, say, sunlight) plus something roughly sinusoidal with period ti l . Now, suppose the time between camera samples is t ; then the ambient light seen by the camera in frame n will be approximately A + B cos ^- 27Tn + where A, B describe the relative brightness of the non-oscillating and oscillating light sources, and φ is a phase offset. If the system uses alternating "on" and "off" frames then the effective camera frame rate is half the actual frame rate. More precisely, the ambient light in frame-pair n can be described by:
A + 5cos ( 2π(2η + + ψ)
Figure imgf000019_0001
which equals 25 sin ( 1 τ πι ) sin τ ί'{n + ) + (j) j which we can also write as:
Figure imgf000019_0002
Related expressions apply if we look separately at the "off" or the "on" frames.
Typical cases for the system we describe are:
• 50Hz power, 60fps camera, giving: , periodic with period 3.
• 60Hz power, 50fps camera, giving:
Figure imgf000019_0003
, periodic with period 5.
In practice, the camera frame rate will not be exactly 50Hz or 60Hz; the mains frequency will not be exactly 60Hz or 50Hz; either may drift slightly; and there will also be slow variations in the ambient light that do not arise from oscillations in the light source. There may also be not-so-slow variations, from moving shadows and the like.
The calculations above describe the light level at any particular pixel. If we use a rolling-shutter pixel, then the value of φ will depend on the y -coordinate of the pixel being measured.
More precisely: there will be some row-to-row delay δ , and then pixels in row k will be measured later than pixels in row 0 by a time kS . Hence the ambient light seen in row k will be approximately
Α + Βοοζ
Figure imgf000019_0004
+ φ) The signal processing in the touch sensitive display device is configured to determine when this is happening, so that the camera change to a different frame rate. In the simplest case we have two potential frequencies for the ambient light, and for each we have one frame rate that would substantially null out the ambient flicker.
Extracting the signal The signal processing in the touch sensitive display device is configured to turn the camera's pixel-by-pixel measurements into data which can be processed to look for the periodic signal we expect. There are many ways to do this, and some preferred approaches are given below. (Note that "pixel" here can mean either native camera pixels, or the grossly downsampled "bins" these are grouped into, or something in between).
• Compute the image's average (or total) pixel value.
This is somewhat disturbed by the presence or absence of the fingers the system is trying to detect, and also by any other variations in ambient light.
• Compute the image's average (or total) pixel value, after clipping or discarding pixels above some threshold level.
This is affected less by fingers, but the computation will be disrupted if the actual oscillatory ambient light reaches the threshold level.
• Compute the image's median (or some other quantile) pixel value.
This is affected much less by small extra-bright or extra-dark artefacts, but it is more expensive to compute. It may be approximated by the median (or other quantile) of a smaller sample of pixels, either in fixed places (for example, every nth, say 4th, pixel on each axis) or chosen at random for each measurement.
• Pick some fixed location or set of locations, and sum or average the pixel values there.
If a location outside the region illuminated by the laser is picked, one can make fingers matter less. On the other hand, the illumination where the system is measuring may then be less representative of the illumination within the image itself.
• Pick some fixed location or set of locations, process each separately, and combine the results.
This is discussed further later - this can provide some improvement in robustness.
• Measure the light using an entirely different sensor, e.g. a photosensor on the top of the device.
This is completely immune to fingers, but requires extra hardware.
The first option (averaging all the pixels) is simple and good enough in practice.
Optionally image enhancement may be applied to the camera image before using it for finger detection. For example the system may compute the median pixel value and subtract it from every pixel; or the system may convolve the captured image with a filter designed to make large smooth dim artefacts/objects disappear or be suppressed and/or to make fingers show up brightly. If such image enhancement is employed than it is preferable to perform this before the ambient light measurement stage.
There is a choice of which frames to use for the signal extraction, depending on the desired trade-offs:
• Measure only the "on minus off" values. This does a reasonable job of eliminating slowly-varying ambient light. Optionally additional processing may be employed to suppress image variations due to fingers, to reduce the risk of being confused by fingers.
• Measure only the "off" values. This avoids confusion by fingers, but make the system more susceptible to disturbance from slowly-varying ambient light.
• Measure both "off" and "on" values. The system obtains twice as many samples per unit time, hence is potentially able to respond faster. Again, optionally additional processing may be employed to suppress image variations due to fingers, to reduce the risk of being confused by fingers.
Measuring the amount of oscillation
At this stage the signal processing has extracted a sequence of observations which is expected to have a component that is periodic - and probably roughly sinusoidal - with a particular period, if the camera frame rate is wrong in a predicted way; otherwise we expect the observations to be varying only slowly.
Call the observations zn ; we expect that
Figure imgf000022_0001
where p is the expected period (in samples) of the oscillatory ambient light, q determines what the oscillation is expected to look like within that period, A is the amount of slowly-varying ambient light, B is the amount of oscillatory ambient light, and εη is an error term arising from noise, non-sinusoidal oscillation, and the like. We are mostly interested in detecting the situation where B is substantial when compared with the other terms.
By way of example, with a frame rate of 60fps, flicker from 50Hz power, and paired frames, we take p = 3 and q = 1 ; with a frame rate of 50fps, flicker from 60Hz power, and paired frames, we take p = 5 and q = 4 . Unpaired frames would give 3,2 and 5,2 respectively.
Thus, let <w = ex (-^i) . Note that ω is a complex number.
One example procedure is as follows: choose a number of samples m that is a multiple of p , take that many samples, and compute S = {l / m)∑zn
and
T = (l/ m)∑afzn .
Equivalently, one may write
Figure imgf000023_0001
Now, ignoring the error terms arising from εη , we have S = A and | |2 = fi2 / 4. An example suitable criterion is then: We have flickering illumination if
Figure imgf000023_0002
for some suitably chosen λ and μ (for example selected by routine experiment). This delivers a verdict every m samples.
The system may instead accumulate blocks of p samples, just as above, and then feed these into an MR filter; then one can look every p samples instead. If the system uses this approach then it can be preferable allow μ (and optionally also λ ) to vary as more samples are accumulated. (After a some accumulation the samples will settle down but at the very beginning - note that preferably the system resets the filter whenever the camera frame rate is changed - the optimal values will vary.)
It is also possible to use a sample-by-sample MR filter (which may, for example, mean updating a value u by un = (l- h)un_l + ha>~lzn ). However this approach is less preferable to that previously described.
Variance estimation The parameter labelled μ above is largely a measure of how much ambient light variation one expects to see that does not fit into a constant-plus-sinusoid pattern. The system may try to estimate how much is actually seen, instead of just guessing.
The numbers referred to as S and T are (in the simpler average-over-a-block algorithm) two of the components in the Fourier decomposition of our signal. The sum of the squares (more precisely: the squared absolute values) of all the components in the Fourier decomposition is V2 = (l/ m)∑|z|2 . So let us compare 52,|Γ|22.
Let W2 = V2 -(s2 +|r|2j ; this is the amount of variance in the ambient light left unexplained by the sinusoidal component we've singled out for our attention.
(If the system uses the more complicated IIR-filter algorithm, one may compute V2 as follows: Feed a constant-1 sequence into the filter, getting out Fi t say. Also, feed the sequence of
Figure imgf000024_0001
into the filter, getting out Fz . Then V2 = FlFz .)
It can then be reasonable to suppose that the εη are (something like) normally distributed with variance W2.
Preferably the system should switch camera frame rate only when the system is reasonably confident that there really is a sinusoidal component large enough that it might affect the touch sensing performance. "Large enough" in this context may be defined as "the measured oscillatory ambient light is at least k " for some choice of k that depends on how the system is measuring ambient light, how big we expect fingers to be, and so forth.
Then the system should switch frequency if \τ\2 > (kF^2 + r2W2 for some r . Here r is how many standard deviations of the residual noise the system requires above the threshold before switching (a typical value may be r2 = 4.) Using multiple sets of samples Instead of taking zn to be some figure that summarizes the ambient light in the whole image, the system could instead use (for instance) the value at a single pixel, perform the above computations separately for several pixels, and then switch frame rate only if the results from different pixels agree sufficiently well.
More than two frequencies
There may be more than two frequencies at which the ambient light is able to oscillate. Then the system may employ multiple camera rates, and the processing may be configured to detect multiple different flicker rates. The calculations are essentially the same as those discussed above, but performed n - l times with different parameters. Identifying a particular frequency component as strongly present may then trigger a switch to the corresponding frame rate. The system may gain some robustness by adding a criterion of the following form: do not switch unless the strongest oscillatory component found is substantially bigger than all the others the signal processing looked for.
Frame rate switching
The calculations described above provide, at fairly frequent regular intervals, an indication of how likely it is that the system is detecting an appreciable amount of oscillatory ambient light. The system may simply change frame rate whenever it seems likely enough that this is so, but there may be better approaches:
Suppose, for instance, that the ambient light actually contains oscillatory components at both 100Hz and 120Hz. (This could happen, for example, in Tokyo, where there are two different mains frequencies). Then the system might find itself repetitively hopping from one frame rate to another and back again, which is undesirable. On the other hand, suppose the system thinks it sees some flicker and therefore switches frame rate. If the system then sees much worse flicker then the system may switch back and should be biased towards staying switched back.
An example algorithm to implement this is as follows, the system maintains a guess at how much flickering illumination there might be at the "other" frame rate. (If there are more than two possible frame rates, the system maintains a guess for each possible other frame rate). When the system is operating at a particular frame rate, the system updates this guess since the system now has actual knowledge. Preferably the system also remembers how long it is since it last measured at any given other frame rate - the longer the elapsed time, the less confident the system is are about how much flicker the system expects to see at that frame rate. When the system is considering a switch from one frame rate to another, the system compares the observed flicker at the current frame rate with the best-guess flicker at the other frame rate, and switches only if there is reasonably confidence that the other rate would provide an improvement (less flicker) on the present frame rate.
Referring again to Figure 3a, in an embodiment of a touch sensing display device 300 according to the invention the above described flicker detection and camera frame rate control may be implemented by a software (and/or hardware) flicker suppression module 350. The flicker suppression module 350 may receive touch sense images either directly from camera 260 or, for example, from module 302, after binning. The module then implements a signal processing procedure as described above and controls camera 260, either directly or via controller 320, to adjust the frame capture rate if necessary.
It will be appreciated that for the touch sensing system to work a user need not actually touch the displayed image. The plane or fan of light is preferably invisible, for example in the infrared, but this is not essential - ultraviolet or visible light may alternatively be used. Although in general the plane or fan of light will be adjacent to displayed image, this is also not essential and, in principle, the projected image could be at some distance beyond the touch sensing surface. The skilled person will appreciate that whilst a relatively thin, flat plane of light is desirable this is not essential and some tilting and/or divergence or spreading of the beam may be acceptable with some loss of precision. Alternatively some convergence of the beam towards the far edge of the display area may be helpful in at least partially compensating for the reduction in brightness of the touch sensor illumination as the light fans out. Further, in embodiments the light defining the touch sheet need not be light defining a continuous plane - instead structured light such as a comb or fan of individual beams and/or one or more scanned light beams, may be employed to define the touch sheet.
No doubt many other effective alternatives will occur to the skilled person. It will be understood that the invention is not limited to the described embodiments and encompasses modifications apparent to those skilled in the art lying within the spirit and scope of the claims appended hereto.

Claims

CLAIMS:
1 . A touch sensitive device, the device comprising:
a touch sensor light source to project a plane of light above a surface;
a camera directed to capture a touch sense image from a region including at least a portion of said plane of light, said touch sense image comprising light scattered from said plane of light by an object approaching or touching said surface; and
a signal processor coupled to said camera, to process a said touch sense image from said camera to identify a lateral location of said object;
wherein said signal processor is further configured to:
detect when an oscillatory component of ambient light captured by said camera has a frequency different to an integral multiple of a frame rate of said camera; and adjust said frame rate of said camera responsive to said detection to an adjusted frame rate, such that said oscillatory component of said ambient light captured by said camera has a frequency within 10% of an integral multiple of said adjusted frame rate.
2. A touch sensitive device as claimed in claim 1 wherein said signal processor is configured to determine an instantaneous level of said oscillatory component of said ambient light captured by said camera by determining an average light intensity value over a plurality of pixels extending in two orthogonal directions over said touch sense image.
3. A touch sensitive device as claimed in claim 1 wherein said camera is a rolling shutter camera, and wherein said signal processor is configured to determine an instantaneous level of said oscillatory component of ambient light captured by said camera by correlating a pattern of variation of light level in said touch sense image in a direction of motion of said rolling shutter with a variation of said light level at an expected frequency of said oscillatory component of said ambient light.
4. A touch sensitive device as claimed in claim 1 , 2 or 3 wherein said signal processor is configured to determine a difference frequency signal level dependent on a level of a frequency component in said touch sense image at a difference frequency between a frame rate, or integral fraction of a frame rate, of said camera and an expected frequency of said oscillatory component of said ambient light, and wherein said detection is responsive to said difference frequency determination.
5. A touch sensitive device as claimed in claim 4 wherein said signal processor is configured to determine said difference frequency level by determining a correlation between light intensity level data from said touch sense image and one or more waveforms at said expected frequency.
6. A touch sensitive device as claimed in claim 4 wherein said signal processor is configured to determine said difference frequency level by digitally filtering light intensity level data from said touch sense image.
7. A touch sensitive device as claimed in claim 4, 5 or 6 wherein said signal processor is configured to compare said difference frequency level with a second said difference frequency level determined from an alternative said expected frequency.
8. A touch sensitive device as claimed in claim 4, 5, 6 or 7 wherein said signal processor is configured to compare said difference frequency level with an average ambient light level determined from a said touch sense image.
9. A touch sensitive device as claimed in any preceding claim, wherein said touch sensor light projection is amplitude modulated, wherein said touch sense image region extends beyond said plane of light, and wherein said signal processor is configured to determine an instantaneous level of said oscillatory component of said ambient light captured by said camera from portions of said touch sense image beyond said plane of light for captured touch sense images when said projection is on.
10. A touch sensitive device as claimed in any preceding claim, wherein said signal processor is configured to suppress the effect of changes between captured touch sense images resulting from changes in average said ambient light level while identifying said object motion.
1 1 . A touch sensitive device as claimed in any preceding claim wherein the device is an image display device, the device further comprising an image projector to project a displayed image onto said surface in front of the device; wherein said touch sensor light source is configured to project said plane of light above said displayed image, and wherein said signal processor is configured to process a said touch sense image to identify said lateral location of the or each of said objects relative to said displayed image.
12. A method of ambient light interference suppression in a touch sensitive device, the device comprising:
a touch sensor light source to project a plane of light above a surface;
a camera directed to capture a touch sense image from a region including at least a portion of said plane of light, said touch sense image comprising light scattered from said plane of light by an object approaching or touching said surface; and
a signal processor coupled to said camera, to process a said touch sense image from said camera to identify a lateral location of said object;
the method comprising:
correlating an ambient light intensity signal from said touch sense image with a difference frequency between a frame rate or an integral fraction of a frame rate of said camera and an expected frequency of an oscillatory component of said ambient light captured by said camera;
determining when said oscillatory component of said ambient light captured by said camera has a frequency different to said integral multiple of said frame rate or said integral fraction of said frame rate of said camera; and
adjusting said frame rate of said camera to an adjusted frame rate, responsive to said determining, such that said oscillatory component of said ambient light captured by said camera has a frequency within 10% of an integral multiple of said adjusted frame rate.
13. A physical data carrier carrying processor control code to implement the method of claim 12.
14. A touch sensitive device, the device comprising:
a touch sensor light source to project a plane of light above a surface;
a camera directed to capture a touch sense image from a region including at least a portion of said plane of light, said touch sense image comprising light scattered from said plane of light by an object approaching or touching said surface; and a signal processor coupled to said camera, to process a said touch sense image from said camera to identify a lateral location of said object;
wherein said touch sensor light projection is amplitude modulated, and wherein said touch sense image region extends beyond said plane of light; and
wherein said signal processor is further configured to:
determine a level of ambient light captured by said camera from portions of said touch sense image beyond said plane of light for captured touch sense images when said projection is on; and
compensate for said determined level of ambient light to identify said scattered light in a said touch sense image for identifying said location of said object.
15. A touch sensitive device as claimed in claim 14 wherein said camera is a rolling shutter camera, and wherein said signal processor is configured to determine said level of ambient light from a portion of said touch sense image within said plane of light when said projection is turned from on to off, and wherein said rolling shutter captures light from a portion of a field of view of said camera where said light projection is off.
16. A touch sensitive device as claimed in claim 14 or 15 wherein said signal processor is configured to determine said level of ambient light additionally from portions of said touch sense image within said plane of light when said projection is off.
17. A touch sensitive device, the device comprising:
a touch sensor light source to project a plane of light above a surface;
a camera directed to capture a touch sense image from a region including at least a portion of said plane of light, said touch sense image comprising light scattered from said plane of light by an object approaching or touching said surface; and
a signal processor coupled to said camera, to process a said touch sense image from said camera to identify a lateral location of said object;
wherein said touch sensor light projection is amplitude modulated, and wherein said touch sense image region extends beyond said plane of light; and
wherein said camera is a rolling shutter camera;
wherein said rolling shutter captures light from a portion of a field of view of said camera where said projection is off; and wherein said signal processor is further configured to determine said level of ambient light from a portion of said touch sense image within said plane of light when said projection is turned from on to off.
18. A touch sensitive device as claimed in any one of claims 14 to 17 wherein the device is an image display device, the device further comprising an image projector to project a displayed image onto said surface in front of the device; wherein said touch sensor light source is configured to project said plane of light above said displayed image, and wherein said signal processor is configured to process a said touch sense image to identify said lateral location of said object relative to said displayed image.
PCT/GB2012/051379 2011-06-16 2012-06-15 Touch-sensitive display devices WO2012172360A2 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
GBGB1110157.3A GB201110157D0 (en) 2011-06-16 2011-06-16 Touch sensitive display devices
GB1110157.3 2011-06-16
US201161508921P 2011-07-18 2011-07-18
US61/508,921 2011-07-18

Publications (2)

Publication Number Publication Date
WO2012172360A2 true WO2012172360A2 (en) 2012-12-20
WO2012172360A3 WO2012172360A3 (en) 2013-03-07

Family

ID=44357875

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2012/051379 WO2012172360A2 (en) 2011-06-16 2012-06-15 Touch-sensitive display devices

Country Status (2)

Country Link
GB (1) GB201110157D0 (en)
WO (1) WO2012172360A2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106416263A (en) * 2014-05-15 2017-02-15 谷歌公司 Intelligent auto-caching of media
US11700462B2 (en) 2019-11-06 2023-07-11 Koninklijke Philips N.V. System for performing ambient light image correction

Citations (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4384201A (en) 1978-04-24 1983-05-17 Carroll Manufacturing Corporation Three-dimensional protective interlock apparatus
DE4121180A1 (en) 1991-06-27 1993-01-07 Bosch Gmbh Robert Finger input type interactive screen display system for road vehicle navigation - has panel screen with matrix of sensing elements that can be of infrared or ultrasonic proximity devices or can be touch foil contacts
US5767842A (en) 1992-02-07 1998-06-16 International Business Machines Corporation Method and device for optical input of commands or data
US6031519A (en) 1997-12-30 2000-02-29 O'brien; Wayne P. Holographic direct manipulation interface
WO2000021282A1 (en) 1998-10-02 2000-04-13 Macronix International Co., Ltd. Method and apparatus for preventing keystone distortion
GB2343023A (en) 1998-10-21 2000-04-26 Global Si Consultants Limited Apparatus for order control
US6281878B1 (en) 1994-11-01 2001-08-28 Stephen V. R. Montellese Apparatus and method for inputing data
WO2001093182A1 (en) 2000-05-29 2001-12-06 Vkb Inc. Virtual data entry device and method for input of alphanumeric and other data
WO2001093006A1 (en) 2000-05-29 2001-12-06 Vkb Inc. Data input device
US20020021287A1 (en) 2000-02-11 2002-02-21 Canesta, Inc. Quasi-three-dimensional method and apparatus to detect and localize interaction of user-object and virtual transfer device
US6377238B1 (en) 1993-04-28 2002-04-23 Mcpheters Robert Douglas Holographic control arrangement
US6491400B1 (en) 2000-10-24 2002-12-10 Eastman Kodak Company Correcting for keystone distortion in a digital image displayed by a digital projector
WO2002101443A2 (en) 2001-06-12 2002-12-19 Silicon Optix Inc. System and method for correcting keystone distortion
US6611921B2 (en) 2001-09-07 2003-08-26 Microsoft Corporation Input device with two input signal generating means having a power state where one input means is powered down and the other input means is cycled between a powered up state and a powered down state
US6614422B1 (en) 1999-11-04 2003-09-02 Canesta, Inc. Method and apparatus for entering data using a virtual input device
US6650318B1 (en) 2000-10-13 2003-11-18 Vkb Inc. Data input device
US6690357B1 (en) 1998-10-07 2004-02-10 Intel Corporation Input device using scanning sensors
US20040095315A1 (en) 2002-11-12 2004-05-20 Steve Montellese Virtual holographic input method and device
US20060187199A1 (en) 2005-02-24 2006-08-24 Vkb Inc. System and method for projection
WO2006108443A1 (en) 2005-04-13 2006-10-19 Sensitive Object Method for determining the location of impacts by acoustic imaging
US20060244720A1 (en) 2005-04-29 2006-11-02 Tracy James L Collapsible projection assembly
US20070019103A1 (en) 2005-07-25 2007-01-25 Vkb Inc. Optical apparatus for virtual interface projection and sensing
US7242388B2 (en) 2001-01-08 2007-07-10 Vkb Inc. Data input device
US7268692B1 (en) 2007-02-01 2007-09-11 Lumio Inc. Apparatus and method for monitoring hand propinquity to plural adjacent item locations
WO2008038275A2 (en) 2006-09-28 2008-04-03 Lumio Inc. Optical touch panel
US7379619B2 (en) 2005-03-09 2008-05-27 Texas Instruments Incorporated System and method for two-dimensional keystone correction for aerial imaging
WO2008075096A1 (en) 2006-12-21 2008-06-26 Light Blue Optics Ltd Holographic image display systems
US7394459B2 (en) 2004-04-29 2008-07-01 Microsoft Corporation Interaction between objects and a virtual environment display
US7417681B2 (en) 2002-06-26 2008-08-26 Vkb Inc. Multifunctional integrated image sensor and application to virtual interface technology
WO2008146098A1 (en) 2007-05-28 2008-12-04 Sensitive Object Method for determining the position of an excitation on a surface and device for implementing such a method
US7519223B2 (en) 2004-06-28 2009-04-14 Microsoft Corporation Recognizing gestures and using gestures for interacting with software applications
USD595785S1 (en) 2007-11-09 2009-07-07 Igt Standalone, multi-player gaming table apparatus with an electronic display
US7593593B2 (en) 2004-06-16 2009-09-22 Microsoft Corporation Method and system for reducing effects of undesired signals in an infrared imaging system
US7599561B2 (en) 2006-02-28 2009-10-06 Microsoft Corporation Compact interactive tabletop with projection-vision
WO2010007404A2 (en) 2008-07-16 2010-01-21 Light Blue Optics Limited Holographic image display systems
WO2010073047A1 (en) 2008-12-24 2010-07-01 Light Blue Optics Limited Touch sensitive image display device

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7142234B2 (en) * 2002-12-10 2006-11-28 Micron Technology, Inc. Method for mismatch detection between the frequency of illumination source and the duration of optical integration time for imager with rolling shutter
US8068148B2 (en) * 2006-01-05 2011-11-29 Qualcomm Incorporated Automatic flicker correction in an image capture device
US8289300B2 (en) * 2009-07-17 2012-10-16 Microsoft Corporation Ambient correction in rolling image capture system

Patent Citations (43)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4384201A (en) 1978-04-24 1983-05-17 Carroll Manufacturing Corporation Three-dimensional protective interlock apparatus
DE4121180A1 (en) 1991-06-27 1993-01-07 Bosch Gmbh Robert Finger input type interactive screen display system for road vehicle navigation - has panel screen with matrix of sensing elements that can be of infrared or ultrasonic proximity devices or can be touch foil contacts
US5767842A (en) 1992-02-07 1998-06-16 International Business Machines Corporation Method and device for optical input of commands or data
US6377238B1 (en) 1993-04-28 2002-04-23 Mcpheters Robert Douglas Holographic control arrangement
US6281878B1 (en) 1994-11-01 2001-08-28 Stephen V. R. Montellese Apparatus and method for inputing data
US6031519A (en) 1997-12-30 2000-02-29 O'brien; Wayne P. Holographic direct manipulation interface
WO2000021282A1 (en) 1998-10-02 2000-04-13 Macronix International Co., Ltd. Method and apparatus for preventing keystone distortion
US6367933B1 (en) 1998-10-02 2002-04-09 Macronix International Co., Ltd. Method and apparatus for preventing keystone distortion
US6690357B1 (en) 1998-10-07 2004-02-10 Intel Corporation Input device using scanning sensors
GB2343023A (en) 1998-10-21 2000-04-26 Global Si Consultants Limited Apparatus for order control
US6614422B1 (en) 1999-11-04 2003-09-02 Canesta, Inc. Method and apparatus for entering data using a virtual input device
US6710770B2 (en) 2000-02-11 2004-03-23 Canesta, Inc. Quasi-three-dimensional method and apparatus to detect and localize interaction of user-object and virtual transfer device
US20020021287A1 (en) 2000-02-11 2002-02-21 Canesta, Inc. Quasi-three-dimensional method and apparatus to detect and localize interaction of user-object and virtual transfer device
US7084857B2 (en) 2000-05-29 2006-08-01 Vkb Inc. Virtual data entry device and method for input of alphanumeric and other data
WO2001093006A1 (en) 2000-05-29 2001-12-06 Vkb Inc. Data input device
US7305368B2 (en) 2000-05-29 2007-12-04 Vkb Inc. Virtual data entry device and method for input of alphanumeric and other data
WO2001093182A1 (en) 2000-05-29 2001-12-06 Vkb Inc. Virtual data entry device and method for input of alphanumeric and other data
US6650318B1 (en) 2000-10-13 2003-11-18 Vkb Inc. Data input device
US6491400B1 (en) 2000-10-24 2002-12-10 Eastman Kodak Company Correcting for keystone distortion in a digital image displayed by a digital projector
US20070222760A1 (en) 2001-01-08 2007-09-27 Vkb Inc. Data input device
US7242388B2 (en) 2001-01-08 2007-07-10 Vkb Inc. Data input device
WO2002101443A2 (en) 2001-06-12 2002-12-19 Silicon Optix Inc. System and method for correcting keystone distortion
US6611921B2 (en) 2001-09-07 2003-08-26 Microsoft Corporation Input device with two input signal generating means having a power state where one input means is powered down and the other input means is cycled between a powered up state and a powered down state
US7417681B2 (en) 2002-06-26 2008-08-26 Vkb Inc. Multifunctional integrated image sensor and application to virtual interface technology
US20040095315A1 (en) 2002-11-12 2004-05-20 Steve Montellese Virtual holographic input method and device
US7394459B2 (en) 2004-04-29 2008-07-01 Microsoft Corporation Interaction between objects and a virtual environment display
US7593593B2 (en) 2004-06-16 2009-09-22 Microsoft Corporation Method and system for reducing effects of undesired signals in an infrared imaging system
US7519223B2 (en) 2004-06-28 2009-04-14 Microsoft Corporation Recognizing gestures and using gestures for interacting with software applications
US20060187199A1 (en) 2005-02-24 2006-08-24 Vkb Inc. System and method for projection
US7379619B2 (en) 2005-03-09 2008-05-27 Texas Instruments Incorporated System and method for two-dimensional keystone correction for aerial imaging
WO2006108443A1 (en) 2005-04-13 2006-10-19 Sensitive Object Method for determining the location of impacts by acoustic imaging
US20060244720A1 (en) 2005-04-29 2006-11-02 Tracy James L Collapsible projection assembly
US20070019103A1 (en) 2005-07-25 2007-01-25 Vkb Inc. Optical apparatus for virtual interface projection and sensing
US7599561B2 (en) 2006-02-28 2009-10-06 Microsoft Corporation Compact interactive tabletop with projection-vision
WO2008038275A2 (en) 2006-09-28 2008-04-03 Lumio Inc. Optical touch panel
WO2008075096A1 (en) 2006-12-21 2008-06-26 Light Blue Optics Ltd Holographic image display systems
US7268692B1 (en) 2007-02-01 2007-09-11 Lumio Inc. Apparatus and method for monitoring hand propinquity to plural adjacent item locations
WO2008146098A1 (en) 2007-05-28 2008-12-04 Sensitive Object Method for determining the position of an excitation on a surface and device for implementing such a method
USD595785S1 (en) 2007-11-09 2009-07-07 Igt Standalone, multi-player gaming table apparatus with an electronic display
WO2010007404A2 (en) 2008-07-16 2010-01-21 Light Blue Optics Limited Holographic image display systems
WO2010073047A1 (en) 2008-12-24 2010-07-01 Light Blue Optics Limited Touch sensitive image display device
WO2010073024A1 (en) 2008-12-24 2010-07-01 Light Blue Optics Ltd Touch sensitive holographic displays
WO2010073045A2 (en) 2008-12-24 2010-07-01 Light Blue Optics Ltd Display device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106416263A (en) * 2014-05-15 2017-02-15 谷歌公司 Intelligent auto-caching of media
US11700462B2 (en) 2019-11-06 2023-07-11 Koninklijke Philips N.V. System for performing ambient light image correction

Also Published As

Publication number Publication date
WO2012172360A3 (en) 2013-03-07
GB201110157D0 (en) 2011-07-27

Similar Documents

Publication Publication Date Title
US9298320B2 (en) Touch sensitive display devices
JP6827073B2 (en) Depth mapping with structured light and time of flight
US20140362052A1 (en) Touch Sensitive Image Display Devices
US20140232695A1 (en) Touch-Sensitive Display Devices
WO2013144599A2 (en) Touch sensing systems
JP5611342B2 (en) Ambient correction in rotating image capture system
JP6110862B2 (en) Object distance determination from images
US20100201812A1 (en) Active display feedback in interactive input systems
US9787907B2 (en) Substance detection device
WO2013108031A2 (en) Touch sensitive image display devices
US10627518B2 (en) Tracking device with improved work surface adaptability
WO2012175703A1 (en) Method and system for reliable reflective object detection using display light scene illumination
US20150248189A1 (en) Touch Sensing Systems
JP2013120586A (en) Projector
US20140247249A1 (en) Touch Sensitive Display Devices
WO2012172360A2 (en) Touch-sensitive display devices
US20130093921A1 (en) Image system and denoising method therefor
TWI454996B (en) Display and method of determining a position of an object applied to a three-dimensional interactive display
GB2499979A (en) Touch-sensitive image display devices
EP3605223B1 (en) Projector equipped with detection function
WO2014065697A1 (en) Method and device for the gesture control of a multimedia display
JP2018159634A (en) Dirt detection device, terminal device, and dirt detection method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12728756

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12728756

Country of ref document: EP

Kind code of ref document: A2