US20110254939A1 - Detecting User Input Provided To A Projected User Interface - Google Patents

Detecting User Input Provided To A Projected User Interface Download PDF

Info

Publication number
US20110254939A1
US20110254939A1 US13/018,549 US201113018549A US2011254939A1 US 20110254939 A1 US20110254939 A1 US 20110254939A1 US 201113018549 A US201113018549 A US 201113018549A US 2011254939 A1 US2011254939 A1 US 2011254939A1
Authority
US
United States
Prior art keywords
true
user
condition
input
pixels classified
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/018,549
Inventor
Tatiana Pavlovna Kadantseva
Ricardo Te Lim
Raymond Chow
George Lyons
Manfred Wittmeir
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Seiko Epson Corp
Original Assignee
Seiko Epson Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Seiko Epson Corp filed Critical Seiko Epson Corp
Priority to US13/018,549 priority Critical patent/US20110254939A1/en
Assigned to EPSON EUROPE ELECTRONICS GMBH reassignment EPSON EUROPE ELECTRONICS GMBH ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: WITTMEIR, MANFRED
Assigned to EPSON RESEARCH & DEVELOPMENT, INC. reassignment EPSON RESEARCH & DEVELOPMENT, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CHOW, RAYMOND, KADANTSEVA, TATIANA PAVLOVNA, LIM, RICARDO TE, LYONS, GEORGE
Assigned to SEIKO EPSON CORPORATION reassignment SEIKO EPSON CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: EPSON EUROPE ELECTRONICS GMBH
Assigned to SEIKO EPSON CORPORATION reassignment SEIKO EPSON CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: EPSON RESEARCH & DEVELOPMENT, INC.
Publication of US20110254939A1 publication Critical patent/US20110254939A1/en
Priority to US14/087,479 priority patent/US9582070B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/002Specific input/output arrangements not covered by G06F3/01 - G06F3/16
    • G06F3/005Input arrangements through a video camera
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47JKITCHEN EQUIPMENT; COFFEE MILLS; SPICE MILLS; APPARATUS FOR MAKING BEVERAGES
    • A47J27/00Cooking-vessels
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47JKITCHEN EQUIPMENT; COFFEE MILLS; SPICE MILLS; APPARATUS FOR MAKING BEVERAGES
    • A47J36/00Parts, details or accessories of cooking-vessels
    • A47J36/32Time-controlled igniting mechanisms or alarm devices
    • A47J36/321Time-controlled igniting mechanisms or alarm devices the electronic control being performed over a network, e.g. by means of a handheld device
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F24HEATING; RANGES; VENTILATING
    • F24CDOMESTIC STOVES OR RANGES ; DETAILS OF DOMESTIC STOVES OR RANGES, OF GENERAL APPLICATION
    • F24C15/00Details
    • F24C15/20Removing cooking fumes
    • F24C15/2021Arrangement or mounting of control or safety systems
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F24HEATING; RANGES; VENTILATING
    • F24CDOMESTIC STOVES OR RANGES ; DETAILS OF DOMESTIC STOVES OR RANGES, OF GENERAL APPLICATION
    • F24C7/00Stoves or ranges heated by electric energy
    • F24C7/08Arrangement or mounting of control or safety devices
    • F24C7/082Arrangement or mounting of control or safety devices on ranges, e.g. control panels, illumination
    • F24C7/083Arrangement or mounting of control or safety devices on ranges, e.g. control panels, illumination on tops, hot plates
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B1/00Details of electric heating devices
    • H05B1/02Automatic switching arrangements specially adapted to apparatus ; Control of heating devices
    • H05B1/0227Applications
    • H05B1/0252Domestic applications
    • H05B1/0258For cooking
    • H05B1/0261For cooking of food
    • H05B1/0263Ovens

Definitions

  • This application relates generally to detecting whether a user intends to provide an input to a system and in particular to detecting a user's intent to provide input to an image of a control.
  • Physical switches, button, knobs, and controls are useful for detecting whether a user intends to provide an input to a system.
  • One problem is that known techniques are expensive.
  • a technique may infer that some activity indicates that a user intends to provide an input, but the activity may also sometimes be consistent with a lack of user intent to provide input.
  • the technique detects the activity and infers intent to provide input, but the user, in fact, does not intend to provide input, the technique provides a false positive.
  • the problem of false positives tends to become more common when known techniques are employed in an inexpensive fashion.
  • low cost is important. Accordingly, there is a need for low-cost, robust methods and apparatus for detecting user input provided to a projected user interface.
  • the apparatus may include a first camera to capture two or more images of the surface and a unit.
  • the unit may determine whether first, second, and third conditions are true.
  • the first condition being that a particular number of pixels classified as skin color are present within one cell of the two or more images, the cell having a location substantially coinciding with the image of the control.
  • the second condition being that the pixels classified as skin color persist for at least a particular time period.
  • the third condition being that the pixels classified as skin color have a first shape.
  • the unit may provide a signal indicative of an intent of a user to provide an input if the each of the first, second, and third conditions are true.
  • the apparatus may include a projector to project one or more user controls onto the surface.
  • the camera and the surface may be in a fixed spatial relationship with one another.
  • the apparatus may include a second camera to capture two or more images of the surface, the second camera being spaced apart from the first camera.
  • the unit may determine whether a fourth condition is true, the fourth condition being that the pixels classified as skin color and having the first shape are within a first distance from the surface.
  • the unit may provide a signal indicative of an intent of a user to provide an input if the each of the first, second, third and fourth conditions are true.
  • the unit may provide a signal indicative of an intent of a user to provide an input if a majority of the first, second, third, and fourth conditions are true.
  • the unit may determine whether a fifth condition is true, the fifth condition being that a particular number of pixels classified as finger-nail color are present within the cell of the two or more images, and that a first count of the pixels classified as finger-nail color at a first time is greater than a second count of the pixels classified as finger-nail color at a second time.
  • the unit may provide a signal indicative of an intent of a user to provide an input if the each of the first, second, third, fourth, and fifth conditions are true.
  • the unit may determine whether a sixth condition is true, the sixth condition being that a first position at a first time of the pixels classified as skin color is different from a second position at a second time of the pixels classified as skin color.
  • the unit may provide a signal indicative of an intent of a user to provide an input if the each of the first, second, third, fourth, fifth, and sixth conditions are true.
  • the unit may provide a signal indicative of an intent of a user to provide an input if a majority of the first, second, third, fourth, fifth, and sixth seventh conditions are true.
  • Embodiments are also directed to methods for determining whether a user intends to provide an input using an image of a control appearing on a surface.
  • FIGS. 1A and 1B illustrate front and side views of a projector and a surface according to an embodiment.
  • FIG. 2 illustrates a plan view of the surface of FIGS. 1A and 1B .
  • FIGS. 3A and 3B illustrate projected controls
  • FIG. 3C illustrates cells corresponding with the projected controls according to one embodiment.
  • FIG. 4A is a front side view of a camera and a surface according to an embodiment.
  • FIG. 4B shows an exemplary frame captured by the camera.
  • FIG. 5 is a front view of a projector, a camera, and a surface according to an embodiment.
  • FIGS. 6A and 6B show flowcharts of methods for determining whether a user intends to provide input using a projected control according to embodiments.
  • FIG. 7 is flow chart of an embodiment of a method for performing validation tests according to an embodiment.
  • FIG. 8 depicts a finger overlaying an exemplary cell.
  • FIG. 9 is a front view of a projector, first and second cameras, and a surface according to an embodiment.
  • FIG. 10 depicts captured images of a finger overlaying an exemplary cell.
  • FIG. 11 illustrates first and second cameras, and a surface according to an embodiment.
  • FIG. 12 depicts captured images of a finger overlaying an exemplary cell.
  • FIG. 13 is a flowchart of a method according to an embodiment.
  • FIG. 14 depicts a finger overlaying an exemplary region of interest according to an embodiment.
  • FIG. 15 is a flowchart of a method according to an embodiment.
  • FIGS. 1A and 1B illustrate front and side views, respectively, of an embodiment in which a projector 100 projects a visible image onto a planar surface 104 .
  • FIG. 2 illustrates a plan view of the planar surface 104 .
  • the projected image includes a control 102 (“projected control”).
  • the projected control 102 may be projected anywhere within a projection area 106 and may be projected so as to appear at a known location within the projection area.
  • the projection area 106 may be rectangular, but this is not critical as other shapes may be used.
  • the projector 100 may project two or more projected controls 102 .
  • the surface 104 may be made from any desired material and may be any desired color.
  • the surface 104 may be a horizontal surface, such as countertop.
  • a horizontal surface 104 may be a table for playing games, such as card games.
  • the surface 104 may also be a vertical surface, such as a wall or a side of an appliance.
  • a vertical surface 104 may be a side of a kiosk or vending machine.
  • the surface 104 may be a cooktop or a side of a refrigerator.
  • the surface 104 may lie in a plane that is neither horizontal nor vertical.
  • the surface 104 may lie in a plane that makes an 80 degree angle with the horizontal.
  • the surface 104 may be non-planar.
  • the surface 104 may be concave or convex, or the projection area 106 part of the surface 104 may be, wholly or partially, concave or convex.
  • the projector 100 and surface 104 may be installed at particular locations so that the projector and surface are in a fixed spatial relationship with one another.
  • the projection area 106 may have fixed dimensions and may appear at a fixed location on the surface 104 .
  • the projection area 106 may be 40 ⁇ 30 cm.
  • the digital image that is input to the projector 100 and used by the projector to create a projected image may have fixed horizontal and vertical dimensions.
  • the input image may be 800 ⁇ 600 pixels, and points “a” and “d” in FIG. 2 may correspond with pixel coordinates (0, 0) and (800, 600), respectively, of the input image.
  • the input image may be mapped to the projection area 106 .
  • 16 pixels of input image may be mapped to 1 cm of the projection area 106 on the surface 104 .
  • the coordinate location of each pixel corresponds with a horizontal and vertical distance from a corner point such as point “a.”
  • the projected control 102 may be projected to appear at a particular coordinate location in projection area 106 on the surface 104 .
  • the coordinate location of each pixel thus corresponds with a physical location and is a known distance from one of the corner points: a, b, c, and d, each of which is projected onto known locations on the surface 104 .
  • FIG. 3A shows a plurality of projected controls 300 projected onto projection area 304 of a cooktop surface 302 according to one embodiment.
  • An image of a projected control may be any type of image and any desired size.
  • a projected control may be a button or a sliding switch.
  • FIG. 3B shows exemplary projected controls 306 , 308 , 310 , and 312 in projection area 304 for selecting a heating element 305 , for increasing, and for decreasing the temperature of a selected element.
  • Each control 306 , 308 , 310 , and 312 is displayed within a cell.
  • FIG. 3C shows cells 314 , 316 , 318 , and 320 that correspond with the projected controls 306 , 308 , 310 , and 312 , respectively.
  • the cells are not displayed on the cooktop; FIG. 3C merely shows the locations of the cells.
  • each cell is 60 ⁇ 60 pixels and corresponds with a 30 ⁇ 30 mm region of the surface 104 , which is approximately 3/2 the width of an exemplary human finger.
  • Each cell corresponds with a particular location on the surface 302 .
  • a visible light camera 400 may capture images of the projection area 106 on the surface 104 .
  • the visible light camera 400 may capture digital images at a frame rate of 30 fps, but this is not critical as other frame rates may be used.
  • the camera 400 may include a CCD or CMOS image sensor having an array of sensor pixels.
  • the camera 400 and surface 104 may be installed at particular locations so that the camera and surface are in a fixed spatial relationship with one another.
  • FIG. 4B shows an exemplary frame 402 captured by camera 400 .
  • the digital image 402 may have fixed dimensions.
  • the frame 402 may be 800 ⁇ 600 pixels.
  • pixels of images 402 captured by the camera 400 may be mapped to physical locations in the projection area 106 .
  • points “e” and “h” of the frame 402 may be mapped to points “a” and “d” of projected area 106 ( FIG. 2 ).
  • the camera 400 may capture projected controls 102 that are projected onto the projection area 106 by projector 100 . Accordingly, the spatial locations of projected controls 102 on a projection surface may be determined from the control's pixel coordinates in captured frame 402 .
  • the camera 400 may capture any physical objects within the projection area 106 . For example, a human user's hand or finger situated between the surface 104 and camera 400 may be captured by the camera 400 .
  • the spatial locations of physical objects situated between the surface 104 and camera 400 may also be determined from object's pixel coordinates in captured frame 402 .
  • FIG. 5 shows one example of the projector 100 and camera 400 positioned to one side of the surface 104 .
  • the projector 100 may be positioned so that the projector is centered with respect to the surface 104 .
  • the camera 400 be positioned so that the camera is centered with respect to the surface 104 .
  • the projector 100 may be positioned to one side of the surface 104 .
  • the camera 400 may be positioned to one side of the surface 104 .
  • the projector 100 and the camera 400 may be positioned on any side of the surface 104 , e.g., front, back, left, or right sides.
  • the projector 100 may be positioned to one side and the camera 400 may be positioned to a different side of the projection area.
  • a unit 502 may be coupled (wired or wirelessly) with the projector 100 and camera 400 according to one embodiment.
  • the control unit 502 may provide images to the projector 100 and receive frames 402 from the camera 400 .
  • the control unit 502 may include a memory.
  • the control unit 502 may perform operations for determining whether a user intends to provide input using a projected control.
  • the control unit 502 may generate a signal indicative of whether a user intends to provide input using a projected control.
  • the control unit 502 may execute instructions stored in a memory.
  • the control unit 502 may be a CPU, a digital signal processor (DSP), or another type of processor, or a state machine.
  • the control unit 502 may be formed on an integrated circuit (IC).
  • FIG. 5 also shows a processing unit 504 , which may be coupled with a memory 506 and an appliance control unit 508 .
  • the processing unit 504 may execute instructions stored in a memory.
  • the processing unit 504 may issue commands to the appliance control unit 508 .
  • the processing unit 504 may be a CPU, a digital signal processor (DSP), or another type of processor, or a state machine.
  • the processing unit 504 may be formed on an IC. Instructions or software executed by the processing unit 504 may enable it to perform known processing and communication operations. In addition, in one embodiment, the instructions or software enable the processing unit 504 to perform operations for determining whether a user intends to provide input using a projected control.
  • the appliance control unit 508 may be unit of an appliance may control the operation of an appliance.
  • control unit 502 provides a signal to the processing unit 504 indicative of a user's intent to provide input using a projected control.
  • the control unit 502 and processing unit 504 may be coupled with one another via wireless transceivers 510 , 512 .
  • the units 502 , 504 may be coupled by any other desired means, e.g., wire or optical fiber.
  • the processing unit 504 may cause an action in an appliance in response to receipt of a signal indicative of a user's intent to provide input using a projected control.
  • Projecting an image onto a surface at an angle results in distortion of the image's dimensions known as a “keystone effect.”
  • the projected image may be warped prior to projection in order to prevent or minimize keystone distortion.
  • capturing an image from a planar surface at an angle results in distortion of the image's dimensions similar to the “keystone effect.”
  • the captured image may be inverse-warped to prevent or minimize inverse-keystone distortion prior to determining the physical locations of objects in the captured image.
  • FIG. 6A shows a flowchart of a method 600 for determining whether a user intends to provide input using a projected control, e.g., to activate or deactivate the control.
  • a projected control e.g., to activate or deactivate the control.
  • one cell corresponding with a physical location of a projected control is repeatedly searched at periodic intervals. Searching may involve continuously scanning the pixels of a cell at periodic intervals, e.g., as each frame or two or more frames are captured. If activity is detected within the cell, it may be tentatively concluded that the user intends to provide input using a projected control. The conclusion may be treated as tentative because the activity may also be consistent with a lack of user intent to provide input.
  • the activity may be a “false positive.” If activity is detected within a cell, the method proceeds to operation 604 in which one or more validation tests may be conducted to determine whether the tentative conclusion should be treated as true. If activity is not detected within a cell, the searching of the cell at periodic intervals may continue.
  • FIG. 6B shows a flowchart of an alternative method 606 .
  • two or more cells of a frame are repeatedly searched for activity at periodic intervals. If activity is detected within one cell, the method proceeds to operation 610 where it is determined if activity is also detected in another of the two or more cells. If activity is also detected in another cell, the method returns to operation 608 .
  • the detection of activity in more than one cell may be inconsistent with an intent to select a single projected control. For example, a user's finger may be in motion, passing over a control within the sample time period. If activity is not detected in another cell, the method proceeds to operation 612 where one or more validation tests are performed to determine whether the tentative conclusion that the user intends to provide input using a projected control should be treated as true. Otherwise, searching of the two or more cells continues at periodic intervals.
  • a search for activity may include examining each pixel within a cell and classifying pixels as either skin-colored pixels and non-skin-colored pixels.
  • pixels of frame 402 are in an RGB color space and a pixel may be classified according to the following known test. Specifically, a pixel may be classified as skin-colored if the following conditions are true:
  • Pixels not satisfying the above conditions are classified as non-skin colored.
  • the pixel classifying test may classify a region of the finger occupied by a finger nail as non-skin-colored, however, this is not essential.
  • alternative tests for classifying pixels into skin-colored and non-skin-colored pixels may be used.
  • Alternative tests may operate on pixels in color spaces other than RGB, e.g., YUV.
  • a count of skin-colored pixels may be generated. If the number of pixels within a cell that are classified as skin-colored exceeds a particular threshold, then it may be tentatively concluded that the user's finger is within the boundaries of the cell and the user intends to provide input using the projected control.
  • the threshold may be 2400 pixels for a 60 ⁇ 60 pixel cell.
  • a search for activity e.g., operations 602 and 608 , may include any known edge detection method. If one or more edges between a non-skin colored region and a skin colored region are detected within a cell, it may be tentatively concluded that the user's finger is within the boundaries of the cell.
  • FIG. 7 is flow chart of an embodiment of a method 614 for performing validation tests.
  • the method 614 may include a valid time operation 618 , a valid shape operation 620 , and a valid height operation 622 .
  • operation 624 if all of the operations return a valid or confirming result, the tentatively conclusion that the user's finger is within the boundaries of the cell and the user intends to provide input using the projected control is confirmed.
  • operation 626 if any of the operations do not return a valid result, the tentative conclusion is not confirmed (operation 626 ).
  • either operation 604 or 612 may include the method 614 .
  • either operation 604 and 612 may include any one or more of the individual validation operations 618 , 620 , and 622 .
  • the valid time operation 618 determines if activity is present for a time period exceeding a threshold time interval. For example, if a sufficient number of skin-color pixels are present for a time period of 0.5 second or longer, then the operation returns a confirming result. On the other hand, if a sufficient number of skin-color pixels is present but for less than the time threshold, then the operation returns an invalid result and it may be concluded that activity is not present.
  • the valid shape operation 620 may include determining if the shape of the skin-colored pixel region matches a valid shape. If the detected shape matches a valid shape, then the operation returns a confirming result. On the other hand, if the detected shape fails to match a valid shape, then the operation returns an invalid result and it may be concluded that activity is not present.
  • FIG. 8 illustrates a finger overlaying an exemplary 30 ⁇ 30 mm cell.
  • the cell may be subdivided into two regions 800 , 802 . Any known edge-detection algorithm may be employed to determine edges of the skin-colored region. Two side edges of a finger may be determined, however, this is not critical. In one embodiment, one side or front edge may be determined.
  • first and second widths W 1 and W 2 of the skin-colored region are determined.
  • a valid shape may be any other desired shape, e.g., polygonal or elliptical.
  • the valid height operation 622 determines if the height of the skin-colored pixel region is less than a threshold height. If the detected height is less than the threshold height, then the operation returns a confirming result. On the other hand, if the detected height is greater than the threshold height, then the operation returns an invalid result and it may be concluded that activity is not present.
  • FIG. 9 illustrates an embodiment for determining if the detected height is less than a threshold height. In the shown embodiment, the projector 100 and the camera 400 are positioned to one side of the surface 104 . In addition, a second camera 900 is positioned to the same side of the surface 104 .
  • Each camera obtains frames that include an image of the cell, e.g., the camera 400 may capture a cell image 1000 and the camera 900 may capture a cell image 1002 , as shown in FIG. 10 .
  • Any known edge-detection algorithm may be employed to determine one or more edges for each of the skin-colored regions of the images 1000 and 1002 .
  • the valid height operation 622 may include determining a distance “C” between comparable edges of the images 1000 , 1002 . If the distance C is less than or equal to a particular offset distance, then the detected height is less than the threshold height and the operation 622 may return a confirming result. A low or zero detected height is consistent with a user's finger touching a surface displaying a projected control.
  • the touching of a control is consistent with a user intent to provide an input via the projected control.
  • the detected height is greater than the threshold height.
  • a detected height above a threshold height is consistent with a user's finger hovering above the surface displaying the projected control, which is inconsistent with a user intent to provide an input via the projected control.
  • the particular threshold height may be determined empirically.
  • the particular offset distance C may be determined from geometry.
  • FIG. 11 illustrates how elevation of an object captured by two cameras may be determined from the offset distance C according to the following expression:
  • the elevation of object 1100 above a projection surface 104 is “Y” and the elevation of the cameras 400 and 900 is “H.”
  • the distance between the cameras 108 and 110 is “L.” While FIG. 10 shows both cameras 400 and 900 to one side of the object 148 , expression (1) may also be used where cameras 400 and 900 are on opposite sides of the object 1100 .
  • FIGS. 6A and 6B show exemplary methods 600 and 606 for determining whether a user intends to provide input using a projected control.
  • FIG. 7 shows an exemplary method 614 for validating a determination that a user intends to provide input using a projected control.
  • each of the valid time operation 618 , valid shape operation 620 , and valid height operation 622 return a result confirming the determination of one of the methods 600 , 606 that a user intends to provide input using a projected control, then it is determined that a user input is detected (operation 624 ).
  • FIG. 12 shows images 1200 , 1202 , and 1204 of a particular cell captured by a camera, e.g., the camera 400 .
  • a human finger is situated between the surface 104 and camera 400 .
  • FIG. 13 illustrates operations in a method 1300 for determining whether a tentative conclusion that the user's finger is within the boundaries of the cell and that the user intends to provide input using the projected control should be confirmed.
  • either operation 604 or 612 may include the method 1300 .
  • either operation 604 and 612 may include the method 1300 separately or in addition to any one or more of the individual validation operations 618 , 620 , and 622 .
  • the operation 1302 may include examining each pixel within a cell and classifying each pixel as either a skin-colored pixel or a non-skin-colored pixel.
  • pixels of frame 402 may be received from camera 102 in a YUV color space or the received frame may be converted to a YUV color space, and a pixel may be classified as skin-colored according to the following test: If
  • the pixel may be considered skin color.
  • the area 1206 of image 1200 shows an area where pixels may be classified as skin-colored pixels.
  • the operation 1302 is not limited to the skin-color test set forth above. In other embodiments, any suitable alternative skin-color test may be used.
  • the operation 1304 may include examining each pixel within a cell and classifying each pixel as either a finger-nail-colored pixel or a non-finger-nail-colored pixel.
  • pixels of frame 402 may be classified according to the following known test: If
  • pixel is nail color.
  • the area 1208 of image 1202 shows an area where pixels may be classified as finger-nail-colored pixels.
  • the operation 1304 is not limited to the nail-color test set forth above. In other embodiments, any suitable alternative nail-color test may be used.
  • One advantage of using a nail-color test is that the variation in color of fingernails among humans of various races is believed to be smaller than the variation in color of skin.
  • the operation 1306 may include comparing the number of nail-colored pixels with a minimum threshold. This is operation is useful for detecting situations where a condition is present that may interfere with a test described below. For example, if the user is wearing nail polish, the counted number of nail-colored pixels is not likely to satisfy the minimum threshold. The wearing of nail polish may interfere with a test described below.
  • the operation 1308 may include comparing a count of the number of nail-colored pixels for a current frame with a corresponding count from a previous frame.
  • the previous frame may be any frame within a particular preceding time period, e.g., 1 ⁇ 2 second.
  • the count of fingernail colored pixels may change as a result of the finger being pressed against the surface 104 .
  • an area 1210 of the fingernail turns white when the finger is pressed against the surface 104 due to blood being forced out of part of the tissue under and adjacent to the fingernail.
  • the area 1210 may take a variety of shapes and sizes depending on the particular user, the area 1210 will be captured by the camera 400 as an area of generally white pixels not satisfying either of the classification tests for skin-colored or finger-nail-colored pixels. Accordingly, when the finger is pressed against the surface 104 , the count of finger-nail-colored pixels will be lower than at times when the finger is not pressed against the surface, e.g., the count of finger-nail-colored pixels in image 1202 will be greater than the count of finger-nail-colored pixels in image 1204 .
  • the operation 1310 may include determining whether a difference between a count of a current frame and a previous frame is greater than a threshold.
  • a pixel difference threshold may be 30 pixels.
  • the operation 1308 may determine if the number of white pixels in the fingernail region or in the cell exceeds a threshold. The presence of white pixels being due, as mentioned, to a portion of the fingernail turning white when the finger is pressed against a surface.
  • a user may wear fingernail polish, which may interfere with the classification of particular pixels as nail colored and the comparing of counts of nail-colored pixels.
  • the hands and fingers of all users have some degree of tremor, tremble, or involuntary shaking movement, i.e., a user's outstretched hand will generally have at least some tremble, the degree of tremble depending on the particular individual.
  • the tremor or tremble generally stops.
  • the operations 1312 and 1314 may be employed where it is difficult or impossible to compare of counts of nail-colored pixels.
  • the operation 1312 evaluates a region of interest 1400 comprised of a matrix of skin-colored and non-skin-colored pixels.
  • FIG. 14 illustrates an exemplary region of interest.
  • FIG. 14 also illustrates one exemplary row “R” of pixel locations and one exemplary column “C” of pixel locations of the matrix.
  • the Y value of non-skin-colored pixels may be set to zero.
  • the operation 1312 may scan and calculate a row sum for each row R of the region of interest. If the Y value for a pixel location is greater than a minimum value, e.g., 50, the column number of the location is added to the row sum. For example, if only the pixels in cells 10 , 11 , 12 , 13 , 14 and 15 of a particular row have Y values greater than 50, then the sum for the row is 75.
  • a minimum value e.g. 50
  • the operation 1312 may calculate a sum of the row sums (“grand total ⁇ row”). In addition, to calculating row sums, the operation 1312 may scan and calculate a column sum for each column C of the region of interest. The column sum may be calculated in a manner analogous to the method for calculating row sums.
  • the operation 1312 may include calculating a sum of the column sums (“grand total ⁇ column”) Additionally, the operation 1312 may determine a position metric for a frame by adding the grand total ⁇ row and the grand total ⁇ column sums. Further, the operation 1312 may include comparing a position metric for a first frame with a second frame. Equal-valued position metrics evidence lack of movement, while a difference between the position metrics of first and second frames indicates movement of an area of skin-colored pixels. A lack of movement of an area of skin-colored pixels within the region of interest is consistent with a user's finger contacting the projection surface. A user's finger in contact with the projection surface is consistent with a user intent to provide an input via the projected control.
  • the operation 1312 may include calculating a difference between the position metrics of first and second frames, and determining whether the position difference is greater than a threshold positional difference amount.
  • the operations 1312 and 1314 may be performed in addition to operations 1308 and 1310 to provide additional confirmation or confidence.
  • the operation 1312 may be performed subsequent to operation 1310 as shown by the dashed line in FIG. 13 .
  • FIG. 15 shows an exemplary method 1500 for combining validation test results.
  • operation 1502 one or more cells corresponding with a physical location of a projected control may be repeatedly searched. If activity is detected within the cell, it may be tentatively concluded that the user intends to provide input using a projected control.
  • operation 1504 it is determined if activity is also detected in another cell (in cases where two or more cells are searched). If activity is also detected in another cell, the method returns to operation 1502 .
  • the operations 1502 and 1504 may be the same as or similar to any corresponding operation of the operations 602 - 612 .
  • operations 1506 , 1508 , 1510 , 1512 , and 1514 may be invoked. Each of the operations 1506 - 1514 may independently determine whether a tentative conclusion should be confirmed and may return an indication of its determination.
  • the operations 1506 , 1508 , and 1510 correspond respectfully with the operations 618 , 620 , and 622 described above.
  • the operation 1512 corresponds with the operations 1302 , 1304 , 1306 , 1308 , and 1310 .
  • the operation 1514 corresponds with the operations 1302 , 1304 , 1306 , 1312 , and 1314 .
  • a decision operation 1516 receives the confirming/non-confirming indications of each of the operations 1506 - 1514 .
  • each indication is given one vote and a majority of votes determines whether to confirm the tentative conclusion that the user intends to use the projected control.
  • the operation 1512 may return an “abstaining” indication if the operation is unable to detect a sufficient number of fingernail-colored pixels.
  • the operation 1516 may include a decision based on a weighted polling of validation tests. The method 1300 provides the advantage that a group of tests will always outperform most of the individual tests. A further advantage of the method 1300 is that each of the tests is relatively inexpensive to implement.
  • projected controls include advantages such as the appearance of the control not being degraded (e.g., wearing down) with repeated physical contact, the appearance of the control being readily modifiable, and the ability to hide the control when it is not needed. While embodiments have been described in terms of detecting a user's intent to provide input to a projected user interface, it is not essential that the image of the control be a projected image. In one embodiment the one or more projected controls described in the various embodiments above may be replaced with a non-projected image on the surface 104 , such as a painted image, an engraved image, an image applied as a decal, label, or sticker, or other non-projected image.
  • any of the operations described in this specification that form part of the embodiments are useful machine operations.
  • some embodiments relate to a device or an apparatus specially constructed for performing these operations.
  • the embodiments may be employed in a general purpose computer selectively activated or configured by a computer program stored in the computer.
  • various general purpose computer systems may be used with computer programs written in accordance with the teachings herein. Accordingly, it should be understood that the embodiments may also be embodied as computer readable code on a computer readable medium.
  • a computer readable medium is any data storage device that can store data which can be thereafter read by a computer system.
  • Examples of the computer readable medium include, among other things, flash drives, floppy disks, memory cards, hard drives, RAMs, ROMs, EPROMs, compact disks, and magnetic tapes.
  • any method described above may be stored as a program of instructions on a computer readable medium.

Abstract

Apparatus and methods for determining whether a user intends to provide an input using an image of a control appearing on a surface. An apparatus may include a first camera to capture two or more images of the surface, and a unit to determine whether various conditions are true. A first condition is that a particular number of pixels classified as skin color are present within one cell of the two or more images. A cell has a location substantially coinciding with the image of the control. A second condition is that the pixels classified as skin color persist for at least a particular time period. A third condition is that the pixels classified as skin color have a first shape. Additional conditions are disclosed. A signal indicative of an intent of a user to provide an input may be provided if the each of the first, second, and third conditions are true.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • The present application claims the benefit under 35 USC Section 119(e) of U.S. Provisional Patent Application Ser. No. 61/325,088, filed Apr. 16, 2010, entitled “Projected User Interface.” The present application is based on and claims priority from this provisional application, the disclosure of which is hereby expressly incorporated herein by reference in its entirety.
  • FIELD
  • This application relates generally to detecting whether a user intends to provide an input to a system and in particular to detecting a user's intent to provide input to an image of a control.
  • BACKGROUND
  • The appearance of electro-mechanical and other types of switches, button, knobs, and controls tends to degrade with repeated use. In addition, the appearance of physical switches and controls is generally fixed so that modifying the language or an iconic image on a physical switch or control requires replacement of the control. Moreover, it is sometimes desired to hide controls when they are not needed. Generally, this is not possible with physical controls without introducing additional structure, such as a movable panel.
  • Physical switches, button, knobs, and controls are useful for detecting whether a user intends to provide an input to a system. There are problems with known techniques for detecting whether a user intends to provide an input where a physical control is not present. One problem is that known techniques are expensive. There is also a problem of “false positives.” A technique may infer that some activity indicates that a user intends to provide an input, but the activity may also sometimes be consistent with a lack of user intent to provide input. When the technique detects the activity and infers intent to provide input, but the user, in fact, does not intend to provide input, the technique provides a false positive. The problem of false positives tends to become more common when known techniques are employed in an inexpensive fashion. However, low cost is important. Accordingly, there is a need for low-cost, robust methods and apparatus for detecting user input provided to a projected user interface.
  • SUMMARY
  • One aspect is directed to an apparatus for determining whether a user intends to provide an input using an image of a control appearing on a surface. The apparatus may include a first camera to capture two or more images of the surface and a unit. The unit may determine whether first, second, and third conditions are true. The first condition being that a particular number of pixels classified as skin color are present within one cell of the two or more images, the cell having a location substantially coinciding with the image of the control. The second condition being that the pixels classified as skin color persist for at least a particular time period. The third condition being that the pixels classified as skin color have a first shape. The unit may provide a signal indicative of an intent of a user to provide an input if the each of the first, second, and third conditions are true.
  • In one embodiment, the apparatus may include a projector to project one or more user controls onto the surface. In addition, the camera and the surface may be in a fixed spatial relationship with one another.
  • In one embodiment, the apparatus may include a second camera to capture two or more images of the surface, the second camera being spaced apart from the first camera. The unit may determine whether a fourth condition is true, the fourth condition being that the pixels classified as skin color and having the first shape are within a first distance from the surface. The unit may provide a signal indicative of an intent of a user to provide an input if the each of the first, second, third and fourth conditions are true. In one alternative embodiment, the unit may provide a signal indicative of an intent of a user to provide an input if a majority of the first, second, third, and fourth conditions are true.
  • In one embodiment, the unit may determine whether a fifth condition is true, the fifth condition being that a particular number of pixels classified as finger-nail color are present within the cell of the two or more images, and that a first count of the pixels classified as finger-nail color at a first time is greater than a second count of the pixels classified as finger-nail color at a second time. The unit may provide a signal indicative of an intent of a user to provide an input if the each of the first, second, third, fourth, and fifth conditions are true.
  • In one embodiment, the unit may determine whether a sixth condition is true, the sixth condition being that a first position at a first time of the pixels classified as skin color is different from a second position at a second time of the pixels classified as skin color. The unit may provide a signal indicative of an intent of a user to provide an input if the each of the first, second, third, fourth, fifth, and sixth conditions are true. In one alternative embodiment, the unit may provide a signal indicative of an intent of a user to provide an input if a majority of the first, second, third, fourth, fifth, and sixth seventh conditions are true.
  • Embodiments are also directed to methods for determining whether a user intends to provide an input using an image of a control appearing on a surface.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIGS. 1A and 1B illustrate front and side views of a projector and a surface according to an embodiment.
  • FIG. 2 illustrates a plan view of the surface of FIGS. 1A and 1B.
  • FIGS. 3A and 3B illustrate projected controls, and FIG. 3C illustrates cells corresponding with the projected controls according to one embodiment.
  • FIG. 4A is a front side view of a camera and a surface according to an embodiment. FIG. 4B shows an exemplary frame captured by the camera.
  • FIG. 5 is a front view of a projector, a camera, and a surface according to an embodiment.
  • FIGS. 6A and 6B show flowcharts of methods for determining whether a user intends to provide input using a projected control according to embodiments.
  • FIG. 7 is flow chart of an embodiment of a method for performing validation tests according to an embodiment.
  • FIG. 8 depicts a finger overlaying an exemplary cell.
  • FIG. 9 is a front view of a projector, first and second cameras, and a surface according to an embodiment.
  • FIG. 10 depicts captured images of a finger overlaying an exemplary cell.
  • FIG. 11 illustrates first and second cameras, and a surface according to an embodiment.
  • FIG. 12 depicts captured images of a finger overlaying an exemplary cell.
  • FIG. 13 is a flowchart of a method according to an embodiment.
  • FIG. 14 depicts a finger overlaying an exemplary region of interest according to an embodiment.
  • FIG. 15 is a flowchart of a method according to an embodiment.
  • DETAILED DESCRIPTION
  • While embodiments may be described generally below, it will be appreciated that the principles and concepts described in this specification may be implemented in a wide variety of contexts, including controls for games, computers, kiosk, vending or other types of machines found in a home, office, or factory. In particular, principles and concepts described in this specification are applicable to home appliances, such as those found in the kitchen or laundry, and to entertainment devices found in the home, such as games and audio/video entertainment.
  • FIGS. 1A and 1B illustrate front and side views, respectively, of an embodiment in which a projector 100 projects a visible image onto a planar surface 104. FIG. 2 illustrates a plan view of the planar surface 104. The projected image includes a control 102 (“projected control”). The projected control 102 may be projected anywhere within a projection area 106 and may be projected so as to appear at a known location within the projection area. The projection area 106 may be rectangular, but this is not critical as other shapes may be used. In addition, the projector 100 may project two or more projected controls 102. The surface 104 may be made from any desired material and may be any desired color. The surface 104 may be a horizontal surface, such as countertop. In one embodiment, a horizontal surface 104 may be a table for playing games, such as card games. The surface 104 may also be a vertical surface, such as a wall or a side of an appliance. In one embodiment, a vertical surface 104 may be a side of a kiosk or vending machine. As additional examples, the surface 104 may be a cooktop or a side of a refrigerator. In one embodiment, the surface 104 may lie in a plane that is neither horizontal nor vertical. For example, the surface 104 may lie in a plane that makes an 80 degree angle with the horizontal. In one embodiment, the surface 104 may be non-planar. For example, the surface 104 may be concave or convex, or the projection area 106 part of the surface 104 may be, wholly or partially, concave or convex.
  • The projector 100 and surface 104 may be installed at particular locations so that the projector and surface are in a fixed spatial relationship with one another. The projection area 106 may have fixed dimensions and may appear at a fixed location on the surface 104. As one example, the projection area 106 may be 40×30 cm. In addition, the digital image that is input to the projector 100 and used by the projector to create a projected image may have fixed horizontal and vertical dimensions. As one example, the input image may be 800×600 pixels, and points “a” and “d” in FIG. 2 may correspond with pixel coordinates (0, 0) and (800, 600), respectively, of the input image. The input image may be mapped to the projection area 106. For example, 16 pixels of input image may be mapped to 1 cm of the projection area 106 on the surface 104. For every pixel of the input image, there is a coordinate location in the projection area 106 on the surface 104. In addition, the coordinate location of each pixel corresponds with a horizontal and vertical distance from a corner point such as point “a.” Accordingly, the projected control 102 may be projected to appear at a particular coordinate location in projection area 106 on the surface 104. The coordinate location of each pixel thus corresponds with a physical location and is a known distance from one of the corner points: a, b, c, and d, each of which is projected onto known locations on the surface 104.
  • As one example of a projected control and projection area, FIG. 3A shows a plurality of projected controls 300 projected onto projection area 304 of a cooktop surface 302 according to one embodiment. An image of a projected control may be any type of image and any desired size. For example, a projected control may be a button or a sliding switch. FIG. 3B shows exemplary projected controls 306, 308, 310, and 312 in projection area 304 for selecting a heating element 305, for increasing, and for decreasing the temperature of a selected element. Each control 306, 308, 310, and 312 is displayed within a cell. FIG. 3C shows cells 314, 316, 318, and 320 that correspond with the projected controls 306, 308, 310, and 312, respectively. The cells are not displayed on the cooktop; FIG. 3C merely shows the locations of the cells. In one embodiment, each cell is 60×60 pixels and corresponds with a 30×30 mm region of the surface 104, which is approximately 3/2 the width of an exemplary human finger. Each cell corresponds with a particular location on the surface 302.
  • Turning now to FIG. 4A, a visible light camera 400 may capture images of the projection area 106 on the surface 104. The visible light camera 400 may capture digital images at a frame rate of 30 fps, but this is not critical as other frame rates may be used. The camera 400 may include a CCD or CMOS image sensor having an array of sensor pixels. The camera 400 and surface 104 may be installed at particular locations so that the camera and surface are in a fixed spatial relationship with one another. FIG. 4B shows an exemplary frame 402 captured by camera 400. The digital image 402 may have fixed dimensions. For example, the frame 402 may be 800×600 pixels. Like the input image, pixels of images 402 captured by the camera 400 may be mapped to physical locations in the projection area 106. For example, points “e” and “h” of the frame 402 may be mapped to points “a” and “d” of projected area 106 (FIG. 2). The camera 400 may capture projected controls 102 that are projected onto the projection area 106 by projector 100. Accordingly, the spatial locations of projected controls 102 on a projection surface may be determined from the control's pixel coordinates in captured frame 402. In addition, the camera 400 may capture any physical objects within the projection area 106. For example, a human user's hand or finger situated between the surface 104 and camera 400 may be captured by the camera 400. The spatial locations of physical objects situated between the surface 104 and camera 400 may also be determined from object's pixel coordinates in captured frame 402.
  • FIG. 5 shows one example of the projector 100 and camera 400 positioned to one side of the surface 104. As this example shows, it is not essential that the projector 100 be positioned so that the projector is centered with respect to the surface 104. In addition, it is not essential that the camera 400 be positioned so that the camera is centered with respect to the surface 104. In alternative embodiments, the projector 100 may be positioned to one side of the surface 104. Further, in alternative embodiments, the camera 400 may be positioned to one side of the surface 104. The projector 100 and the camera 400 may be positioned on any side of the surface 104, e.g., front, back, left, or right sides. In addition, it is not critical that the projector 100 and camera 400 be positioned so that they are close to or adjacent one another. For example, the projector 100 may be positioned to one side and the camera 400 may be positioned to a different side of the projection area.
  • Still referring to FIG. 5, a unit 502 may be coupled (wired or wirelessly) with the projector 100 and camera 400 according to one embodiment. The control unit 502 may provide images to the projector 100 and receive frames 402 from the camera 400. The control unit 502 may include a memory. The control unit 502 may perform operations for determining whether a user intends to provide input using a projected control. The control unit 502 may generate a signal indicative of whether a user intends to provide input using a projected control. The control unit 502 may execute instructions stored in a memory. The control unit 502 may be a CPU, a digital signal processor (DSP), or another type of processor, or a state machine. The control unit 502 may be formed on an integrated circuit (IC).
  • FIG. 5 also shows a processing unit 504, which may be coupled with a memory 506 and an appliance control unit 508. The processing unit 504 may execute instructions stored in a memory. The processing unit 504 may issue commands to the appliance control unit 508. The processing unit 504 may be a CPU, a digital signal processor (DSP), or another type of processor, or a state machine. The processing unit 504 may be formed on an IC. Instructions or software executed by the processing unit 504 may enable it to perform known processing and communication operations. In addition, in one embodiment, the instructions or software enable the processing unit 504 to perform operations for determining whether a user intends to provide input using a projected control. The appliance control unit 508 may be unit of an appliance may control the operation of an appliance.
  • In one embodiment, the control unit 502 provides a signal to the processing unit 504 indicative of a user's intent to provide input using a projected control. The control unit 502 and processing unit 504 may be coupled with one another via wireless transceivers 510, 512. Alternatively, the units 502, 504 may be coupled by any other desired means, e.g., wire or optical fiber. The processing unit 504 may cause an action in an appliance in response to receipt of a signal indicative of a user's intent to provide input using a projected control.
  • Projecting an image onto a surface at an angle results in distortion of the image's dimensions known as a “keystone effect.” In embodiments where the projector 100 is positioned to one side of the surface 104, the projected image may be warped prior to projection in order to prevent or minimize keystone distortion. In addition, capturing an image from a planar surface at an angle results in distortion of the image's dimensions similar to the “keystone effect.” The captured image may be inverse-warped to prevent or minimize inverse-keystone distortion prior to determining the physical locations of objects in the captured image.
  • FIG. 6A shows a flowchart of a method 600 for determining whether a user intends to provide input using a projected control, e.g., to activate or deactivate the control. In operation 602, one cell corresponding with a physical location of a projected control is repeatedly searched at periodic intervals. Searching may involve continuously scanning the pixels of a cell at periodic intervals, e.g., as each frame or two or more frames are captured. If activity is detected within the cell, it may be tentatively concluded that the user intends to provide input using a projected control. The conclusion may be treated as tentative because the activity may also be consistent with a lack of user intent to provide input. In other words, the activity may be a “false positive.” If activity is detected within a cell, the method proceeds to operation 604 in which one or more validation tests may be conducted to determine whether the tentative conclusion should be treated as true. If activity is not detected within a cell, the searching of the cell at periodic intervals may continue.
  • FIG. 6B shows a flowchart of an alternative method 606. In operation 608, two or more cells of a frame are repeatedly searched for activity at periodic intervals. If activity is detected within one cell, the method proceeds to operation 610 where it is determined if activity is also detected in another of the two or more cells. If activity is also detected in another cell, the method returns to operation 608. The detection of activity in more than one cell may be inconsistent with an intent to select a single projected control. For example, a user's finger may be in motion, passing over a control within the sample time period. If activity is not detected in another cell, the method proceeds to operation 612 where one or more validation tests are performed to determine whether the tentative conclusion that the user intends to provide input using a projected control should be treated as true. Otherwise, searching of the two or more cells continues at periodic intervals.
  • A search for activity, e.g., operations 602 and 608, may include examining each pixel within a cell and classifying pixels as either skin-colored pixels and non-skin-colored pixels. In one embodiment, pixels of frame 402 are in an RGB color space and a pixel may be classified according to the following known test. Specifically, a pixel may be classified as skin-colored if the following conditions are true:

  • R>95,G>40,B>20, and  (1)

  • (max{R,G,B}−min{R,G,B})>15, and  (2)

  • |R−G|>15, and  (3)

  • R>G,R>B;  (4)

  • or

  • R>220,G>210,B>170, and  (1)

  • |R−G|≦15, and  (2)

  • R>B,G>B.  (3)
  • Pixels not satisfying the above conditions are classified as non-skin colored. The pixel classifying test may classify a region of the finger occupied by a finger nail as non-skin-colored, however, this is not essential. In alternative embodiments, alternative tests for classifying pixels into skin-colored and non-skin-colored pixels may be used. Alternative tests may operate on pixels in color spaces other than RGB, e.g., YUV. As each pixel within a cell is examined, a count of skin-colored pixels may be generated. If the number of pixels within a cell that are classified as skin-colored exceeds a particular threshold, then it may be tentatively concluded that the user's finger is within the boundaries of the cell and the user intends to provide input using the projected control. As one example of a threshold, the threshold may be 2400 pixels for a 60×60 pixel cell. In one alternative, a search for activity, e.g., operations 602 and 608, may include any known edge detection method. If one or more edges between a non-skin colored region and a skin colored region are detected within a cell, it may be tentatively concluded that the user's finger is within the boundaries of the cell.
  • FIG. 7 is flow chart of an embodiment of a method 614 for performing validation tests. The method 614 may include a valid time operation 618, a valid shape operation 620, and a valid height operation 622. In one embodiment, if all of the operations return a valid or confirming result, the tentatively conclusion that the user's finger is within the boundaries of the cell and the user intends to provide input using the projected control is confirmed (operation 624). On the other hand, if any of the operations do not return a valid result, the tentative conclusion is not confirmed (operation 626). In one embodiment, either operation 604 or 612 may include the method 614. Alternatively, either operation 604 and 612 may include any one or more of the individual validation operations 618, 620, and 622.
  • The valid time operation 618 determines if activity is present for a time period exceeding a threshold time interval. For example, if a sufficient number of skin-color pixels are present for a time period of 0.5 second or longer, then the operation returns a confirming result. On the other hand, if a sufficient number of skin-color pixels is present but for less than the time threshold, then the operation returns an invalid result and it may be concluded that activity is not present.
  • The valid shape operation 620 may include determining if the shape of the skin-colored pixel region matches a valid shape. If the detected shape matches a valid shape, then the operation returns a confirming result. On the other hand, if the detected shape fails to match a valid shape, then the operation returns an invalid result and it may be concluded that activity is not present. FIG. 8 illustrates a finger overlaying an exemplary 30×30 mm cell. For purposes of valid shape operation 620 the cell may be subdivided into two regions 800, 802. Any known edge-detection algorithm may be employed to determine edges of the skin-colored region. Two side edges of a finger may be determined, however, this is not critical. In one embodiment, one side or front edge may be determined. After edge determination, first and second widths W1 and W2 of the skin-colored region are determined. A valid shape may be rectangular and it may be concluded that the skin-colored region has a rectangular shape if W1 equals W2. It is not critical that W1=W2, and in one embodiment a confirming result may be returned if the absolute value of the difference between W1 and W2 is less than a particular margin (abs{W2−W1}<margin), e.g., 2.5 mm. In addition to rectangular, a valid shape may be any other desired shape, e.g., polygonal or elliptical.
  • Referring again to FIG. 7, the valid height operation 622 determines if the height of the skin-colored pixel region is less than a threshold height. If the detected height is less than the threshold height, then the operation returns a confirming result. On the other hand, if the detected height is greater than the threshold height, then the operation returns an invalid result and it may be concluded that activity is not present. FIG. 9 illustrates an embodiment for determining if the detected height is less than a threshold height. In the shown embodiment, the projector 100 and the camera 400 are positioned to one side of the surface 104. In addition, a second camera 900 is positioned to the same side of the surface 104. Each camera obtains frames that include an image of the cell, e.g., the camera 400 may capture a cell image 1000 and the camera 900 may capture a cell image 1002, as shown in FIG. 10. Any known edge-detection algorithm may be employed to determine one or more edges for each of the skin-colored regions of the images 1000 and 1002. The valid height operation 622 may include determining a distance “C” between comparable edges of the images 1000, 1002. If the distance C is less than or equal to a particular offset distance, then the detected height is less than the threshold height and the operation 622 may return a confirming result. A low or zero detected height is consistent with a user's finger touching a surface displaying a projected control. The touching of a control, in turn, is consistent with a user intent to provide an input via the projected control. On the other hand, if the distance C is greater than the offset distance, then the detected height is greater than the threshold height. A detected height above a threshold height is consistent with a user's finger hovering above the surface displaying the projected control, which is inconsistent with a user intent to provide an input via the projected control. The particular threshold height may be determined empirically. The particular offset distance C may be determined from geometry.
  • FIG. 11 illustrates how elevation of an object captured by two cameras may be determined from the offset distance C according to the following expression:
  • Y = C × H L + C ( 1 )
  • The elevation of object 1100 above a projection surface 104 is “Y” and the elevation of the cameras 400 and 900 is “H.” The distance between the cameras 108 and 110 is “L.” While FIG. 10 shows both cameras 400 and 900 to one side of the object 148, expression (1) may also be used where cameras 400 and 900 are on opposite sides of the object 1100.
  • To summarize, FIGS. 6A and 6B show exemplary methods 600 and 606 for determining whether a user intends to provide input using a projected control. FIG. 7 shows an exemplary method 614 for validating a determination that a user intends to provide input using a projected control. In one embodiment, if each of the valid time operation 618, valid shape operation 620, and valid height operation 622 return a result confirming the determination of one of the methods 600, 606 that a user intends to provide input using a projected control, then it is determined that a user input is detected (operation 624). On the other hand, if any of the operations 618, 620, and 622 do not confirm the determination of method 600 or 606, the it is determined that a user input is not detected (operation 626). These operations may be performed, in whole or part, by control unit 502, processing unit 504, or any other suitable unit.
  • Referring now to FIGS. 12 and 13, an alternative embodiment for performing validation tests is shown. FIG. 12 shows images 1200, 1202, and 1204 of a particular cell captured by a camera, e.g., the camera 400. In the images, a human finger is situated between the surface 104 and camera 400. FIG. 13 illustrates operations in a method 1300 for determining whether a tentative conclusion that the user's finger is within the boundaries of the cell and that the user intends to provide input using the projected control should be confirmed. In one embodiment, either operation 604 or 612 may include the method 1300. In addition, either operation 604 and 612 may include the method 1300 separately or in addition to any one or more of the individual validation operations 618, 620, and 622.
  • The operation 1302 may include examining each pixel within a cell and classifying each pixel as either a skin-colored pixel or a non-skin-colored pixel. In one embodiment, pixels of frame 402 may be received from camera 102 in a YUV color space or the received frame may be converted to a YUV color space, and a pixel may be classified as skin-colored according to the following test: If

  • 102<U<128,

  • 102<V<128,

  • 115<U+128<145,

  • 150<V+128<170, and

  • 100<Y<200,
  • the pixel may be considered skin color. The area 1206 of image 1200 shows an area where pixels may be classified as skin-colored pixels. The operation 1302 is not limited to the skin-color test set forth above. In other embodiments, any suitable alternative skin-color test may be used.
  • The operation 1304 may include examining each pixel within a cell and classifying each pixel as either a finger-nail-colored pixel or a non-finger-nail-colored pixel. In one embodiment, pixels of frame 402 may be classified according to the following known test: If

  • 102<U<128,

  • 102<V<128,

  • 105<U+128<145,

  • 158<V+128<170,

  • 100<Y<200,
  • pixel is nail color. The area 1208 of image 1202 shows an area where pixels may be classified as finger-nail-colored pixels. The operation 1304 is not limited to the nail-color test set forth above. In other embodiments, any suitable alternative nail-color test may be used. One advantage of using a nail-color test is that the variation in color of fingernails among humans of various races is believed to be smaller than the variation in color of skin.
  • The operation 1306 may include comparing the number of nail-colored pixels with a minimum threshold. This is operation is useful for detecting situations where a condition is present that may interfere with a test described below. For example, if the user is wearing nail polish, the counted number of nail-colored pixels is not likely to satisfy the minimum threshold. The wearing of nail polish may interfere with a test described below.
  • The operation 1308 may include comparing a count of the number of nail-colored pixels for a current frame with a corresponding count from a previous frame. The previous frame may be any frame within a particular preceding time period, e.g., ½ second. The count of fingernail colored pixels may change as a result of the finger being pressed against the surface 104. As shown in image 1204, an area 1210 of the fingernail turns white when the finger is pressed against the surface 104 due to blood being forced out of part of the tissue under and adjacent to the fingernail. While the area 1210 may take a variety of shapes and sizes depending on the particular user, the area 1210 will be captured by the camera 400 as an area of generally white pixels not satisfying either of the classification tests for skin-colored or finger-nail-colored pixels. Accordingly, when the finger is pressed against the surface 104, the count of finger-nail-colored pixels will be lower than at times when the finger is not pressed against the surface, e.g., the count of finger-nail-colored pixels in image 1202 will be greater than the count of finger-nail-colored pixels in image 1204. The operation 1310 may include determining whether a difference between a count of a current frame and a previous frame is greater than a threshold. As one example, a pixel difference threshold may be 30 pixels. In one alternative embodiment, the operation 1308 may determine if the number of white pixels in the fingernail region or in the cell exceeds a threshold. The presence of white pixels being due, as mentioned, to a portion of the fingernail turning white when the finger is pressed against a surface.
  • A user may wear fingernail polish, which may interfere with the classification of particular pixels as nail colored and the comparing of counts of nail-colored pixels. Generally speaking, the hands and fingers of all users have some degree of tremor, tremble, or involuntary shaking movement, i.e., a user's outstretched hand will generally have at least some tremble, the degree of tremble depending on the particular individual. However, when the hand or finger is placed against an object, the tremor or tremble generally stops. The operations 1312 and 1314 may be employed where it is difficult or impossible to compare of counts of nail-colored pixels. The operation 1312 evaluates a region of interest 1400 comprised of a matrix of skin-colored and non-skin-colored pixels. FIG. 14 illustrates an exemplary region of interest. FIG. 14 also illustrates one exemplary row “R” of pixel locations and one exemplary column “C” of pixel locations of the matrix. In one embodiment, the Y value of non-skin-colored pixels may be set to zero. The operation 1312 may scan and calculate a row sum for each row R of the region of interest. If the Y value for a pixel location is greater than a minimum value, e.g., 50, the column number of the location is added to the row sum. For example, if only the pixels in cells 10, 11, 12, 13, 14 and 15 of a particular row have Y values greater than 50, then the sum for the row is 75. As a second example, if only the pixels in cells 9, 10, 11, 12, 13, and 14 of a particular row have Y values greater than 50, then the sum for the row is 69. Thus, a lateral movement of the area of skin-colored pixels results in a change in a row sum. The change in row sum is proportional to the amount of movement. Further, the operation 1312 may calculate a sum of the row sums (“grand total−row”). In addition, to calculating row sums, the operation 1312 may scan and calculate a column sum for each column C of the region of interest. The column sum may be calculated in a manner analogous to the method for calculating row sums. Further, the operation 1312 may include calculating a sum of the column sums (“grand total−column”) Additionally, the operation 1312 may determine a position metric for a frame by adding the grand total−row and the grand total−column sums. Further, the operation 1312 may include comparing a position metric for a first frame with a second frame. Equal-valued position metrics evidence lack of movement, while a difference between the position metrics of first and second frames indicates movement of an area of skin-colored pixels. A lack of movement of an area of skin-colored pixels within the region of interest is consistent with a user's finger contacting the projection surface. A user's finger in contact with the projection surface is consistent with a user intent to provide an input via the projected control. Conversely, movement of an area of skin-colored pixels within the region of interest is consistent with a user's finger being in a position above the projection surface. A user's finger not contacting the projection surface is not consistent with a user intent to provide an input via the projected control. The operation 1312 may include calculating a difference between the position metrics of first and second frames, and determining whether the position difference is greater than a threshold positional difference amount.
  • In one embodiment, the operations 1312 and 1314 may be performed in addition to operations 1308 and 1310 to provide additional confirmation or confidence. For example, the operation 1312 may be performed subsequent to operation 1310 as shown by the dashed line in FIG. 13.
  • In one embodiment, two or more validation tests may be performed and the results combined to determine whether a tentative conclusion that a user intends to provide input using a projected control should be confirmed. FIG. 15 shows an exemplary method 1500 for combining validation test results. In operation 1502, one or more cells corresponding with a physical location of a projected control may be repeatedly searched. If activity is detected within the cell, it may be tentatively concluded that the user intends to provide input using a projected control. In optional operation 1504, it is determined if activity is also detected in another cell (in cases where two or more cells are searched). If activity is also detected in another cell, the method returns to operation 1502. The operations 1502 and 1504 may be the same as or similar to any corresponding operation of the operations 602-612.
  • If activity is detected within one cell, operations 1506, 1508, 1510, 1512, and 1514 may be invoked. Each of the operations 1506-1514 may independently determine whether a tentative conclusion should be confirmed and may return an indication of its determination. The operations 1506, 1508, and 1510 correspond respectfully with the operations 618, 620, and 622 described above. In addition, the operation 1512 corresponds with the operations 1302, 1304, 1306, 1308, and 1310. Further, the operation 1514 corresponds with the operations 1302, 1304, 1306, 1312, and 1314. A decision operation 1516 receives the confirming/non-confirming indications of each of the operations 1506-1514. In one embodiment, each indication is given one vote and a majority of votes determines whether to confirm the tentative conclusion that the user intends to use the projected control. In one embodiment, the operation 1512 may return an “abstaining” indication if the operation is unable to detect a sufficient number of fingernail-colored pixels. In alternative embodiments, the operation 1516 may include a decision based on a weighted polling of validation tests. The method 1300 provides the advantage that a group of tests will always outperform most of the individual tests. A further advantage of the method 1300 is that each of the tests is relatively inexpensive to implement.
  • The use of projected controls include advantages such as the appearance of the control not being degraded (e.g., wearing down) with repeated physical contact, the appearance of the control being readily modifiable, and the ability to hide the control when it is not needed. While embodiments have been described in terms of detecting a user's intent to provide input to a projected user interface, it is not essential that the image of the control be a projected image. In one embodiment the one or more projected controls described in the various embodiments above may be replaced with a non-projected image on the surface 104, such as a painted image, an engraved image, an image applied as a decal, label, or sticker, or other non-projected image.
  • The methods and their variations described above may be implemented in hardware, software, or in a combination of hardware and software. Software for implementing all or part of any method described above may stored in any suitable memory for execution by a control unit 502 or processing unit 504.
  • It should be understood that the embodiments described above may employ various computer-implemented operations involving data stored in computer systems. These operations are those requiring physical manipulation of physical quantities. Usually, though not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated. Further, the manipulations performed may be referred to in terms, such as producing, identifying, determining, or comparing.
  • Any of the operations described in this specification that form part of the embodiments are useful machine operations. As described above, some embodiments relate to a device or an apparatus specially constructed for performing these operations. It should be appreciated, however, that the embodiments may be employed in a general purpose computer selectively activated or configured by a computer program stored in the computer. In particular, various general purpose computer systems may be used with computer programs written in accordance with the teachings herein. Accordingly, it should be understood that the embodiments may also be embodied as computer readable code on a computer readable medium.
  • A computer readable medium is any data storage device that can store data which can be thereafter read by a computer system. Examples of the computer readable medium include, among other things, flash drives, floppy disks, memory cards, hard drives, RAMs, ROMs, EPROMs, compact disks, and magnetic tapes. In one embodiment, any method described above may be stored as a program of instructions on a computer readable medium.
  • Although the present invention has been fully described by way of the embodiments described in this specification with reference to the accompanying drawings, various changes and modifications will be apparent to those having skill in this field. Therefore, unless these changes and modifications depart from the scope of the present invention, they should be construed as being included in this specification.

Claims (16)

1. An apparatus for determining whether a user intends to provide an input using an image of a control appearing on a surface, comprising:
a first camera to capture two or more images of the surface; and
a unit to determine whether first, second, and third conditions are true, the first condition being that a particular number of pixels classified as skin color are present within one cell of the two or more images, the cell having a location substantially coinciding with the image of the control, the second condition being that the pixels classified as skin color persist for at least a particular time period, the third condition being that the pixels classified as skin color have a first shape, and to provide a signal indicative of an intent of a user to provide an input if the each of the first, second, and third conditions are true.
2. The apparatus of claim 1, further comprising a projector to project one or more user controls onto the surface.
3. The apparatus of claim 1, wherein the camera and the surface are in a fixed spatial relationship with one another.
4. The apparatus of claim 1, further comprising:
a second camera to capture two or more images of the surface, the second camera being spaced apart from the first camera; and wherein
the unit determines whether a fourth condition is true, the fourth condition being that the pixels classified as skin color and having the first shape are within a first distance from the surface, and the unit provides a signal indicative of an intent of a user to provide an input if the each of the first, second, third and fourth conditions are true.
5. The apparatus of claim 4, wherein the unit provides a signal indicative of an intent of a user to provide an input if a majority of the first, second, third, and fourth conditions are true.
6. The apparatus of claim 4, wherein the unit determines whether a fifth condition is true, the fifth condition being that a particular number of pixels classified as finger-nail color are present within the cell of the two or more images, and that a first count of the pixels classified as finger-nail color at a first time is greater than a second count of the pixels classified as finger-nail color at a second time, and the unit provides a signal indicative of an intent of a user to provide an input if the each of the first, second, third, fourth, and fifth conditions are true.
7. The apparatus of claim 6, wherein the unit determines whether a sixth condition is true, the sixth condition being that a first position at a first time of the pixels classified as skin color is different from a second position at a second time of the pixels classified as skin color, and the unit provides a signal indicative of an intent of a user to provide an input if the each of the first, second, third, fourth, fifth, and sixth conditions are true.
8. The apparatus of claim 7, wherein the unit provides a signal indicative of an intent of a user to provide an input if a majority of the first, second, third, fourth, fifth, and sixth seventh conditions are true.
9. A method for determining whether a user intends to provide an input using an image of a control appearing on a surface, comprising:
capturing two or more images of the surface;
determining whether first, second, and third conditions are true, the first condition being that a particular number of pixels classified as skin color are present within one cell of the two or more images, the cell having a location substantially coinciding with the image of the control, the second condition being that the pixels classified as skin color persist for at least a particular time period, the third condition being that the pixels classified as skin color have a first shape; and
providing a signal indicative of an intent of a user to provide an input if the each of the first, second, and third conditions are true.
10. The method of claim 9, further comprising projecting one or more user controls onto the surface.
11. The method of claim 9, maintaining the camera and the surface in a fixed spatial relationship with one another.
12. The method of claim 9, further comprising:
capturing two or more images of the surface from a second perspective; and
determining whether a fourth condition is true, the fourth condition being that the pixels classified as skin color and having the first shape are within a first distance from the surface;
wherein the signal indicative of an intent of a user to provide an input is provided if the each of the first, second, third and fourth conditions are true.
13. The method of claim 12, wherein the signal indicative of an intent of a user to provide an input is provided if a majority of the first, second, third, and fourth conditions are true.
14. The method of claim 12, further comprising:
determining whether a fifth condition is true, the fifth condition being that a particular number of pixels classified as finger-nail color are present within the cell of the two or more images; and
determining that a first count of the pixels classified as finger-nail color at a first time is greater than a second count of the pixels classified as finger-nail color at a second time;
wherein the signal indicative of an intent of a user to provide an input is provided if the each of the first, second, third, fourth, and fifth conditions are true.
15. The method of claim 14, further comprising:
determining whether a sixth condition is true, the sixth condition being that a first position at a first time of the pixels classified as skin color is different from a second position at a second time of the pixels classified as skin color;
wherein the signal indicative of an intent of a user to provide an input is provided if the each of the first, second, third, fourth, fifth, and sixth conditions are true.
16. The method of claim 15, wherein the signal indicative of an intent of a user to provide an input is provided if a majority of the first, second, third, fourth, fifth, and sixth seventh conditions are true.
US13/018,549 2010-04-16 2011-02-01 Detecting User Input Provided To A Projected User Interface Abandoned US20110254939A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US13/018,549 US20110254939A1 (en) 2010-04-16 2011-02-01 Detecting User Input Provided To A Projected User Interface
US14/087,479 US9582070B2 (en) 2010-04-16 2013-11-22 Detecting user input provided to a projected user interface

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US32508810P 2010-04-16 2010-04-16
US13/018,549 US20110254939A1 (en) 2010-04-16 2011-02-01 Detecting User Input Provided To A Projected User Interface

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US14/087,479 Continuation US9582070B2 (en) 2010-04-16 2013-11-22 Detecting user input provided to a projected user interface

Publications (1)

Publication Number Publication Date
US20110254939A1 true US20110254939A1 (en) 2011-10-20

Family

ID=44787440

Family Applications (3)

Application Number Title Priority Date Filing Date
US13/018,549 Abandoned US20110254939A1 (en) 2010-04-16 2011-02-01 Detecting User Input Provided To A Projected User Interface
US13/034,954 Abandoned US20110253693A1 (en) 2010-04-16 2011-02-25 Monitoring And Controlling A Cooking Environment
US14/087,479 Active 2032-05-08 US9582070B2 (en) 2010-04-16 2013-11-22 Detecting user input provided to a projected user interface

Family Applications After (2)

Application Number Title Priority Date Filing Date
US13/034,954 Abandoned US20110253693A1 (en) 2010-04-16 2011-02-25 Monitoring And Controlling A Cooking Environment
US14/087,479 Active 2032-05-08 US9582070B2 (en) 2010-04-16 2013-11-22 Detecting user input provided to a projected user interface

Country Status (1)

Country Link
US (3) US20110254939A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150089453A1 (en) * 2013-09-25 2015-03-26 Aquifi, Inc. Systems and Methods for Interacting with a Projected User Interface
US20150370441A1 (en) * 2014-06-23 2015-12-24 Infosys Limited Methods, systems and computer-readable media for converting a surface to a touch surface
CN108253484A (en) * 2018-01-02 2018-07-06 广东美的厨房电器制造有限公司 Range hood and its control device and control method
US20210392306A1 (en) * 2019-10-01 2021-12-16 Panasonic Intellectual Property Management Co., Ltd. Heating cooker

Families Citing this family (128)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8837465B2 (en) 2008-04-02 2014-09-16 Twilio, Inc. System and method for processing telephony sessions
WO2009124223A1 (en) 2008-04-02 2009-10-08 Twilio Inc. System and method for processing telephony sessions
EP2149755B1 (en) * 2008-07-30 2012-12-05 Electrolux Home Products Corporation N.V. Oven and method of operating the same
WO2010040010A1 (en) 2008-10-01 2010-04-08 Twilio Inc Telephony web event system and method
US8976158B2 (en) 2009-02-15 2015-03-10 Neonode Inc. User interface for white goods and associated multi-channel proximity sensors
WO2010101935A1 (en) 2009-03-02 2010-09-10 Twilio Inc. Method and system for a multitenancy telephone network
US9210275B2 (en) 2009-10-07 2015-12-08 Twilio, Inc. System and method for running a multi-module telephony application
CN101940433B (en) * 2010-02-22 2012-10-17 谢国华 Electric heating kettle with weighting function and weighting control method thereof
US9930297B2 (en) 2010-04-30 2018-03-27 Becton, Dickinson And Company System and method for acquiring images of medication preparations
US9338064B2 (en) 2010-06-23 2016-05-10 Twilio, Inc. System and method for managing a computing cluster
US20120208495A1 (en) 2010-06-23 2012-08-16 Twilio, Inc. System and method for monitoring account usage on a platform
US9459926B2 (en) 2010-06-23 2016-10-04 Twilio, Inc. System and method for managing a computing cluster
US9459925B2 (en) 2010-06-23 2016-10-04 Twilio, Inc. System and method for managing a computing cluster
US8838707B2 (en) 2010-06-25 2014-09-16 Twilio, Inc. System and method for enabling real-time eventing
US8649268B2 (en) 2011-02-04 2014-02-11 Twilio, Inc. Method for processing telephony sessions of a network
US9398622B2 (en) 2011-05-23 2016-07-19 Twilio, Inc. System and method for connecting a communication to a client
US9648006B2 (en) 2011-05-23 2017-05-09 Twilio, Inc. System and method for communicating with a client application
US20140044123A1 (en) 2011-05-23 2014-02-13 Twilio, Inc. System and method for real time communicating with a client application
AU2012272554A1 (en) * 2011-06-21 2013-12-19 Icookit Pty Ltd System for automating cooking steps
CN102396999A (en) * 2011-09-02 2012-04-04 溢锋设计有限公司 Boiler with radiofrequency detection function and boiler system
WO2013044138A1 (en) 2011-09-21 2013-03-28 Twilio, Inc. System and method for authorizing and connecting application developers and users
US10182147B2 (en) 2011-09-21 2019-01-15 Twilio Inc. System and method for determining and communicating presence information
US9495227B2 (en) 2012-02-10 2016-11-15 Twilio, Inc. System and method for managing concurrent events
US20130269537A1 (en) 2012-04-16 2013-10-17 Eugenio Minvielle Conditioning system for nutritional substances
US20130269538A1 (en) 2012-04-16 2013-10-17 Eugenio Minvielle Transformation system for nutritional substances
US9429920B2 (en) 2012-04-16 2016-08-30 Eugenio Minvielle Instructions for conditioning nutritional substances
US20140069838A1 (en) 2012-04-16 2014-03-13 Eugenio Minvielle Nutritional Substance Label System For Adaptive Conditioning
US9460633B2 (en) 2012-04-16 2016-10-04 Eugenio Minvielle Conditioner with sensors for nutritional substances
US9702858B1 (en) 2012-04-16 2017-07-11 Iceberg Luxembourg S.A.R.L. Dynamic recipe control
US9602586B2 (en) 2012-05-09 2017-03-21 Twilio, Inc. System and method for managing media in a distributed communication network
US9240941B2 (en) 2012-05-09 2016-01-19 Twilio, Inc. System and method for managing media in a distributed communication network
US20130304928A1 (en) 2012-05-09 2013-11-14 Twilio, Inc. System and method for managing latency in a distributed telephony network
WO2014018168A1 (en) * 2012-06-07 2014-01-30 Oy Halton Group Ltd. Fire suppression systems, devices, and methods
US9247062B2 (en) 2012-06-19 2016-01-26 Twilio, Inc. System and method for queuing a communication session
US8737962B2 (en) 2012-07-24 2014-05-27 Twilio, Inc. Method and system for preventing illicit use of a telephony platform
US8948356B2 (en) 2012-10-15 2015-02-03 Twilio, Inc. System and method for routing communications
US8938053B2 (en) 2012-10-15 2015-01-20 Twilio, Inc. System and method for triggering on platform usage
JP5758368B2 (en) * 2012-10-24 2015-08-05 三菱電機株式会社 Heating cooker, cooking utensil, and heating cooking system
US9694398B2 (en) * 2012-10-31 2017-07-04 Honeywell International Inc. Controlling a fume hood airflow using an image of a fume hood opening
EP2921031A1 (en) * 2012-11-14 2015-09-23 Arçelik Anonim Sirketi A food preparation appliance operated on an induction heating cooktop
US9729211B2 (en) * 2012-11-19 2017-08-08 Bose Corporation Proximity based wireless audio connection
US9253254B2 (en) 2013-01-14 2016-02-02 Twilio, Inc. System and method for offering a multi-partner delegated platform
US9282124B2 (en) 2013-03-14 2016-03-08 Twilio, Inc. System and method for integrating session initiation protocol communication in a telecommunications platform
US20160037967A1 (en) * 2013-03-15 2016-02-11 Carrier Commerical Refregeration, Inc. Grill with active plate leveling control
US20160022091A1 (en) * 2013-03-15 2016-01-28 Carrier Commercial Refrigeration, Inc. Grill with active plate leveling control
WO2014160908A2 (en) * 2013-03-29 2014-10-02 Neonode Inc. User interface for white goods and associated multi-channel proximity sensors
US10070484B2 (en) * 2013-04-11 2018-09-04 Colorado State University Research Foundation Apparatus, system, and method for a heating surface having a selectable shape, size, location, and heat intensity
US10281156B2 (en) * 2013-04-23 2019-05-07 Alto-Shaam, Inc. Zero clearance combination oven
US9338280B2 (en) 2013-06-19 2016-05-10 Twilio, Inc. System and method for managing telephony endpoint inventory
US9240966B2 (en) 2013-06-19 2016-01-19 Twilio, Inc. System and method for transmitting and receiving media messages
US9225840B2 (en) 2013-06-19 2015-12-29 Twilio, Inc. System and method for providing a communication endpoint information service
US9439535B1 (en) * 2013-07-05 2016-09-13 Star Manufacturing International Inc. Roller grill temperature control assemblies and methods for cooking human food
US9483328B2 (en) 2013-07-19 2016-11-01 Twilio, Inc. System and method for delivering application content
US9274858B2 (en) 2013-09-17 2016-03-01 Twilio, Inc. System and method for tagging and tracking events of an application platform
US9137127B2 (en) 2013-09-17 2015-09-15 Twilio, Inc. System and method for providing communication platform metadata
TWI536206B (en) * 2013-11-05 2016-06-01 緯創資通股份有限公司 Locating method, locating device, depth determining method and depth determining device of operating body
US9325624B2 (en) 2013-11-12 2016-04-26 Twilio, Inc. System and method for enabling dynamic multi-modal communication
US9553799B2 (en) 2013-11-12 2017-01-24 Twilio, Inc. System and method for client communication in a distributed telephony network
US20150208845A1 (en) * 2014-01-27 2015-07-30 CircuitLab, Inc. Apparatus for cooking and methods
DK2908601T3 (en) * 2014-02-17 2017-02-13 Przed Produkcyjno Uslugowo- Handlowe Geco Spólka Z O O AUTOMATIC KITCHEN EQUIPMENT AND THE METHOD OF AUTOMATIC CONTROL OF THE COOKING PROCESS
US9752786B2 (en) 2014-03-12 2017-09-05 Haier Us Appliance Solutions, Inc. Sensing system for a cooktop appliance with airflow protected sensor
US9528710B2 (en) 2014-03-12 2016-12-27 Haier U.S. Appliance Solutions, Inc. Sensing system for a cooktop appliance with airflow protected sensor
US9344573B2 (en) 2014-03-14 2016-05-17 Twilio, Inc. System and method for a work distribution service
WO2015157229A1 (en) * 2014-04-07 2015-10-15 Rober Mark Braxton Microwave oven with thermal imaging temperature display and control
US9330469B2 (en) * 2014-04-08 2016-05-03 General Electric Company Systems and methods for boil detection
US9226217B2 (en) 2014-04-17 2015-12-29 Twilio, Inc. System and method for enabling multi-modal communication
US9412048B2 (en) * 2014-04-21 2016-08-09 Haier Us Appliance Solutions, Inc. Systems and methods for cookware detection
US9449220B2 (en) * 2014-04-21 2016-09-20 Haier Us Appliance Solutions, Inc. Systems and methods for cookware detection
US20150334785A1 (en) * 2014-05-15 2015-11-19 Cooktek Induction Systems, Llc Menu-based cooking appliance
US9292746B2 (en) * 2014-05-20 2016-03-22 Broan-Nutone Llc Automated emissions capture determination
KR102208568B1 (en) * 2014-05-30 2021-01-28 삼성전자주식회사 Cocking apparatus
US10845774B2 (en) * 2014-06-12 2020-11-24 SmartyPlans, Inc. Cooking device operable to sense an ingredient characteristic and a cooking environment
US9246694B1 (en) 2014-07-07 2016-01-26 Twilio, Inc. System and method for managing conferencing in a distributed communication network
US9251371B2 (en) 2014-07-07 2016-02-02 Twilio, Inc. Method and system for applying data retention policies in a computing platform
US9774687B2 (en) 2014-07-07 2017-09-26 Twilio, Inc. System and method for managing media and signaling in a communication platform
US9516101B2 (en) 2014-07-07 2016-12-06 Twilio, Inc. System and method for collecting feedback in a multi-tenant communication platform
KR20160012849A (en) * 2014-07-26 2016-02-03 서원영 Intelligent electric range
WO2016028921A1 (en) * 2014-08-19 2016-02-25 Meyer Intellectual Properties Limited Automated cooking control via enhanced cooking equipment
WO2016065080A1 (en) 2014-10-21 2016-04-28 Twilio, Inc. System and method for providing a miro-services communication platform
DE102014119315A1 (en) * 2014-12-22 2016-06-23 Vorwerk & Co. Interholding Gmbh Food processor, especially blender
US11064570B2 (en) * 2015-01-28 2021-07-13 Samsung Electronics Co., Ltd. Cooking appliance and method for controlling the same
US9477975B2 (en) 2015-02-03 2016-10-25 Twilio, Inc. System and method for a media intelligence platform
CN107613819B (en) * 2015-03-12 2021-04-06 维他拌管理有限公司 Display system for a blending system
JP2018517532A (en) 2015-03-27 2018-07-05 パラシャント チョーダリー, Autonomous cooking device for preparing food from recipe file and method for creating recipe file
US10739013B2 (en) 2015-05-05 2020-08-11 June Life, Inc. Tailored food preparation with an oven
WO2016179424A1 (en) 2015-05-05 2016-11-10 June Life, Inc. Connected food preparation system and method of use
US9948703B2 (en) 2015-05-14 2018-04-17 Twilio, Inc. System and method for signaling through data storage
US10419891B2 (en) 2015-05-14 2019-09-17 Twilio, Inc. System and method for communicating through multiple endpoints
EP3172996B1 (en) * 2015-11-30 2021-01-13 Whirlpool Corporation Cooking system
WO2017093787A1 (en) * 2015-12-02 2017-06-08 Tsika Mahouele Amivalora Systeme Mapfa c 1.0
ES2618798B1 (en) * 2015-12-17 2018-04-06 Bsh Electrodomésticos España, S.A. Home appliance system with cooking range and functional unit
US20170208825A1 (en) * 2016-01-21 2017-07-27 Alan Backus Gaseous transfer device
US10659349B2 (en) 2016-02-04 2020-05-19 Twilio Inc. Systems and methods for providing secure network exchanged for a multitenant virtual private cloud
AU2017220091A1 (en) * 2016-02-18 2018-08-30 Meyer Intellectual Properties Limited Auxiliary button for a cooking system
US11766151B2 (en) 2016-02-18 2023-09-26 Meyer Intellectual Properties Ltd. Cooking system with error detection
CN107296507A (en) * 2016-04-15 2017-10-27 松下知识产权经营株式会社 Cook householder method and cook accessory system
US20190101434A1 (en) * 2016-04-20 2019-04-04 Vorwerk & Co. Interholding Gmbh System and Method for Automatically Determining the Weight of Foods
US10063713B2 (en) 2016-05-23 2018-08-28 Twilio Inc. System and method for programmatic device connectivity
US10686902B2 (en) 2016-05-23 2020-06-16 Twilio Inc. System and method for a multi-channel notification service
JP6735513B2 (en) * 2016-06-15 2020-08-05 パナソニックIpマネジメント株式会社 Cooking support method and cooking support system
GB2552972B (en) * 2016-08-16 2020-01-15 Richards Morphy N I Ltd Induction cooking method and apparatus
US10251512B2 (en) * 2016-10-27 2019-04-09 Frank Nudo Portable warming device
WO2018178155A1 (en) 2017-04-01 2018-10-04 Koninklijke Philips N.V. Sensing and control device and method for a weight measurement device
WO2018202293A1 (en) * 2017-05-03 2018-11-08 Arcelik Anonim Sirketi Cooking appliance with improved flexibility and energy efficiency
DE102017209841A1 (en) * 2017-06-12 2018-12-13 BSH Hausgeräte GmbH Display system, extractor and method for displaying at least one state on a hob
KR102372170B1 (en) * 2017-06-26 2022-03-08 삼성전자주식회사 Range hood and control method of thereof
WO2019012643A1 (en) * 2017-07-13 2019-01-17 三菱電機株式会社 Non-contact power supply system
US11131502B2 (en) * 2017-08-14 2021-09-28 Ut-Battelle, Llc Heating system with induction power supply and electromagnetic acoustic transducer with induction power supply
US10936125B2 (en) * 2017-10-23 2021-03-02 Haier Us Appliance Solutions, Inc. Capacitive touch sensors and methods of operating capacitive touch sensors
TWI672102B (en) * 2017-11-16 2019-09-21 志勇無限創意有限公司 Assisting appatatus for bean roasting and bean roasting appatatus
CN107908144B (en) * 2017-11-24 2020-09-18 北京小米移动软件有限公司 Method and device for controlling smoke extractor and storage medium
IT201800000794A1 (en) * 2018-01-12 2019-07-12 Elica Spa Hob with integrated extractor hood and scale
US11116050B1 (en) 2018-02-08 2021-09-07 June Life, Inc. High heat in-situ camera systems and operation methods
KR102506666B1 (en) * 2018-02-19 2023-03-07 삼성전자주식회사 Microwave, display device and cooking system including the same
IT201800004471A1 (en) * 2018-04-13 2019-10-13 MULTIFUNCTIONAL SUSPENDED HOOD FOR DOMESTIC EXTRACTION WITH BUILT-IN PROJECTOR-IMAGE DETECTOR
US10782026B2 (en) * 2018-05-09 2020-09-22 Takisha Schulterbrandt Appparatus and method for positioning a cooking instrument
EP3667173B1 (en) * 2018-12-12 2023-06-07 Electrolux Appliances Aktiebolag Food preparation entity
ES2766998A1 (en) * 2018-12-13 2020-06-15 Bsh Electrodomesticos Espana Sa COOKING SYSTEM (Machine-translation by Google Translate, not legally binding)
US10747371B1 (en) 2019-06-28 2020-08-18 Konica Minolta Business Solutions U.S.A., Inc. Detection of finger press from live video stream
US11382349B2 (en) * 2019-07-18 2022-07-12 Grand Mate Co., Ltd. Coffee bean roaster
US11112121B2 (en) * 2019-10-04 2021-09-07 Haier Us Appliance Solutions, Inc. Cooking engagement system with automatic cooktop monitoring
US11536458B2 (en) * 2019-11-11 2022-12-27 Haier Us Appliance Solutions, Inc. User engagement system and methods for adjusting a cooking recipe based on an identified utensil
WO2021195622A1 (en) 2020-03-27 2021-09-30 June Life, Inc. System and method for classification of ambiguous objects
US11727682B2 (en) * 2020-11-16 2023-08-15 Haier Us Appliance Solutions, Inc. Lid detection method for an over-the-range appliance
US20220296037A1 (en) * 2021-03-19 2022-09-22 Haier Us Appliance Solutions, Inc. System and method for detecting food items and managing cooking timers on a cooktop appliance
US11838144B2 (en) 2022-01-13 2023-12-05 Whirlpool Corporation Assisted cooking calibration optimizer
DE102022101273A1 (en) * 2022-01-20 2023-07-20 Berbel Ablufttechnik Gmbh Kitchen system having a cooktop and an overhead console and method using the kitchen system
US20230233019A1 (en) * 2022-01-21 2023-07-27 Samsung Electronics Company, Ltd. Systems and Methods for Real-Time Monitoring of Boiling Fluid for Food Processing Assistance

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009031633A1 (en) * 2007-09-04 2009-03-12 Canon Kabushiki Kaisha Image projection apparatus and control method for same
US20100231522A1 (en) * 2005-02-23 2010-09-16 Zienon, Llc Method and apparatus for data entry input
US20110187678A1 (en) * 2010-01-29 2011-08-04 Tyco Electronics Corporation Touch system using optical components to image multiple fields of view on an image sensor
US8355545B2 (en) * 2007-04-10 2013-01-15 Lumidigm, Inc. Biometric detection using spatial, temporal, and/or spectral techniques

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5661667A (en) 1994-03-14 1997-08-26 Virtek Vision Corp. 3D imaging using a laser projector
US8228305B2 (en) 1995-06-29 2012-07-24 Apple Inc. Method for providing human input to a computer
DE19940123A1 (en) 1999-08-24 2001-03-01 Bsh Bosch Siemens Hausgeraete Control or regulating device of a cooker
US6140617A (en) 1999-10-22 2000-10-31 General Electric Company Cooktop control and monitoring system including detecting properties of a utensil through a solid-surface cooktop
US6583392B2 (en) 2000-12-22 2003-06-24 General Electric Company Apparatus and method for determining properties of a cooktop using ultrasound techniques
US6492627B1 (en) 2001-07-26 2002-12-10 Emerson Electric Co. Heating unit and control system for cooktops having capability to detect presence of a pan and methods of operating same
AU2002342067A1 (en) 2001-10-12 2003-04-22 Hrl Laboratories, Llc Vision-based pointer tracking method and apparatus
US7308112B2 (en) 2004-05-14 2007-12-11 Honda Motor Co., Ltd. Sign based human-machine interaction
US7519210B2 (en) 2004-09-09 2009-04-14 Raphael Hirsch Method of assessing localized shape and temperature of the human body
US7953253B2 (en) * 2005-12-31 2011-05-31 Arcsoft, Inc. Face detection on mobile devices
US7599561B2 (en) 2006-02-28 2009-10-06 Microsoft Corporation Compact interactive tabletop with projection-vision
WO2008137708A1 (en) * 2007-05-04 2008-11-13 Gesturetek, Inc. Camera-based user input for compact devices
US20090153494A1 (en) 2007-12-18 2009-06-18 Kevin Scott Laundroche Touch display for an appliance
CN101640779B (en) * 2008-07-31 2011-01-05 鸿富锦精密工业(深圳)有限公司 Encryption system and encryption method of image intake device
JP4771183B2 (en) * 2009-01-30 2011-09-14 株式会社デンソー Operating device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100231522A1 (en) * 2005-02-23 2010-09-16 Zienon, Llc Method and apparatus for data entry input
US8355545B2 (en) * 2007-04-10 2013-01-15 Lumidigm, Inc. Biometric detection using spatial, temporal, and/or spectral techniques
WO2009031633A1 (en) * 2007-09-04 2009-03-12 Canon Kabushiki Kaisha Image projection apparatus and control method for same
US20110187678A1 (en) * 2010-01-29 2011-08-04 Tyco Electronics Corporation Touch system using optical components to image multiple fields of view on an image sensor

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Joe Marshall et al. Pressing the Flesh: Sensing Multiple Touch and Finger Pressure on Arbitrary Surfaces ,2008 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150089453A1 (en) * 2013-09-25 2015-03-26 Aquifi, Inc. Systems and Methods for Interacting with a Projected User Interface
US20150370441A1 (en) * 2014-06-23 2015-12-24 Infosys Limited Methods, systems and computer-readable media for converting a surface to a touch surface
CN108253484A (en) * 2018-01-02 2018-07-06 广东美的厨房电器制造有限公司 Range hood and its control device and control method
US20210392306A1 (en) * 2019-10-01 2021-12-16 Panasonic Intellectual Property Management Co., Ltd. Heating cooker

Also Published As

Publication number Publication date
US9582070B2 (en) 2017-02-28
US20140078052A1 (en) 2014-03-20
US20110253693A1 (en) 2011-10-20

Similar Documents

Publication Publication Date Title
US9582070B2 (en) Detecting user input provided to a projected user interface
US20210096651A1 (en) Vehicle systems and methods for interaction detection
US9268412B2 (en) Input apparatus having an input recognition unit and input recognition method by using the same
JP6561079B2 (en) Multi-touch input discrimination
EP2891950B1 (en) Human-to-computer natural three-dimensional hand gesture based navigation method
US20120146903A1 (en) Gesture recognition apparatus, gesture recognition method, control program, and recording medium
US9274608B2 (en) Systems and methods for triggering actions based on touch-free gesture detection
Agarwal et al. High precision multi-touch sensing on surfaces using overhead cameras
US20130279756A1 (en) Computer vision based hand identification
US20150015485A1 (en) Calibrating Vision Systems
US9348466B2 (en) Touch discrimination using fisheye lens
JPWO2009139214A1 (en) Display device and control method
US20090174674A1 (en) Apparatus and methods for a touch user interface using an image sensor
CN108022543B (en) Advertisement autonomous demonstration method and system, advertisement machine and application
US9298246B2 (en) Information processing device, system, and information processing method
Martin et al. Human friendly interface design for virtual fitting room applications on android based mobile devices
CN103608761B (en) Input equipment, input method and recording medium
US10235607B2 (en) Control device, control method, and computer program product
US20150227789A1 (en) Information processing apparatus, information processing method, and program
JP5799817B2 (en) Finger position detection device, finger position detection method, and computer program for finger position detection
US20140015950A1 (en) Touch detection apparatus, touch detection method and recording medium
TW201624196A (en) A re-anchorable virtual panel in 3D space
CN104951211B (en) A kind of information processing method and electronic equipment
US20240069647A1 (en) Detecting method, detecting device, and recording medium
US20240070889A1 (en) Detecting method, detecting device, and recording medium

Legal Events

Date Code Title Description
AS Assignment

Owner name: EPSON EUROPE ELECTRONICS GMBH, GERMANY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:WITTMEIR, MANFRED;REEL/FRAME:025728/0275

Effective date: 20110121

Owner name: EPSON RESEARCH & DEVELOPMENT, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KADANTSEVA, TATIANA PAVLOVNA;LIM, RICARDO TE;CHOW, RAYMOND;AND OTHERS;REEL/FRAME:025728/0356

Effective date: 20110118

AS Assignment

Owner name: SEIKO EPSON CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:EPSON EUROPE ELECTRONICS GMBH;REEL/FRAME:025851/0517

Effective date: 20110216

AS Assignment

Owner name: SEIKO EPSON CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:EPSON RESEARCH & DEVELOPMENT, INC.;REEL/FRAME:025867/0553

Effective date: 20110224

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION