US20010012001A1 - Information input apparatus - Google Patents

Information input apparatus Download PDF

Info

Publication number
US20010012001A1
US20010012001A1 US09/110,570 US11057098A US2001012001A1 US 20010012001 A1 US20010012001 A1 US 20010012001A1 US 11057098 A US11057098 A US 11057098A US 2001012001 A1 US2001012001 A1 US 2001012001A1
Authority
US
United States
Prior art keywords
semi
transparent screen
information
manipulation
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US09/110,570
Other versions
US6414672B2 (en
Inventor
Junichi Rekimoto
Nobuyuki Matsushita
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp filed Critical Sony Corp
Assigned to SONY CORPORATION reassignment SONY CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MATSUSHITA, NOBUYUKI, REKIMOTO, JUNICHI
Publication of US20010012001A1 publication Critical patent/US20010012001A1/en
Application granted granted Critical
Publication of US6414672B2 publication Critical patent/US6414672B2/en
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected

Definitions

  • the present invention relates to an information input apparatus suitably used for, for instance, interactive input/output.
  • Computer apparatuses and the like commonly employ, under various application programs etc., what is called interactive input/output form in which the computer apparatus side presents, in the forms of a display, a prescribed response to a user's manipulation.
  • the touch panel is commonly known as one of input devices that are used for the above type of interactive input/output.
  • a user can perform a desired manipulation by sliding, for instance, his finger in an arbitrary direction while touching the panel.
  • the projection display is also known which functions as a computerized white board.
  • a user can perform a manipulation on the white board by using a dedicated infrared-light-emitting pen.
  • Video Place an apparatus that is intended to provide interactive effect.
  • Video Place is an artistic apparatus using a video camera, for instance.
  • a viewer of a Video Place apparatus causes the video camera to photograph his hand or some other part of his body as a silhouette.
  • the viewer can enjoy a reaction or a change in an image that is displayed on a monitor device and that is a combination of an image photographed above and some other image by moving his hand or some other part of his body freely while watching the image on the monitor device.
  • the pointing manipulation is generally limited to one using a finger. No manipulation can be performed in a space in front of the touch panel; it is necessary to cause a physical manipulation body such a finger to contact the manipulation surface. Further, being relatively expensive, the touch panel is not appropriate for a large-size manipulation panel.
  • An object of the present invention is to provide a more advanced or enhanced interactive input/output environment.
  • the invention provides an information input apparatus comprising a semi-transparent screen that functions as an operator input manipulation surface; pickup means for picking up an input manipulation of an operator on the semi-transparent screen by capturing only light or electromagnetic waves in a predetermined wavelength range that comes through the semi-transparent screen, to thereby produce a pickup signal; and control processing means for generating detection image information corresponding to the input manipulation of the operator based on the pickup information, and for executing a control process based on input manipulation information that is recognized based on the detection image information.
  • a physical object, for instance, that has approached the semi-transparent screen causes a variation in the state of light or electromagnetic waves entering the pickup means.
  • a state variation in light or electromagnetic waves is picked up as image information.
  • the thus-obtained image information is used as manipulation information, and a necessary control process can be executed in accordance with the manipulation information. That is, interactive input/output can be realized by producing input information by a manipulation in which some physical object capable of causing a variation in the state of light or electromagnetic waves in a predetermined wavelength range to be captured by the pickup means is made close to the semi-transparent screen or moved in its vicinity.
  • what functions as a manipulation panel is merely a semi-transparent screen. Since the semi-transparent screen can be formed, for instance, by combining a material for forming a transparent screen and a material for forming a semi-transparent screen, a large-size semi-transparent screen can easily be formed.
  • the above configuration may further be provided with projection display means so that it can project, onto the semi-transparent screen, an image of visible light in an wavelength range excluding the wavelength range of light or electromagnetic waves to be captured by the pickup means, wherein the control processing means executes, as the above-mentioned control process, a display image generation process for causing the projection display means to project a display image and a control on the projection display means.
  • the semi-transparent screen has a function of a display panel as well as a function of a manipulation panel, an interactive response in response to a manipulation that has been performed on the semi-transparent screen can be displayed as an image on the same semi-transparent screen.
  • FIG. 1 conceptually shows an example configuration of an interactive display system according to a first embodiment of the present invention
  • FIG. 2 shows an internal configuration of a control device that is provided in the interactive display system according to the first embodiment
  • FIG. 3 is a flowchart showing a process of detecting and holding reference input image levels
  • FIG. 4 is a flowchart showing a process of generating detection image information
  • FIG. 5 illustrates a first application example of the interactive display according to the first embodiment
  • FIG. 6 is a flowchart showing a process for realizing the first application example of FIG. 5;
  • FIG. 7 illustrates a second application example of the interactive display according to the first embodiment
  • FIG. 8 illustrates a third application example of the interactive display according to the first embodiment
  • FIG. 9 illustrates a fourth application example of the interactive display according to the first embodiment
  • FIG. 10 illustrates a fifth application example of the interactive display according to the first embodiment
  • FIG. 11 conceptually shows an example configuration of an interactive display system according to a second embodiment of the invention.
  • FIG. 12 conceptually shows an example configuration of an interactive display system according to a third embodiment of the invention.
  • FIG. 13 shows the internal configuration of a control device provided in the interactive display system according to the third embodiment
  • FIGS. 14A and 14B illustrate operation examples of the interactive display system according to the third embodiment
  • FIG. 15 conceptually shows an example configuration of an interactive display system according to a fourth embodiment of the invention.
  • FIG. 16 conceptually shows an example configuration of an interactive display system according to a fifth embodiment of the invention.
  • FIG. 17 conceptually shows another example configuration of an interactive display system according to the fifth embodiment of the invention.
  • FIG. 18 conceptually shows an example configuration of an interactive display system according to a sixth embodiment of the invention.
  • FIG. 19 shows the internal configuration of a control device provided in the interactive display system according to the sixth embodiment
  • FIG. 20 conceptually shows another example configuration of an interactive display system according to the sixth embodiment of the invention.
  • FIG. 21 conceptually shows an example configuration of an interactive display system according to a seventh embodiment of the invention.
  • FIG. 22 conceptually shows an example configuration of an interactive display system according to an eighth embodiment of the invention.
  • FIG. 23 shows the internal configuration of a control device provided in the interactive display system according to the eighth embodiment.
  • FIGS. 1 - 10 A first embodiment of the invention will be described with reference to FIGS. 1 - 10 .
  • FIG. 1 conceptually shows an example configuration of an interactive display system having an information input apparatus according to the first embodiment of the invention.
  • An interactive display system 1 is composed of a semi-transparent screen 2 , an infrared light-emitting diode (LED) panel 3 , a CCD (charge-coupled device) camera 4 , a projector 5 , and a control device 6 .
  • the infrared LED camera 3 , the CCD camera 4 , and the projector 5 are provided on the back side of the semi-transparent screen 2 .
  • the semi-transparent screen 2 is formed by bonding a semi-transparent film that looks like tracing paper to a transparent glass plate or by using a member having transparency such as a frosted glass. As described later, the semi-transparent screen 2 has functions of both of a manipulation panel and a display panel in the interactive display system 1 .
  • the infrared LED panel 3 is constructed in such a manner that many infrared LEDs are arranged collectively with respect to a panel surface.
  • the infrared LED panel 3 is so disposed that infrared beams emitted from the respective infrared LEDs are applied to the entire back surface of the semi-transparent screen 2 .
  • the infrared LEDs are driven by the control device 6 so as to always emit infrared light.
  • the infrared LEDs of the infrared LED panel 3 may be provided in a number that is enough for infrared light beams emitted therefrom to illuminate the entire semi-transparent screen 2 .
  • image information reflected from the semi-transparent screen 2 is obtained based on a difference obtained by subtracting a current infrared image level from an initial infrared image level. Therefore, it is not necessary that the quantity of infrared light applied to the semi-transparent screen 2 be uniform over the entire screen 2 . Therefore, the infrared LED panel 3 may be much smaller than the semi-transparent screen 2 .
  • the CCD camera 4 is a camera device using a CCD as an imaging device and functions as a pickup means for picking up an input manipulation of an operator on the semi-transparent screen 2 .
  • the CCD camera 4 is provided to recognize, as image information, a manipulation that is performed on the semi-transparent screen 2 by photographing only the infrared component of an image formed on the semi-transparent screen 2 .
  • an infrared transmission filter 4 a that transmits only a light component in an infrared wavelength band is provided in the optical system of the CCD camera 4 .
  • the position of the CCD camera 4 is so set that the entire semi-transparent screen 2 is included in its photographing range.
  • the projector 5 projects visible image light onto the back surface of the semi-transparent screen 2 based on image information that is supplied from the control device 6 . For example, a user can see, from the front side of the semi-transparent screen 2 , an image projected on the semi-transparent screen 2 by the projector 5 .
  • the optical system of the projector 5 includes an infrared cutoff filter 5 a for cutting off an infrared component of light, as a result of which the light coming from an image that is projected on the semi-transparent screen 2 does not include an infrared component. Therefore, the CCD camera 4 does not detect a projection image of the projector 5 .
  • the control device 6 captures image information (video data) from an imaging signal that is supplied from the CCD camera 4 and obtains manipulation information from the image information. Based on the manipulation information, the control device 6 performs a display control for an image to be displayed on the semi-transparent screen 2 by the projector, and other necessary controls. Further, the control device 6 drives, for light emission, the infrared LEDs of the infrared LED panel 3 .
  • the positions of the infrared LED panel 3 , the CCD camera 4 , and the projector 5 may be so set that each of those devices can play its role satisfactorily.
  • FIG. 2 is a block diagram showing an example of an internal configuration of the control device 6 .
  • an LED driving section 10 is to drive, for light emission, the infrared LEDs of the infrared LED panel 3 .
  • An image input section 11 generates a video signal (video information) by performing prescribed signal processing on an imaging signal that has been produced by the CCD camera 4 based on infrared light coming from the semi-transparent screen 2 , and supplies it to an input image processing section 12 .
  • the input image processing section 12 converts the video signal that is supplied from the image input section 11 into video signal data (digital signal).
  • the input image processing section 12 picks up information of a manipulation that has been performed on the semi-transparent screen 2 by executing a necessary analyzing process etc. by using “image information” (for example, frame-by-frame video data) that is obtained based on the video signal data.
  • image information for example, frame-by-frame video data
  • the manipulation information that is obtained based on the image information is the position (coordinates) on an image of a manipulation body that is performing a manipulation on the semi-transparent screen 2 or the signal level of an image.
  • the manipulation information is transmitted to a database driving section 14 .
  • the video signal data can be supplied to an image combining section 17 .
  • An threshold value control section 13 sets a threshold value that is necessary for a process to be executed on manipulation information in the input image processing section 12 , and transmits it to the input image processing section 12 .
  • the input image processing section 12 generates manipulation information by executing a necessary process such as an analysis on image information by using the threshold value that has been set by the threshold value control section 13 .
  • a current image state (detection image information) of the semi-transparent screen 2 is obtained by calculating a frame difference of input image data (described later).
  • Such information as a reference value (reference image input level) to be used the frame difference calculation is stored in the threshold value control section 13 (described later).
  • the database driving section 14 captures manipulation information generated by the input image processing section 12 , and executes, when necessary, a necessary process based on the manipulation information.
  • Program data necessary for control processes to be executed by the database driving section 14 is stored in a database memory 15 .
  • the database driving section 14 executes the necessary control process based on the program data stored in the database memory 15 .
  • an image generation section 16 Controlled by the database driving section 14 , an image generation section 16 generates necessary image data (video signal data (digital data)) and outputs it to an image combining section 17 .
  • the image combining section 17 combines video signal data that is supplied from the input image processing section 12 with video signal data that is supplied from the image generation section 16 , and outputs resulting data to an RGB signals generation section 18 .
  • the RGB signals generation section 18 converts the video signal data that is supplied from the image combining section 17 into, for instance, analog RGB signals, and outputs those to the projector 5 .
  • image light carrying a video signal that reflects a response to a manipulation that has been performed on the semi-transparent screen 2 is applied to the semi-transparent screen 2 from the projector 5 .
  • infrared light is applied from the infrared LED panel 3 to the entire semi-transparent screen 2 (see FIG. 1) from the back side. Because the screen 2 is semi-transparent, not all of the infrared light passes through the semi-transparent screen 2 and a certain part of it is reflected by the semi-transparent screen 2 .
  • the initial levels of video signal data that are obtained by photographing infrared light that is reflected by the semi-transparent screen 2 with the CCD camera 4 in a state that no manipulation is performed on the semi-transparent screen 2 are stored as “reference input image levels.”
  • the reference input image levels may be obtained by detecting signal levels of the respective pixels of, for instance, one frame by using input video signal data. This detection is performed by the input image processing section 12 .
  • the information of the reference input image levels thus detected is transmitted to the threshold value control section 13 and stored there.
  • FIG. 3 is a flowchart showing an example process of detecting the reference input image level.
  • the input image processing section 12 detects signal levels of the respective pixels by using 1-frame image data that is obtained from a video signal that is supplied from the image input section 11 and employs detection results as reference input image levels Lint. Specifically, luminous signal component levels of the respective pixels may be detected and employed as the reference input image levels Lint.
  • the reference input image levels Lint are transmitted to the threshold value control section 13 and stored there.
  • the process of FIG. 3 of detecting the reference input image levels Lint and storing those in the threshold value control section 13 may be executed at the time of turning on the power of the interactive display system, or the reference input image levels Lint may be updated based on a user's instruction when necessary.
  • FIG. 4 is a flowchart showing a process to be executed by the input image processing section 12 to obtain image information (hereinafter especially referred to as “detection image information”) as a basis of manipulation information.
  • the input image processing section 12 necessarily detects current input image levels Lprs at step S 201 .
  • the input image levels Lprs are information obtained by detecting signal levels of the respective pixels of frame-by-frame image data that is obtained by photographing a current infrared image on the semi-transparent screen 2 with the CCD camera 4 .
  • the input image processing section 12 generates current detection image information (i.e., frame-by-frame video data including pixel-based level information) based on the input image level differences L.
  • the amount of infrared light that is passed through the semi-transparent screen 2 and then reflected by the user's body is small; that is, most of the infrared light that has passed through the semi-transparent screen 2 does not return to the back side of the semi-transparent screen 2 .
  • the current input image levels Lprs are approximately equal to the reference input image levels Lint, and hence the input image level differences L detected by the input image processing section 12 are approximately equal to 0. Therefore, the detection image information that is generated based on the input image level differences L remains approximately the same as in the initial state and has almost no variation component.
  • the input image processing section 12 detects this state as a state in which the current input image levels Lprs minus the initial input image levels Lint gradually increase in an image portion corresponding to the user's body. Accordingly, the figure of the user approaching the semi-transparent screen 2 is captured increasingly clearly as the detection image information in accordance with the calculated input image level differences L.
  • the input image processing section 12 generates such image information that the level is high in an image region corresponding to the user's finger and decreases with the distance from the semi-transparent screen 2 in an image region corresponding to the user's body which region is part of the background.
  • An image portion corresponding to only the user's finger can easily be separated from the background by comparing the detection image information with the threshold value that is preset is the threshold value control section 13 .
  • the semi-transparent screen 2 functions, for instance, as a manipulation panel for an interactive interface.
  • the manipulation body for performing a manipulation need not be a special pointing device and may be of any kind as long as it reflects infrared light. That is, as described above, the entire human body or a part thereof, or some other object may be used as a manipulation body without causing any problem.
  • the semi-transparent screen 2 can be a simple means constructed, for example, by combining a transparent glass plate and a semi-transparent thin film such as tracing paper or using such a glass plate as a fronted glass.
  • the semi-transparent screen 2 can easily be increased in size with a small cost increase. This is much different than in the touch panel which cannot be increased in size easily.
  • manipulation information can be produced based on an image that is obtained from infrared light that is reflected from the semi-transparent screen 2 , a plurality of manipulation bodies can be recognized at the same time to perform necessary controls as long as their images can be recognized. That is, a plurality of different manipulation bodies can be manipulated at the same time. This is very useful when the semi-transparent screen 2 is of a large size because different kinds of manipulations can be performed at the same time by using various regions of the semi-transparent screen 2 .
  • the semi-transparent screen 2 also functions as an image display panel, a direct manipulation can be realized easily. For example, as described later, a configuration is possible in which a menu picture or the like on which a manipulation is to be performed is displayed and a user is allowed to perform a manipulation on the menu picture with his finger, for instance.
  • the interactive display system provides many possible ways of inputting manipulation information and hence enables easy construction of interactive input/output environments that could not be realized conventionally.
  • FIG. 5 shows a first application example of the interactive display system 1 according to this embodiment in which a menu manipulation is performed.
  • FIG. 5 shows a state in which the semi-transparent screen 2 is viewed from the front side.
  • an indication display indicating the selection of the region of the manipulation item designated by the user is made in the menu picture M (for instance, a cursor is located at the selected region or the selected region is emphasized in some form).
  • a display control for this purpose is realized by detecting the coordinates of the region designated by the user from the detection image information.
  • a lapse of a predetermined time (for instance, several seconds) from the start of an indication display of the above kind is regarded as an enter manipulation. If the user has performed an enter manipulation, that is, if a state that some manipulation item is indication-displayed has lasted for the predetermined time or longer, a control operation corresponding to the designated manipulation item is performed. For example, depending on the designated manipulation item, a menu picture of another layer is displayed or a desired operation is performed on the interactive display system 1 . If the interactive display system 1 is so configured as to be connectable to some other external apparatus and the menu picture M is for performing a manipulation control on the external apparatus, an operation of the external apparatus corresponding to the designated manipulation item is controlled.
  • FIG. 6 is a flowchart showing a process of the control device 6 that corresponds to the application example of FIG. 5. This process is basically executed as the input image processing section 12 of the control device 6 recognizes manipulation information based on detection image information and the database driving section 14 performs a proper operation based on the manipulation information according to a program that is stored in the database memory 15 .
  • step S 301 it is judged whether a “close body” has been detected from the current detection image information.
  • the term “close body” means some subject of detection that is within the predetermined range from the semi-transparent screen 2 (in FIG. 5, the user's body).
  • a “close body” is detected by comparing, with the input image processing section 12 , the detection image information with a threshold value that has been set for close body detection by the threshold value control section 13 . If a value equal to or larger than the threshold value is obtained in a region of the detection image information, a detection “a close body exists” is made. If there is no region where a value equal to or larger than the threshold value is obtained, a detection “no close body exists” is made.
  • the threshold value for close body detection may be set based on an image level of a human body (user) that would usually be obtained as detection image information when he approaches the semi-transparent screen 2 to a certain extent (for instance, tens of centimeters from the screen 2 ).
  • step S 301 If no close body is detected at step S 301 , the process goes to step S 308 , where it is judged whether the menu picture M is now displayed. If the menu picture M is not displayed, the process returns to the original routine (i.e., returns to step S 301 ). If the menu picture M is displayed, the process goes to step S 309 , where a control process for erasing the menu picture M is executed. For example, the process of erasing the menu picture M is realized in such a manner that the database driving section 14 causes the image generation section 16 to stop a process of generating image data of the menu picture M.
  • step S 301 the process goes to step S 302 , where the position of the close body on the semi-transparent screen 2 is detected.
  • this process is realized by detecting the coordinates of a region in the detection image information that is occupied by the close body.
  • the detection of coordinates there may be detected a prescribed one point of the region of the close body or a plurality of points that are determined according a prescribed rule.
  • the point or points to be detected may be set arbitrarily in accordance with an actual application environment or the like.
  • a control for displaying the menu picture M in an area of the semi-transparent screen 2 corresponding to the position of the close body that was detected at step S 302 is performed.
  • the database driving section 14 causes the image generation section 16 to generate image data of a menu picture of a proper kind according to a menu picture display program that is stored in the database memory 15 .
  • the database driving section 14 causes display image data to be generated in such a manner that image data of the menu picture is mapped with a display area corresponding to the position of the close body that was detected at step S 302 .
  • the menu picture M is finally projected by the projector 5 at the position on the semi-transparent screen 2 where the user's approach was detected.
  • step S 304 it is judged at step S 304 whether a “manipulation body” has been detected in the display regions of the manipulation items of the menu picture M being displayed.
  • the term “manipulation body” is an object (subject of detection) that is very close to the front surface of the semi-transparent screen 2 (distant by about 3-30 cm though this value depends on the setting of the threshold value). In the case of FIG. 5, the user's finger pointing the menu picture M is a subject of detection.
  • the process of detecting a “manipulation body” starts from detecting presence/absence of a manipulation body by comparing the image levels of the detection image information with the threshold value that is set for manipulation body detection in the threshold value control section 13 .
  • the threshold value for this purpose is set larger than the above-described threshold value for close body detection, because it is now necessary to detect an object that is very close to the front surface of the semi-transparent screen 2 by discriminating it from the background.
  • Non-detection of a manipulation body in any of the display regions of the manipulation items of the menu picture M at step S 304 occurs in the following cases.
  • a first case is such that no manipulation body is detected in the detection image information (for example, the user does not pointing the semi-transparent screen 2 in a very close range).
  • a second case is such that a manipulation body is detected in the detection image information but its detection position (coordinates) does not belong to the area in the image information corresponding to the display area of the menu picture M on the semi-transparent screen 2 (for example, the position on the semi-transparent screen 2 pointed by the user in a very close range is out of any of the regions of the manipulation items of the menu picture M).
  • the process returns to step S 301 .
  • the manipulation body detecting process of step S 304 may be as follows.
  • the shape of a hand or a finger of a human body that will appear during a manipulation is stored in advance in the database memory 15 . Presence/absence of a manipulation body is detected by comparing the information on the shape of a hand or a finger with an image shape that is obtained as detection image information and then evaluating the degree of their coincidence.
  • input information since input information is detected from image information, input information can be recognized as manipulation information based on an image shape in the detection image information.
  • step S 304 If it is judged at step S 304 that the manipulation body is detected in the display region of a manipulation item of the menu picture M, the process goes to step S 305 , where a control is performed so that an indication display is performed on the manipulation item of the menu picture M corresponding to the position where the manipulation body was detected. The process then goes to step S 306 .
  • Step S 306 is a process of waiting for an enter manipulation. As described above, when a predetermined time has elapsed from the start of the indication display, a decision is made that an enter manipulation has been performed. Therefore, it is judged at step S 306 whether the state that the manipulation body is detected in the same manner as at step S 304 has lasted for the predetermined time or longer. This judgment is performed in such a manner that the input image processing section 12 monitors occurrence of a state transition in the current detection image.
  • step S 306 there may occur an event that the user changes the designating position so as to point a manipulation item of the menu picture M that is different from the one that has been designated so far. In this case, an indication display etc. will be performed on the newly designated manipulation item of the menu picture M.
  • step S 306 determines whether the state that the manipulation body is detected in the same manner as at step S 304 has lasted for the predetermined time or longer. If it is judged at step S 306 that the state that the manipulation body is detected in the same manner as at step S 304 has lasted for the predetermined time or longer, the process goes to step S 307 with a judgment that an enter manipulation has been done.
  • step S 307 a control process corresponding to the manipulation item of the menu picture M located at the position where the manipulation body was detected. This process is performed by the database driving section 14 according to a program stored in the database memory 15 .
  • FIG. 7 shows a second application example of the interactive display system 1 according to this embodiment.
  • a map of the world is displayed on the semi-transparent screen 2 as an image projected by the projector 5 under the control of the control device 6 .
  • the map of the world may be displayed by performing a manipulation on the menu picture M shown in FIG. 5.
  • it may be displayed automatically when a user (explainer) who has approached to the semi-transparent screen 2 to a certain extent has been detected as a “close body.”
  • a user explainer
  • the display form in this state it is possible to display a map so that a reference country or area (for instance, Japan) is always located at a position close, in the horizontal direction, to the position in front of the semi-transparent screen 2 where the explainer stands.
  • an explanatory image DT that is some explanation as to an area designated by the explainer is superimposed on the map (i.e., the semi-transparent screen 2 ) at a designated position. This is done in such a manner that the control device 6 detects, as the position of a manipulation body, the position (coordinates) pointed by the explainer with his finger or the like and performs a control to display the explanatory image DT of the area corresponding to the detected position of the manipulation body.
  • Image data of the map and various explanatory images DT are stored in advance in the database memory 15 .
  • the size of the semi-transparent screen 2 as the display screen (and the manipulation panel) can be increased easily. Therefore, a conference or a demonstration using a large-size semi-transparent screen 2 like the second application example are applications to which the interactive display system 1 according to this embodiment is satisfactorily applied.
  • FIG. 8 shows a third application example of the interactive display system according to this embodiment.
  • FIG. 8 shows a state that two menu pictures M 1 and M 2 are displayed simultaneously and a user is performing manipulations at the same time on the menu pictures M 1 and M 2 .
  • the manipulation information is obtained from the “detection image information” that is produced based on an infrared image that is photographed by the CCD camera 4 . That is, the manipulation information is obtained by recognizing an image state. Therefore, even if a plurality of manipulation bodies (in this example, hands or fingers of the user) are detected at the same time in the detection image information as in the case of FIG. 8, detection results of the respective manipulation bodies can be handled as different pieces of manipulation information.
  • FIG. 9 shows a fourth application example of the interactive display system 1 according to this embodiment.
  • FIG. 9 shows a state that parameter adjustment images PC 1 and PC 2 for adjustment of the values of certain parameters are displayed on the semi-transparent screen 2 and a user is performing manipulations at the same time on the parameter adjustment images PC 1 and PC 2 with his hands.
  • the parameter adjustment images PC 1 and PC 2 are images simulating slide volumes.
  • the user performs manipulations in such a manner that he places his hands on the semi-transparent screen 2 at lever portions (lever images LV) of the respective parameter adjustment images PC 1 and PC 2 and moves, that is, slides, his hands vertically so as to obtain desired parameter values.
  • a display is so made that the lever images LV are moved vertically in accordance with the movements of the hands, and the control device 6 executes a corresponding process to variably control the actual parameter values accordingly.
  • the lever images LV are manipulated at the same time, the respective pieces of manipulation information can be recognized simultaneously and the parameter values can be varied simultaneously in accordance with the respective manipulations.
  • FIG. 10 shows a fifth application example of the interactive display system 1 according to this embodiment.
  • FIG. 10 shows a state that an adult person and a child are performing manipulations on different menu pictures at the same time.
  • the system 1 is used in the following situation.
  • the occupation ratio and the position-related state of a close body (described above in connection with FIG. 6) and the vertical position (height) of a manipulation body on the semi-transparent screen 2 are different for an adult person and a child. That is, because of a difference in height, a child appears as a close body in a lower region in the detection image information than an adult person. Similarly, a manipulation body (user's finger or the like) of an adult person tends to appear in a lower region in the detection image information than that of a child.
  • a certain threshold value is set for the vertical height in the detection image information.
  • a close body or a manipulation body is detected whose height exceeds the predetermined threshold value
  • a menu picture Mad for adults is displayed at the position of the close body or the manipulation body (the height is also changed that is included in the definition of the display position on the semi-transparent screen 2 in this example).
  • a close body or a manipulation body is detected whose height does not exceed the threshold value
  • a menu picture Mch for children is displayed at the position (including the height) of the close body or the manipulation body.
  • the threshold value may be proper values that are different for a close body and a manipulation body.
  • the threshold value for height discrimination may be different for adults and children.
  • the fifth application example is the same as the application examples of FIGS. 8 and 9 in that a configuration is possible in which when manipulations are performed on the menu picture Mad for adults and the menu picture Mch for children at the same time, corresponding pieces of manipulation information are recognized simultaneously and control operations responsive thereto are performed.
  • FIG. 11 shows an interactive display system according to a second embodiment of the invention.
  • the components in FIG. 11 that are the same as the corresponding components in FIG. 2 are given the same reference numerals as the latter and will not be described below.
  • the control device 6 may have the same internal configuration as that shown in FIG. 2.
  • an infrared transmitter PD that emits infrared light is used as a pointing device.
  • the manipulation information is obtained based on the detection image information that is produced from an infrared image on the semi-transparent screen 2 that is photographed by the CCD camera 4 . Therefore, in this embodiment, a variation in the quantity of infrared light entering the CCD camera 4 from the semi-transparent screen 2 can be recognized as manipulation information.
  • a finger or the like is used to point a desired position on the semi-transparent screen 2
  • a user has the infrared transmitter PD with his hand and illuminates the front surface of the semi-transparent screen 2 at a desired position with an infrared beam emitted from the infrared transmitter PD.
  • the input image processing section 12 of the control device 6 may operate so as to recognize such a level variation in the detection image information as manipulation information.
  • an infrared beam is invisible, it is preferable to, for instance, display a spot SP on the semi-transparent screen 2 so that a user can find a current illumination position on the semi-transparent screen 2 of an infrared beam emitted from the infrared transmitter PD.
  • This spot display can be realized in such a manner that the input image processing section 12 of the control device 6 recognizes the current illumination position (coordinates) of an infrared beam based on the detection image information and a display control is performed so that the projector 5 projects the spot SP at the illumination position thus recognized.
  • FIG. 12 conceptually shows the entire configuration of an interactive display system 1 B according to a third embodiment of the invention
  • FIG. 13 is a block diagram showing the internal configuration of the control device 6 .
  • the components in FIGS. 12 and 13 that are the same as the corresponding components in FIGS. 1 and 2 are given the same reference numerals as the latter and will not be described below.
  • the interactive display system is provided with two CCD cameras, that is, a first CCD camera 4 A and a second CCD camera 4 B.
  • the first CCD camera 4 A has the same role as the CCD camera 4 in the first embodiment. That is, the first CCD camera 4 A is provided on the pack side of the semi-transparent screen 2 to photograph an image through infrared light coming from the entire semi-transparent screen 2 as an imaging range.
  • the second CCD camera 4 B is provided to photograph an image in a prescribed region on the semi-transparent screen 2 with enlargement or reduction.
  • the second CCD camera 4 B has a pan/tilt/zoom mechanism 7 .
  • the pan/tilt/zoom mechanism 7 is provided with a mechanism (pan/tilt mechanism) for rotating the second CCD camera 4 B in both horizontal and vertical planes as well as a mechanism (zoom mechanism) for varying the magnification factor of a photographing image by moving a zoom lens that is provided in the second CCD camera 4 B.
  • Controls on the pan/tilt/zoom mechanism 7 that is, a pan/tilt position variable control and a zoom ratio variable control, are performed by the database driving section 14 as shown in FIG. 13. Therefore, when necessary, the database memory 15 stores a program for controlling the pan/tilt/zoom mechanism 7 .
  • Each of the CCD cameras 4 A and 4 B is provided with an infrared transmission filter 4 a so as to photograph only an infrared image on the semi-transparent screen 2 .
  • the control device 6 has two image input sections 11 A and 11 B that correspond to the first and second CCD cameras 4 A and 4 B, respectively.
  • the image input section 11 A receives an imaging signal from the first CCD camera 4 A and supplies a corresponding video signal to the input image processing section 12
  • the image input section 11 B receives an imaging signal from the second CCD camera 4 B and supplies a corresponding video signal to the input image processing section 12 . Therefore, in this embodiment, the input image processing section 12 generates two kinds of detection image information based on the video signals of the first and second CCD cameras 4 A and 4 B, and produces manipulation information from the two kinds of detection image information.
  • the interactive display system 1 B according to this embodiment can be used in the following manner.
  • the detection image information shown in FIG. 14A generally includes an image region A where the body is displayed and an image region B where the hand is displayed, and the latter has a large value (for instance, in the luminance level) than the former.
  • the input image processing section 12 can recognize the image region B as a “manipulation body” by separating it from the background including the image region A.
  • the manipulation body is limited to a hand or a finger of a human body, a configuration is possible in which a hand or a finger of a user is recognized as a manipulation body based on its shape that is obtained as detection image information.
  • the database driving section 14 performs controls to zoom-photograph the manipulation body, that is, the image region B, with the second CCD camera 4 B based on the information on the position of the manipulation body on the semi-transparent screen 2 .
  • a pan/tilt control is performed on the pan/tilt/zoom mechanism 7 based on the information on the position of the image region B on the semi-transparent screen 2 so that the image region B is located approximately at the center of a photographing image of the second CCD camera 4 B
  • a zoom control is performed on the pan/tilt/zoom mechanism 7 so that the image region B occupies almost all of the photographing image.
  • detection image information obtained based on an imaging signal of the second CCD camera 4 B becomes image information in which the image region B (manipulation body) is enlarged as shown in FIG. 14B.
  • the above application method is just an example, and other various application examples are conceivable that uses two CCD cameras. Further, the pan/tilt/zoom mechanism 7 may be provided in both CCD cameras. Still further, a configuration is conceivable in which three or more CCD cameras are used (the pan/tilt/zoom mechanism may be provided in any of those CCD cameras), and pieces of manipulation information are independently obtained from the respective CCD cameras. This configuration also provides various application examples.
  • FIG. 15 conceptually shows an example configuration of an interactive display system 1 C according to a fourth embodiment of the invention.
  • the components in FIG. 15 that are the same as the corresponding components in, for instance, FIG. 1 are given the same reference numerals as the latter and will not be described below.
  • the control device 6 may have the same the internal configuration as that shown in FIG. 2.
  • the semi-transparent screen 2 is a wall-like one.
  • consideration of the functions (as a display panel and a manipulation panel) of the semi-transparent screen 2 leads to an understanding that the semi-transparent screen 2 should not be limited to the wall-like one.
  • a semi-transparent screen 2 A has a curved shape.
  • FIG. 15 shows a state that a semi-spherical semi-transparent screen 2 A is installed.
  • at least the infrared LED panel 3 , the CCD camera 4 , and the projector 5 are provided inside the semi-spherical semi-transparent screen 2 A.
  • the infrared transmission filter 4 a provided in the CCD camera 4 and the infrared cutoff filter Sa provided in the projector 5 are not shown in FIG. 15, actually they are provided in the same manner as in the above embodiments.
  • FIG. 15 shows a state that a map of the world is projected on the semi-transparent screen 2 A.
  • the interactive display system 1 C can be used in the same manner as described in the first embodiment in connection with FIG. 7.
  • FIG. 16 conceptually shows an example configuration of an interactive display system 1 D according to a fifth embodiment of the invention.
  • the components in FIG. 16 that are the same as the corresponding components in FIG. 1 are given the same reference numerals as the latter and will not be described below.
  • the infrared transmission filter 4 a and the infrared cutoff filter 5 a are not shown in FIG. 16, actually they are provided in the CCD camera 4 and the projector 5 , respectively.
  • the control device 6 is not shown in FIG. 16 either, actually it is provided to control the infrared LED panel 3 , the CCD camera 4 , and the projector 5 .
  • the control device 6 may have the same the internal configuration as that shown in FIG. 2.
  • the semi-transparent screen 2 is provided as a wall surface of a passage.
  • the infrared LED panel 3 , the CCD camera 4 , and the projector 5 are provided behind the wall surface (semi-transparent screen 2 ) of the passage. That is, the interactive display system 1 D according to this embodiment serves as part of the wall surface of the passage.
  • the optical paths of light beams emitted from the infrared LED panel 3 and the projector 5 and a light beam to enter the CCD camera 4 may be changed through reflection by using a mirror MR as shown in FIG. 17.
  • a mirror MR as shown in FIG. 17.
  • a considerably long distance may be needed to obtain an imaging range or a projection display range capable of covering a large-size semi-transparent screen 2 .
  • the mirror MR By using the mirror MR, a sufficiently wide imaging range or projection display range can be obtained even with a short depth. This relaxes the conditions that should be satisfied by an environment in which to install the interactive display system 1 D according to this embodiment.
  • FIG. 18 conceptually shows an example configuration of an interactive display system 1 E according to a sixth embodiment of the invention.
  • the components in FIG. 18 that are the same as the corresponding components in FIG. 1 are given the same reference numerals as the latter and will not be described below.
  • the infrared transmission filter 4 a and the infrared cutoff filter 5 a are not shown in FIG. 16, actually they are provided in the CCD camera 4 and the projector 5 , respectively.
  • a semi-transparent screen 2 B is installed as a table. That is, supported by four table legs F the semi-transparent screen 2 B also functions as the top plate of a table as an ordinary piece of furniture.
  • the infrared LED panel 3 , the CCD camera 4 , the projector 5 (and the control device 6 ) are provided under the semi-transparent screen 2 B.
  • this embodiment is so configured that when a user performs a manipulation on the semi-transparent screen 2 B, a monitor device 30 is controlled in accordance with the manipulation.
  • FIG. 19 is a block diagram showing the internal configuration of the control device 6 of the interactive display system 1 E according to this embodiment.
  • the components in FIG. 19 that are the same as the corresponding components in FIG. 2 are given the same reference numerals and will not be described below.
  • the control device 6 shown in FIG. 19 is provided with an external apparatus control section 20 .
  • the external apparatus control section 20 is a circuit for performing a manipulation control on the monitor device 30 .
  • the external apparatus control section 20 receives manipulation information from the database driving section 14 , and transmits, to the monitor device 30 , a command signal for a necessary manipulation control on the monitor device 30 . Therefore, the database memory 14 stores a program for realizing manipulations on the monitor device 30 in the interactive display system 1 E.
  • a remote controller display RMD is displayed in a particular region on the semi-transparent screen 2 B as shown in FIG. 18.
  • the remote controller display RMD simulates the manipulation panel surface of a remote controller having keys that enable various manipulations on the monitor device 30 .
  • a display control of the remote controller display RMD is realized in such a manner that the database driving section 14 performs a display control by using image data of the remote controller display RMD that is stored in the database memory 15 , to cause the projector 5 to project an image of the remote controller display RMD.
  • the position of the remote controller display RMD may be set arbitrarily.
  • the remote controller display RMD may be displayed at an arbitrary position that is convenient (i.e, easy to use) for a user in response to a prescribed setting manipulation on the interactive display system 1 E.
  • the database driving section 14 recognizes only a current display position of the remote controller display RMD, it can always grasp the display positions (coordinates) of various keys of the remote controller display RMD.
  • a user has selected a desired channel by manipulating a numeral key (i.e., by a manipulation on the remote controller display RMD displayed on the semi-transparent screen 2 B).
  • the user may perform an manipulation on the semi-transparent screen 2 B with a feeling of depressing a desired one of the numeral keys of the remote controller display RMD on the semi-transparent screen 2 B.
  • the user need not always bring a manipulation body such as a finger into contact with the table surface (semi-transparent screen 2 B). It is naturally possible to judge, as a key manipulation, a manipulation that is performed on a space above a desired key.
  • the position (coordinates) of the above manipulation of the user is detected by the input image processing section 12 based on detection image information that is obtained through photographing of the CCD camera 4 .
  • the database driving section 14 judges the coordinates of which key of the remote controller display RMD the detected coordinates of the manipulation position coincide with, and transmits, for instance, information indicating the type of the key whose coordinates coincide with those of the manipulation position to the external apparatus control section 20 .
  • the external apparatus control section 20 Based on the information indicating the type of key that is supplied from the database driving section 14 , the external apparatus control section 20 outputs a command signal corresponding to the type of key to the monitor device 30 .
  • the database driving section 14 should transmits information to the effect that a numeral key corresponding to some channel number has been manipulated.
  • the external apparatus control section 20 transmits a command signal for making a switch to the channel number corresponding to the designated numeral key.
  • the monitor device 30 operates to make a switch to a picture of the manipulated channel.
  • the external apparatus that can be manipulated by the interactive display system 1 E is not limited to the monitor device (television receiver).
  • a configuration is possible which allows any of other various electronic apparatuses to be manipulated.
  • the invention enables a configuration in which remote controller displays RMD for plural kinds of apparatuses are displayed simultaneously to allow a user's manipulation thereon. It is also possible to recognize simultaneous manipulations on a plurality of keys of one remote controller display RMD and to control the external apparatus accordingly.
  • the semi-transparent screen 2 B is a table surface as in the case of this embodiment, the following operation is possible.
  • an image reflecting the shape of the object is obtained as detection image information through infrared light that is reflected from the object.
  • this type of variation in image is used as manipulation information.
  • image data of the detection image information can be used as an image to be projected by the projector 5 . That is, for example, a configuration is possible in which an image of an object placed on the semi-transparent screen 2 B is used like its shadow (see shadow displays SHD shown in FIG. 18). In this case, the position of a shadow display SHD on the semi-transparent screen 2 B varies so as to follow the position of the object. And the shape of the shadow display SHD varies in accordance with the distance of the object from the surface of the semi-transparent screen 2 B. Therefore, there can be obtained a visual effect that would be interesting to a user.
  • detection image information that is obtained in the input image processing section 12 of the control device 6 may be supplied to the image combining section 17 as image data.
  • the image combining section 17 combines image data (detection image information; image data for a shadow display SHD) and a remote controller display RMD that has been generated by the image generation section 17 under the control of the database driving section 14 , whereby a resulting image is finally projected onto the semi-transparent screen 2 B as shown in FIG. 18.
  • an enhanced visual effect can be obtained by applying a special effect to image data of detection image information through proper signal processing, such as multi-colorizing it or changing its shape.
  • the infrared LED panel 3 , the CCD camera 4 , and the projector 5 under the table.
  • a mirror MR for changing the optical paths through reflection may be provided under the semi-transparent screen 2 B (table surface) as shown in FIG. 20.
  • the infrared LED panel 3 , the CCD camera 4 , the projector 5 , etc. may be provided on the side of the table.
  • a half mirror MR having preset transmittance and reflectance values may be used instead of the mirror MR.
  • the infrared LED panel 3 for instance, may be provided in the floor portion that is under the table, rather than on the side of the table. This means an increase in the degree of freedom of the device installation.
  • the installation method using a half mirror can also be applicable to the installation form of FIG. 17 (fifth embodiment)
  • FIG. 21 conceptually shows an example configuration of an interactive display system 1 F according to a seventh embodiment of the invention.
  • the components in FIG. 21 that are the same as the corresponding components in FIG. 1 are given the same reference numerals as the latter and will not be described below.
  • the infrared LED panel 3 is not provided in the interactive display system 1 F.
  • infrared light included in natural light is used for detection of manipulation information instead of using the infrared LED panel 3 .
  • the reference input image levels Lint that are necessary to obtain detection image information are detected based on image information that is obtained from an imaging signal produced by the CCD camera 4 based on infrared light that has come from the front side and passed through the semi-transparent screen 2 , for example, in a state that neither a close body nor a manipulation body exists.
  • FIG. 22 conceptually shows the entire configuration of an interactive display system 1 G according to an eighth embodiment of the invention
  • FIG. 23 is a block diagram showing the internal configuration of the control device 6 of the interactive display system 1 G.
  • the components in FIGS. 22 and 23 that are the same as the corresponding components in FIGS. 1 and 2 are given the same reference numerals as the latter and will not be described below.
  • the infrared LED panel 3 and the CCD camera 4 of the above-described embodiments are replaced by a microwave generator 40 and a microwave receiver 41 , respectively.
  • a microwave driving circuit 110 for driving the microwave generator 40 is provided instead of the LED driving section 10 (see FIG. 1). Also provided are a reception signal input section 111 for converting reception microwaves that are supplied from the microwave receiver 41 into data of a predetermined form and outputting it and an input data processing section 112 for executing a prescribed process on microwave reception data that is supplied from the reception signal input section 111 to thereby obtain, for instance, detection image information and for obtaining manipulation information based on the detection image information.
  • the reception signal input section 111 and the input data processing section 112 are functional circuit sections that replace the image input section 11 and the input image processing section 12 (see FIG. 1), respectively.
  • microwaves are used as a medium for detection of manipulation information, it is not necessary to use the infrared transmission filter 4 a and the infrared cutoff filter 5 a that were provided in the CCD camera 4 and the projector 5 , respectively, in the above embodiments.
  • An information input apparatus can be constructed even by using, for detection of manipulation information, such a medium as microwaves having a feature of being reflected by an object, basically in the same manner as in the above embodiments (in which infrared light is used for detection of manipulation information).
  • the invention provides an information input apparatus comprising a semi-transparent screen that functions as an operator input manipulation surface; pickup means for picking up an input manipulation of an operator on the semi-transparent screen by capturing only light or electromagnetic waves in a predetermined wavelength range that comes through the semi-transparent screen, to thereby produce a pickup signal; and control processing means for generating detection image information corresponding to the input manipulation of the operator based on the pickup information, and for executing a control process based on input manipulation information that is recognized based on the detection image information.
  • any object that causes a variation in the state of light or electromagnetic waves in a predetermined wavelength range by, for instance, reflecting it can serve as a manipulation body with which to perform a manipulation. That is, no special pointing device for a manipulation is needed. Since a manipulation body that is located close to the semi-transparent screen (for instance, in a space in front of the semi-transparent screen) is recognizable, various manipulation methods are possible. For example, a manipulation may be performed in a space in front of the front surface of the semi-transparent screen without bringing the manipulation body into contact with the semi-transparent screen as a manipulation panel, or an object approaching the semi-transparent screen may be recognized. A response process corresponding to a recognized manipulation is executed.
  • the manipulation panel of the invention may be a mere semi-transparent screen, its size can easily be increased in contrast to the case of the conventional touch panel.
  • the control processing means may be so configured as to be able to recognize plural pieces of input manipulation information based on image states of the detection image information and to execute different control processes based on the respective pieces of input manipulation information.
  • a plurality of subjects of detection can be recognized at the same time and control processes responsive to the respective detected subjects can be executed independently. For example, it is possible to perform manipulations on a plurality of menu pictures at the same time.
  • the control processing means may be so configured as to recognize the input manipulation information based on a particular image shape that is obtained as an image state of the detection image information. In this case, for example, it is possible to determine whether to employ a subject of detection as manipulation information based on its shape. This technique easily enables employment, as manipulation information, of only manipulations performed with a hand or a finger of a human body.
  • the control processing means may be so configured as to be able to recognize a hand or a finger of a human body as a subject of detection of the input manipulation information.
  • a hand or a finger of a human body as a subject of detection of the input manipulation information.
  • the information input apparatus of the invention may further comprise projection display means provided so as to be able to project, onto the semi-transparent screen, an image of visible light in an wavelength range excluding the predetermined wavelength range of light or electromagnetic waves to be captured by the pickup means, wherein the control processing means executes, as the control process, a display image generation process for causing the projection display means to project a display image and a control on the projection display means.
  • the semi-transparency of the semi-transparent screen is utilized. That is, by projecting an image onto the semi-transparent screen with a projector or the like, the semi-transparent screen can be used as not only a manipulation panel but also a display panel for image display.
  • a menu picture for causing the control processing means to execute a prescribed process may be set as the display image.
  • the menu picture prompts a user to perform various kinds of manipulations.
  • an initial image having a predetermined content may be set as the display image.
  • the control processing means may have attribute information relating to an image content of a particular region in the initial image, wherein when it has been judged that the particular region has been designated as the input manipulation information, the control processing means executes a control process so that the projection display means projects an image indicating the attribute information relating to the designated region.
  • an application is possible in which an initial image such as a map is displayed and designation of some area on the map causes display of its attribute information such as an explanation of that area.
  • a direct input form of manipulation information is realized in which a user performs a manipulation on an image displayed on the semi-transparent screen.
  • the control processing means may be so configured as to be able to generate the display image by using the detection image information.
  • an image can be displayed on the semi-transparent screen by using the detection image information.
  • the control processing means may execute the display image generation process so that the display image is displayed in an area on the semi-transparent screen corresponding to a located position of a physical object on the semi-transparent screen or in a space near the semi-transparent screen.
  • a visual effect can be obtained such as displaying a shadow of an object placed on the semi-transparent screen so as to be associated with the object.
  • the pickup means may comprise irradiating means for always irradiating the semi-transparent screen with light or electromagnetic waves in the predetermined wavelength range that are to be captured by the pickup means.
  • irradiating means for always irradiating the semi-transparent screen with light or electromagnetic waves in the predetermined wavelength range that are to be captured by the pickup means.
  • a medium for instance, infrared light or microwaves
  • This provides high reliability irrespective of the installation environment of a system having the information input apparatus.
  • the semi-transparent screen may be formed by combining a material (such as glass) for forming a transparent screen and a material (some semi-transparent film) for forming a semi-transparent screen.
  • a material such as glass
  • a material such as transparent film
  • the semi-transparent screen can be formed at a low cost while the above-mentioned capability of providing a large-size screen is maintained.
  • the semi-transparent screen may constitute a wall surface, have a curved surface, or be disposed so as to constitute a table surface. In this manner, the semi-transparent screen may be used in various forms. Accordingly, the application range of the information input apparatus of the invention is increased.
  • the information input apparatus of the invention may further comprise pointing device means capable of causing a state variation in light or electromagnetic waves in the predetermined wavelength range to be captured by the pickup means by irradiating the semi-transparent screen with light or electromagnetic waves.
  • pointing device means capable of causing a state variation in light or electromagnetic waves in the predetermined wavelength range to be captured by the pickup means by irradiating the semi-transparent screen with light or electromagnetic waves.
  • the pickup means may comprise a plurality of imaging means for producing imaging signals through photographing with different magnification factors, wherein the control processing means executes the control process based on the detection image information that is generated based on imaging signals that are supplied from the plurality of imaging means.
  • image information for detection is obtained with different magnification factors by a plurality of imaging means provided as the pickup means.
  • the control processing means may select, according to a predetermined rule, a particular imaging region of an area of detection image information that is obtained based on an imaging signal produced by a predetermined one of the plurality of imaging means, and execute a control so that one of the imaging means that is different from the predetermined imaging means photographs an image in the particular imaging region with a varied magnification factor.
  • detection image information may be obtained by photographing the pointed region with the second imaging means while magnifying it. This enables highly accurate detection of a pointed position (coordinates).
  • the invention provides a high degree of freedom for the input of information such as a manipulation. Further, the invention easily provides a large-size manipulation panel that can also be used as a display panel, thereby providing possibilities of various application forms. That is, the invention provides an advantage that an interactive input/output environment can easily be advanced or enhanced.

Abstract

There are provided an infrared LED panel for always applying infrared light to the back surface of a semi-transparent screen, a CCD camera for capturing only infrared light coming from the semi-transparent screen, and a projector for projecting an image onto the semi-transparent screen (with light not including infrared light). When a user performs a manipulation on the front side of the semi-transparent screen, the quantity of reflected infrared light varies. A control device picks up a variation in reflection light quantity as detection image information based on an imaging signal of the CCD camera. Further, in accordance with manipulation information that is obtained based on the detection image information, the control device executes controls necessary for, for instance, switching of an interface image to be displayed on the semi-transparent screen. A manipulation method may be in such a form as to cause a variation in detection image information through reflection of infrared light. Interactive input/output with various manipulation methods is thus enabled.

Description

    BACKGROUND OF THE INVENTION
  • The present invention relates to an information input apparatus suitably used for, for instance, interactive input/output. [0001]
  • Computer apparatuses and the like commonly employ, under various application programs etc., what is called interactive input/output form in which the computer apparatus side presents, in the forms of a display, a prescribed response to a user's manipulation. [0002]
  • For example, the touch panel is commonly known as one of input devices that are used for the above type of interactive input/output. With the touch panel, a user can perform a desired manipulation by sliding, for instance, his finger in an arbitrary direction while touching the panel. [0003]
  • The projection display is also known which functions as a computerized white board. In this projection display, for example, a user can perform a manipulation on the white board by using a dedicated infrared-light-emitting pen. [0004]
  • Further, the apparatus called “Video Place” as an apparatus that is intended to provide interactive effect. Video Place is an artistic apparatus using a video camera, for instance. For example, a viewer of a Video Place apparatus causes the video camera to photograph his hand or some other part of his body as a silhouette. The viewer can enjoy a reaction or a change in an image that is displayed on a monitor device and that is a combination of an image photographed above and some other image by moving his hand or some other part of his body freely while watching the image on the monitor device. [0005]
  • Incidentally, to realize a more advanced interactive input/output environment, the above-described, currently available input devices have the following limits. [0006]
  • In the case of the touch panel, the pointing manipulation is generally limited to one using a finger. No manipulation can be performed in a space in front of the touch panel; it is necessary to cause a physical manipulation body such a finger to contact the manipulation surface. Further, being relatively expensive, the touch panel is not appropriate for a large-size manipulation panel. [0007]
  • In the case of the projection display functioning as a computerized white board, although the manipulation screen can easily be increased in size, a special pointing device such as an infrared-light-emitting pen is needed as described above. [0008]
  • In the case of Video Place, since an interactive manipulation is realized by using a silhouette of a hand or a human body, the input/output interface is indirect and the functionality is insufficient to enable a direct manipulation. [0009]
  • SUMMARY OF THE INVENTION
  • An object of the present invention is to provide a more advanced or enhanced interactive input/output environment. [0010]
  • To attain the above object, the invention provides an information input apparatus comprising a semi-transparent screen that functions as an operator input manipulation surface; pickup means for picking up an input manipulation of an operator on the semi-transparent screen by capturing only light or electromagnetic waves in a predetermined wavelength range that comes through the semi-transparent screen, to thereby produce a pickup signal; and control processing means for generating detection image information corresponding to the input manipulation of the operator based on the pickup information, and for executing a control process based on input manipulation information that is recognized based on the detection image information. [0011]
  • In the information input apparatus having the above basic configuration, a physical object, for instance, that has approached the semi-transparent screen causes a variation in the state of light or electromagnetic waves entering the pickup means. In the invention, such a state variation in light or electromagnetic waves is picked up as image information. The thus-obtained image information is used as manipulation information, and a necessary control process can be executed in accordance with the manipulation information. That is, interactive input/output can be realized by producing input information by a manipulation in which some physical object capable of causing a variation in the state of light or electromagnetic waves in a predetermined wavelength range to be captured by the pickup means is made close to the semi-transparent screen or moved in its vicinity. In the invention, what functions as a manipulation panel is merely a semi-transparent screen. Since the semi-transparent screen can be formed, for instance, by combining a material for forming a transparent screen and a material for forming a semi-transparent screen, a large-size semi-transparent screen can easily be formed. [0012]
  • Where the above configuration is further provided with irradiating means for always irradiating the semi-transparent screen with light or electromagnetic waves to be captured by the pickup means, a medium for detection of information on a manipulation that is performed on the semi-transparent screen can be obtained easily. [0013]
  • The above configuration may further be provided with projection display means so that it can project, onto the semi-transparent screen, an image of visible light in an wavelength range excluding the wavelength range of light or electromagnetic waves to be captured by the pickup means, wherein the control processing means executes, as the above-mentioned control process, a display image generation process for causing the projection display means to project a display image and a control on the projection display means. [0014]
  • In this case, since the semi-transparent screen has a function of a display panel as well as a function of a manipulation panel, an interactive response in response to a manipulation that has been performed on the semi-transparent screen can be displayed as an image on the same semi-transparent screen. [0015]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 conceptually shows an example configuration of an interactive display system according to a first embodiment of the present invention; [0016]
  • FIG. 2 shows an internal configuration of a control device that is provided in the interactive display system according to the first embodiment; [0017]
  • FIG. 3 is a flowchart showing a process of detecting and holding reference input image levels; [0018]
  • FIG. 4 is a flowchart showing a process of generating detection image information; [0019]
  • FIG. 5 illustrates a first application example of the interactive display according to the first embodiment; [0020]
  • FIG. 6 is a flowchart showing a process for realizing the first application example of FIG. 5; [0021]
  • FIG. 7 illustrates a second application example of the interactive display according to the first embodiment; [0022]
  • FIG. 8 illustrates a third application example of the interactive display according to the first embodiment; [0023]
  • FIG. 9 illustrates a fourth application example of the interactive display according to the first embodiment; [0024]
  • FIG. 10 illustrates a fifth application example of the interactive display according to the first embodiment; [0025]
  • FIG. 11 conceptually shows an example configuration of an interactive display system according to a second embodiment of the invention; [0026]
  • FIG. 12 conceptually shows an example configuration of an interactive display system according to a third embodiment of the invention; [0027]
  • FIG. 13 shows the internal configuration of a control device provided in the interactive display system according to the third embodiment; [0028]
  • FIGS. 14A and 14B illustrate operation examples of the interactive display system according to the third embodiment; [0029]
  • FIG. 15 conceptually shows an example configuration of an interactive display system according to a fourth embodiment of the invention; [0030]
  • FIG. 16 conceptually shows an example configuration of an interactive display system according to a fifth embodiment of the invention; [0031]
  • FIG. 17 conceptually shows another example configuration of an interactive display system according to the fifth embodiment of the invention; [0032]
  • FIG. 18 conceptually shows an example configuration of an interactive display system according to a sixth embodiment of the invention; [0033]
  • FIG. 19 shows the internal configuration of a control device provided in the interactive display system according to the sixth embodiment; [0034]
  • FIG. 20 conceptually shows another example configuration of an interactive display system according to the sixth embodiment of the invention; [0035]
  • FIG. 21 conceptually shows an example configuration of an interactive display system according to a seventh embodiment of the invention; [0036]
  • FIG. 22 conceptually shows an example configuration of an interactive display system according to an eighth embodiment of the invention; and [0037]
  • FIG. 23 shows the internal configuration of a control device provided in the interactive display system according to the eighth embodiment. [0038]
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • Information input apparatuses according to embodiments of the present invention will be hereinafter described. [0039]
  • Embodiment 1
  • A first embodiment of the invention will be described with reference to FIGS. [0040] 1-10.
  • FIG. 1 conceptually shows an example configuration of an interactive display system having an information input apparatus according to the first embodiment of the invention. [0041]
  • An [0042] interactive display system 1 according to this embodiment is composed of a semi-transparent screen 2, an infrared light-emitting diode (LED) panel 3, a CCD (charge-coupled device) camera 4, a projector 5, and a control device 6. The infrared LED camera 3, the CCD camera 4, and the projector 5 are provided on the back side of the semi-transparent screen 2.
  • For example, the [0043] semi-transparent screen 2 is formed by bonding a semi-transparent film that looks like tracing paper to a transparent glass plate or by using a member having transparency such as a frosted glass. As described later, the semi-transparent screen 2 has functions of both of a manipulation panel and a display panel in the interactive display system 1.
  • For example, the [0044] infrared LED panel 3 is constructed in such a manner that many infrared LEDs are arranged collectively with respect to a panel surface. The infrared LED panel 3 is so disposed that infrared beams emitted from the respective infrared LEDs are applied to the entire back surface of the semi-transparent screen 2. The infrared LEDs are driven by the control device 6 so as to always emit infrared light.
  • The infrared LEDs of the [0045] infrared LED panel 3 may be provided in a number that is enough for infrared light beams emitted therefrom to illuminate the entire semi-transparent screen 2. As described later, image information reflected from the semi-transparent screen 2 is obtained based on a difference obtained by subtracting a current infrared image level from an initial infrared image level. Therefore, it is not necessary that the quantity of infrared light applied to the semi-transparent screen 2 be uniform over the entire screen 2. Therefore, the infrared LED panel 3 may be much smaller than the semi-transparent screen 2.
  • The [0046] CCD camera 4 is a camera device using a CCD as an imaging device and functions as a pickup means for picking up an input manipulation of an operator on the semi-transparent screen 2. The CCD camera 4 is provided to recognize, as image information, a manipulation that is performed on the semi-transparent screen 2 by photographing only the infrared component of an image formed on the semi-transparent screen 2. To this end, an infrared transmission filter 4 a that transmits only a light component in an infrared wavelength band is provided in the optical system of the CCD camera 4. The position of the CCD camera 4 is so set that the entire semi-transparent screen 2 is included in its photographing range.
  • The [0047] projector 5 projects visible image light onto the back surface of the semi-transparent screen 2 based on image information that is supplied from the control device 6. For example, a user can see, from the front side of the semi-transparent screen 2, an image projected on the semi-transparent screen 2 by the projector 5. The optical system of the projector 5 includes an infrared cutoff filter 5 a for cutting off an infrared component of light, as a result of which the light coming from an image that is projected on the semi-transparent screen 2 does not include an infrared component. Therefore, the CCD camera 4 does not detect a projection image of the projector 5.
  • Incorporating, for instance, a microcomputer, the [0048] control device 6 captures image information (video data) from an imaging signal that is supplied from the CCD camera 4 and obtains manipulation information from the image information. Based on the manipulation information, the control device 6 performs a display control for an image to be displayed on the semi-transparent screen 2 by the projector, and other necessary controls. Further, the control device 6 drives, for light emission, the infrared LEDs of the infrared LED panel 3.
  • The positions of the [0049] infrared LED panel 3, the CCD camera 4, and the projector 5 may be so set that each of those devices can play its role satisfactorily.
  • FIG. 2 is a block diagram showing an example of an internal configuration of the [0050] control device 6. In the control device 6 shown in FIG. 2, an LED driving section 10 is to drive, for light emission, the infrared LEDs of the infrared LED panel 3.
  • An [0051] image input section 11 generates a video signal (video information) by performing prescribed signal processing on an imaging signal that has been produced by the CCD camera 4 based on infrared light coming from the semi-transparent screen 2, and supplies it to an input image processing section 12.
  • For example, the input [0052] image processing section 12 converts the video signal that is supplied from the image input section 11 into video signal data (digital signal). The input image processing section 12 picks up information of a manipulation that has been performed on the semi-transparent screen 2 by executing a necessary analyzing process etc. by using “image information” (for example, frame-by-frame video data) that is obtained based on the video signal data. For example, the manipulation information that is obtained based on the image information is the position (coordinates) on an image of a manipulation body that is performing a manipulation on the semi-transparent screen 2 or the signal level of an image. The manipulation information is transmitted to a database driving section 14. The video signal data can be supplied to an image combining section 17.
  • An threshold [0053] value control section 13 sets a threshold value that is necessary for a process to be executed on manipulation information in the input image processing section 12, and transmits it to the input image processing section 12. The input image processing section 12 generates manipulation information by executing a necessary process such as an analysis on image information by using the threshold value that has been set by the threshold value control section 13. In this embodiment, a current image state (detection image information) of the semi-transparent screen 2 is obtained by calculating a frame difference of input image data (described later). Such information as a reference value (reference image input level) to be used the frame difference calculation is stored in the threshold value control section 13 (described later).
  • The [0054] database driving section 14 captures manipulation information generated by the input image processing section 12, and executes, when necessary, a necessary process based on the manipulation information. Program data necessary for control processes to be executed by the database driving section 14 is stored in a database memory 15. The database driving section 14 executes the necessary control process based on the program data stored in the database memory 15.
  • Controlled by the [0055] database driving section 14, an image generation section 16 generates necessary image data (video signal data (digital data)) and outputs it to an image combining section 17.
  • If necessary, the [0056] image combining section 17 combines video signal data that is supplied from the input image processing section 12 with video signal data that is supplied from the image generation section 16, and outputs resulting data to an RGB signals generation section 18.
  • The RGB signals [0057] generation section 18 converts the video signal data that is supplied from the image combining section 17 into, for instance, analog RGB signals, and outputs those to the projector 5. As a result, image light carrying a video signal that reflects a response to a manipulation that has been performed on the semi-transparent screen 2 is applied to the semi-transparent screen 2 from the projector 5.
  • Next, a description will be made of a method of detecting manipulation information in the above-configured [0058] interactive display system 1 according to the embodiment.
  • As described above, infrared light is applied from the [0059] infrared LED panel 3 to the entire semi-transparent screen 2 (see FIG. 1) from the back side. Because the screen 2 is semi-transparent, not all of the infrared light passes through the semi-transparent screen 2 and a certain part of it is reflected by the semi-transparent screen 2.
  • In this embodiment, the initial levels of video signal data that are obtained by photographing infrared light that is reflected by the [0060] semi-transparent screen 2 with the CCD camera 4 in a state that no manipulation is performed on the semi-transparent screen 2 are stored as “reference input image levels.” The reference input image levels may be obtained by detecting signal levels of the respective pixels of, for instance, one frame by using input video signal data. This detection is performed by the input image processing section 12. The information of the reference input image levels thus detected is transmitted to the threshold value control section 13 and stored there.
  • FIG. 3 is a flowchart showing an example process of detecting the reference input image level. As shown in FIG. 3, first, at step S[0061] 101, the input image processing section 12 detects signal levels of the respective pixels by using 1-frame image data that is obtained from a video signal that is supplied from the image input section 11 and employs detection results as reference input image levels Lint. Specifically, luminous signal component levels of the respective pixels may be detected and employed as the reference input image levels Lint.
  • At the subsequent step S[0062] 102, the reference input image levels Lint are transmitted to the threshold value control section 13 and stored there.
  • The process of FIG. 3 of detecting the reference input image levels Lint and storing those in the threshold [0063] value control section 13 may be executed at the time of turning on the power of the interactive display system, or the reference input image levels Lint may be updated based on a user's instruction when necessary.
  • In a state that the information of the reference input image levels Lint is held in the above manner, image information to be handled as manipulation information is obtained in the following manner. [0064]
  • FIG. 4 is a flowchart showing a process to be executed by the input [0065] image processing section 12 to obtain image information (hereinafter especially referred to as “detection image information”) as a basis of manipulation information. In this process, the input image processing section 12 necessarily detects current input image levels Lprs at step S201. The input image levels Lprs are information obtained by detecting signal levels of the respective pixels of frame-by-frame image data that is obtained by photographing a current infrared image on the semi-transparent screen 2 with the CCD camera 4.
  • Then, at step S[0066] 202, the input image processing section 12 calculates input image level differences L by subtracting the reference input image levels Lint from the current input image levels Lprs (L=Lprs−Lint). Specifically, the input image level differences L are obtained by subtracting a reference input image level Lint from a current input image level Lprs for each pixel at the same position. Therefore, the signal level differences are always obtained as the input image level differences L for the respective pixels by subtracting the reference input image levels Lint from the current input image levels Lprs. At step S203, the input image processing section 12 generates current detection image information (i.e., frame-by-frame video data including pixel-based level information) based on the input image level differences L.
  • The above operation of obtaining the detection image information will now be described in connection with actual motion of a user on the front side of the [0067] semi-transparent screen 2. For example, a user performs a manipulation on the front side of the semi-transparent screen 2 by using some object that reflects infrared light. To simplify the description, it is assumed that the user uses his finger or body.
  • When the user is located on the front side of the [0068] semi-transparent screen 2 at a position far from it (see FIG. 1), the amount of infrared light that is passed through the semi-transparent screen 2 and then reflected by the user's body is small; that is, most of the infrared light that has passed through the semi-transparent screen 2 does not return to the back side of the semi-transparent screen 2. At this time, the current input image levels Lprs are approximately equal to the reference input image levels Lint, and hence the input image level differences L detected by the input image processing section 12 are approximately equal to 0. Therefore, the detection image information that is generated based on the input image level differences L remains approximately the same as in the initial state and has almost no variation component.
  • If the user slowly approaches the [0069] semi-transparent screen 2 starting from the above state, the amount of infrared light that is passed through the semi-transparent screen 2, reflected by the user's body, and again passed through the semi-transparent screen 2 to reach its back side gradually increases. The input image processing section 12 detects this state as a state in which the current input image levels Lprs minus the initial input image levels Lint gradually increase in an image portion corresponding to the user's body. Accordingly, the figure of the user approaching the semi-transparent screen 2 is captured increasingly clearly as the detection image information in accordance with the calculated input image level differences L.
  • In a state that the user's body is very close to the semi-transparent screen [0070] 2 (for instance, distant from the semi-transparent screen 2 by 30 cm or less though this value depends on the setting of the threshold value), most of the infrared light reflected by the human body passes through the semi-transparent screen 2 and reaches its back side. In this state, detection image information reflecting a clear body shape is generated.
  • Now assume another case in which the user whose body is somewhat distant from the [0071] semi-transparent screen 2 places, for instance, his finger at a position very close to the semi-transparent screen 2.
  • In this state, the user's finger that is very close to the [0072] semi-transparent screen 2 reflects infrared light at a higher rate than the other part of the body. Therefore, the input image processing section 12 generates such image information that the level is high in an image region corresponding to the user's finger and decreases with the distance from the semi-transparent screen 2 in an image region corresponding to the user's body which region is part of the background. An image portion corresponding to only the user's finger can easily be separated from the background by comparing the detection image information with the threshold value that is preset is the threshold value control section 13. Similarly, by setting a proper threshold value, it is possible to produce image information in which an image portion corresponding to only the user' body that is distant from the semi-transparent screen 2 is extracted. In this manner, a threshold value that is suitable for an actually necessary condition is set in the threshold value control section 13.
  • With the configuration for detecting the state on the front side of the [0073] semi-transparent screen 2 in the above manner, the following advantages are obtained when the semi-transparent screen 2 functions, for instance, as a manipulation panel for an interactive interface.
  • First, since in this embodiment manipulation information is produced based on an image that is obtained from the quantity of infrared light reflected from the [0074] semi-transparent screen 2, the manipulation body for performing a manipulation need not be a special pointing device and may be of any kind as long as it reflects infrared light. That is, as described above, the entire human body or a part thereof, or some other object may be used as a manipulation body without causing any problem.
  • In the case of the touch panel or the like, it is necessary to bring a manipulation body such as a finger into contact with the manipulation panel surface. In contrast, in this embodiment, since the position and the motion of the manipulation body are detected through reflected infrared light, it is not necessary to bring the manipulation body into contact with the [0075] semi-transparent screen 2 and a manipulation may be performed in a space in front of the semi-transparent screen 2.
  • Since the quantity of reflected infrared light varies with the distance of the manipulation body from the [0076] semi-transparent screen 2 as described above, it is possible to use the distance of the manipulation body from the semi-transparent screen 2 as manipulation information.
  • As described above, the [0077] semi-transparent screen 2 can be a simple means constructed, for example, by combining a transparent glass plate and a semi-transparent thin film such as tracing paper or using such a glass plate as a fronted glass. In particular, since no driving circuit or the like dedicated to the panel is necessary, the semi-transparent screen 2 can easily be increased in size with a small cost increase. This is much different than in the touch panel which cannot be increased in size easily.
  • Since manipulation information can be produced based on an image that is obtained from infrared light that is reflected from the [0078] semi-transparent screen 2, a plurality of manipulation bodies can be recognized at the same time to perform necessary controls as long as their images can be recognized. That is, a plurality of different manipulation bodies can be manipulated at the same time. This is very useful when the semi-transparent screen 2 is of a large size because different kinds of manipulations can be performed at the same time by using various regions of the semi-transparent screen 2.
  • Further, since the [0079] semi-transparent screen 2 also functions as an image display panel, a direct manipulation can be realized easily. For example, as described later, a configuration is possible in which a menu picture or the like on which a manipulation is to be performed is displayed and a user is allowed to perform a manipulation on the menu picture with his finger, for instance.
  • As described above, the interactive display system according to this embodiment provides many possible ways of inputting manipulation information and hence enables easy construction of interactive input/output environments that could not be realized conventionally. [0080]
  • Next, application examples of the above-configured [0081] interactive display system 1 according to this embodiment will be described with reference to FIGS. 5-9.
  • FIG. 5 shows a first application example of the [0082] interactive display system 1 according to this embodiment in which a menu manipulation is performed. FIG. 5 shows a state in which the semi-transparent screen 2 is viewed from the front side.
  • When a user approaches the front surface of the [0083] semi-transparent screen 2 as shown in FIG. 5, first the control device 6 of the interactive display system 1 recognizes the position on the semi-transparent screen 2 corresponding to the user who is located close to the semi-transparent screen 2 based on detection image information that is obtained in this state. Then, as shown in FIG. 5, a menu picture M is displayed at the position on the semi-transparent screen 2 corresponding to the user. Naturally the menu picture M is projected on the semi-transparent screen 2 by the projector 5.
  • Assume that in the state that the menu picture M is displayed on the [0084] semi-transparent screen 2 at the position close to the user, he designates, for instance, with his finger, an arbitrary region of the menu picture M where a manipulation item is displayed. At this time, the finger tip of the user should be located at a position that is distant from the semi-transparent screen 2 by about 3-30 cm.
  • As a result, an indication display indicating the selection of the region of the manipulation item designated by the user is made in the menu picture M (for instance, a cursor is located at the selected region or the selected region is emphasized in some form). A display control for this purpose is realized by detecting the coordinates of the region designated by the user from the detection image information. [0085]
  • In this example, a lapse of a predetermined time (for instance, several seconds) from the start of an indication display of the above kind is regarded as an enter manipulation. If the user has performed an enter manipulation, that is, if a state that some manipulation item is indication-displayed has lasted for the predetermined time or longer, a control operation corresponding to the designated manipulation item is performed. For example, depending on the designated manipulation item, a menu picture of another layer is displayed or a desired operation is performed on the [0086] interactive display system 1. If the interactive display system 1 is so configured as to be connectable to some other external apparatus and the menu picture M is for performing a manipulation control on the external apparatus, an operation of the external apparatus corresponding to the designated manipulation item is controlled.
  • If the user goes away from the [0087] semi-transparent screen 2 to such an extent that the distance between the user and the semi-transparent screen 2 becomes larger than a certain value, the menu picture M that has been displayed so far is erased automatically.
  • FIG. 6 is a flowchart showing a process of the [0088] control device 6 that corresponds to the application example of FIG. 5. This process is basically executed as the input image processing section 12 of the control device 6 recognizes manipulation information based on detection image information and the database driving section 14 performs a proper operation based on the manipulation information according to a program that is stored in the database memory 15.
  • In the routine of FIG. 6, first, at step S[0089] 301, it is judged whether a “close body” has been detected from the current detection image information. The term “close body” means some subject of detection that is within the predetermined range from the semi-transparent screen 2 (in FIG. 5, the user's body).
  • A “close body” is detected by comparing, with the input [0090] image processing section 12, the detection image information with a threshold value that has been set for close body detection by the threshold value control section 13. If a value equal to or larger than the threshold value is obtained in a region of the detection image information, a detection “a close body exists” is made. If there is no region where a value equal to or larger than the threshold value is obtained, a detection “no close body exists” is made. For example, the threshold value for close body detection may be set based on an image level of a human body (user) that would usually be obtained as detection image information when he approaches the semi-transparent screen 2 to a certain extent (for instance, tens of centimeters from the screen 2).
  • If no close body is detected at step S[0091] 301, the process goes to step S308, where it is judged whether the menu picture M is now displayed. If the menu picture M is not displayed, the process returns to the original routine (i.e., returns to step S301). If the menu picture M is displayed, the process goes to step S309, where a control process for erasing the menu picture M is executed. For example, the process of erasing the menu picture M is realized in such a manner that the database driving section 14 causes the image generation section 16 to stop a process of generating image data of the menu picture M.
  • On the other hand, if a close body is detected at step S[0092] 301, the process goes to step S302, where the position of the close body on the semi-transparent screen 2 is detected. For example, this process is realized by detecting the coordinates of a region in the detection image information that is occupied by the close body. As for the detection of coordinates, there may be detected a prescribed one point of the region of the close body or a plurality of points that are determined according a prescribed rule. The point or points to be detected may be set arbitrarily in accordance with an actual application environment or the like.
  • At the subsequent step S[0093] 303, a control for displaying the menu picture M in an area of the semi-transparent screen 2 corresponding to the position of the close body that was detected at step S302. In this control process, for example, the database driving section 14 causes the image generation section 16 to generate image data of a menu picture of a proper kind according to a menu picture display program that is stored in the database memory 15.
  • For example, the [0094] database driving section 14 causes display image data to be generated in such a manner that image data of the menu picture is mapped with a display area corresponding to the position of the close body that was detected at step S302. As a result, the menu picture M is finally projected by the projector 5 at the position on the semi-transparent screen 2 where the user's approach was detected.
  • After the execution of step S[0095] 303, it is judged at step S304 whether a “manipulation body” has been detected in the display regions of the manipulation items of the menu picture M being displayed. The term “manipulation body” is an object (subject of detection) that is very close to the front surface of the semi-transparent screen 2 (distant by about 3-30 cm though this value depends on the setting of the threshold value). In the case of FIG. 5, the user's finger pointing the menu picture M is a subject of detection.
  • The process of detecting a “manipulation body” starts from detecting presence/absence of a manipulation body by comparing the image levels of the detection image information with the threshold value that is set for manipulation body detection in the threshold [0096] value control section 13. The threshold value for this purpose is set larger than the above-described threshold value for close body detection, because it is now necessary to detect an object that is very close to the front surface of the semi-transparent screen 2 by discriminating it from the background.
  • If a manipulation body is detected through the comparison with the threshold value, the coordinates of the position in the detection image information where the manipulation body was detected are detected. Then, it is judged whether the detection position belongs to the display area of the menu picture M in the image information, presence/absence of the manipulation body in the display area, on the [0097] semi-transparent screen 2, of the menu picture M being displayed can be detected.
  • Non-detection of a manipulation body in any of the display regions of the manipulation items of the menu picture M at step S[0098] 304 occurs in the following cases. A first case is such that no manipulation body is detected in the detection image information (for example, the user does not pointing the semi-transparent screen 2 in a very close range). A second case is such that a manipulation body is detected in the detection image information but its detection position (coordinates) does not belong to the area in the image information corresponding to the display area of the menu picture M on the semi-transparent screen 2 (for example, the position on the semi-transparent screen 2 pointed by the user in a very close range is out of any of the regions of the manipulation items of the menu picture M). In any of the above cases, the process returns to step S301.
  • Where the manipulation body is limited to a hand or a finger of a human body, the manipulation body detecting process of step S[0099] 304 may be as follows. The shape of a hand or a finger of a human body that will appear during a manipulation is stored in advance in the database memory 15. Presence/absence of a manipulation body is detected by comparing the information on the shape of a hand or a finger with an image shape that is obtained as detection image information and then evaluating the degree of their coincidence. In the invention, since input information is detected from image information, input information can be recognized as manipulation information based on an image shape in the detection image information.
  • If it is judged at step S[0100] 304 that the manipulation body is detected in the display region of a manipulation item of the menu picture M, the process goes to step S305, where a control is performed so that an indication display is performed on the manipulation item of the menu picture M corresponding to the position where the manipulation body was detected. The process then goes to step S306.
  • Step S[0101] 306 is a process of waiting for an enter manipulation. As described above, when a predetermined time has elapsed from the start of the indication display, a decision is made that an enter manipulation has been performed. Therefore, it is judged at step S306 whether the state that the manipulation body is detected in the same manner as at step S304 has lasted for the predetermined time or longer. This judgment is performed in such a manner that the input image processing section 12 monitors occurrence of a state transition in the current detection image.
  • If it is detected that the manipulation body has disappeared from the current detection image information or the detection position of the manipulation body in the current detection image information goes out of the display region of the manipulation item of the menu picture M where the manipulation body was detected at step S[0102] 304, the process returns from step S306 to step S301. (At step S306, there may occur an event that the user changes the designating position so as to point a manipulation item of the menu picture M that is different from the one that has been designated so far. In this case, an indication display etc. will be performed on the newly designated manipulation item of the menu picture M.)
  • On the other hand, if it is judged at step S[0103] 306 that the state that the manipulation body is detected in the same manner as at step S304 has lasted for the predetermined time or longer, the process goes to step S307 with a judgment that an enter manipulation has been done.
  • At step S[0104] 307, a control process corresponding to the manipulation item of the menu picture M located at the position where the manipulation body was detected. This process is performed by the database driving section 14 according to a program stored in the database memory 15.
  • FIG. 7 shows a second application example of the [0105] interactive display system 1 according to this embodiment. In this example, a map of the world is displayed on the semi-transparent screen 2 as an image projected by the projector 5 under the control of the control device 6.
  • For example, the map of the world may be displayed by performing a manipulation on the menu picture M shown in FIG. 5. Alternatively, it may be displayed automatically when a user (explainer) who has approached to the [0106] semi-transparent screen 2 to a certain extent has been detected as a “close body.” As for the display form in this state, it is possible to display a map so that a reference country or area (for instance, Japan) is always located at a position close, in the horizontal direction, to the position in front of the semi-transparent screen 2 where the explainer stands.
  • In this example, an explanatory image DT that is some explanation as to an area designated by the explainer is superimposed on the map (i.e., the semi-transparent screen [0107] 2) at a designated position. This is done in such a manner that the control device 6 detects, as the position of a manipulation body, the position (coordinates) pointed by the explainer with his finger or the like and performs a control to display the explanatory image DT of the area corresponding to the detected position of the manipulation body. Image data of the map and various explanatory images DT are stored in advance in the database memory 15.
  • In the [0108] interactive display system 1 according to this embodiment, as described above, the size of the semi-transparent screen 2 as the display screen (and the manipulation panel) can be increased easily. Therefore, a conference or a demonstration using a large-size semi-transparent screen 2 like the second application example are applications to which the interactive display system 1 according to this embodiment is satisfactorily applied.
  • FIG. 8 shows a third application example of the interactive display system according to this embodiment. FIG. 8 shows a state that two menu pictures M[0109] 1 and M2 are displayed simultaneously and a user is performing manipulations at the same time on the menu pictures M1 and M2.
  • As described above, in this embodiment, the manipulation information is obtained from the “detection image information” that is produced based on an infrared image that is photographed by the [0110] CCD camera 4. That is, the manipulation information is obtained by recognizing an image state. Therefore, even if a plurality of manipulation bodies (in this example, hands or fingers of the user) are detected at the same time in the detection image information as in the case of FIG. 8, detection results of the respective manipulation bodies can be handled as different pieces of manipulation information.
  • Therefore, in this embodiment, a configuration is possible in which even if the user performs manipulations on a plurality of (in this example, two) menu pictures M[0111] 1 and M2 at the same time with his hands as shown in FIG. 8, proper operations responsive to the manipulations that have been performed on the menu pictures M1 and M2 are performed. In the interactive display system 1 according to this embodiment, the size of the semi-transparent screen 2 can be increased easily. A large-size display panel (manipulation panel) can be utilized particularly effectively by displaying a plurality of images as subjects of manipulation simultaneously.
  • FIG. 9 shows a fourth application example of the [0112] interactive display system 1 according to this embodiment. FIG. 9 shows a state that parameter adjustment images PC1 and PC2 for adjustment of the values of certain parameters are displayed on the semi-transparent screen 2 and a user is performing manipulations at the same time on the parameter adjustment images PC1 and PC2 with his hands. For example, the parameter adjustment images PC1 and PC2 are images simulating slide volumes.
  • In this example, for example, the user performs manipulations in such a manner that he places his hands on the [0113] semi-transparent screen 2 at lever portions (lever images LV) of the respective parameter adjustment images PC1 and PC2 and moves, that is, slides, his hands vertically so as to obtain desired parameter values. In this example, a display is so made that the lever images LV are moved vertically in accordance with the movements of the hands, and the control device 6 executes a corresponding process to variably control the actual parameter values accordingly. As in the case of FIG. 8, even if the lever images LV are manipulated at the same time, the respective pieces of manipulation information can be recognized simultaneously and the parameter values can be varied simultaneously in accordance with the respective manipulations.
  • FIG. 10 shows a fifth application example of the [0114] interactive display system 1 according to this embodiment. FIG. 10 shows a state that an adult person and a child are performing manipulations on different menu pictures at the same time.
  • In this example, the [0115] system 1 is used in the following situation. With an assumption that users perform manipulations on the semi-transparent screen 2 in a standing state, the occupation ratio and the position-related state of a close body (described above in connection with FIG. 6) and the vertical position (height) of a manipulation body on the semi-transparent screen 2 are different for an adult person and a child. That is, because of a difference in height, a child appears as a close body in a lower region in the detection image information than an adult person. Similarly, a manipulation body (user's finger or the like) of an adult person tends to appear in a lower region in the detection image information than that of a child.
  • In view of the above, in the fifth application example, in a case where the application is such that an adult person and a child are supposed to perform different kinds of manipulations, a certain threshold value is set for the vertical height in the detection image information. When a close body or a manipulation body is detected whose height exceeds the predetermined threshold value, a menu picture Mad for adults is displayed at the position of the close body or the manipulation body (the height is also changed that is included in the definition of the display position on the [0116] semi-transparent screen 2 in this example). When a close body or a manipulation body is detected whose height does not exceed the threshold value, a menu picture Mch for children is displayed at the position (including the height) of the close body or the manipulation body. The threshold value may be proper values that are different for a close body and a manipulation body. And the threshold value for height discrimination may be different for adults and children.
  • The fifth application example is the same as the application examples of FIGS. 8 and 9 in that a configuration is possible in which when manipulations are performed on the menu picture Mad for adults and the menu picture Mch for children at the same time, corresponding pieces of manipulation information are recognized simultaneously and control operations responsive thereto are performed. [0117]
  • For the second to fifth application examples shown in FIGS. [0118] 7-10, flowchart-based descriptions for the control device 6 are omitted. However, those application examples are the same as the first application example of FIG. 5 in that the presence/absence and the position (coordinates) of a “close body” or a “manipulation body” is detected and a necessary control operation is performed by recognizing a positional relationship between a detection result and a certain manipulation subject image that is displayed on the semi-transparent screen 2. A process corresponding to each of the second to fifth application examples can be realized basically in the same manner as according to the flowchart of FIG. 6.
  • Embodiment 2
  • FIG. 11 shows an interactive display system according to a second embodiment of the invention. The components in FIG. 11 that are the same as the corresponding components in FIG. 2 are given the same reference numerals as the latter and will not be described below. Further, the [0119] control device 6 may have the same internal configuration as that shown in FIG. 2.
  • In the second embodiment, as shown in FIG. 11, an infrared transmitter PD that emits infrared light is used as a pointing device. [0120]
  • As in the case of the first embodiment, in this embodiment the manipulation information is obtained based on the detection image information that is produced from an infrared image on the [0121] semi-transparent screen 2 that is photographed by the CCD camera 4. Therefore, in this embodiment, a variation in the quantity of infrared light entering the CCD camera 4 from the semi-transparent screen 2 can be recognized as manipulation information.
  • While in the first embodiment a finger or the like is used to point a desired position on the [0122] semi-transparent screen 2, in the second embodiment a user has the infrared transmitter PD with his hand and illuminates the front surface of the semi-transparent screen 2 at a desired position with an infrared beam emitted from the infrared transmitter PD.
  • When an infrared beam emitted from the infrared transmitter PD is applied to the [0123] semi-transparent screen 2, resulting detection image information becomes such that a level at the position (coordinates) illuminated with the infrared beam is different from levels of an area surrounding that position. Therefore, the input image processing section 12 of the control device 6 may operate so as to recognize such a level variation in the detection image information as manipulation information.
  • Since an infrared beam is invisible, it is preferable to, for instance, display a spot SP on the [0124] semi-transparent screen 2 so that a user can find a current illumination position on the semi-transparent screen 2 of an infrared beam emitted from the infrared transmitter PD. This spot display can be realized in such a manner that the input image processing section 12 of the control device 6 recognizes the current illumination position (coordinates) of an infrared beam based on the detection image information and a display control is performed so that the projector 5 projects the spot SP at the illumination position thus recognized.
  • Embodiment 3
  • FIG. 12 conceptually shows the entire configuration of an interactive display system [0125] 1B according to a third embodiment of the invention, and FIG. 13 is a block diagram showing the internal configuration of the control device 6. The components in FIGS. 12 and 13 that are the same as the corresponding components in FIGS. 1 and 2 are given the same reference numerals as the latter and will not be described below.
  • As shown in FIGS. 12 and 13, the interactive display system according to this embodiment is provided with two CCD cameras, that is, a [0126] first CCD camera 4A and a second CCD camera 4B. The first CCD camera 4A has the same role as the CCD camera 4 in the first embodiment. That is, the first CCD camera 4A is provided on the pack side of the semi-transparent screen 2 to photograph an image through infrared light coming from the entire semi-transparent screen 2 as an imaging range.
  • As described later, the [0127] second CCD camera 4B is provided to photograph an image in a prescribed region on the semi-transparent screen 2 with enlargement or reduction. To this end, as shown in FIGS. 12 and 13, the second CCD camera 4B has a pan/tilt/zoom mechanism 7. The pan/tilt/zoom mechanism 7 is provided with a mechanism (pan/tilt mechanism) for rotating the second CCD camera 4B in both horizontal and vertical planes as well as a mechanism (zoom mechanism) for varying the magnification factor of a photographing image by moving a zoom lens that is provided in the second CCD camera 4B. Controls on the pan/tilt/zoom mechanism 7, that is, a pan/tilt position variable control and a zoom ratio variable control, are performed by the database driving section 14 as shown in FIG. 13. Therefore, when necessary, the database memory 15 stores a program for controlling the pan/tilt/zoom mechanism 7.
  • Each of the [0128] CCD cameras 4A and 4B is provided with an infrared transmission filter 4 a so as to photograph only an infrared image on the semi-transparent screen 2.
  • In this embodiment, as shown in FIG. 13, the [0129] control device 6 has two image input sections 11A and 11B that correspond to the first and second CCD cameras 4A and 4B, respectively. The image input section 11A receives an imaging signal from the first CCD camera 4A and supplies a corresponding video signal to the input image processing section 12, and the image input section 11B receives an imaging signal from the second CCD camera 4B and supplies a corresponding video signal to the input image processing section 12. Therefore, in this embodiment, the input image processing section 12 generates two kinds of detection image information based on the video signals of the first and second CCD cameras 4A and 4B, and produces manipulation information from the two kinds of detection image information.
  • For example, the interactive display system [0130] 1B according to this embodiment can be used in the following manner.
  • Assume a case in which as shown in FIG. 12 a user is performing, with his finger, a manipulation on a certain position on the [0131] semi-transparent screen 2 in a state that he is located in front of the front surface of the semi-transparent screen 2. In this case, detection image information produced by the input image processing section 12 after photographing by the first CCD camera 4A for photographing the entire semi-transparent screen 2 is as shown in FIG. 14A. In the above type of manipulation, usually the finger (or hand) of the user is closest to the semi-transparent screen 2 and the remaining parts of his body are more distant from the semi-transparent screen 2 than this hand.
  • Therefore, the detection image information shown in FIG. 14A generally includes an image region A where the body is displayed and an image region B where the hand is displayed, and the latter has a large value (for instance, in the luminance level) than the former. As described above, the input [0132] image processing section 12 can recognize the image region B as a “manipulation body” by separating it from the background including the image region A. As described above, where the manipulation body is limited to a hand or a finger of a human body, a configuration is possible in which a hand or a finger of a user is recognized as a manipulation body based on its shape that is obtained as detection image information.
  • If the input [0133] image processing section 12 has detected the image region B as a “manipulation body,” the database driving section 14 performs controls to zoom-photograph the manipulation body, that is, the image region B, with the second CCD camera 4B based on the information on the position of the manipulation body on the semi-transparent screen 2. Specifically, after a pan/tilt control is performed on the pan/tilt/zoom mechanism 7 based on the information on the position of the image region B on the semi-transparent screen 2 so that the image region B is located approximately at the center of a photographing image of the second CCD camera 4B, a zoom control is performed on the pan/tilt/zoom mechanism 7 so that the image region B occupies almost all of the photographing image. As a result of the above controls, detection image information obtained based on an imaging signal of the second CCD camera 4B becomes image information in which the image region B (manipulation body) is enlarged as shown in FIG. 14B.
  • Where manipulation information is obtained (particularly a designated position is detected) based on the detection imaging information shown in FIG. 14B, the relative resolution is made higher than in the case where the information of a designated position is obtained based on, for instance, the detection image information shown in FIG. 14A. This makes it possible to obtain more accurate information of a designated position. [0134]
  • The above application method is just an example, and other various application examples are conceivable that uses two CCD cameras. Further, the pan/tilt/[0135] zoom mechanism 7 may be provided in both CCD cameras. Still further, a configuration is conceivable in which three or more CCD cameras are used (the pan/tilt/zoom mechanism may be provided in any of those CCD cameras), and pieces of manipulation information are independently obtained from the respective CCD cameras. This configuration also provides various application examples.
  • Embodiment 4
  • FIG. 15 conceptually shows an example configuration of an interactive display system [0136] 1C according to a fourth embodiment of the invention. The components in FIG. 15 that are the same as the corresponding components in, for instance, FIG. 1 are given the same reference numerals as the latter and will not be described below. Further, the control device 6 may have the same the internal configuration as that shown in FIG. 2.
  • In the interactive display systems according to the above embodiments, the [0137] semi-transparent screen 2 is a wall-like one. However, consideration of the functions (as a display panel and a manipulation panel) of the semi-transparent screen 2 leads to an understanding that the semi-transparent screen 2 should not be limited to the wall-like one. In the interactive display system 1C according to this embodiment, a semi-transparent screen 2A has a curved shape. FIG. 15 shows a state that a semi-spherical semi-transparent screen 2A is installed. For example, at least the infrared LED panel 3, the CCD camera 4, and the projector 5 are provided inside the semi-spherical semi-transparent screen 2A. Users perform manipulations from outside the semi-transparent screen 2A. Although the infrared transmission filter 4 a provided in the CCD camera 4 and the infrared cutoff filter Sa provided in the projector 5 are not shown in FIG. 15, actually they are provided in the same manner as in the above embodiments.
  • FIG. 15 shows a state that a map of the world is projected on the [0138] semi-transparent screen 2A. In this case, the interactive display system 1C can be used in the same manner as described in the first embodiment in connection with FIG. 7.
  • Embodiment 5
  • FIG. 16 conceptually shows an example configuration of an [0139] interactive display system 1D according to a fifth embodiment of the invention. The components in FIG. 16 that are the same as the corresponding components in FIG. 1 are given the same reference numerals as the latter and will not be described below. Although the infrared transmission filter 4 a and the infrared cutoff filter 5 a are not shown in FIG. 16, actually they are provided in the CCD camera 4 and the projector 5, respectively. Although the control device 6 is not shown in FIG. 16 either, actually it is provided to control the infrared LED panel 3, the CCD camera 4, and the projector 5. The control device 6 may have the same the internal configuration as that shown in FIG. 2.
  • In this embodiment, for example, the [0140] semi-transparent screen 2 is provided as a wall surface of a passage. For example, the infrared LED panel 3, the CCD camera 4, and the projector 5 are provided behind the wall surface (semi-transparent screen 2) of the passage. That is, the interactive display system 1D according to this embodiment serves as part of the wall surface of the passage.
  • Now assume that a user (pedestrian) is walking on the passage and has just passed by the semi-transparent screen [0141] 2 (wall surface) of the interactive display system 1D.
  • As described above, in the invention, even if a user is somewhat distant from the [0142] semi-transparent screen 2 that functions as a manipulation panel, he can be detected as a “close body.” Therefore, in this example, when a pedestrian has come to a position aside of the semi-transparent screen 2 (wall surface) of the interactive display system 1D, his figure is detected as a close body. For example, an operation is possible in which based on a detection result of a close body, a guide image GD showing, for instance, a place to which the passage leads is projected on the semi-transparent screen 2 (wall surface) by the projector 5 as shown in FIG. 16.
  • When viewed from the pedestrian side, the operation of the [0143] interactive display system 1D would look such that a guide display of the place to which the passage leads automatically appears as he walks on the passage.
  • In the fifth embodiment, in a case where there is not a sufficient space to install the [0144] infrared LED panel 3, the CCD camera 4, and the projector 5 (and the control device 6) on the back side of the passage, the optical paths of light beams emitted from the infrared LED panel 3 and the projector 5 and a light beam to enter the CCD camera 4 may be changed through reflection by using a mirror MR as shown in FIG. 17. In particular, depending on the angles of view of the CCD camera 4 and the projector 5, a considerably long distance may be needed to obtain an imaging range or a projection display range capable of covering a large-size semi-transparent screen 2. By using the mirror MR, a sufficiently wide imaging range or projection display range can be obtained even with a short depth. This relaxes the conditions that should be satisfied by an environment in which to install the interactive display system 1D according to this embodiment.
  • Embodiment 6
  • FIG. 18 conceptually shows an example configuration of an [0145] interactive display system 1E according to a sixth embodiment of the invention. The components in FIG. 18 that are the same as the corresponding components in FIG. 1 are given the same reference numerals as the latter and will not be described below. Although the infrared transmission filter 4 a and the infrared cutoff filter 5 a are not shown in FIG. 16, actually they are provided in the CCD camera 4 and the projector 5, respectively.
  • In this embodiment, a [0146] semi-transparent screen 2B is installed as a table. That is, supported by four table legs F the semi-transparent screen 2B also functions as the top plate of a table as an ordinary piece of furniture. In this embodiment, as shown in FIG. 18, the infrared LED panel 3, the CCD camera 4, the projector 5 (and the control device 6) are provided under the semi-transparent screen 2B.
  • As described later, this embodiment is so configured that when a user performs a manipulation on the [0147] semi-transparent screen 2B, a monitor device 30 is controlled in accordance with the manipulation.
  • FIG. 19 is a block diagram showing the internal configuration of the [0148] control device 6 of the interactive display system 1E according to this embodiment. The components in FIG. 19 that are the same as the corresponding components in FIG. 2 are given the same reference numerals and will not be described below.
  • The [0149] control device 6 shown in FIG. 19 is provided with an external apparatus control section 20. In this embodiment, the external apparatus control section 20 is a circuit for performing a manipulation control on the monitor device 30. The external apparatus control section 20 receives manipulation information from the database driving section 14, and transmits, to the monitor device 30, a command signal for a necessary manipulation control on the monitor device 30. Therefore, the database memory 14 stores a program for realizing manipulations on the monitor device 30 in the interactive display system 1E.
  • In the [0150] interactive display system 1E according to the sixth embodiment, for example, a remote controller display RMD is displayed in a particular region on the semi-transparent screen 2B as shown in FIG. 18. The remote controller display RMD simulates the manipulation panel surface of a remote controller having keys that enable various manipulations on the monitor device 30. A display control of the remote controller display RMD is realized in such a manner that the database driving section 14 performs a display control by using image data of the remote controller display RMD that is stored in the database memory 15, to cause the projector 5 to project an image of the remote controller display RMD.
  • The position of the remote controller display RMD may be set arbitrarily. For example, the remote controller display RMD may be displayed at an arbitrary position that is convenient (i.e, easy to use) for a user in response to a prescribed setting manipulation on the [0151] interactive display system 1E. In this case, if the database driving section 14 recognizes only a current display position of the remote controller display RMD, it can always grasp the display positions (coordinates) of various keys of the remote controller display RMD.
  • Now assume a case that a user has selected a desired channel by manipulating a numeral key (i.e., by a manipulation on the remote controller display RMD displayed on the [0152] semi-transparent screen 2B). At this time, the user may perform an manipulation on the semi-transparent screen 2B with a feeling of depressing a desired one of the numeral keys of the remote controller display RMD on the semi-transparent screen 2B. In this case, depending on the setting of a threshold value for key manipulation judgment in the threshold value control section 13, the user need not always bring a manipulation body such as a finger into contact with the table surface (semi-transparent screen 2B). It is naturally possible to judge, as a key manipulation, a manipulation that is performed on a space above a desired key.
  • The position (coordinates) of the above manipulation of the user is detected by the input [0153] image processing section 12 based on detection image information that is obtained through photographing of the CCD camera 4. The database driving section 14 judges the coordinates of which key of the remote controller display RMD the detected coordinates of the manipulation position coincide with, and transmits, for instance, information indicating the type of the key whose coordinates coincide with those of the manipulation position to the external apparatus control section 20.
  • Based on the information indicating the type of key that is supplied from the [0154] database driving section 14, the external apparatus control section 20 outputs a command signal corresponding to the type of key to the monitor device 30.
  • Since the manipulation being considered is a channel switching manipulation using one numeral key, the [0155] database driving section 14 should transmits information to the effect that a numeral key corresponding to some channel number has been manipulated. The external apparatus control section 20 transmits a command signal for making a switch to the channel number corresponding to the designated numeral key. As a result, the monitor device 30 operates to make a switch to a picture of the manipulated channel.
  • It goes without saying that the external apparatus that can be manipulated by the [0156] interactive display system 1E is not limited to the monitor device (television receiver). A configuration is possible which allows any of other various electronic apparatuses to be manipulated. The invention enables a configuration in which remote controller displays RMD for plural kinds of apparatuses are displayed simultaneously to allow a user's manipulation thereon. It is also possible to recognize simultaneous manipulations on a plurality of keys of one remote controller display RMD and to control the external apparatus accordingly.
  • Where the [0157] semi-transparent screen 2B is a table surface as in the case of this embodiment, the following operation is possible.
  • For example, where some object such as a piece of tableware is placed on the [0158] semi-transparent screen 2B as the table surface, an image reflecting the shape of the object is obtained as detection image information through infrared light that is reflected from the object. In the previous embodiments, this type of variation in image is used as manipulation information. In contrast, in this embodiment, image data of the detection image information can be used as an image to be projected by the projector 5. That is, for example, a configuration is possible in which an image of an object placed on the semi-transparent screen 2B is used like its shadow (see shadow displays SHD shown in FIG. 18). In this case, the position of a shadow display SHD on the semi-transparent screen 2B varies so as to follow the position of the object. And the shape of the shadow display SHD varies in accordance with the distance of the object from the surface of the semi-transparent screen 2B. Therefore, there can be obtained a visual effect that would be interesting to a user.
  • To realize the above type of shadow display SHD, detection image information that is obtained in the input [0159] image processing section 12 of the control device 6 may be supplied to the image combining section 17 as image data. The image combining section 17 combines image data (detection image information; image data for a shadow display SHD) and a remote controller display RMD that has been generated by the image generation section 17 under the control of the database driving section 14, whereby a resulting image is finally projected onto the semi-transparent screen 2B as shown in FIG. 18.
  • Although a detailed description is not made here, an enhanced visual effect can be obtained by applying a special effect to image data of detection image information through proper signal processing, such as multi-colorizing it or changing its shape. [0160]
  • Incidentally, in the sixth embodiment described above, it is necessary to provide at least the [0161] infrared LED panel 3, the CCD camera 4, and the projector 5 under the table. However, there may occur a case that because of an insufficient distance between the floor and the table surface (semi-transparent screen 2B) it is difficult to secure a straight distance that is necessary for the CCD camera 4 or the projector 5 to cover the entire semi-transparent screen 2B. In such a case, as already described in the fifth embodiment, a mirror MR for changing the optical paths through reflection may be provided under the semi-transparent screen 2B (table surface) as shown in FIG. 20. The infrared LED panel 3, the CCD camera 4, the projector 5, etc. may be provided on the side of the table.
  • A half mirror MR having preset transmittance and reflectance values may be used instead of the mirror MR. In this case, the [0162] infrared LED panel 3, for instance, may be provided in the floor portion that is under the table, rather than on the side of the table. This means an increase in the degree of freedom of the device installation. The installation method using a half mirror can also be applicable to the installation form of FIG. 17 (fifth embodiment)
  • Embodiment 7
  • FIG. 21 conceptually shows an example configuration of an [0163] interactive display system 1F according to a seventh embodiment of the invention. The components in FIG. 21 that are the same as the corresponding components in FIG. 1 are given the same reference numerals as the latter and will not be described below.
  • The [0164] infrared LED panel 3 is not provided in the interactive display system 1F.
  • For example, where the interactive display system according to the invention is used outdoors, that is, in an environment of strong ambient light, there is a possibility that manipulation information cannot be detected properly (i.e., proper detection image information that enables recognition of manipulation information cannot be obtained) in the configuration as shown in FIG. 1 in which infrared light emitted from the [0165] infrared LED panel 3 is used for detection of manipulation information. This because the intensity of infrared light emitted from the infrared LED panel 3 is relatively lowered by the presence of strong infrared light in daytime natural light.
  • In view of the above, in this embodiment, infrared light included in natural light is used for detection of manipulation information instead of using the [0166] infrared LED panel 3.
  • In this embodiment, the reference input image levels Lint that are necessary to obtain detection image information are detected based on image information that is obtained from an imaging signal produced by the [0167] CCD camera 4 based on infrared light that has come from the front side and passed through the semi-transparent screen 2, for example, in a state that neither a close body nor a manipulation body exists.
  • When some manipulation is performed on the [0168] semi-transparent screen 2, a close body, a manipulation body, or the like is viewed from the CCD camera 4 side as an infrared shadow on the semi-transparent screen 2 because it interrupts infrared light included in natural light. In the control device 6 of this embodiment, image information that varies in such a manner that the image level decreases (i.e., an image becomes darker) from the reference input image levels Lint is used as manipulation information.
  • The internal configuration of the [0169] control device 6 of the interactive display system is not shown. However, it is noted that the LED driving section 10 is not provided because of the elimination of the infrared LED panel 3.
  • Embodiment 8
  • FIG. 22 conceptually shows the entire configuration of an interactive display system [0170] 1G according to an eighth embodiment of the invention, and FIG. 23 is a block diagram showing the internal configuration of the control device 6 of the interactive display system 1G. The components in FIGS. 22 and 23 that are the same as the corresponding components in FIGS. 1 and 2 are given the same reference numerals as the latter and will not be described below.
  • In the interactive display system [0171] 1G, the infrared LED panel 3 and the CCD camera 4 of the above-described embodiments are replaced by a microwave generator 40 and a microwave receiver 41, respectively.
  • Accordingly, in the [0172] control device 6, a microwave driving circuit 110 for driving the microwave generator 40 is provided instead of the LED driving section 10 (see FIG. 1). Also provided are a reception signal input section 111 for converting reception microwaves that are supplied from the microwave receiver 41 into data of a predetermined form and outputting it and an input data processing section 112 for executing a prescribed process on microwave reception data that is supplied from the reception signal input section 111 to thereby obtain, for instance, detection image information and for obtaining manipulation information based on the detection image information. The reception signal input section 111 and the input data processing section 112 are functional circuit sections that replace the image input section 11 and the input image processing section 12 (see FIG. 1), respectively. In this example, since microwaves are used as a medium for detection of manipulation information, it is not necessary to use the infrared transmission filter 4 a and the infrared cutoff filter 5 a that were provided in the CCD camera 4 and the projector 5, respectively, in the above embodiments.
  • An information input apparatus according to the invention can be constructed even by using, for detection of manipulation information, such a medium as microwaves having a feature of being reflected by an object, basically in the same manner as in the above embodiments (in which infrared light is used for detection of manipulation information). [0173]
  • Although no specific application example is shown nor described, it is naturally possible to construct an information input apparatus according to the invention by omitting the [0174] projector 5 that was provided in the above embodiments, because in the invention it is sufficient to detect information on a manipulation that is performed on the semi-transparent screen 2. In this case, the semi-transparent screen 2 functions as only a manipulation panel. Where a display means capable of interactive response display is necessary while this configuration is maintained, a display device other than the semi-transparent screen 2 may be used.
  • Application examples of the interactive display systems that are constructed according to the invention are not limited to the embodiments and the application examples described above. Other various manipulation methods and applications are possible that make use of the advantages of the input apparatus of the invention. It is also possible to perform an interactive response by voice in the interactive display systems according to the embodiments of the invention. [0175]
  • As is apparent from the above descriptions of the embodiments, the invention provides the following advantages. [0176]
  • The invention provides an information input apparatus comprising a semi-transparent screen that functions as an operator input manipulation surface; pickup means for picking up an input manipulation of an operator on the semi-transparent screen by capturing only light or electromagnetic waves in a predetermined wavelength range that comes through the semi-transparent screen, to thereby produce a pickup signal; and control processing means for generating detection image information corresponding to the input manipulation of the operator based on the pickup information, and for executing a control process based on input manipulation information that is recognized based on the detection image information. [0177]
  • In the information input apparatus having the above basic configuration, any object that causes a variation in the state of light or electromagnetic waves in a predetermined wavelength range by, for instance, reflecting it can serve as a manipulation body with which to perform a manipulation. That is, no special pointing device for a manipulation is needed. Since a manipulation body that is located close to the semi-transparent screen (for instance, in a space in front of the semi-transparent screen) is recognizable, various manipulation methods are possible. For example, a manipulation may be performed in a space in front of the front surface of the semi-transparent screen without bringing the manipulation body into contact with the semi-transparent screen as a manipulation panel, or an object approaching the semi-transparent screen may be recognized. A response process corresponding to a recognized manipulation is executed. [0178]
  • Since the manipulation panel of the invention may be a mere semi-transparent screen, its size can easily be increased in contrast to the case of the conventional touch panel. [0179]
  • The control processing means may be so configured as to be able to recognize plural pieces of input manipulation information based on image states of the detection image information and to execute different control processes based on the respective pieces of input manipulation information. In this case, a plurality of subjects of detection can be recognized at the same time and control processes responsive to the respective detected subjects can be executed independently. For example, it is possible to perform manipulations on a plurality of menu pictures at the same time. [0180]
  • The control processing means may be so configured as to recognize the input manipulation information based on a particular image shape that is obtained as an image state of the detection image information. In this case, for example, it is possible to determine whether to employ a subject of detection as manipulation information based on its shape. This technique easily enables employment, as manipulation information, of only manipulations performed with a hand or a finger of a human body. [0181]
  • The control processing means may be so configured as to be able to recognize a hand or a finger of a human body as a subject of detection of the input manipulation information. In this case, for example, it is possible to recognize motion of a hand or a finger of a human body as manipulation information based on an image shape that is obtained in the detection image information and to prevent other subjects of detection from being recognized as manipulation information. [0182]
  • The information input apparatus of the invention may further comprise projection display means provided so as to be able to project, onto the semi-transparent screen, an image of visible light in an wavelength range excluding the predetermined wavelength range of light or electromagnetic waves to be captured by the pickup means, wherein the control processing means executes, as the control process, a display image generation process for causing the projection display means to project a display image and a control on the projection display means. In this case, the semi-transparency of the semi-transparent screen is utilized. That is, by projecting an image onto the semi-transparent screen with a projector or the like, the semi-transparent screen can be used as not only a manipulation panel but also a display panel for image display. [0183]
  • A menu picture for causing the control processing means to execute a prescribed process may be set as the display image. The menu picture prompts a user to perform various kinds of manipulations. Alternatively, an initial image having a predetermined content may be set as the display image. The control processing means may have attribute information relating to an image content of a particular region in the initial image, wherein when it has been judged that the particular region has been designated as the input manipulation information, the control processing means executes a control process so that the projection display means projects an image indicating the attribute information relating to the designated region. In this case, an application is possible in which an initial image such as a map is displayed and designation of some area on the map causes display of its attribute information such as an explanation of that area. In each of the above cases, a direct input form of manipulation information is realized in which a user performs a manipulation on an image displayed on the semi-transparent screen. [0184]
  • The control processing means may be so configured as to be able to generate the display image by using the detection image information. In this case, an image can be displayed on the semi-transparent screen by using the detection image information. Alternatively, the control processing means may execute the display image generation process so that the display image is displayed in an area on the semi-transparent screen corresponding to a located position of a physical object on the semi-transparent screen or in a space near the semi-transparent screen. In this case, by utilizing the detection image information, a visual effect can be obtained such as displaying a shadow of an object placed on the semi-transparent screen so as to be associated with the object. [0185]
  • The pickup means may comprise irradiating means for always irradiating the semi-transparent screen with light or electromagnetic waves in the predetermined wavelength range that are to be captured by the pickup means. In this case, a medium (for instance, infrared light or microwaves) to be captured by the pickup means can be obtained in a stable manner. This provides high reliability irrespective of the installation environment of a system having the information input apparatus. [0186]
  • The semi-transparent screen may be formed by combining a material (such as glass) for forming a transparent screen and a material (some semi-transparent film) for forming a semi-transparent screen. In this case, the semi-transparent screen can be formed at a low cost while the above-mentioned capability of providing a large-size screen is maintained. [0187]
  • The semi-transparent screen may constitute a wall surface, have a curved surface, or be disposed so as to constitute a table surface. In this manner, the semi-transparent screen may be used in various forms. Accordingly, the application range of the information input apparatus of the invention is increased. [0188]
  • The information input apparatus of the invention may further comprise pointing device means capable of causing a state variation in light or electromagnetic waves in the predetermined wavelength range to be captured by the pickup means by irradiating the semi-transparent screen with light or electromagnetic waves. This is based on the feature of the basic configuration of the invention that any object that causes a variation in the state of light or electromagnetic waves in a predetermined wavelength range to be captured by the pickup means it can serve as a manipulation body. For example, with the use of a particular pointing device, a user can input manipulation information correctly from a position relatively distant from the semi-transparent screen. [0189]
  • The pickup means may comprise a plurality of imaging means for producing imaging signals through photographing with different magnification factors, wherein the control processing means executes the control process based on the detection image information that is generated based on imaging signals that are supplied from the plurality of imaging means. In this case, image information for detection is obtained with different magnification factors by a plurality of imaging means provided as the pickup means. Further, the control processing means may select, according to a predetermined rule, a particular imaging region of an area of detection image information that is obtained based on an imaging signal produced by a predetermined one of the plurality of imaging means, and execute a control so that one of the imaging means that is different from the predetermined imaging means photographs an image in the particular imaging region with a varied magnification factor. In this case, for example, in recognizing a position on the semi-transparent screen pointed by a user, detection image information may be obtained by photographing the pointed region with the second imaging means while magnifying it. This enables highly accurate detection of a pointed position (coordinates). [0190]
  • As described above, the invention provides a high degree of freedom for the input of information such as a manipulation. Further, the invention easily provides a large-size manipulation panel that can also be used as a display panel, thereby providing possibilities of various application forms. That is, the invention provides an advantage that an interactive input/output environment can easily be advanced or enhanced. [0191]

Claims (18)

What is claimed is:
1. An information input apparatus comprising:
a semi-transparent screen that functions as an operator input manipulation surface;
pickup means for picking up an input manipulation of an operator on the semi-transparent screen by capturing only light or electromagnetic waves in a predetermined wavelength range that comes through the semi-transparent screen, to thereby produce a pickup signal; and
control processing means for generating detection image information corresponding to the input manipulation of the operator based on the pickup information, and for executing a control process based on input manipulation information that is recognized based on the detection image information.
2. The information input apparatus according to
claim 1
, wherein the control processing means is so configured as to be able to recognize plural pieces of input manipulation information based on image states of the detection image information and to execute different control processes based on the respective pieces of input manipulation information.
3. The information input apparatus according to
claim 1
, wherein the control processing means is so configured as to recognize the input manipulation information based on a particular image shape that is obtained as an image state of the detection image information.
4. The information input apparatus according to
claim 1
, wherein the control processing means is so configured as to be able to recognize a hand or a finger of a human body as a subject of detection of the input manipulation information.
5. The information input apparatus according to
claim 1
, further comprising projection display means provided so as to be able to project, onto the semi-transparent screen, an image of visible light in an wavelength range excluding the predetermined wavelength range of light or electromagnetic waves to be captured by the pickup means, wherein the control processing means executes, as the control process, a display image generation process for causing the projection display means to project a display image and a control on the projection display means.
6. The information input apparatus according to
claim 5
, wherein a menu picture for causing the control processing means to execute a prescribed process is set as the display image.
7. The information input apparatus according to
claim 5
, wherein an initial image having a predetermined content is set as the display image.
8. The information input apparatus according to
claim 7
, wherein the control processing means has attribute information relating to an image content of a particular region in the initial image, and wherein when it has been judged that the particular region has been designated as the input manipulation information, the control processing means executes a control process so that the projection display means projects an image indicating the attribute information relating to the designated region.
9. The information input apparatus according to
claim 5
, wherein the control processing means is so configured as to be able to generate the display image by using the detection image information.
10. The information input apparatus according to
claim 5
, wherein the control processing means executes the display image generation process so that the display image is displayed in an area on the semi-transparent screen corresponding to a located position of a physical object on the semi-transparent screen or in a space near the semi-transparent screen.
11. The information input apparatus according to
claim 1
, wherein the pickup means comprises irradiating means for always irradiating the semi-transparent screen with light or electromagnetic waves in the predetermined wavelength range that are to be captured by the pickup means.
12. The information input apparatus according to
claim 1
, wherein the semi-transparent screen is formed by combining a material for forming a transparent screen and a material for forming a semi-transparent screen.
13. The information input apparatus according to
claim 1
, wherein the semi-transparent screen constitutes a wall surface.
14. The information input apparatus according to
claim 1
, wherein the semi-transparent screen has a curved surface.
15. The information input apparatus according to
claim 1
, wherein the semi-transparent screen is disposed so as to constitute a table surface.
16. The information input apparatus according to
claim 1
, further comprising pointing device means capable of causing a state variation in light or electromagnetic waves in the predetermined wavelength range to be captured by the pickup means by irradiating the semi-transparent screen with light or electromagnetic waves.
17. The information input apparatus according to
claim 1
, wherein the pickup means comprises a plurality of imaging means for producing imaging signals through photographing with different magnification factors, and wherein the control processing means executes the control process based on the detection image information that is generated based on imaging signals that are supplied from the plurality of imaging means.
18. The information input apparatus according to
claim 17
, wherein the control processing means selects, according to a predetermined rule, a particular imaging region of an area of detection image information that is obtained based on an imaging signal produced by a predetermined one of the plurality of imaging means, and executes a control so that one of the imaging means that is different from the predetermined imaging means photographs an image in the particular imaging region with a varied magnification factor.
US09/110,570 1997-07-07 1998-07-06 Information input apparatus Expired - Lifetime US6414672B2 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP18093697A JP3968477B2 (en) 1997-07-07 1997-07-07 Information input device and information input method
JP9-180936 1997-07-07
JPPO9-180936 1997-07-07

Publications (2)

Publication Number Publication Date
US20010012001A1 true US20010012001A1 (en) 2001-08-09
US6414672B2 US6414672B2 (en) 2002-07-02

Family

ID=16091874

Family Applications (1)

Application Number Title Priority Date Filing Date
US09/110,570 Expired - Lifetime US6414672B2 (en) 1997-07-07 1998-07-06 Information input apparatus

Country Status (2)

Country Link
US (1) US6414672B2 (en)
JP (1) JP3968477B2 (en)

Cited By (178)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020041698A1 (en) * 2000-08-31 2002-04-11 Wataru Ito Object detecting method and object detecting apparatus and intruding object monitoring apparatus employing the object detecting method
US20020186221A1 (en) * 2001-06-05 2002-12-12 Reactrix Systems, Inc. Interactive video display system
US6545670B1 (en) * 1999-05-11 2003-04-08 Timothy R. Pryor Methods and apparatus for man machine interfaces and related activity
US20030095401A1 (en) * 2001-11-20 2003-05-22 Palm, Inc. Non-visible light display illumination system and method
US6587752B1 (en) * 2001-12-25 2003-07-01 National Institute Of Advanced Industrial Science And Technology Robot operation teaching method and apparatus
DE10228110A1 (en) * 2002-06-24 2004-01-15 Siemens Ag Control unit for motor vehicle components
WO2004042666A2 (en) * 2002-11-05 2004-05-21 Disney Enterprises, Inc. Video actuated interactive environment
US20040183775A1 (en) * 2002-12-13 2004-09-23 Reactrix Systems Interactive directed light/sound system
WO2004102301A2 (en) * 2003-05-15 2004-11-25 Qinetiq Limited Non contact human-computer interface
US20050110964A1 (en) * 2002-05-28 2005-05-26 Matthew Bell Interactive video window display system
US20050122308A1 (en) * 2002-05-28 2005-06-09 Matthew Bell Self-contained interactive video display system
WO2005057921A2 (en) * 2003-12-09 2005-06-23 Reactrix Systems, Inc. Self-contained interactive video display system
EP1566729A2 (en) * 2004-02-23 2005-08-24 Aruze Corp. Information input device
WO2005091651A2 (en) * 2004-03-18 2005-09-29 Reactrix Systems, Inc. Interactive video display system
US20050227217A1 (en) * 2004-03-31 2005-10-13 Wilson Andrew D Template matching on interactive surface
US20050226505A1 (en) * 2004-03-31 2005-10-13 Wilson Andrew D Determining connectedness and offset of 3D objects relative to an interactive surface
US20050270358A1 (en) * 2002-07-19 2005-12-08 Jorg Kuchen Image-recording device, method for recording an image that is visualized on a display unit, arrangement of an image-recording device and a display unit, use of said image-recording device, and use of said arrangement
US20050281475A1 (en) * 2004-06-16 2005-12-22 Microsoft Corporation Method and system for reducing effects of undesired signals in an infrared imaging system
US20060001650A1 (en) * 2004-06-30 2006-01-05 Microsoft Corporation Using physical objects to adjust attributes of an interactive display application
EP1615109A2 (en) 2004-06-28 2006-01-11 Microsoft Corporation Recognizing gestures and using gestures for interacting with software applications
US20060044280A1 (en) * 2004-08-31 2006-03-02 Huddleston Wyatt A Interface
WO2006041834A2 (en) * 2004-10-04 2006-04-20 Disney Enterprises, Inc. Interactive projection system and method
US20060109252A1 (en) * 2004-11-23 2006-05-25 Microsoft Corporation Reducing accidental touch-sensitive device activation
US20060112335A1 (en) * 2004-11-18 2006-05-25 Microsoft Corporation Method and system for providing multiple input connecting user interface
WO2006060095A1 (en) * 2004-12-02 2006-06-08 Hewlett-Packard Development Company, L.P. Display panel
US20060118634A1 (en) * 2004-12-07 2006-06-08 Blythe Michael M Object with symbology
US20060132432A1 (en) * 2002-05-28 2006-06-22 Matthew Bell Interactive video display system
US20060139314A1 (en) * 2002-05-28 2006-06-29 Matthew Bell Interactive video display system
US20060158616A1 (en) * 2005-01-15 2006-07-20 International Business Machines Corporation Apparatus and method for interacting with a subject in an environment
US20060158617A1 (en) * 2005-01-20 2006-07-20 Hewlett-Packard Development Company, L.P. Projector
US20060227099A1 (en) * 2005-03-30 2006-10-12 Microsoft Corporation Responding to change of state of control on device disposed on an interactive display surface
US20070018989A1 (en) * 2005-07-20 2007-01-25 Playmotion, Llc Sensory integration therapy system and associated method of use
WO2007035343A1 (en) * 2005-09-16 2007-03-29 Mega Fun Co. Llc System and method for providing an interactive interface
WO2007060606A1 (en) * 2005-11-25 2007-05-31 Koninklijke Philips Electronics N.V. Touchless manipulation of an image
US20070200970A1 (en) * 2006-02-28 2007-08-30 Microsoft Corporation Uniform illumination of interactive display panel
US20070220444A1 (en) * 2006-03-20 2007-09-20 Microsoft Corporation Variable orientation user interface
US20070236485A1 (en) * 2006-03-31 2007-10-11 Microsoft Corporation Object Illumination in a Virtual Environment
US20070262965A1 (en) * 2004-09-03 2007-11-15 Takuya Hirai Input Device
WO2007131382A2 (en) * 2006-05-17 2007-11-22 Eidgenössische Technische Hochschule Displaying information interactively
US20070300307A1 (en) * 2006-06-23 2007-12-27 Microsoft Corporation Security Using Physical Objects
US20080001916A1 (en) * 2004-10-05 2008-01-03 Nikon Corporation Electronic Device
US20080040692A1 (en) * 2006-06-29 2008-02-14 Microsoft Corporation Gesture input
US20080050042A1 (en) * 2006-05-31 2008-02-28 Zhang Guangjun Hardware-in-the-loop simulation system and method for computer vision
US20080062125A1 (en) * 2006-09-08 2008-03-13 Victor Company Of Japan, Limited Electronic appliance
US20080090658A1 (en) * 2004-12-03 2008-04-17 Toshiyuki Kaji Game Machine
US20080143975A1 (en) * 2006-10-25 2008-06-19 International Business Machines Corporation System and method for interacting with a display
US7394459B2 (en) 2004-04-29 2008-07-01 Microsoft Corporation Interaction between objects and a virtual environment display
US20080165266A1 (en) * 2007-01-05 2008-07-10 Microsoft Corporation Specular reflection reduction using multiple cameras
WO2008091471A1 (en) * 2007-01-26 2008-07-31 Microsoft Corporation Alternating light sources to reduce specular reflection
US20080246738A1 (en) * 2005-05-04 2008-10-09 Koninklijke Philips Electronics, N.V. System and Method for Projecting Control Graphics
US20080252596A1 (en) * 2007-04-10 2008-10-16 Matthew Bell Display Using a Three-Dimensional vision System
US20080267465A1 (en) * 2004-04-30 2008-10-30 Kabushiki Kaisha Dds Operating Input Device and Operating Input Program
WO2009071121A2 (en) * 2007-12-05 2009-06-11 Almeva Ag Interaction arrangement for interaction between a display screen and a pointer object
US20090153476A1 (en) * 2007-12-18 2009-06-18 Samsung Electronics Co., Ltd. Display apparatus and control method thereof
US20090189857A1 (en) * 2008-01-25 2009-07-30 Microsoft Corporation Touch sensing for curved displays
US20090231281A1 (en) * 2008-03-11 2009-09-17 Microsoft Corporation Multi-touch virtual keyboard
US20090247234A1 (en) * 2008-03-25 2009-10-01 Lg Electronics Inc. Mobile terminal and method of displaying information therein
US20090309826A1 (en) * 2008-06-17 2009-12-17 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Systems and devices
US20100008582A1 (en) * 2008-07-10 2010-01-14 Samsung Electronics Co., Ltd. Method for recognizing and translating characters in camera-based image
US20100036988A1 (en) * 2008-08-07 2010-02-11 Chien-Wei Chang Multimedia playing device
WO2010023348A1 (en) * 2008-08-26 2010-03-04 Multitouch Oy Interactive displays
US20100066689A1 (en) * 2008-06-17 2010-03-18 Jung Edward K Y Devices related to projection input surfaces
US20100079409A1 (en) * 2008-09-29 2010-04-01 Smart Technologies Ulc Touch panel for an interactive input system, and interactive input system incorporating the touch panel
US20100083109A1 (en) * 2008-09-29 2010-04-01 Smart Technologies Ulc Method for handling interactions with multiple users of an interactive input system, and interactive input system executing the method
US20100079493A1 (en) * 2008-09-29 2010-04-01 Smart Technologies Ulc Method for selecting and manipulating a graphical object in an interactive input system, and interactive input system executing the method
FR2937753A1 (en) * 2008-10-23 2010-04-30 Idealys Entertainment Virtual reading device for e.g. school, has computer displaying contents on screen by capture software, and casing integrating computer to form assembly activated by firewire camera while being triggered by movements of user
US20100134409A1 (en) * 2008-11-30 2010-06-03 Lenovo (Singapore) Pte. Ltd. Three-dimensional user interface
US20100209007A1 (en) * 2005-05-20 2010-08-19 Eyeclick Ltd. System and method for detecting changes in an environment
US7787706B2 (en) 2004-06-14 2010-08-31 Microsoft Corporation Method for controlling an intensity of an infrared source used to detect objects adjacent to an interactive display surface
US20100231506A1 (en) * 2004-09-07 2010-09-16 Timothy Pryor Control of appliances, kitchen and home
US7809167B2 (en) 2003-10-24 2010-10-05 Matthew Bell Method and system for processing captured image information in an interactive video display system
US20100269072A1 (en) * 2008-09-29 2010-10-21 Kotaro Sakata User interface device, user interface method, and recording medium
US20100295820A1 (en) * 2009-05-19 2010-11-25 Microsoft Corporation Light-induced shape-memory polymer display screen
US20100315491A1 (en) * 2009-06-10 2010-12-16 Disney Enterprises, Inc. Projector systems and methods for producing digitally augmented, interactive cakes and other Food Products
US20110007227A1 (en) * 2009-07-07 2011-01-13 Canon Kabushiki Kaisha Image projection apparatus and method for controlling the same
US20110032215A1 (en) * 2009-06-15 2011-02-10 Smart Technologies Ulc Interactive input system and components therefor
EP2284667A1 (en) * 2009-08-07 2011-02-16 Sony Corporation Position detection apparatus and position detection method
US20110050650A1 (en) * 2009-09-01 2011-03-03 Smart Technologies Ulc Interactive input system with improved signal-to-noise ratio (snr) and image capture method
US20110057875A1 (en) * 2009-09-04 2011-03-10 Sony Corporation Display control apparatus, display control method, and display control program
US20110069019A1 (en) * 2009-07-08 2011-03-24 Smart Technologies Ulc Method for manipulating a graphic widget in a three-dimensional environment displayed on a touch panel of an interactive input system
US20110089857A1 (en) * 2008-06-10 2011-04-21 Koninklijke Philips Electronics N.V. Programmable user interface device for controlling an electrical power supplied to an electrical consumer
US20110128386A1 (en) * 2008-08-01 2011-06-02 Hilabs Interactive device and method for use
EP2330558A1 (en) * 2008-09-29 2011-06-08 Panasonic Corporation User interface device, user interface method, and recording medium
US20110157047A1 (en) * 2009-12-25 2011-06-30 Canon Kabushiki Kaisha Information processing apparatus and control method therefor
US20110169748A1 (en) * 2010-01-11 2011-07-14 Smart Technologies Ulc Method for handling user input in an interactive input system, and interactive input system executing the method
US20110181551A1 (en) * 2005-08-31 2011-07-28 Microsoft Corporation Input method for surface of interactive display
WO2011098654A1 (en) * 2010-02-09 2011-08-18 Multitouch Oy Interactive display
US20110208979A1 (en) * 2008-09-22 2011-08-25 Envault Corporation Oy Method and Apparatus for Implementing Secure and Selectively Deniable File Storage
US20110227876A1 (en) * 2008-08-26 2011-09-22 Multi Touch Oy Interactive Display Device with Infrared Capture Unit
US8068641B1 (en) * 2008-06-19 2011-11-29 Qualcomm Incorporated Interaction interface for controlling an application
US8077147B2 (en) 2005-12-30 2011-12-13 Apple Inc. Mouse with optical sensing surface
US8081822B1 (en) 2005-05-31 2011-12-20 Intellectual Ventures Holding 67 Llc System and method for sensing a feature of an object in an interactive video display
CN102289284A (en) * 2011-08-05 2011-12-21 上海源珅多媒体有限公司 Spherical interactive induction protection device
US8094137B2 (en) 2007-07-23 2012-01-10 Smart Technologies Ulc System and method of detecting contact on a display
US8098277B1 (en) 2005-12-02 2012-01-17 Intellectual Ventures Holding 67 Llc Systems and methods for communication between a reactive video system and a mobile communication device
US8159682B2 (en) 2007-11-12 2012-04-17 Intellectual Ventures Holding 67 Llc Lens system
US8199108B2 (en) 2002-12-13 2012-06-12 Intellectual Ventures Holding 67 Llc Interactive directed light/sound system
US8230367B2 (en) 2007-09-14 2012-07-24 Intellectual Ventures Holding 67 Llc Gesture-based user interactions with status indicators for acceptable inputs in volumetric zones
US8228305B2 (en) 1995-06-29 2012-07-24 Apple Inc. Method for providing human input to a computer
FR2970797A1 (en) * 2011-01-25 2012-07-27 Intui Sense TOUCH AND GESTURE CONTROL DEVICE AND METHOD FOR INTERPRETATION OF THE ASSOCIATED GESTURE
US8239784B2 (en) 2004-07-30 2012-08-07 Apple Inc. Mode-based graphical user interfaces for touch sensitive input devices
EP2487624A1 (en) * 2005-01-07 2012-08-15 Qualcomm Incorporated Detecting and tracking objects in images
US20120206339A1 (en) * 2009-07-07 2012-08-16 Elliptic Laboratories As Control using movements
US20120218179A1 (en) * 2009-09-11 2012-08-30 Sony Corporation Display apparatus and control method
US8259163B2 (en) 2008-03-07 2012-09-04 Intellectual Ventures Holding 67 Llc Display with built in 3D sensing
US8262236B2 (en) 2008-06-17 2012-09-11 The Invention Science Fund I, Llc Systems and methods for transmitting information associated with change of a projection surface
FR2972544A1 (en) * 2011-03-10 2012-09-14 Intui Sense ROBUST IMAGE ACQUISITION AND PROCESSING SYSTEM FOR INTERACTIVE FACADE, FACADE AND INTERACTIVE DEVICE THEREFOR
US8267526B2 (en) 2008-06-17 2012-09-18 The Invention Science Fund I, Llc Methods associated with receiving and transmitting information related to projection
FR2972820A1 (en) * 2011-03-18 2012-09-21 Intui Sense ROBUST INTERACTIVE DEVICE WITH SHADOWS
US8300042B2 (en) 2001-06-05 2012-10-30 Microsoft Corporation Interactive video display system using strobed light
US8308304B2 (en) 2008-06-17 2012-11-13 The Invention Science Fund I, Llc Systems associated with receiving and transmitting information related to projection
US8314773B2 (en) 2002-09-09 2012-11-20 Apple Inc. Mouse having an optically-based scrolling feature
US20120293405A1 (en) * 2009-09-15 2012-11-22 Sony Corporation Display device and controlling method
US8381135B2 (en) 2004-07-30 2013-02-19 Apple Inc. Proximity detector in handheld device
US8376558B2 (en) 2008-06-17 2013-02-19 The Invention Science Fund I, Llc Systems and methods for projecting in response to position change of a projection surface
US8384005B2 (en) 2008-06-17 2013-02-26 The Invention Science Fund I, Llc Systems and methods for selectively projecting information in response to at least one specified motion associated with pressure applied to at least one projection surface
US20130057492A1 (en) * 2011-09-06 2013-03-07 Toshiba Tec Kabushiki Kaisha Information display apparatus and method
US20130069899A1 (en) * 2008-03-04 2013-03-21 Jason Clay Beaver Touch Event Model
WO2013052880A1 (en) * 2011-10-07 2013-04-11 Qualcomm Incorporated Vision-based interactive projection system
US20130147736A1 (en) * 2011-12-09 2013-06-13 Ricoh Company, Ltd. Electronic information board apparatus, electronic information board system, and method of controlling electronic information board
US8482535B2 (en) 1999-11-08 2013-07-09 Apple Inc. Programmable tactile touch screen displays and man-machine interfaces for improved vehicle instrumentation and telematics
US8487866B2 (en) 2003-10-24 2013-07-16 Intellectual Ventures Holding 67 Llc Method and system for managing an interactive video display system
US8552999B2 (en) 2010-06-14 2013-10-08 Apple Inc. Control selection approximation
US8566044B2 (en) 2009-03-16 2013-10-22 Apple Inc. Event recognition
US8566045B2 (en) 2009-03-16 2013-10-22 Apple Inc. Event recognition
US8576199B1 (en) 2000-02-22 2013-11-05 Apple Inc. Computer control systems
US8595218B2 (en) 2008-06-12 2013-11-26 Intellectual Ventures Holding 67 Llc Interactive display management systems and methods
US8602564B2 (en) 2008-06-17 2013-12-10 The Invention Science Fund I, Llc Methods and systems for projecting in response to position
US8610674B2 (en) 1995-06-29 2013-12-17 Apple Inc. Programmable tactile touch screen displays and man-machine interfaces for improved vehicle instrumentation and telematics
US8608321B2 (en) 2008-06-17 2013-12-17 The Invention Science Fund I, Llc Systems and methods for projecting in response to conformation
US8641203B2 (en) 2008-06-17 2014-02-04 The Invention Science Fund I, Llc Methods and systems for receiving and transmitting signals between server and projector apparatuses
US8661363B2 (en) 2007-01-07 2014-02-25 Apple Inc. Application programming interfaces for scrolling operations
US8682602B2 (en) 2009-03-16 2014-03-25 Apple Inc. Event recognition
EP2711807A1 (en) * 2012-09-24 2014-03-26 LG Electronics, Inc. Image display apparatus and method for operating the same
US8717305B2 (en) 2008-03-04 2014-05-06 Apple Inc. Touch event model for web pages
US8723822B2 (en) 2008-03-04 2014-05-13 Apple Inc. Touch event model programming interface
US8723787B2 (en) 2008-06-17 2014-05-13 The Invention Science Fund I, Llc Methods and systems related to an image capture projection surface
EP2733657A1 (en) * 2012-11-19 2014-05-21 CSS electronic AG Device for entering data and/or control commands
US8733952B2 (en) 2008-06-17 2014-05-27 The Invention Science Fund I, Llc Methods and systems for coordinated use of two or more user responsive projectors
EP2219134A3 (en) * 2009-02-13 2014-07-23 Sony Corporation Information processing apparatus and information processing method
US8820939B2 (en) 2008-06-17 2014-09-02 The Invention Science Fund I, Llc Projection associated methods and systems
US20140282271A1 (en) * 2013-03-15 2014-09-18 Jean Hsiang-Chun Lu User interface responsive to operator position and gestures
US8936367B2 (en) 2008-06-17 2015-01-20 The Invention Science Fund I, Llc Systems and methods associated with projecting in response to conformation
US8944608B2 (en) 2008-06-17 2015-02-03 The Invention Science Fund I, Llc Systems and methods associated with projecting in response to conformation
US20150052477A1 (en) * 2013-08-19 2015-02-19 Samsung Electronics Co., Ltd. Enlargement and reduction of data with a stylus
EP2849442A1 (en) * 2013-09-16 2015-03-18 ameria GmbH Gesture-controlled rear-projection system
US9014417B1 (en) 2012-10-22 2015-04-21 Google Inc. Method and apparatus for themes using photo-active surface paint
EP2879000A1 (en) * 2013-10-31 2015-06-03 Funai Electric Co., Ltd. Projector device
CN104750244A (en) * 2013-12-27 2015-07-01 索尼公司 Display control device, display control system, display control method, and program
WO2015036852A3 (en) * 2013-08-23 2015-08-20 Lumo Play, Inc. Interactive projection effect and entertainment system
US9128519B1 (en) 2005-04-15 2015-09-08 Intellectual Ventures Holding 67 Llc Method and system for state-based control of objects
US9164596B1 (en) * 2012-10-22 2015-10-20 Google Inc. Method and apparatus for gesture interaction with a photo-active painted surface
US9195320B1 (en) 2012-10-22 2015-11-24 Google Inc. Method and apparatus for dynamic signage using a painted surface display system
US9239673B2 (en) 1998-01-26 2016-01-19 Apple Inc. Gesturing with a multipoint sensing device
US9239677B2 (en) 2004-05-06 2016-01-19 Apple Inc. Operation of a computer with touch screen interface
US9241124B2 (en) 2013-05-01 2016-01-19 Lumo Play, Inc. Content generation for interactive video projection systems
US9292111B2 (en) 1998-01-26 2016-03-22 Apple Inc. Gesturing with a multipoint sensing device
US9298363B2 (en) 2011-04-11 2016-03-29 Apple Inc. Region activation for touch sensitive surface
US9311112B2 (en) 2009-03-16 2016-04-12 Apple Inc. Event recognition
JP2016186677A (en) * 2015-03-27 2016-10-27 セイコーエプソン株式会社 Interactive projection system, pointing element, and method for controlling interactive projection system
US9513744B2 (en) 1994-08-15 2016-12-06 Apple Inc. Control systems employing novel physical controls and touch screens
US9529519B2 (en) 2007-01-07 2016-12-27 Apple Inc. Application programming interfaces for gesture operations
EP2191352A4 (en) * 2007-09-04 2016-12-28 Canon Kk Image projection apparatus and control method for same
WO2017005639A1 (en) * 2015-07-03 2017-01-12 Menger, Christian Gesture-sensing system for visualization devices
US9646562B1 (en) 2012-04-20 2017-05-09 X Development Llc System and method of generating images on photoactive surfaces
EP2546696A4 (en) * 2010-03-08 2017-06-14 Dai Nippon Printing Co., Ltd. Small-form-factor display device with touch-panel functionality, and screen used as the display therein
US9684521B2 (en) 2010-01-26 2017-06-20 Apple Inc. Systems having discrete and continuous gesture recognizers
US9733716B2 (en) 2013-06-09 2017-08-15 Apple Inc. Proxy gesture recognizer
US9993733B2 (en) 2014-07-09 2018-06-12 Lumo Interactive Inc. Infrared reflective device interactive projection effect system
WO2018231213A1 (en) * 2017-06-14 2018-12-20 Hewlett-Packard Development Company, L.P. Display adjustments
US10719697B2 (en) * 2016-09-01 2020-07-21 Mitsubishi Electric Corporation Gesture judgment device, gesture operation device, and gesture judgment method
US10963142B2 (en) 2007-01-07 2021-03-30 Apple Inc. Application programming interfaces for scrolling
US11073949B2 (en) * 2019-02-14 2021-07-27 Seiko Epson Corporation Display method, display device, and interactive projector configured to receive an operation to an operation surface by a hand of a user
US20220066545A1 (en) * 2019-05-14 2022-03-03 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Interactive control method and apparatus, electronic device and storage medium
CN114173026A (en) * 2020-09-16 2022-03-11 北京德胜智课教育科技有限公司 Teaching video recording system and teaching video recording method
US11325157B2 (en) * 2017-07-06 2022-05-10 Allgaier Werke Gmbh Device and method for capturing movement patterns of tumbler screening machines
US11385742B2 (en) * 2020-02-17 2022-07-12 Seiko Epson Corporation Position detection method, position detection device, and position detection system
CN114816206A (en) * 2022-03-15 2022-07-29 联想(北京)有限公司 Data processing method and electronic equipment

Families Citing this family (85)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7084859B1 (en) 1992-09-18 2006-08-01 Pryor Timothy R Programmable tactile touch screen displays and man-machine interfaces for improved vehicle instrumentation and telematics
US7489303B1 (en) * 2001-02-22 2009-02-10 Pryor Timothy R Reconfigurable instrument panels
US6526157B2 (en) 1997-08-01 2003-02-25 Sony Corporation Image processing apparatus, image processing method and transmission medium
GB9722766D0 (en) 1997-10-28 1997-12-24 British Telecomm Portable computers
EP1085432B1 (en) * 1999-09-20 2008-12-03 NCR International, Inc. Information retrieval and display
WO2001052230A1 (en) * 2000-01-10 2001-07-19 Ic Tech, Inc. Method and system for interacting with a display
US7138983B2 (en) 2000-01-31 2006-11-21 Canon Kabushiki Kaisha Method and apparatus for detecting and interpreting path of designated position
JP4803883B2 (en) * 2000-01-31 2011-10-26 キヤノン株式会社 Position information processing apparatus and method and program thereof.
JP2001222375A (en) 2000-02-08 2001-08-17 Seiko Epson Corp Indicated position detection system and method, presentation system and information storage medium
DE10007891C2 (en) * 2000-02-21 2002-11-21 Siemens Ag Method and arrangement for interacting with a representation visible in a shop window
JP3640156B2 (en) 2000-02-22 2005-04-20 セイコーエプソン株式会社 Pointed position detection system and method, presentation system, and information storage medium
US20080122799A1 (en) * 2001-02-22 2008-05-29 Pryor Timothy R Human interfaces for vehicles, homes, and other applications
JP3620397B2 (en) * 2000-03-28 2005-02-16 セイコーエプソン株式会社 Pointed position detection system, presentation system, and information storage medium
GB0027314D0 (en) * 2000-11-09 2000-12-27 Ncr Int Inc Information retrieval and display
US20080024463A1 (en) * 2001-02-22 2008-01-31 Timothy Pryor Reconfigurable tactile control display applications
JP2004014383A (en) * 2002-06-10 2004-01-15 Smk Corp Contact type input device
KR100575906B1 (en) 2002-10-25 2006-05-02 미츠비시 후소 트럭 앤드 버스 코포레이션 Hand pattern switching apparatus
US7390092B2 (en) * 2002-11-08 2008-06-24 Belliveau Richard S Image projection lighting devices with visible and infrared imaging
DE10260305A1 (en) * 2002-12-20 2004-07-15 Siemens Ag HMI setup with an optical touch screen
US20050168448A1 (en) * 2004-01-30 2005-08-04 Simpson Zachary B. Interactive touch-screen using infrared illuminators
JP4220408B2 (en) * 2004-02-13 2009-02-04 株式会社日立製作所 Table type information terminal
JP2005230476A (en) * 2004-02-23 2005-09-02 Aruze Corp Game machine
JP2005242694A (en) * 2004-02-26 2005-09-08 Mitsubishi Fuso Truck & Bus Corp Hand pattern switching apparatus
US20050225473A1 (en) * 2004-04-08 2005-10-13 Alex Hill Infrared emission sensor
WO2006013783A1 (en) * 2004-08-04 2006-02-09 Matsushita Electric Industrial Co., Ltd. Input device
US8560972B2 (en) * 2004-08-10 2013-10-15 Microsoft Corporation Surface UI for gesture-based interaction
US20060044282A1 (en) * 2004-08-27 2006-03-02 International Business Machines Corporation User input apparatus, system, method and computer program for use with a screen having a translucent surface
US7576725B2 (en) * 2004-10-19 2009-08-18 Microsoft Corporation Using clear-coded, see-through objects to manipulate virtual objects
US7359564B2 (en) * 2004-10-29 2008-04-15 Microsoft Corporation Method and system for cancellation of ambient light using light frequency
US7898505B2 (en) * 2004-12-02 2011-03-01 Hewlett-Packard Development Company, L.P. Display system
US20060132459A1 (en) * 2004-12-20 2006-06-22 Huddleston Wyatt A Interpreting an image
EP1849123A2 (en) * 2005-01-07 2007-10-31 GestureTek, Inc. Optical flow based tilt sensor
CN101198964A (en) 2005-01-07 2008-06-11 格斯图尔泰克股份有限公司 Creating 3D images of objects by illuminating with infrared patterns
US20060158437A1 (en) * 2005-01-20 2006-07-20 Blythe Michael M Display device
JP4689684B2 (en) * 2005-01-21 2011-05-25 ジェスチャー テック,インコーポレイテッド Tracking based on movement
US7499027B2 (en) * 2005-04-29 2009-03-03 Microsoft Corporation Using a light pointer for input on an interactive display surface
US7535463B2 (en) * 2005-06-15 2009-05-19 Microsoft Corporation Optical flow-based manipulation of graphical objects
US7525538B2 (en) * 2005-06-28 2009-04-28 Microsoft Corporation Using same optics to image, illuminate, and project
US8847924B2 (en) * 2005-10-03 2014-09-30 Hewlett-Packard Development Company, L.P. Reflecting light
US8060840B2 (en) 2005-12-29 2011-11-15 Microsoft Corporation Orientation free user interface
JP2007223416A (en) * 2006-02-22 2007-09-06 Tokai Rika Co Ltd Vehicular instrument panel
US10026177B2 (en) * 2006-02-28 2018-07-17 Microsoft Technology Licensing, Llc Compact interactive tabletop with projection-vision
US8599133B2 (en) * 2006-07-28 2013-12-03 Koninklijke Philips N.V. Private screens self distributing along the shop window
US20080096651A1 (en) * 2006-07-28 2008-04-24 Aruze Corp. Gaming machine
KR100837166B1 (en) * 2007-01-20 2008-06-11 엘지전자 주식회사 Method of displaying an information in electronic device and the electronic device thereof
JP2008264321A (en) * 2007-04-23 2008-11-06 Takahiro Matsuo Performance device and its performance method
JP5344804B2 (en) * 2007-06-07 2013-11-20 株式会社タイトー Game device using projected shadow
JP2007299434A (en) * 2007-08-23 2007-11-15 Advanced Telecommunication Research Institute International Large-screen touch panel system, and retrieval/display system
JP5508269B2 (en) * 2007-09-11 2014-05-28 スマート・インターネット・テクノロジー・シーアールシー・プロプライエタリー・リミテッド System and method for manipulating digital images on a computer display
AU2008299578B2 (en) * 2007-09-11 2014-12-04 Cruiser Interactive Pty Ltd A system and method for capturing digital images
EP2201440A4 (en) 2007-09-11 2012-08-29 Smart Internet Technology Crc Pty Ltd An interface element for a computer interface
EP2201448A4 (en) * 2007-09-11 2013-10-16 Smart Internet Technology Crc Systems and methods for remote file transfer
JP4933389B2 (en) * 2007-09-14 2012-05-16 株式会社リコー Image projection display device, image projection display method, image projection display program, and recording medium
US9195886B2 (en) * 2008-01-09 2015-11-24 Cybernet Systems Corporation Rapid prototyping and machine vision for reconfigurable interfaces
JP4954143B2 (en) * 2008-06-02 2012-06-13 三菱電機株式会社 Position detection device
JP5094566B2 (en) * 2008-06-02 2012-12-12 三菱電機株式会社 Video display system
JP4609543B2 (en) 2008-07-25 2011-01-12 ソニー株式会社 Information processing apparatus and information processing method
JP4609557B2 (en) 2008-08-29 2011-01-12 ソニー株式会社 Information processing apparatus and information processing method
JP5279646B2 (en) * 2008-09-03 2013-09-04 キヤノン株式会社 Information processing apparatus, operation method thereof, and program
EP2194447A1 (en) 2008-12-08 2010-06-09 IBBT vzw Electronic painting system and apparatus for input to the same
US8384682B2 (en) * 2009-01-08 2013-02-26 Industrial Technology Research Institute Optical interactive panel and display system with optical interactive panel
JP5050153B2 (en) * 2009-01-16 2012-10-17 株式会社国際電気通信基礎技術研究所 Website search system
US9652030B2 (en) * 2009-01-30 2017-05-16 Microsoft Technology Licensing, Llc Navigation of a virtual plane using a zone of restriction for canceling noise
US9383823B2 (en) * 2009-05-29 2016-07-05 Microsoft Technology Licensing, Llc Combining gestures beyond skeletal
JP2011028555A (en) * 2009-07-27 2011-02-10 Sony Corp Information processor and information processing method
US10025321B2 (en) * 2009-12-21 2018-07-17 Ncr Corporation Self-service system with user interface positioning
CN101776836B (en) * 2009-12-28 2013-08-07 武汉全真光电科技有限公司 Projection display system and desktop computer
CN102129151A (en) * 2010-01-20 2011-07-20 鸿富锦精密工业(深圳)有限公司 Front projection control system and method
JP5740822B2 (en) 2010-03-04 2015-07-01 ソニー株式会社 Information processing apparatus, information processing method, and program
JP5605059B2 (en) * 2010-07-29 2014-10-15 大日本印刷株式会社 Transmission screen and display device
JP5645444B2 (en) * 2010-03-30 2014-12-24 キヤノン株式会社 Image display system and control method thereof
US8818027B2 (en) * 2010-04-01 2014-08-26 Qualcomm Incorporated Computing device interface
JP4968360B2 (en) * 2010-04-05 2012-07-04 ソニー株式会社 Image display device
JP2011248041A (en) * 2010-05-26 2011-12-08 Seiko Epson Corp Mounting device and projection type display apparatus
JP6281857B2 (en) * 2012-01-13 2018-02-21 株式会社ドワンゴ Video system and photographing method
US9395066B2 (en) * 2012-01-13 2016-07-19 Laser Devices, Inc. Adjustable beam illuminator
US8833994B2 (en) * 2012-03-08 2014-09-16 Laser Devices, Inc. Light pointer having optical fiber light source
JP6326714B2 (en) * 2012-11-16 2018-05-23 カシオ計算機株式会社 Projection system
JP6202942B2 (en) * 2013-08-26 2017-09-27 キヤノン株式会社 Information processing apparatus and control method thereof, computer program, and storage medium
US9574759B2 (en) 2015-01-16 2017-02-21 Steiner Eoptics, Inc. Adjustable laser illumination pattern
JP6507905B2 (en) * 2015-03-31 2019-05-08 富士通株式会社 CONTENT DISPLAY CONTROL METHOD, CONTENT DISPLAY CONTROL DEVICE, AND CONTENT DISPLAY CONTROL PROGRAM
WO2017006422A1 (en) * 2015-07-06 2017-01-12 富士通株式会社 Electronic device
JP6465197B2 (en) * 2017-12-12 2019-02-06 ソニー株式会社 Information processing apparatus, information processing method, and program
WO2019188046A1 (en) * 2018-03-26 2019-10-03 富士フイルム株式会社 Projection system, projection control device, projection control method, and projection control program
CN109597530B (en) * 2018-11-21 2022-04-19 深圳闳宸科技有限公司 Display device and screen positioning method

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5012284A (en) * 1989-09-29 1991-04-30 Xerox Corporation Magnification adjustment for computer forms
US6008800A (en) * 1992-09-18 1999-12-28 Pryor; Timothy R. Man machine interfaces for entering data into a computer
DE4423005C1 (en) * 1994-06-30 1995-11-30 Siemens Ag Computer data entry stylus with indistinguishable contact surfaces
US5687297A (en) * 1995-06-29 1997-11-11 Xerox Corporation Multifunctional apparatus for appearance tuning and resolution reconstruction of digital images
US5736975A (en) * 1996-02-02 1998-04-07 Interactive Sales System Interactive video display

Cited By (350)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9513744B2 (en) 1994-08-15 2016-12-06 Apple Inc. Control systems employing novel physical controls and touch screens
US8228305B2 (en) 1995-06-29 2012-07-24 Apple Inc. Method for providing human input to a computer
US8427449B2 (en) 1995-06-29 2013-04-23 Apple Inc. Method for providing human input to a computer
US9758042B2 (en) 1995-06-29 2017-09-12 Apple Inc. Programmable tactile touch screen displays and man-machine interfaces for improved vehicle instrumentation and telematics
US8610674B2 (en) 1995-06-29 2013-12-17 Apple Inc. Programmable tactile touch screen displays and man-machine interfaces for improved vehicle instrumentation and telematics
US9239673B2 (en) 1998-01-26 2016-01-19 Apple Inc. Gesturing with a multipoint sensing device
US9292111B2 (en) 1998-01-26 2016-03-22 Apple Inc. Gesturing with a multipoint sensing device
US6545670B1 (en) * 1999-05-11 2003-04-08 Timothy R. Pryor Methods and apparatus for man machine interfaces and related activity
US8482535B2 (en) 1999-11-08 2013-07-09 Apple Inc. Programmable tactile touch screen displays and man-machine interfaces for improved vehicle instrumentation and telematics
US8576199B1 (en) 2000-02-22 2013-11-05 Apple Inc. Computer control systems
US20020041698A1 (en) * 2000-08-31 2002-04-11 Wataru Ito Object detecting method and object detecting apparatus and intruding object monitoring apparatus employing the object detecting method
US7082209B2 (en) * 2000-08-31 2006-07-25 Hitachi Kokusai Electric, Inc. Object detecting method and object detecting apparatus and intruding object monitoring apparatus employing the object detecting method
US8300042B2 (en) 2001-06-05 2012-10-30 Microsoft Corporation Interactive video display system using strobed light
US20020186221A1 (en) * 2001-06-05 2002-12-12 Reactrix Systems, Inc. Interactive video display system
US7259747B2 (en) 2001-06-05 2007-08-21 Reactrix Systems, Inc. Interactive video display system
US7834846B1 (en) 2001-06-05 2010-11-16 Matthew Bell Interactive video display system
US20030095401A1 (en) * 2001-11-20 2003-05-22 Palm, Inc. Non-visible light display illumination system and method
US6587752B1 (en) * 2001-12-25 2003-07-01 National Institute Of Advanced Industrial Science And Technology Robot operation teaching method and apparatus
US9606668B2 (en) 2002-02-07 2017-03-28 Apple Inc. Mode-based graphical user interfaces for touch sensitive input devices
US20050110964A1 (en) * 2002-05-28 2005-05-26 Matthew Bell Interactive video window display system
US7170492B2 (en) 2002-05-28 2007-01-30 Reactrix Systems, Inc. Interactive video display system
US20060139314A1 (en) * 2002-05-28 2006-06-29 Matthew Bell Interactive video display system
US20060132432A1 (en) * 2002-05-28 2006-06-22 Matthew Bell Interactive video display system
US7348963B2 (en) 2002-05-28 2008-03-25 Reactrix Systems, Inc. Interactive video display system
US8035624B2 (en) 2002-05-28 2011-10-11 Intellectual Ventures Holding 67 Llc Computer vision based touch screen
US8035612B2 (en) * 2002-05-28 2011-10-11 Intellectual Ventures Holding 67 Llc Self-contained interactive video display system
US20080150890A1 (en) * 2002-05-28 2008-06-26 Matthew Bell Interactive Video Window
US7710391B2 (en) 2002-05-28 2010-05-04 Matthew Bell Processing an image utilizing a spatially varying pattern
US20050122308A1 (en) * 2002-05-28 2005-06-09 Matthew Bell Self-contained interactive video display system
US8035614B2 (en) 2002-05-28 2011-10-11 Intellectual Ventures Holding 67 Llc Interactive video window
DE10228110A1 (en) * 2002-06-24 2004-01-15 Siemens Ag Control unit for motor vehicle components
US7581682B2 (en) * 2002-07-19 2009-09-01 Gavitec Ag Image-recording device, method for recording an image that is visualized on a display unit, arrangement of an image-recording device and a display unit, use of said image-recording device, and use of said arrangement
US20050270358A1 (en) * 2002-07-19 2005-12-08 Jorg Kuchen Image-recording device, method for recording an image that is visualized on a display unit, arrangement of an image-recording device and a display unit, use of said image-recording device, and use of said arrangement
US8314773B2 (en) 2002-09-09 2012-11-20 Apple Inc. Mouse having an optically-based scrolling feature
WO2004042666A3 (en) * 2002-11-05 2005-02-10 Disney Entpr Inc Video actuated interactive environment
US7775883B2 (en) * 2002-11-05 2010-08-17 Disney Enterprises, Inc. Video actuated interactive environment
US20040102247A1 (en) * 2002-11-05 2004-05-27 Smoot Lanny Starkes Video actuated interactive environment
WO2004042666A2 (en) * 2002-11-05 2004-05-21 Disney Enterprises, Inc. Video actuated interactive environment
US8199108B2 (en) 2002-12-13 2012-06-12 Intellectual Ventures Holding 67 Llc Interactive directed light/sound system
US7576727B2 (en) 2002-12-13 2009-08-18 Matthew Bell Interactive directed light/sound system
US20040183775A1 (en) * 2002-12-13 2004-09-23 Reactrix Systems Interactive directed light/sound system
CN100409159C (en) * 2003-05-15 2008-08-06 秦内蒂克有限公司 Non contact human-computer interface
WO2004102301A3 (en) * 2003-05-15 2006-06-08 Qinetiq Ltd Non contact human-computer interface
US20060238490A1 (en) * 2003-05-15 2006-10-26 Qinetiq Limited Non contact human-computer interface
WO2004102301A2 (en) * 2003-05-15 2004-11-25 Qinetiq Limited Non contact human-computer interface
US8487866B2 (en) 2003-10-24 2013-07-16 Intellectual Ventures Holding 67 Llc Method and system for managing an interactive video display system
US7809167B2 (en) 2003-10-24 2010-10-05 Matthew Bell Method and system for processing captured image information in an interactive video display system
KR101258587B1 (en) * 2003-12-09 2013-05-02 인텔렉츄얼 벤처스 홀딩 67 엘엘씨 Self-Contained Interactive Video Display System
WO2005057399A3 (en) * 2003-12-09 2005-09-29 Matthew Bell Self-contained interactive video display system
WO2005057399A2 (en) * 2003-12-09 2005-06-23 Reactrix Systems, Inc. Self-contained interactive video display system
WO2005057921A2 (en) * 2003-12-09 2005-06-23 Reactrix Systems, Inc. Self-contained interactive video display system
WO2005057398A2 (en) * 2003-12-09 2005-06-23 Matthew Bell Interactive video window display system
WO2005057921A3 (en) * 2003-12-09 2005-09-29 Matthew Bell Self-contained interactive video display system
WO2005057398A3 (en) * 2003-12-09 2005-09-29 Matthew Bell Interactive video window display system
EP1566729A2 (en) * 2004-02-23 2005-08-24 Aruze Corp. Information input device
EP1566729A3 (en) * 2004-02-23 2009-02-18 Aruze Corp. Information input device
WO2005091651A3 (en) * 2004-03-18 2006-01-12 Reactrix Systems Inc Interactive video display system
WO2005091651A2 (en) * 2004-03-18 2005-09-29 Reactrix Systems, Inc. Interactive video display system
US7379562B2 (en) 2004-03-31 2008-05-27 Microsoft Corporation Determining connectedness and offset of 3D objects relative to an interactive surface
US20050227217A1 (en) * 2004-03-31 2005-10-13 Wilson Andrew D Template matching on interactive surface
US20050226505A1 (en) * 2004-03-31 2005-10-13 Wilson Andrew D Determining connectedness and offset of 3D objects relative to an interactive surface
US7394459B2 (en) 2004-04-29 2008-07-01 Microsoft Corporation Interaction between objects and a virtual environment display
US20080267465A1 (en) * 2004-04-30 2008-10-30 Kabushiki Kaisha Dds Operating Input Device and Operating Input Program
US9239677B2 (en) 2004-05-06 2016-01-19 Apple Inc. Operation of a computer with touch screen interface
US7787706B2 (en) 2004-06-14 2010-08-31 Microsoft Corporation Method for controlling an intensity of an infrared source used to detect objects adjacent to an interactive display surface
US20050281475A1 (en) * 2004-06-16 2005-12-22 Microsoft Corporation Method and system for reducing effects of undesired signals in an infrared imaging system
US8165422B2 (en) * 2004-06-16 2012-04-24 Microsoft Corporation Method and system for reducing effects of undesired signals in an infrared imaging system
US8670632B2 (en) 2004-06-16 2014-03-11 Microsoft Corporation System for reducing effects of undesired signals in an infrared imaging system
US20090262070A1 (en) * 2004-06-16 2009-10-22 Microsoft Corporation Method and System for Reducing Effects of Undesired Signals in an Infrared Imaging System
US7593593B2 (en) * 2004-06-16 2009-09-22 Microsoft Corporation Method and system for reducing effects of undesired signals in an infrared imaging system
US20080193043A1 (en) * 2004-06-16 2008-08-14 Microsoft Corporation Method and system for reducing effects of undesired signals in an infrared imaging system
US7613358B2 (en) * 2004-06-16 2009-11-03 Microsoft Corporation Method and system for reducing effects of undesired signals in an infrared imaging system
EP1615109A2 (en) 2004-06-28 2006-01-11 Microsoft Corporation Recognizing gestures and using gestures for interacting with software applications
EP1615109A3 (en) * 2004-06-28 2006-10-04 Microsoft Corporation Recognizing gestures and using gestures for interacting with software applications
US7519223B2 (en) 2004-06-28 2009-04-14 Microsoft Corporation Recognizing gestures and using gestures for interacting with software applications
US20060010400A1 (en) * 2004-06-28 2006-01-12 Microsoft Corporation Recognizing gestures and using gestures for interacting with software applications
US20060001650A1 (en) * 2004-06-30 2006-01-05 Microsoft Corporation Using physical objects to adjust attributes of an interactive display application
US7743348B2 (en) * 2004-06-30 2010-06-22 Microsoft Corporation Using physical objects to adjust attributes of an interactive display application
US9348458B2 (en) 2004-07-30 2016-05-24 Apple Inc. Gestures for touch sensitive input devices
US11036282B2 (en) 2004-07-30 2021-06-15 Apple Inc. Proximity detector in handheld device
US10042418B2 (en) 2004-07-30 2018-08-07 Apple Inc. Proximity detector in handheld device
US8612856B2 (en) 2004-07-30 2013-12-17 Apple Inc. Proximity detector in handheld device
US8239784B2 (en) 2004-07-30 2012-08-07 Apple Inc. Mode-based graphical user interfaces for touch sensitive input devices
US8381135B2 (en) 2004-07-30 2013-02-19 Apple Inc. Proximity detector in handheld device
US8479122B2 (en) 2004-07-30 2013-07-02 Apple Inc. Gestures for touch sensitive input devices
WO2006026012A2 (en) * 2004-08-31 2006-03-09 Hewlett-Packard Development Company, L.P. Touch-screen interface
WO2006026012A3 (en) * 2004-08-31 2006-04-20 Hewlett Packard Development Co Touch-screen interface
US20060044280A1 (en) * 2004-08-31 2006-03-02 Huddleston Wyatt A Interface
US20070262965A1 (en) * 2004-09-03 2007-11-15 Takuya Hirai Input Device
US20100231506A1 (en) * 2004-09-07 2010-09-16 Timothy Pryor Control of appliances, kitchen and home
WO2006041834A2 (en) * 2004-10-04 2006-04-20 Disney Enterprises, Inc. Interactive projection system and method
WO2006041834A3 (en) * 2004-10-04 2007-07-12 Disney Entpr Inc Interactive projection system and method
US8330714B2 (en) * 2004-10-05 2012-12-11 Nikon Corporation Electronic device
US20080001916A1 (en) * 2004-10-05 2008-01-03 Nikon Corporation Electronic Device
US20060112335A1 (en) * 2004-11-18 2006-05-25 Microsoft Corporation Method and system for providing multiple input connecting user interface
US7925996B2 (en) 2004-11-18 2011-04-12 Microsoft Corporation Method and system for providing multiple input connecting user interface
US7847789B2 (en) * 2004-11-23 2010-12-07 Microsoft Corporation Reducing accidental touch-sensitive device activation
US20060109252A1 (en) * 2004-11-23 2006-05-25 Microsoft Corporation Reducing accidental touch-sensitive device activation
CN1797305B (en) * 2004-11-23 2011-06-22 微软公司 Method for distinguishing indicator pen input from non-indicator pen input on touch-sensitive surface
US20060119798A1 (en) * 2004-12-02 2006-06-08 Huddleston Wyatt A Display panel
US8508710B2 (en) 2004-12-02 2013-08-13 Hewlett-Packard Development Company, L.P. Display panel
WO2006060095A1 (en) * 2004-12-02 2006-06-08 Hewlett-Packard Development Company, L.P. Display panel
US20080090658A1 (en) * 2004-12-03 2008-04-17 Toshiyuki Kaji Game Machine
US20060118634A1 (en) * 2004-12-07 2006-06-08 Blythe Michael M Object with symbology
CN102831387A (en) * 2005-01-07 2012-12-19 高通股份有限公司 Detecting and tracking objects in images
US8483437B2 (en) 2005-01-07 2013-07-09 Qualcomm Incorporated Detecting and tracking objects in images
EP2487624A1 (en) * 2005-01-07 2012-08-15 Qualcomm Incorporated Detecting and tracking objects in images
US20060158616A1 (en) * 2005-01-15 2006-07-20 International Business Machines Corporation Apparatus and method for interacting with a subject in an environment
US20060158617A1 (en) * 2005-01-20 2006-07-20 Hewlett-Packard Development Company, L.P. Projector
US7503658B2 (en) * 2005-01-20 2009-03-17 Hewlett-Packard Development Company, L.P. Projector
US20060227099A1 (en) * 2005-03-30 2006-10-12 Microsoft Corporation Responding to change of state of control on device disposed on an interactive display surface
US7570249B2 (en) * 2005-03-30 2009-08-04 Microsoft Corporation Responding to change of state of control on device disposed on an interactive display surface
US9128519B1 (en) 2005-04-15 2015-09-08 Intellectual Ventures Holding 67 Llc Method and system for state-based control of objects
US20080246738A1 (en) * 2005-05-04 2008-10-09 Koninklijke Philips Electronics, N.V. System and Method for Projecting Control Graphics
US20100209007A1 (en) * 2005-05-20 2010-08-19 Eyeclick Ltd. System and method for detecting changes in an environment
US8081822B1 (en) 2005-05-31 2011-12-20 Intellectual Ventures Holding 67 Llc System and method for sensing a feature of an object in an interactive video display
US20070018989A1 (en) * 2005-07-20 2007-01-25 Playmotion, Llc Sensory integration therapy system and associated method of use
WO2007019443A1 (en) * 2005-08-05 2007-02-15 Reactrix Systems, Inc. Interactive video display system
US20110181551A1 (en) * 2005-08-31 2011-07-28 Microsoft Corporation Input method for surface of interactive display
US8519952B2 (en) 2005-08-31 2013-08-27 Microsoft Corporation Input method for surface of interactive display
WO2007035343A1 (en) * 2005-09-16 2007-03-29 Mega Fun Co. Llc System and method for providing an interactive interface
WO2007060606A1 (en) * 2005-11-25 2007-05-31 Koninklijke Philips Electronics N.V. Touchless manipulation of an image
US8098277B1 (en) 2005-12-02 2012-01-17 Intellectual Ventures Holding 67 Llc Systems and methods for communication between a reactive video system and a mobile communication device
US8077147B2 (en) 2005-12-30 2011-12-13 Apple Inc. Mouse with optical sensing surface
US20070200970A1 (en) * 2006-02-28 2007-08-30 Microsoft Corporation Uniform illumination of interactive display panel
US7515143B2 (en) * 2006-02-28 2009-04-07 Microsoft Corporation Uniform illumination of interactive display panel
US20070220444A1 (en) * 2006-03-20 2007-09-20 Microsoft Corporation Variable orientation user interface
US8930834B2 (en) 2006-03-20 2015-01-06 Microsoft Corporation Variable orientation user interface
US20070236485A1 (en) * 2006-03-31 2007-10-11 Microsoft Corporation Object Illumination in a Virtual Environment
US8139059B2 (en) * 2006-03-31 2012-03-20 Microsoft Corporation Object illumination in a virtual environment
WO2007131382A3 (en) * 2006-05-17 2008-06-12 Eidgenoess Tech Hochschule Displaying information interactively
WO2007131382A2 (en) * 2006-05-17 2007-11-22 Eidgenössische Technische Hochschule Displaying information interactively
US20090184943A1 (en) * 2006-05-17 2009-07-23 Eidgenossische Technische Hochschule Displaying Information Interactively
US7768527B2 (en) * 2006-05-31 2010-08-03 Beihang University Hardware-in-the-loop simulation system and method for computer vision
US20080050042A1 (en) * 2006-05-31 2008-02-28 Zhang Guangjun Hardware-in-the-loop simulation system and method for computer vision
US8001613B2 (en) 2006-06-23 2011-08-16 Microsoft Corporation Security using physical objects
US20070300307A1 (en) * 2006-06-23 2007-12-27 Microsoft Corporation Security Using Physical Objects
US20080040692A1 (en) * 2006-06-29 2008-02-14 Microsoft Corporation Gesture input
US20080062125A1 (en) * 2006-09-08 2008-03-13 Victor Company Of Japan, Limited Electronic appliance
US8179367B2 (en) * 2006-09-08 2012-05-15 JVC Kenwood Corporation Electronic appliance having a display and a detector for generating a detection signal
US8356254B2 (en) * 2006-10-25 2013-01-15 International Business Machines Corporation System and method for interacting with a display
US20080143975A1 (en) * 2006-10-25 2008-06-19 International Business Machines Corporation System and method for interacting with a display
US7630002B2 (en) * 2007-01-05 2009-12-08 Microsoft Corporation Specular reflection reduction using multiple cameras
US20080165266A1 (en) * 2007-01-05 2008-07-10 Microsoft Corporation Specular reflection reduction using multiple cameras
US10817162B2 (en) 2007-01-07 2020-10-27 Apple Inc. Application programming interfaces for scrolling operations
US11449217B2 (en) 2007-01-07 2022-09-20 Apple Inc. Application programming interfaces for gesture operations
US9639260B2 (en) 2007-01-07 2017-05-02 Apple Inc. Application programming interfaces for gesture operations
US9760272B2 (en) 2007-01-07 2017-09-12 Apple Inc. Application programming interfaces for scrolling operations
US10613741B2 (en) 2007-01-07 2020-04-07 Apple Inc. Application programming interface for gesture operations
US10963142B2 (en) 2007-01-07 2021-03-30 Apple Inc. Application programming interfaces for scrolling
US10175876B2 (en) 2007-01-07 2019-01-08 Apple Inc. Application programming interfaces for gesture operations
US11954322B2 (en) 2007-01-07 2024-04-09 Apple Inc. Application programming interface for gesture operations
US9575648B2 (en) 2007-01-07 2017-02-21 Apple Inc. Application programming interfaces for gesture operations
US8661363B2 (en) 2007-01-07 2014-02-25 Apple Inc. Application programming interfaces for scrolling operations
US9665265B2 (en) 2007-01-07 2017-05-30 Apple Inc. Application programming interfaces for gesture operations
US9529519B2 (en) 2007-01-07 2016-12-27 Apple Inc. Application programming interfaces for gesture operations
US9037995B2 (en) 2007-01-07 2015-05-19 Apple Inc. Application programming interfaces for scrolling operations
US9448712B2 (en) 2007-01-07 2016-09-20 Apple Inc. Application programming interfaces for scrolling operations
US10481785B2 (en) 2007-01-07 2019-11-19 Apple Inc. Application programming interfaces for scrolling operations
US8212857B2 (en) 2007-01-26 2012-07-03 Microsoft Corporation Alternating light sources to reduce specular reflection
WO2008091471A1 (en) * 2007-01-26 2008-07-31 Microsoft Corporation Alternating light sources to reduce specular reflection
US20080252596A1 (en) * 2007-04-10 2008-10-16 Matthew Bell Display Using a Three-Dimensional vision System
US8094137B2 (en) 2007-07-23 2012-01-10 Smart Technologies Ulc System and method of detecting contact on a display
EP2191352A4 (en) * 2007-09-04 2016-12-28 Canon Kk Image projection apparatus and control method for same
US10990189B2 (en) 2007-09-14 2021-04-27 Facebook, Inc. Processing of gesture-based user interaction using volumetric zones
US9811166B2 (en) 2007-09-14 2017-11-07 Intellectual Ventures Holding 81 Llc Processing of gesture-based user interactions using volumetric zones
US9058058B2 (en) 2007-09-14 2015-06-16 Intellectual Ventures Holding 67 Llc Processing of gesture-based user interactions activation levels
US10564731B2 (en) 2007-09-14 2020-02-18 Facebook, Inc. Processing of gesture-based user interactions using volumetric zones
US8230367B2 (en) 2007-09-14 2012-07-24 Intellectual Ventures Holding 67 Llc Gesture-based user interactions with status indicators for acceptable inputs in volumetric zones
US8159682B2 (en) 2007-11-12 2012-04-17 Intellectual Ventures Holding 67 Llc Lens system
US9229107B2 (en) 2007-11-12 2016-01-05 Intellectual Ventures Holding 81 Llc Lens system
US8810803B2 (en) 2007-11-12 2014-08-19 Intellectual Ventures Holding 67 Llc Lens system
WO2009071121A3 (en) * 2007-12-05 2009-08-13 Almeva Ag Interaction arrangement for interaction between a display screen and a pointer object
US20110102320A1 (en) * 2007-12-05 2011-05-05 Rudolf Hauke Interaction arrangement for interaction between a screen and a pointer object
WO2009071121A2 (en) * 2007-12-05 2009-06-11 Almeva Ag Interaction arrangement for interaction between a display screen and a pointer object
US9582115B2 (en) * 2007-12-05 2017-02-28 Almeva Ag Interaction arrangement for interaction between a screen and a pointer object
US8237654B2 (en) * 2007-12-18 2012-08-07 Samsung Electroncis Co., Ltd. Display apparatus and control method thereof
US20090153476A1 (en) * 2007-12-18 2009-06-18 Samsung Electronics Co., Ltd. Display apparatus and control method thereof
US20090189857A1 (en) * 2008-01-25 2009-07-30 Microsoft Corporation Touch sensing for curved displays
US9857915B2 (en) * 2008-01-25 2018-01-02 Microsoft Technology Licensing, Llc Touch sensing for curved displays
US10521109B2 (en) 2008-03-04 2019-12-31 Apple Inc. Touch event model
US9389712B2 (en) 2008-03-04 2016-07-12 Apple Inc. Touch event model
US9323335B2 (en) 2008-03-04 2016-04-26 Apple Inc. Touch event model programming interface
US8836652B2 (en) 2008-03-04 2014-09-16 Apple Inc. Touch event model programming interface
US20130069899A1 (en) * 2008-03-04 2013-03-21 Jason Clay Beaver Touch Event Model
US8560975B2 (en) * 2008-03-04 2013-10-15 Apple Inc. Touch event model
US8723822B2 (en) 2008-03-04 2014-05-13 Apple Inc. Touch event model programming interface
US8717305B2 (en) 2008-03-04 2014-05-06 Apple Inc. Touch event model for web pages
US9720594B2 (en) 2008-03-04 2017-08-01 Apple Inc. Touch event model
US9690481B2 (en) 2008-03-04 2017-06-27 Apple Inc. Touch event model
US10936190B2 (en) 2008-03-04 2021-03-02 Apple Inc. Devices, methods, and user interfaces for processing touch events
US11740725B2 (en) 2008-03-04 2023-08-29 Apple Inc. Devices, methods, and user interfaces for processing touch events
US8645827B2 (en) 2008-03-04 2014-02-04 Apple Inc. Touch event model
US9971502B2 (en) 2008-03-04 2018-05-15 Apple Inc. Touch event model
US9798459B2 (en) 2008-03-04 2017-10-24 Apple Inc. Touch event model for web pages
US10831278B2 (en) 2008-03-07 2020-11-10 Facebook, Inc. Display with built in 3D sensing capability and gesture control of tv
US8259163B2 (en) 2008-03-07 2012-09-04 Intellectual Ventures Holding 67 Llc Display with built in 3D sensing
US9247236B2 (en) 2008-03-07 2016-01-26 Intellectual Ventures Holdings 81 Llc Display with built in 3D sensing capability and gesture control of TV
US20090231281A1 (en) * 2008-03-11 2009-09-17 Microsoft Corporation Multi-touch virtual keyboard
US9939990B2 (en) 2008-03-25 2018-04-10 Lg Electronics Inc. Mobile terminal and method of displaying information therein
US20090247234A1 (en) * 2008-03-25 2009-10-01 Lg Electronics Inc. Mobile terminal and method of displaying information therein
US20110089857A1 (en) * 2008-06-10 2011-04-21 Koninklijke Philips Electronics N.V. Programmable user interface device for controlling an electrical power supplied to an electrical consumer
US8595218B2 (en) 2008-06-12 2013-11-26 Intellectual Ventures Holding 67 Llc Interactive display management systems and methods
US8308304B2 (en) 2008-06-17 2012-11-13 The Invention Science Fund I, Llc Systems associated with receiving and transmitting information related to projection
US8641203B2 (en) 2008-06-17 2014-02-04 The Invention Science Fund I, Llc Methods and systems for receiving and transmitting signals between server and projector apparatuses
US8267526B2 (en) 2008-06-17 2012-09-18 The Invention Science Fund I, Llc Methods associated with receiving and transmitting information related to projection
US8820939B2 (en) 2008-06-17 2014-09-02 The Invention Science Fund I, Llc Projection associated methods and systems
US20090309826A1 (en) * 2008-06-17 2009-12-17 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Systems and devices
US8262236B2 (en) 2008-06-17 2012-09-11 The Invention Science Fund I, Llc Systems and methods for transmitting information associated with change of a projection surface
US8602564B2 (en) 2008-06-17 2013-12-10 The Invention Science Fund I, Llc Methods and systems for projecting in response to position
US8540381B2 (en) 2008-06-17 2013-09-24 The Invention Science Fund I, Llc Systems and methods for receiving information associated with projecting
US8403501B2 (en) 2008-06-17 2013-03-26 The Invention Science Fund, I, LLC Motion responsive devices and systems
US8608321B2 (en) 2008-06-17 2013-12-17 The Invention Science Fund I, Llc Systems and methods for projecting in response to conformation
US8384005B2 (en) 2008-06-17 2013-02-26 The Invention Science Fund I, Llc Systems and methods for selectively projecting information in response to at least one specified motion associated with pressure applied to at least one projection surface
US8944608B2 (en) 2008-06-17 2015-02-03 The Invention Science Fund I, Llc Systems and methods associated with projecting in response to conformation
US8936367B2 (en) 2008-06-17 2015-01-20 The Invention Science Fund I, Llc Systems and methods associated with projecting in response to conformation
US8376558B2 (en) 2008-06-17 2013-02-19 The Invention Science Fund I, Llc Systems and methods for projecting in response to position change of a projection surface
US8430515B2 (en) 2008-06-17 2013-04-30 The Invention Science Fund I, Llc Systems and methods for projecting
US8857999B2 (en) 2008-06-17 2014-10-14 The Invention Science Fund I, Llc Projection in response to conformation
US8939586B2 (en) 2008-06-17 2015-01-27 The Invention Science Fund I, Llc Systems and methods for projecting in response to position
US8955984B2 (en) 2008-06-17 2015-02-17 The Invention Science Fund I, Llc Projection associated methods and systems
US20100066689A1 (en) * 2008-06-17 2010-03-18 Jung Edward K Y Devices related to projection input surfaces
US8733952B2 (en) 2008-06-17 2014-05-27 The Invention Science Fund I, Llc Methods and systems for coordinated use of two or more user responsive projectors
US8723787B2 (en) 2008-06-17 2014-05-13 The Invention Science Fund I, Llc Methods and systems related to an image capture projection surface
US8068641B1 (en) * 2008-06-19 2011-11-29 Qualcomm Incorporated Interaction interface for controlling an application
US20100008582A1 (en) * 2008-07-10 2010-01-14 Samsung Electronics Co., Ltd. Method for recognizing and translating characters in camera-based image
US20110128386A1 (en) * 2008-08-01 2011-06-02 Hilabs Interactive device and method for use
US20150074617A1 (en) * 2008-08-07 2015-03-12 Digilife Technologies Co., Ltd. Multimedia Playing Device
US9389696B2 (en) * 2008-08-07 2016-07-12 Digilife Technologies Co., Ltd. Multimedia playing device
US20100036988A1 (en) * 2008-08-07 2010-02-11 Chien-Wei Chang Multimedia playing device
USRE45298E1 (en) * 2008-08-07 2014-12-23 Digilife Technologies Co., Ltd Multimedia playing device
US8305345B2 (en) * 2008-08-07 2012-11-06 Life Technologies Co., Ltd. Multimedia playing device
WO2010023348A1 (en) * 2008-08-26 2010-03-04 Multitouch Oy Interactive displays
US20110227876A1 (en) * 2008-08-26 2011-09-22 Multi Touch Oy Interactive Display Device with Infrared Capture Unit
US20110208979A1 (en) * 2008-09-22 2011-08-25 Envault Corporation Oy Method and Apparatus for Implementing Secure and Selectively Deniable File Storage
US8555088B2 (en) 2008-09-22 2013-10-08 Envault Corporation Oy Method and apparatus for implementing secure and selectively deniable file storage
EP2330558A4 (en) * 2008-09-29 2014-04-30 Panasonic Corp User interface device, user interface method, and recording medium
US20100079409A1 (en) * 2008-09-29 2010-04-01 Smart Technologies Ulc Touch panel for an interactive input system, and interactive input system incorporating the touch panel
US8464160B2 (en) 2008-09-29 2013-06-11 Panasonic Corporation User interface device, user interface method, and recording medium
US20100269072A1 (en) * 2008-09-29 2010-10-21 Kotaro Sakata User interface device, user interface method, and recording medium
US8810522B2 (en) 2008-09-29 2014-08-19 Smart Technologies Ulc Method for selecting and manipulating a graphical object in an interactive input system, and interactive input system executing the method
EP2330558A1 (en) * 2008-09-29 2011-06-08 Panasonic Corporation User interface device, user interface method, and recording medium
US20100079493A1 (en) * 2008-09-29 2010-04-01 Smart Technologies Ulc Method for selecting and manipulating a graphical object in an interactive input system, and interactive input system executing the method
US20100083109A1 (en) * 2008-09-29 2010-04-01 Smart Technologies Ulc Method for handling interactions with multiple users of an interactive input system, and interactive input system executing the method
FR2937753A1 (en) * 2008-10-23 2010-04-30 Idealys Entertainment Virtual reading device for e.g. school, has computer displaying contents on screen by capture software, and casing integrating computer to form assembly activated by firewire camera while being triggered by movements of user
US20100134409A1 (en) * 2008-11-30 2010-06-03 Lenovo (Singapore) Pte. Ltd. Three-dimensional user interface
EP2219134A3 (en) * 2009-02-13 2014-07-23 Sony Corporation Information processing apparatus and information processing method
US8566044B2 (en) 2009-03-16 2013-10-22 Apple Inc. Event recognition
US11163440B2 (en) 2009-03-16 2021-11-02 Apple Inc. Event recognition
US10719225B2 (en) 2009-03-16 2020-07-21 Apple Inc. Event recognition
US9285908B2 (en) 2009-03-16 2016-03-15 Apple Inc. Event recognition
US9965177B2 (en) 2009-03-16 2018-05-08 Apple Inc. Event recognition
US8682602B2 (en) 2009-03-16 2014-03-25 Apple Inc. Event recognition
US9311112B2 (en) 2009-03-16 2016-04-12 Apple Inc. Event recognition
US8566045B2 (en) 2009-03-16 2013-10-22 Apple Inc. Event recognition
US11755196B2 (en) 2009-03-16 2023-09-12 Apple Inc. Event recognition
US9483121B2 (en) 2009-03-16 2016-11-01 Apple Inc. Event recognition
US8279200B2 (en) * 2009-05-19 2012-10-02 Microsoft Corporation Light-induced shape-memory polymer display screen
US20100295820A1 (en) * 2009-05-19 2010-11-25 Microsoft Corporation Light-induced shape-memory polymer display screen
US8223196B2 (en) 2009-06-10 2012-07-17 Disney Enterprises, Inc. Projector systems and methods for producing digitally augmented, interactive cakes and other food products
US20100315491A1 (en) * 2009-06-10 2010-12-16 Disney Enterprises, Inc. Projector systems and methods for producing digitally augmented, interactive cakes and other Food Products
EP2284668A3 (en) * 2009-06-15 2012-06-27 SMART Technologies ULC Interactive input system and components therefor
US20110032215A1 (en) * 2009-06-15 2011-02-10 Smart Technologies Ulc Interactive input system and components therefor
US9946357B2 (en) 2009-07-07 2018-04-17 Elliptic Laboratories As Control using movements
US20110007227A1 (en) * 2009-07-07 2011-01-13 Canon Kabushiki Kaisha Image projection apparatus and method for controlling the same
US8220936B2 (en) * 2009-07-07 2012-07-17 Canon Kabushiki Kaisha Image projection apparatus with operation image generation based on distance to the projection surface, and method for controlling the same
US8941625B2 (en) * 2009-07-07 2015-01-27 Elliptic Laboratories As Control using movements
US20120206339A1 (en) * 2009-07-07 2012-08-16 Elliptic Laboratories As Control using movements
US8851685B2 (en) * 2009-07-07 2014-10-07 Canon Kabushiki Kaisha Image display apparatus for displaying GUI image and method for controlling the same
US20120249501A1 (en) * 2009-07-07 2012-10-04 Canon Kabushiki Kaisha Image projection apparatus and method for controlling the same
US20110069019A1 (en) * 2009-07-08 2011-03-24 Smart Technologies Ulc Method for manipulating a graphic widget in a three-dimensional environment displayed on a touch panel of an interactive input system
US8416206B2 (en) 2009-07-08 2013-04-09 Smart Technologies Ulc Method for manipulating a graphic widget in a three-dimensional environment displayed on a touch panel of an interactive input system
EP2284667A1 (en) * 2009-08-07 2011-02-16 Sony Corporation Position detection apparatus and position detection method
US20110050650A1 (en) * 2009-09-01 2011-03-03 Smart Technologies Ulc Interactive input system with improved signal-to-noise ratio (snr) and image capture method
US8902195B2 (en) 2009-09-01 2014-12-02 Smart Technologies Ulc Interactive input system with improved signal-to-noise ratio (SNR) and image capture method
US20110057875A1 (en) * 2009-09-04 2011-03-10 Sony Corporation Display control apparatus, display control method, and display control program
US20160224128A1 (en) * 2009-09-04 2016-08-04 Sony Corporation Display control apparatus, display control method, and display control program
US10606380B2 (en) 2009-09-04 2020-03-31 Sony Corporation Display control apparatus, display control method, and display control program
US9342142B2 (en) * 2009-09-04 2016-05-17 Sony Corporation Display control apparatus, display control method, and display control program
US9830004B2 (en) * 2009-09-04 2017-11-28 Sony Corporation Display control apparatus, display control method, and display control program
US8913007B2 (en) * 2009-09-11 2014-12-16 Sony Corporation Display apparatus and control method
US9298258B2 (en) 2009-09-11 2016-03-29 Sony Corporation Display apparatus and control method
US20120218179A1 (en) * 2009-09-11 2012-08-30 Sony Corporation Display apparatus and control method
US9489043B2 (en) 2009-09-15 2016-11-08 Sony Corporation Display device and controlling method
US20120293405A1 (en) * 2009-09-15 2012-11-22 Sony Corporation Display device and controlling method
US8952890B2 (en) * 2009-09-15 2015-02-10 Sony Corporation Display device and controlling method
US20110157047A1 (en) * 2009-12-25 2011-06-30 Canon Kabushiki Kaisha Information processing apparatus and control method therefor
US8810527B2 (en) * 2009-12-25 2014-08-19 Canon Kabushiki Kaisha Information processing apparatus and control method therefor
US8502789B2 (en) 2010-01-11 2013-08-06 Smart Technologies Ulc Method for handling user input in an interactive input system, and interactive input system executing the method
US20110169748A1 (en) * 2010-01-11 2011-07-14 Smart Technologies Ulc Method for handling user input in an interactive input system, and interactive input system executing the method
US10732997B2 (en) 2010-01-26 2020-08-04 Apple Inc. Gesture recognizers with delegates for controlling and modifying gesture recognition
US9684521B2 (en) 2010-01-26 2017-06-20 Apple Inc. Systems having discrete and continuous gesture recognizers
US8654103B2 (en) 2010-02-09 2014-02-18 Multitouch Oy Interactive display
CN102667689A (en) * 2010-02-09 2012-09-12 多点触控有限公司 Interactive display
WO2011098654A1 (en) * 2010-02-09 2011-08-18 Multitouch Oy Interactive display
EP2546696A4 (en) * 2010-03-08 2017-06-14 Dai Nippon Printing Co., Ltd. Small-form-factor display device with touch-panel functionality, and screen used as the display therein
EP2634630A3 (en) * 2010-03-08 2017-06-14 Dai Nippon Printing Co., Ltd. Screens for use as displays of small-sized display devices with touch panel functions, and small-sized display devices with touch panel functions comprising said screens
US10216408B2 (en) 2010-06-14 2019-02-26 Apple Inc. Devices and methods for identifying user interface objects based on view hierarchy
US8552999B2 (en) 2010-06-14 2013-10-08 Apple Inc. Control selection approximation
EP2668556A2 (en) * 2011-01-25 2013-12-04 Intui Sense Touch and gesture control device, and related gesture-interpretation method
WO2012101373A3 (en) * 2011-01-25 2014-06-26 Intui Sense Touch and gesture control device, and related gesture-interpretation method
FR2970797A1 (en) * 2011-01-25 2012-07-27 Intui Sense TOUCH AND GESTURE CONTROL DEVICE AND METHOD FOR INTERPRETATION OF THE ASSOCIATED GESTURE
FR2972544A1 (en) * 2011-03-10 2012-09-14 Intui Sense ROBUST IMAGE ACQUISITION AND PROCESSING SYSTEM FOR INTERACTIVE FACADE, FACADE AND INTERACTIVE DEVICE THEREFOR
WO2012120243A3 (en) * 2011-03-10 2014-09-18 Intui Sense Robust image acquisition and processing system for an interactive front panel, and associated interactive front panel and device
WO2012127161A3 (en) * 2011-03-18 2014-09-18 Intui Sense Interactive device robust to cast shadows
FR2972820A1 (en) * 2011-03-18 2012-09-21 Intui Sense ROBUST INTERACTIVE DEVICE WITH SHADOWS
US9298363B2 (en) 2011-04-11 2016-03-29 Apple Inc. Region activation for touch sensitive surface
CN102289284A (en) * 2011-08-05 2011-12-21 上海源珅多媒体有限公司 Spherical interactive induction protection device
US20130057492A1 (en) * 2011-09-06 2013-03-07 Toshiba Tec Kabushiki Kaisha Information display apparatus and method
WO2013052880A1 (en) * 2011-10-07 2013-04-11 Qualcomm Incorporated Vision-based interactive projection system
US9030445B2 (en) 2011-10-07 2015-05-12 Qualcomm Incorporated Vision-based interactive projection system
US9626042B2 (en) 2011-10-07 2017-04-18 Qualcomm Incorporated Vision-based interactive projection system
US9395828B2 (en) * 2011-12-09 2016-07-19 Ricoh Company, Ltd. Electronic information board apparatus, that displays image input from external apparatus
US20130147736A1 (en) * 2011-12-09 2013-06-13 Ricoh Company, Ltd. Electronic information board apparatus, electronic information board system, and method of controlling electronic information board
US9646562B1 (en) 2012-04-20 2017-05-09 X Development Llc System and method of generating images on photoactive surfaces
EP2711807A1 (en) * 2012-09-24 2014-03-26 LG Electronics, Inc. Image display apparatus and method for operating the same
US9250707B2 (en) 2012-09-24 2016-02-02 Lg Electronics Inc. Image display apparatus and method for operating the same
US9804683B1 (en) 2012-10-22 2017-10-31 X Development Llc Method and apparatus for gesture interaction with a photo-active painted surface
US9164596B1 (en) * 2012-10-22 2015-10-20 Google Inc. Method and apparatus for gesture interaction with a photo-active painted surface
US9195320B1 (en) 2012-10-22 2015-11-24 Google Inc. Method and apparatus for dynamic signage using a painted surface display system
US9576551B2 (en) 2012-10-22 2017-02-21 X Development Llc Method and apparatus for gesture interaction with a photo-active painted surface
US9014417B1 (en) 2012-10-22 2015-04-21 Google Inc. Method and apparatus for themes using photo-active surface paint
EP2733657A1 (en) * 2012-11-19 2014-05-21 CSS electronic AG Device for entering data and/or control commands
US20140282271A1 (en) * 2013-03-15 2014-09-18 Jean Hsiang-Chun Lu User interface responsive to operator position and gestures
US10152135B2 (en) * 2013-03-15 2018-12-11 Intel Corporation User interface responsive to operator position and gestures
CN105190481A (en) * 2013-03-15 2015-12-23 英特尔公司 User interface responsive to operator position and gestures
US9241124B2 (en) 2013-05-01 2016-01-19 Lumo Play, Inc. Content generation for interactive video projection systems
US11429190B2 (en) 2013-06-09 2022-08-30 Apple Inc. Proxy gesture recognizer
US9733716B2 (en) 2013-06-09 2017-08-15 Apple Inc. Proxy gesture recognizer
US20150052477A1 (en) * 2013-08-19 2015-02-19 Samsung Electronics Co., Ltd. Enlargement and reduction of data with a stylus
US10037132B2 (en) * 2013-08-19 2018-07-31 Samsung Electronics Co., Ltd. Enlargement and reduction of data with a stylus
WO2015036852A3 (en) * 2013-08-23 2015-08-20 Lumo Play, Inc. Interactive projection effect and entertainment system
WO2015036144A1 (en) * 2013-09-16 2015-03-19 Ameria Gmbh Gesture-controlled rear-projection system
EP2849442A1 (en) * 2013-09-16 2015-03-18 ameria GmbH Gesture-controlled rear-projection system
EP2879000A1 (en) * 2013-10-31 2015-06-03 Funai Electric Co., Ltd. Projector device
CN104750244A (en) * 2013-12-27 2015-07-01 索尼公司 Display control device, display control system, display control method, and program
US20150185824A1 (en) * 2013-12-27 2015-07-02 Sony Corporation Display control device, display control system, display control method, and program
US10013050B2 (en) * 2013-12-27 2018-07-03 Sony Corporation Display control based on user information
US9993733B2 (en) 2014-07-09 2018-06-12 Lumo Interactive Inc. Infrared reflective device interactive projection effect system
JP2016186677A (en) * 2015-03-27 2016-10-27 セイコーエプソン株式会社 Interactive projection system, pointing element, and method for controlling interactive projection system
WO2017005639A1 (en) * 2015-07-03 2017-01-12 Menger, Christian Gesture-sensing system for visualization devices
US10719697B2 (en) * 2016-09-01 2020-07-21 Mitsubishi Electric Corporation Gesture judgment device, gesture operation device, and gesture judgment method
US11480790B2 (en) 2017-06-14 2022-10-25 Hewlett-Packard Development Company, L.P. Display adjustments
WO2018231213A1 (en) * 2017-06-14 2018-12-20 Hewlett-Packard Development Company, L.P. Display adjustments
US11325157B2 (en) * 2017-07-06 2022-05-10 Allgaier Werke Gmbh Device and method for capturing movement patterns of tumbler screening machines
US11073949B2 (en) * 2019-02-14 2021-07-27 Seiko Epson Corporation Display method, display device, and interactive projector configured to receive an operation to an operation surface by a hand of a user
US20220066545A1 (en) * 2019-05-14 2022-03-03 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Interactive control method and apparatus, electronic device and storage medium
US11385742B2 (en) * 2020-02-17 2022-07-12 Seiko Epson Corporation Position detection method, position detection device, and position detection system
CN114173026A (en) * 2020-09-16 2022-03-11 北京德胜智课教育科技有限公司 Teaching video recording system and teaching video recording method
CN114816206A (en) * 2022-03-15 2022-07-29 联想(北京)有限公司 Data processing method and electronic equipment

Also Published As

Publication number Publication date
US6414672B2 (en) 2002-07-02
JP3968477B2 (en) 2007-08-29
JPH1124839A (en) 1999-01-29

Similar Documents

Publication Publication Date Title
US6414672B2 (en) Information input apparatus
JP3321053B2 (en) Information input device, information input method, and correction data generation device
KR100588042B1 (en) Interactive presentation system
KR100298240B1 (en) Information input device, information input method and solid state imaging device
JP3257585B2 (en) Imaging device using space mouse
US8693732B2 (en) Computer vision gesture based control of a device
EP2287708B1 (en) Image recognizing apparatus, operation determination method, and program
US8971629B2 (en) User interface system based on pointing device
US20050281475A1 (en) Method and system for reducing effects of undesired signals in an infrared imaging system
JP2012208926A (en) Detection device, input device, projector and electronic apparatus
JPWO2018003861A1 (en) Display device and control device
JP2000222097A (en) Solid state image pickup device
US20090115971A1 (en) Dual-mode projection apparatus and method for locating a light spot in a projected image
WO2012070950A1 (en) Camera-based multi-touch interaction and illumination system and method
JPH03167621A (en) Computer input system for changing display visible image generated by computer
US20140053115A1 (en) Computer vision gesture based control of a device
JPH1157216A (en) Game device
JP2009140498A (en) Information input/output device and information input/output method
JPH08331667A (en) Pointing system
KR101989998B1 (en) Input system for a computer incorporating a virtual touch screen
JP4712754B2 (en) Information processing apparatus and information processing method
JP4687820B2 (en) Information input device and information input method
JP2003067108A (en) Information display device and operation recognition method for the same
JPH1153111A (en) Information input/output device
TW201128455A (en) Signaling device position determination

Legal Events

Date Code Title Description
AS Assignment

Owner name: SONY CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:REKIMOTO, JUNICHI;MATSUSHITA, NOBUYUKI;REEL/FRAME:009467/0738;SIGNING DATES FROM 19980903 TO 19980908

STCF Information on status: patent grant

Free format text: PATENTED CASE

FPAY Fee payment

Year of fee payment: 4

FEPP Fee payment procedure

Free format text: PAYER NUMBER DE-ASSIGNED (ORIGINAL EVENT CODE: RMPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FPAY Fee payment

Year of fee payment: 8

FPAY Fee payment

Year of fee payment: 12