WO2005003948A1 - Control system and control method - Google Patents

Control system and control method Download PDF

Info

Publication number
WO2005003948A1
WO2005003948A1 PCT/JP2004/006643 JP2004006643W WO2005003948A1 WO 2005003948 A1 WO2005003948 A1 WO 2005003948A1 JP 2004006643 W JP2004006643 W JP 2004006643W WO 2005003948 A1 WO2005003948 A1 WO 2005003948A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
input
input part
shape
instruction
Prior art date
Application number
PCT/JP2004/006643
Other languages
French (fr)
Japanese (ja)
Inventor
Shigeru Enomoto
Yoshinori Washizu
Ryuji Yamamoto
Munetaka Tsuda
Tomonori Shimomura
Junichi Rekimoto
Original Assignee
Sony Computer Entertainment Inc.
Sony Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Computer Entertainment Inc., Sony Corporation filed Critical Sony Computer Entertainment Inc.
Publication of WO2005003948A1 publication Critical patent/WO2005003948A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/041012.5D-digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface and also measures the distance of the input means within a short range in the Z direction, possibly with a separate measurement setup
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Definitions

  • the present invention relates to control technology for electronic devices and the like, and particularly relates to the shape or motion of an input part including a part of a user's body, an object operated by the user, or a distance to the input part.
  • the present invention relates to a control system and a control method having a user interface capable of inputting a user's instruction.
  • the present invention has been made in view of such a situation, and its object is to provide an excellent operability. In providing a user interface.
  • the control system includes a detection unit that detects a shape or an operation of one or more input parts including at least a part of a user's body or at least a part of an object operated by the user, or a distance to the input part.
  • the analysis unit that determines the user's instruction by praying the shape or motion of the input part detected by the input unit or the distance to the input part, and the function corresponding to the user's instruction determined by the analysis unit And a control unit to execute.
  • the detection unit may be an imaging device capable of acquiring distance information to an input part.
  • the detection unit may be an input device that includes a plurality of electrodes and detects a change in capacitance between the electrodes due to approach of an input site.
  • FIG. 1 shows a configuration of a control system 10 according to the first embodiment.
  • the control system 10 displays an input device 20 for inputting an instruction from a user, a control device 40 for controlling an operation of an application or the like according to the instruction input from the input device 20, and an image output from the control device 40. And a display device 30 to be used.
  • the control system 10 of the present embodiment includes a user interface (hereinafter, referred to as a “user interface”) that allows a user to input an instruction by an action (gesture) using a part of the body such as a finger, hand, foot, or head.
  • the input device 20 is an input device including at least a part of a user's body. Has the function of a detection unit that detects the shape or movement of the part, or the distance to the input part
  • FIG. 2 shows a configuration example of the input device 20 and the display device 30 according to the embodiment.
  • an arbitrary display device 32 such as a liquid crystal display device, a plasma display device, or a cathode ray tube (Cathord Ray Tube: CRT) display device is used.
  • the control device 40 analyzes the operation of the user photographed by the imaging device 22 by image processing, determines a gesture performed by the user, and acquires a user instruction.
  • a wide range of the user's body can be imaged and its operation can be determined, so that the user can give an instruction by a gesture using a hand or foot that can be moved with just a finger. it can.
  • This method is suitable when the user makes a gesture at a position some distance from the imaging device 22.
  • an imaging device having no distance measuring function may be used as the input device 20, as described later.
  • FIG. 3 shows another configuration example of the input device 20 and the display device 30 according to the embodiment.
  • a projector 36 that projects an image on a screen 38 is used as the display device 30, and an imaging device 22 having a distance measuring function is used as the input device 20.
  • an image is projected on a transparent or translucent screen 38 made of glass or the like provided in front of the user by a projector 36 provided on the upper rear side of the user, and the image displayed on the screen 38 is displayed.
  • a user making a gesture toward the camera is photographed by the imaging device 22 provided on the opposite side of the screen 38.
  • the imaging device 22 can be arranged at a position distant from the screen 38, even if the user makes a gesture near the screen 38, the user's body part The distance to the position can be detected with high accuracy.
  • FIG. 4 shows another configuration example of the input device 20 and the display device 30 according to the embodiment.
  • an arbitrary display device 32 such as a liquid crystal display device, a plasma display device, or a CRT display device is used as the display device 30, and a display screen 34 of the display device 32 is used as the input device 20.
  • the touch panel 24 provided inside is used. Alternatively, an image may be projected on the surface of the touch panel 24 by a projector.
  • the touch panel 24 can be any type of touch panel, such as a resistance pressure-sensitive method or an infrared detection method. According to this configuration example, the user can input an instruction while directly touching an object or the like displayed on the display screen 34.
  • FIG. 5 shows still another configuration example of the input device 20 and the display device 30 according to the embodiment.
  • an arbitrary display device 32 such as a liquid crystal display device, a plasma display device, and a CRT display device is used as the display device 30, and the display screen 34 of the display device 32 is used as the input device 20.
  • a non-contact type input device 26 provided inside is used.
  • a configuration in which an image is projected on the surface of the non-contact input device 26 by a projector may be employed.
  • the non-contact input device 26 is an input device capable of detecting the shape of an object, such as a fingertip of a user, and the distance to the object when the object approaches, for example, Japanese Patent Application Laid-Open No. 2002-342033.
  • the non-contact type input device disclosed in Japanese Patent Application Laid-Open No. 2002-342033 is provided with a plurality of linear electrodes arranged vertically and horizontally, and when a conductive object such as a fingertip of a user approaches the electrodes. At the same time, it detects the change in capacitance according to the degree of approach, and obtains three-dimensional position information of an object near the input device.
  • the non-contact input device 26 can be provided close to the display screen 34, and it is possible to accurately detect the user's movement and the shape of the body part near the display screen 34. Therefore, the user can input an instruction with an input part such as a finger near the image while viewing the image displayed on the display screen 34.
  • the non-contact input device 26 only the part of the user's body that is in the vicinity of the input device is detected, eliminating the need to extract a specific part to analyze the motion and simplifying the processing. can do.
  • the shape and distance can be detected simply by approaching, it is possible to input instructions without touching the display screen 34. Therefore, even a user who feels resistance to directly touching the display screen 34 can be used comfortably.
  • the user who does not need to press the display screen 34 strongly can detect the approach before touching the display screen 34, it is possible to provide a user interface with good responsiveness and excellent operational feeling.
  • Which of the plurality of configuration examples described above is adopted may be determined according to the environment of the place where the control system 10 is installed, the type of application or content to be controlled, and the like. Les ,. For example, in an application in which a plurality of users share and use one display device 30, or in an application in which a content such as a movie is played, the user needs to keep a relatively large distance from the display device 30. Since it is assumed that an instruction is input, the configuration example shown in FIG. 2 or FIG. 3 may be adopted. Further, in the case of an application that one user uses personally, for example, when editing data such as images and documents, it is assumed that the distance between the user and the display device 30 is relatively short. Therefore, the configuration example shown in FIG. 4 or FIG. 5 may be adopted. When constructing the control system 10 using a plurality of configuration examples in combination, it is necessary to select which type of the gesture interface is to be adopted for each application or content, and to switch and use the input device 20 as appropriate. You may.
  • FIG. 6 shows an internal configuration of the control device 40.
  • this configuration can be realized by any computer CPU, memory, or other LSI, and in software, the power realized by programs loaded into the memory, etc.
  • the functional blocks are drawn. Therefore, it is understood by those skilled in the art that these functional blocks can be realized in various forms by hardware only, software only, or a combination thereof.
  • the control device 40 includes an acquisition unit 42 that acquires an input signal detected by the input device 20, an analysis unit 44 that analyzes a user operation from the input signal acquired by the acquisition unit 42, and determines a user instruction, And a control unit 46 for executing a function corresponding to the user's instruction determined by the analysis unit 44.
  • the analysis unit 44 acquires an image having distance information captured by the imaging device, and determines a user operation by image processing.
  • the analysis unit 44 uses a shape recognition technique to detect a part of the user's body, such as the head, eyes, hands, fingers, The motion of the user may be determined by extracting a foot or the like and analyzing the motion of the extracted body part.
  • the analysis unit 44 may determine the user's operation by analyzing the shape or time change of the input signal.
  • the analysis unit 44 may determine the user's motion by analyzing the shape and distance of the input signal and their time change.
  • FIG. 7 shows hardware components of the control device 40.
  • the control device 40 includes a CPU 120, an input interface 122, a display interface 124, a memory 130, a node disk 132, and a drive device 128. These components are electrically connected by a signal transmission path such as a bus 126.
  • the input interface 122 acquires an input signal detected by the input device 20.
  • the display interface 124 outputs an image signal to be displayed on the display device 30.
  • the hard disk 132 is a large-capacity magnetic storage device, and stores various files.
  • the recording medium 140 records a program for causing the CPU 120 to realize the functions of the control device 40 described above. When the recording medium 140 is inserted into the drive device 128, the program is read into the memory 130 or the hard disk 132, and the CPU 120 performs the control processing of the present embodiment by the read program.
  • the recording medium 140 is a computer-readable medium such as a CD-ROM, a DVD, and an FD.
  • the program may be transmitted from an external server regardless of whether it is wireless or wired.
  • the program is stored in the hard disk 132 in advance, not only when the program is realized from the outside but also when the computer realizes the control function of the present embodiment. It is well understood by those skilled in the art that this is the case.
  • the function of the control unit 46 may be realized by an operating system ( ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ S) executed by a CPU or the like of a computer, an input / output control application, or the like.
  • an operating system ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ S
  • the user's instruction is notified to the application of the topmost window near the location where the input part such as the user's finger has approached, and the Application responds to user instructions
  • the associated function may be executed. If there is no window at that position, the OS or I / O control application may execute the function associated with the instruction.
  • a force is used to input an instruction with a single pointer.
  • the user can use fingers, hands, and feet.
  • An instruction can be given by using a plurality of input units using, for example.
  • FIG. 8 and FIG. 9 are diagrams for explaining an example of inputting an instruction by the movement of a plurality of fingers of the user.
  • the user With the thumb and index finger closed as shown in FIG. 8, the user brings his / her finger close to the icon 200 displayed on the screen of the display device 30, and opens the thumb and index finger as shown in FIG. Is performed.
  • the acquisition unit 42 sends the input signal detected by the input device 20 to the analysis unit 44, and the analysis unit 44 analyzes the movement of the user's finger and determines that the user has issued an instruction to open the finger. I do.
  • the analysis unit 44 extracts the user's hand by using a technique such as shape recognition and tracks the movement of the finger, and performs an operation of opening the finger by the user. Determine what you have done.
  • the analysis unit 44 divides a set of input parts approaching or touching coordinates near the icon 200 on the display screen into two, and It is determined that the user has performed the operation of opening the finger when the user has moved away from the camera. Only when a group of input parts is divided into two parts When two input parts move in a direction away from each other, it may be determined that the user has performed the operation of opening the finger.
  • the control unit 46 executes a function associated with the operation of opening the finger. For example, the control unit 46 may execute a function of activating an application associated with the icon 200 displayed near the user's finger. If the icon is associated with a file, an application that can handle the file may be started and the function to open the file may be executed. In this way, the "open finger" operation is associated with functions such as "open application” and "open file”. By doing so, the user's operation and the application's operation are intuitively matched, so that a more intuitive user interface with a higher level of affinity and immediate affinity can be realized.
  • the operation of opening a finger may be associated with functions such as "start”, “determine”, and “determine”.
  • each application may implement a function corresponding to the operation of opening the finger. For example, in an image processing application, when a user performs an operation of opening a finger on an image, a function of enlarging the portion or extending the portion in the direction in which the finger is opened may be executed.
  • the control unit 46 performs the function associated with the finger closing operation. You can do it.
  • the control unit 46 may execute a function of terminating the application associated with the icon 200 or the window displayed near the user's finger, or may execute the function of terminating the icon 200 or the file associated with the window. You may perform the function to close the.
  • an image processing application when a user performs an operation of closing a finger on an image, a function of reducing that part or reducing the part in a direction in which the finger is closed may be executed.
  • control unit 46 receives an instruction using a plurality of input sites including a user's finger, hand, and the like, and performs the operation.
  • the associated function may be executed.
  • a book browsing application when a user performs an operation of pinching a corner of a displayed book page, the user may memorize the page and execute a “pinch bookmark” function.
  • a function of “picking up” the item may be executed.
  • FIG. 10 is a diagram for explaining an example of inputting an instruction based on the shape of the user's hand. As shown in FIG. 9, it is assumed that the user places his palm on the display screen with his / her hand open. If the input device 20 is a camera with a distance measurement function, the analysis unit 44 The shape of the hand is determined by extracting the hand of the user by the technique. When the input device 20 is a touch panel or a non-contact type input device, the analysis unit 44 extracts a feature point from a shape of an object approaching or touching the display screen by using a known technique. Alternatively, the shape of the object is determined by performing evaluation using a predetermined evaluation formula.
  • a general hand shape may be stored in a database, and the shape of the hand may be determined by extracting a matching shape from the database.
  • the shape may be determined based on the area of the detected object. For example, when the user places his / her hand on the display screen, the area is the largest when the hand is open and the smallest when the user holds the hand. Utilizing this fact, the shape of the hand may be determined.
  • the control unit 46 performs a function corresponding to the shape, for example, an application corresponding to the window 210 which is displayed at the highest position in the position where the palm is placed. Perform the function to "quit".
  • FIG. 11 is a diagram illustrating an example of mapping an object to a hand shape.
  • the user moves a finger from the position of the object 220 on the display screen to the left hand with the other hand, here the right hand, and creates a specific shape with the other hand, here the left hand.
  • the method of determining the shape of the hand is the same as in the example described above.
  • the control unit 46 executes a function of storing a file corresponding to the object 220 moved by the finger of the right hand in a storage location of the hard disk 132 corresponding to the shape of the left hand.
  • the storage location corresponding to the shape of the left hand may be a directory or a folder in the file system, or may be a virtual folder. You can also associate the file with the shape of your hand. It is not only necessary to associate one storage location with one shape. For example, the storage location may be associated with each finger of an open hand.
  • FIG. 12 shows an example of a table for mapping an object to a hand shape.
  • the table 230 is held in the memory 130 or the hard disk 132 of the control device 40.
  • the table 230 has a shape column 232 and a storage location column 234.
  • the shape column 232 holds an image representing a hand shape, parameters, and the like. Recognize hand shapes as images When recognizing the image, the file name of the image file may be stored.When recognizing the shape of the hand as a parameter using feature points or evaluation formulas, the parameter or the file in which the parameter is stored You can keep the file name and so on.
  • the storage location column 234 holds the storage location of the object.
  • the hand shape and the storage location corresponding to the shape may be registered in advance, or if the hand shape is an unregistered shape when the user performs the operation shown in FIG. 11,
  • the controller 46 may register the shape and storage location of the hand in the table 230. At this time, the user may specify the storage location, or an appropriate storage location may be automatically assigned.
  • the storage location field 234 holds the file name of the file.
  • FIG. 13 is a view for explaining an example of extracting an object stored by the operation shown in FIG.
  • the user creates the shape of the hand corresponding to the storage location of the object to be retrieved with one hand, here the left hand, and moves away from the position near the left hand with the other hand, here the right hand. Perform the operation of moving your finger.
  • the control unit 46 specifies the object stored in the storage location corresponding to the shape of the left hand, and displays it on the display screen.
  • the control unit 46 may execute a function of opening a file corresponding to the object.
  • FIG. 14 is a diagram for explaining an example of an instruction input according to a distance.
  • the control unit 46 displays a moving image in which a fish is swimming on the display device.
  • the control unit 46 sets the moving image in which the fish escapes from the vicinity of the user's hand. Switch the display to the image.
  • the speed at which the user's hand approaches is calculated, and when the speed is fast, the threshold for switching the display is increased, and the hand is at a far position.
  • the fish may start to escape from time to time, and when the speed is slow, the threshold value may be reduced so that the fish does not escape until the hand comes close.
  • the control unit 46 displays a moving image in which the fish is freely swimming, and when the distance is smaller than the predetermined value, the fish is displayed. May display a moving image of swimming while avoiding the position of the user's hand. Since the non-contact input device can accurately detect distance information near the display screen, in this example, it is more preferable to use the non-contact input device as the input device 20.
  • the input device 20 When an imaging device is used as the input device 20, if the user is too far from the imaging device, there is a possibility that the user's instruction cannot be determined accurately due to the relationship between the distance measurement function and the accuracy of the image processing. In addition, when a non-contact input device is used as the input device 20, unless a part such as a user's hand approaches the input device 20, the capacitance does not change and cannot be detected. As described above, since there is an upper limit value of the distance at which the user can input an instruction according to the characteristics of the input device 20, the above-described example of a moving image in which a fish swims indicates whether or not the instruction can be input. It may be used as a means.
  • a moving image of the fish swimming freely is displayed, and input is not possible. Is presented to the user, and when the user approaches the input device and reaches a distance at which an instruction can be input, a moving image in which a fish swims around the user's hand to avoid the fish is displayed, and input can be performed. May be presented to the user.
  • the control unit 46 previously holds a distance threshold value for determining whether or not an instruction input to the input device 20 is possible.
  • FIG. 15 and FIG. 16 are diagrams for explaining the function of moving the object displayed on the screen.
  • the user tries to move the object displayed on the display device 30 by shifting it with a finger.
  • the control unit 46 detects the movement amount, speed, and acceleration of a part such as a user's finger, and determines the movement amount of the object accordingly.
  • the weight of each object that is, the virtual energy value required to move the object is set in advance, and the energy value and the amount of movement when the user moves his or her finger to move the object are set.
  • Speed, or acceleration The moving state of the object is controlled based on the degree. As shown in FIG.
  • the control unit 46 moves the user's finger quickly for a light object, as shown in FIG. Does not move the object if the finger is quickly moved.
  • the heavy object starts moving by slowly moving the finger of the user, and moves gradually following the movement of the finger. This makes it possible to simulate a moving state simulating actual static friction and dynamic frictional resistance, and to give the user an operational feeling reflecting the weight of the object.
  • the technology described in this example can be used in a user interface using a conventional pointing device.
  • the control unit 46 may determine the magnitude of the force applied to the object by the user and control the moving state of the object according to the magnitude of the force.
  • the magnitude of the force may be determined based on the distance between the input device 20 and the user's hand. For example, the closer the user's hand is to the display screen, the more force may be applied to the object.
  • the magnitude of the force may be determined based on the magnitude of the pressing force.
  • the magnitude of the force may be determined based on the degree of pressing of a pressure-sensitive mouse or the like.
  • FIG. 17 is a diagram for explaining a paper handling function.
  • the control unit 46 displays an image of a book opened on the display device 30.
  • the control unit 46 turns the page gradually following the movement of the user's finger, and turns the page when it exceeds a predetermined position. Displays a moving image on the next page.
  • the control unit 46 displays an image in which the paper cannot follow the movement of the finger, but turns halfway and returns. As a result, it is possible to give the user a sense of presence as if he / she is actually reading a book.
  • the control unit 46 may hold in advance a threshold value of the speed used to determine whether the paper can follow the movement of the user.
  • the control unit 46 indicates that the paper makes a wrinkle or performs an operation such as pinching or tearing the paper. At this time, the paper may be made uneven, or the paper may be displayed as torn.
  • the instruction input according to the moving amount, the speed, and the acceleration when the user moves an object with a finger or the like and passes over another object, the other object is changed according to the moving speed.
  • a predetermined process may be performed on the object. For example, the function of opening the file corresponding to the moving object by the application corresponding to the passed object may be executed.
  • FIG. 18 is a diagram for describing an example of handling a three-dimensional object.
  • the control unit 46 performs the same operation as the direction in which the user turns his hand. Rotate the 3D object 240 in the direction.
  • the gesture interface of the present embodiment it is possible to give an operational feeling as if a 3D object is actually handled by hand, and to provide a user interface for handling a 3D space. Operability can be dramatically improved.
  • the control system 10 of the present embodiment and the internal configurations of the input device 20, the display device 30, and the control device 40 are the same as those of the first embodiment.
  • a configuration example using the non-contact input device 26 shown in FIG. 5 will be mainly described, but the same applies to a case where another configuration example is used.
  • the non-contact input device 26 that detects a change in capacitance
  • a conductive object is placed on the display screen, and when the user touches the object, the capacitance changes and the shape of the object is detected. You. Utilizing this, conductive objects
  • a user interface can be constructed by associating predetermined functions with shapes and movements of the user.
  • FIG. 19 is a diagram for explaining an example in which an instruction is input based on the shape and movement of an object.
  • a window 250 of a music reproduction application controlled by the control unit 46 is displayed on the display device 30.
  • the analysis unit 44 analyzes the shape of the input signal, detects that the input signal is the volume control unit 260, analyzes the movement of the knob 262, and transmits it to the control unit 46.
  • the control unit 46 controls the volume in the music playback application according to the amount of movement of the knob 262.
  • the non-contact input device that detects a change in capacitance detects only a conductive object
  • a specific shape is drawn by a conductive wire on the bottom surface of the insulating object, and the shape and a predetermined function are determined. It may be associated.
  • control system 10 corresponds to an electronic device such as a personal computer.
  • a non-contact input device 26 may be used as the input device 20, and the display device 30 and the input device 20 may be provided on the tabletop of the table so that a game or the like can be enjoyed thereon.
  • the display device 30 and the input device 20 may be provided on a floor of a passage or the like to display a footprint of a user walking or to navigate a destination of the user by using an image or light.
  • the present invention can be applied to a user interface for controlling an electronic device or the like.
  • FIG. 1 is a diagram showing a configuration of a control system according to an embodiment.
  • FIG. 2 is a diagram showing a configuration example of an input device and a display device according to an embodiment.
  • FIG. 3 is a diagram showing another configuration example of the input device and the display device according to the embodiment.
  • FIG. 4 is a diagram showing still another configuration example of the input device and the display device according to the embodiment.
  • FIG. 5 is a diagram showing still another configuration example of the input device and the display device according to the embodiment.
  • FIG. 6 is a diagram showing an internal configuration of a control device.
  • FIG. 7 is a diagram showing hardware components of the control device.
  • FIG. 8 is a diagram for explaining an example of inputting an instruction by movement of a plurality of fingers of a user.
  • FIG. 9 is a diagram for explaining an example in which an instruction is input by movement of a plurality of fingers of a user.
  • FIG. 10 is a diagram for explaining an example of inputting an instruction according to the shape of a user's hand.
  • FIG. 11 is a diagram illustrating an example of mapping an object to a hand shape.
  • FIG. 12 is a diagram showing an example of a table for mapping objects to hand shapes.
  • FIG. 13 is a diagram for explaining an example of retrieving an object stored by the operation shown in FIG.
  • FIG. 14 is a diagram for explaining an example of an instruction input according to a distance.
  • FIG. 15 is a diagram for explaining a function of moving an object displayed on a screen.
  • FIG. 16 is a diagram for explaining a function of moving an object displayed on a screen.
  • FIG. 17 is a diagram for explaining a function for handling paper.
  • FIG. 18 is a diagram for describing an example of handling a three-dimensional object.
  • FIG. 19 is a diagram for explaining an example of inputting an instruction based on the shape and movement of an object.
  • Control system 10 "Control system, 20" Input device, 22 Image sensor, 24 Touch panel, 26 Non-contact type input device, 30 Display device, 40 Control device, 42 ⁇ Acquisition unit, 44 ⁇ Analysis unit, 46 ⁇ Control unit.

Abstract

A user interface is provided which is excellent in operability. In a control system, there is provided, on the inner side of a display screen (34) of a display device (32), a noncontact input device (26) that determines a shape or action of an input part including at least a part of the body of a user or at least a part of an object operated by the user or determines a distance to the input part. An analyzing part analyzes the shape or action of the input part or the distance thereto determined by the noncontact input device (26) to determine an user's instruction. A control part executes a function corresponding to the user's instruction determined by the analyzing part.

Description

明 細 書  Specification
制御システムおよび制御方法  Control system and control method
技術分野  Technical field
[0001] 本発明は、電子機器などの制御技術に関し、とくに、ユーザの体の一部や、ユーザ により操作される物体などを含む入力部位の形状又は動作、又は入力部位までの距 離などによりユーザの指示を入力することが可能なユーザインタフェイスを備えた制 御システム及び方法に関する。 背景技術  The present invention relates to control technology for electronic devices and the like, and particularly relates to the shape or motion of an input part including a part of a user's body, an object operated by the user, or a distance to the input part. The present invention relates to a control system and a control method having a user interface capable of inputting a user's instruction. Background art
[0002] ITの発達に伴レ、、高度な機能を有する電子機器が多数登場してレ、る。電子機器の 機能が高度になるにつれて、ユーザが電子機器に機能を実行させるための指示の 内容も多様化しており、分力りやすく操作性に優れたユーザインタフェイス(User Interface : UI)の必要性が増している。たとえば、パーソナルコンピュータ(Personal Computer : PC)においては、以前は、ユーザがキーボードによりコマンド文字列を入 力するキャラクタベースのユーザインタフェイスが主に採用されていた力 現在では、 アイコンやウィンドウなどの画像を表示画面に提示し、表示画面上のポインタをユー ザがマウスなどのポインティングデバイスにより操作して指示を入力するグラフィカル' ユーザインタフェイス(Graphical User Interface : GUI)が主流となっている。  [0002] With the development of IT, a large number of electronic devices having advanced functions have appeared. As the functions of electronic devices become more sophisticated, the content of instructions for users to execute functions on electronic devices is diversifying, and a user interface (UI) that is easy to operate and has excellent operability is needed. Sex is increasing. For example, in the case of personal computers (PCs), a character-based user interface that previously allows the user to enter command strings using a keyboard has been mainly used. Is displayed on a display screen, and a user operates a pointer on the display screen with a pointing device such as a mouse to input an instruction.
発明の開示  Disclosure of the invention
発明が解決しょうとする課題  Problems to be solved by the invention
[0003] GUIの登場により、ユーザは、所望の機能を PCに実行させるためのコマンド文字 列を覚え、入力する煩わしさから解放された。しかし、 PCが広く大衆に普及したとは 言え、子供や高齢者などを中心に、マウスなどの操作に慣れておらず、抵抗感を感じ るユーザもいる。また、より現実世界に近い感覚で指示を入力することが可能な UIを 求める声もある。今後、 PCなどの電子機器が更に生活の中に浸透し、より多様なュ 一ザ層により使用されることが予想されるため、より直感的で分かりやすぐ操作性に 優れた UIの開発が望まれる。 [0003] With the advent of the GUI, the user has been relieved of the trouble of memorizing and inputting a command character string for causing a PC to execute a desired function. However, despite the widespread use of PCs by the masses, some users, especially children and the elderly, are not accustomed to operating a mouse and feel a sense of resistance. Others are calling for a UI that allows them to enter instructions with a feeling closer to the real world. In the future, it is expected that electronic devices such as PCs will further penetrate into daily life and be used by more diverse user layers, so the development of UIs that are more intuitive, easy to understand, and easy to operate will be developed. desired.
[0004] 本発明はこうした状況に鑑みてなされたものであり、その目的は、操作性に優れた ユーザインタフェイスの提供にある。 [0004] The present invention has been made in view of such a situation, and its object is to provide an excellent operability. In providing a user interface.
課題を解決するための手段  Means for solving the problem
[0005] 本発明のある態様は、制御システムに関する。この制御システムは、ユーザの体の 少なくとも一部又はユーザにより操作される物体の少なくとも一部を含む単数又は複 数の入力部位の形状又は動作、又は前記入力部位までの距離を検知する検知部と 、入力部により検知された入力部位の形状又は動作、又は入力部位までの距離を解 祈して、ユーザの指示を判別する解析部と、解析部により判別されたユーザの指示 に対応する機能を実行する制御部と、を備えることを特徴とする。  [0005] One embodiment of the present invention relates to a control system. The control system includes a detection unit that detects a shape or an operation of one or more input parts including at least a part of a user's body or at least a part of an object operated by the user, or a distance to the input part. The analysis unit that determines the user's instruction by praying the shape or motion of the input part detected by the input unit or the distance to the input part, and the function corresponding to the user's instruction determined by the analysis unit And a control unit to execute.
[0006] 検知部は、入力部位までの距離情報を取得可能な撮像装置であってもよい。検知 部は、複数の電極を含み、入力部位の接近による前記電極間の静電容量の変化を 検知する入力装置であってもよい。  [0006] The detection unit may be an imaging device capable of acquiring distance information to an input part. The detection unit may be an input device that includes a plurality of electrodes and detects a change in capacitance between the electrodes due to approach of an input site.
[0007] なお、以上の構成要素の任意の組合せ、本発明の表現を方法、装置、システム、 記録媒体、コンピュータプログラムなどの間で変換したものもまた、本発明の態様とし て有効である。  [0007] It is to be noted that any combination of the above-described components and any conversion of the expression of the present invention between a method, an apparatus, a system, a recording medium, a computer program, and the like are also effective as embodiments of the present invention.
発明の効果  The invention's effect
[0008] 本発明によれば、操作性に優れたユーザインタフヱイスを提供することができる。  [0008] According to the present invention, it is possible to provide a user interface with excellent operability.
発明を実施するための最良の形態  BEST MODE FOR CARRYING OUT THE INVENTION
[0009] (第 1の実施の形態)  [0009] (First embodiment)
図 1は、第 1の実施の形態に係る制御システム 10の構成を示す。制御システム 10 は、ユーザからの指示を入力する入力装置 20と、入力装置 20から入力される指示に 応じてアプリケーションなどの動作を制御する制御装置 40と、制御装置 40から出力 される画像を表示する表示装置 30とを備える。本実施の形態の制御システム 10は、 ユーザが、指、手、足、頭など、体の一部を使って、動作 (ジェスチャー)により指示を 入力することが可能なユーザインタフェイス(以下、「ジエスチュラルインタフェイス」と いう)を提供する。これにより、ユーザが、現実世界と同様な動作で表示画面に表示さ れたオブジェクトを扱ったり、アプリケーションに指示を入力したりすることを可能とし、 直感的に分力 やすぐ操作性に優れたユーザインタフヱイスを実現することができる 。本実施の形態において、入力装置 20は、ユーザの体の少なくとも一部を含む入力 部位の形状又は動作、又は入力部位までの距離を検知する検知部の機能を有する FIG. 1 shows a configuration of a control system 10 according to the first embodiment. The control system 10 displays an input device 20 for inputting an instruction from a user, a control device 40 for controlling an operation of an application or the like according to the instruction input from the input device 20, and an image output from the control device 40. And a display device 30 to be used. The control system 10 of the present embodiment includes a user interface (hereinafter, referred to as a “user interface”) that allows a user to input an instruction by an action (gesture) using a part of the body such as a finger, hand, foot, or head. A “gestural interface”). This enables the user to handle objects displayed on the display screen in the same way as in the real world, and to input instructions to applications, resulting in an intuitively superior force and operability. A user interface can be realized. In the present embodiment, the input device 20 is an input device including at least a part of a user's body. Has the function of a detection unit that detects the shape or movement of the part, or the distance to the input part
[0010] 図 2は、実施の形態に係る入力装置 20および表示装置 30の構成例を示す。図 2に 示した例では、表示装置 30として、液晶ディスプレイ装置、プラズマディスプレイ装置 、陰極線管(Cathord Ray Tube : CRT)ディスプレイ装置などの任意の表示装置 32が 用いられ、入力装置 20として、表示装置 32と一体的に、または別体として設けられた 、測距機能を有する撮像装置 22が用いられる。制御装置 40は、撮像装置 22により 撮影されたユーザの動作を画像処理により解析し、ユーザが行ったジェスチャーを判 別し、ユーザの指示を取得する。この構成例によれば、ユーザの体の広い範囲を撮 影して、その動作を判別することができるので、ユーザは、指だけでなぐ手や足など を用いたジェスチャーにより指示を行うことができる。この方式は、撮像装置 22からあ る程度離れた位置でユーザがジェスチャーを行う場合に適してレ、る。ユーザとの距離 に依存しなレ、方式のジヱスチュラルインタフェイスを採用する場合は、入力装置 20と して、測距機能を有しない撮像装置を用いてもよいが、後述するように、ユーザが表 示画面上のオブジェクトを指などを用いて扱うことが可能なジエスチュラルインタフエ イスを提供するためには、測距機能を有する撮像装置 22を用いるのが好ましい。 FIG. 2 shows a configuration example of the input device 20 and the display device 30 according to the embodiment. In the example shown in FIG. 2, as the display device 30, an arbitrary display device 32 such as a liquid crystal display device, a plasma display device, or a cathode ray tube (Cathord Ray Tube: CRT) display device is used. An imaging device 22 having a distance measuring function, which is provided integrally with or separately from the imaging device 32, is used. The control device 40 analyzes the operation of the user photographed by the imaging device 22 by image processing, determines a gesture performed by the user, and acquires a user instruction. According to this configuration example, a wide range of the user's body can be imaged and its operation can be determined, so that the user can give an instruction by a gesture using a hand or foot that can be moved with just a finger. it can. This method is suitable when the user makes a gesture at a position some distance from the imaging device 22. In the case of adopting a digital interface of a system that does not depend on the distance to the user, an imaging device having no distance measuring function may be used as the input device 20, as described later. In order to provide a gesture interface that allows a user to handle an object on the display screen with a finger or the like, it is preferable to use the imaging device 22 having a distance measurement function.
[0011] 図 3は、実施の形態に係る入力装置 20および表示装置 30の他の構成例を示す。  FIG. 3 shows another configuration example of the input device 20 and the display device 30 according to the embodiment.
図 3に示した例では、表示装置 30として、スクリーン 38に画像を投影するプロジェクタ 36が用いられ、入力装置 20として、測距機能を有する撮像装置 22が用いられる。図 3の例では、ユーザの後方上部に設けられたプロジェクタ 36により、ユーザの前方に 設けられたガラスなどの透明または半透明のスクリーン 38に画像を投影し、スクリー ン 38に表示された画像に向かってジェスチャーを行うユーザを、スクリーン 38の反対 側に設けられた撮像装置 22により撮影している。この構成例によれば、ユーザの体 の広い範囲の動作を取得できるとともに、ユーザ力スクリーン 38に投影された画像に 直接触れながら、または画像の近傍で動作を行うことができるので、スクリーン 38に 投影されたオブジェクトなどを手などで直接扱っているような操作感をユーザに与え ること力 Sできる。また、撮像装置 22をスクリーン 38から離れた位置に配置することがで きるので、スクリーン 38の近傍でユーザがジェスチャーを行っても、ユーザの体の部 位までの距離を精度良く検知することができる。 In the example shown in FIG. 3, a projector 36 that projects an image on a screen 38 is used as the display device 30, and an imaging device 22 having a distance measuring function is used as the input device 20. In the example of FIG. 3, an image is projected on a transparent or translucent screen 38 made of glass or the like provided in front of the user by a projector 36 provided on the upper rear side of the user, and the image displayed on the screen 38 is displayed. A user making a gesture toward the camera is photographed by the imaging device 22 provided on the opposite side of the screen 38. According to this configuration example, it is possible to acquire the motion of a wide range of the body of the user and to perform the motion while directly touching the image projected on the user force screen 38 or in the vicinity of the image. The ability to give the user an operational feeling as if he / she is directly handling the projected object with his / her hand. Further, since the imaging device 22 can be arranged at a position distant from the screen 38, even if the user makes a gesture near the screen 38, the user's body part The distance to the position can be detected with high accuracy.
[0012] 図 4は、実施の形態に係る入力装置 20および表示装置 30のさらに別の構成例を 示す。図 4に示した例では、表示装置 30として、液晶ディスプレイ装置、プラズマディ スプレイ装置、 CRTディスプレイ装置などの任意の表示装置 32が用いられ、入力装 置 20として、表示装置 32の表示画面 34の内側に設けられたタツチパネル 24が用い られる。または、プロジェクタによりタツチパネル 24の表面に画像を投影する構成であ つてもよレ、。タツチパネル 24は、抵抗感圧方式、赤外線検出方式など、任意の方式 のタツチパネルであってよレ、。この構成例によれば、ユーザは、表示画面 34に表示さ れているオブジェクトなどに直接触れつつ指示を入力することができる。  FIG. 4 shows another configuration example of the input device 20 and the display device 30 according to the embodiment. In the example shown in FIG. 4, an arbitrary display device 32 such as a liquid crystal display device, a plasma display device, or a CRT display device is used as the display device 30, and a display screen 34 of the display device 32 is used as the input device 20. The touch panel 24 provided inside is used. Alternatively, an image may be projected on the surface of the touch panel 24 by a projector. The touch panel 24 can be any type of touch panel, such as a resistance pressure-sensitive method or an infrared detection method. According to this configuration example, the user can input an instruction while directly touching an object or the like displayed on the display screen 34.
[0013] 図 5は、実施の形態に係る入力装置 20および表示装置 30のさらに別の構成例を 示す。図 5に示した例では、表示装置 30として、液晶ディスプレイ装置、プラズマディ スプレイ装置、 CRTディスプレイ装置などの任意の表示装置 32が用いられ、入力装 置 20として、表示装置 32の表示画面 34の内側に設けられた非接触型入力装置 26 が用いられる。または、プロジェクタにより非接触型入力装置 26の表面に画像を投影 する構成であってもよい。この非接触型入力装置 26は、ユーザの指先などの物体が 接近したときに、その物体の形状や物体までの距離などを検知可能な入力装置であ り、たとえば、特開 2002-342033号公報に開示された入力装置を利用可能である 。特開 2002 - 342033号公報に開示された非接触型入力装置は、縦横に配列され た複数の線状の電極を備え、ユーザの指先などの導電性の物体が電極に近づレ、た ときに、その接近の程度に応じた静電容量の変化を検知し、入力装置の近傍にある 物体の 3次元位置情報を取得する。この構成例によれば、非接触型入力装置 26を 表示画面 34に近接して設けることができ、表示画面 34の近傍におけるユーザの動 作や体の部位の形状を精度良く検知することができるので、ユーザは、表示画面 34 に表示された画像を見ながら、その画像の近傍で指などの入力部位により指示を入 力すること力 Sできる。非接触型入力装置 26では、ユーザの体のうち、入力装置の近 傍に接近した部位のみが検出されるので、動作を解析するために特定の部位を抽出 する手間が省け、処理を簡略化することができる。また、接近させただけで、形状や 距離を検知することができるので、表示画面 34に触れなくても指示入力が可能であり 、表示画面 34に直接触ることに抵抗を感じるユーザであっても快適に使用することが できる。さらに、表示画面 34を強く押す必要が無ぐユーザが表示画面 34に触れる 前から接近を検知することができるので、反応性が良ぐ操作感に優れたユーザイン タフヱイスを提供することができる。 FIG. 5 shows still another configuration example of the input device 20 and the display device 30 according to the embodiment. In the example shown in FIG. 5, an arbitrary display device 32 such as a liquid crystal display device, a plasma display device, and a CRT display device is used as the display device 30, and the display screen 34 of the display device 32 is used as the input device 20. A non-contact type input device 26 provided inside is used. Alternatively, a configuration in which an image is projected on the surface of the non-contact input device 26 by a projector may be employed. The non-contact input device 26 is an input device capable of detecting the shape of an object, such as a fingertip of a user, and the distance to the object when the object approaches, for example, Japanese Patent Application Laid-Open No. 2002-342033. The input device disclosed in US Pat. The non-contact type input device disclosed in Japanese Patent Application Laid-Open No. 2002-342033 is provided with a plurality of linear electrodes arranged vertically and horizontally, and when a conductive object such as a fingertip of a user approaches the electrodes. At the same time, it detects the change in capacitance according to the degree of approach, and obtains three-dimensional position information of an object near the input device. According to this configuration example, the non-contact input device 26 can be provided close to the display screen 34, and it is possible to accurately detect the user's movement and the shape of the body part near the display screen 34. Therefore, the user can input an instruction with an input part such as a finger near the image while viewing the image displayed on the display screen 34. With the non-contact input device 26, only the part of the user's body that is in the vicinity of the input device is detected, eliminating the need to extract a specific part to analyze the motion and simplifying the processing. can do. In addition, since the shape and distance can be detected simply by approaching, it is possible to input instructions without touching the display screen 34. Therefore, even a user who feels resistance to directly touching the display screen 34 can be used comfortably. Furthermore, since the user who does not need to press the display screen 34 strongly can detect the approach before touching the display screen 34, it is possible to provide a user interface with good responsiveness and excellent operational feeling.
[0014] 上述した複数の構成例のうち、いずれの構成を採用するかは、本制御システム 10 を設置する場所の環境、制御するアプリケーションやコンテンツなどの種類などに応 じて決定されてもよレ、。たとえば、複数のユーザが一つの表示装置 30を共有して利 用するようなアプリケーションの場合、または、映画などのコンテンツを再生するアプリ ケーシヨンの場合、ユーザが表示装置 30から比較的距離をとつて指示を入力するこ とが想定されるので、図 2または図 3に示した構成例を採用してもよい。また、一人の ユーザが個人的に利用するようなアプリケーションの場合、たとえば、画像や文書な どのデータを編集する場合、ユーザと表示装置 30との間の距離は比較的短レ、ことが 想定されるので、図 4または図 5に示した構成例を採用してもよい。複数の構成例を 併用した制御システム 10を構築する場合は、アプリケーションやコンテンツごとに、い ずれの方式のジエスチュラルインタフェイスを採用するかを選択し、入力装置 20を適 宜切り替えて利用してもよい。  [0014] Which of the plurality of configuration examples described above is adopted may be determined according to the environment of the place where the control system 10 is installed, the type of application or content to be controlled, and the like. Les ,. For example, in an application in which a plurality of users share and use one display device 30, or in an application in which a content such as a movie is played, the user needs to keep a relatively large distance from the display device 30. Since it is assumed that an instruction is input, the configuration example shown in FIG. 2 or FIG. 3 may be adopted. Further, in the case of an application that one user uses personally, for example, when editing data such as images and documents, it is assumed that the distance between the user and the display device 30 is relatively short. Therefore, the configuration example shown in FIG. 4 or FIG. 5 may be adopted. When constructing the control system 10 using a plurality of configuration examples in combination, it is necessary to select which type of the gesture interface is to be adopted for each application or content, and to switch and use the input device 20 as appropriate. You may.
[0015] 図 6は、制御装置 40の内部構成を示す。この構成は、ハードウェア的には、任意の コンピュータの CPU、メモリ、その他の LSIで実現でき、ソフトウェア的にはメモリに口 ードされたプログラムなどによって実現される力 ここではそれらの連携によって実現 される機能ブロックを描いている。したがって、これらの機能ブロックがハードウェアの み、ソフトウェアのみ、またはそれらの組合せによっていろいろな形で実現できること は、当業者には理解されるところである。制御装置 40は、入力装置 20により検知され た入力信号を取得する取得部 42と、取得部 42が取得した入力信号からユーザの動 作を解析してユーザの指示を判別する解析部 44と、解析部 44により判別されたユー ザの指示に対応する機能を実行する制御部 46とを含む。  FIG. 6 shows an internal configuration of the control device 40. In terms of hardware, this configuration can be realized by any computer CPU, memory, or other LSI, and in software, the power realized by programs loaded into the memory, etc. The functional blocks are drawn. Therefore, it is understood by those skilled in the art that these functional blocks can be realized in various forms by hardware only, software only, or a combination thereof. The control device 40 includes an acquisition unit 42 that acquires an input signal detected by the input device 20, an analysis unit 44 that analyzes a user operation from the input signal acquired by the acquisition unit 42, and determines a user instruction, And a control unit 46 for executing a function corresponding to the user's instruction determined by the analysis unit 44.
[0016] 入力装置 20が測距機能付き撮像装置であった場合、解析部 44は、撮像装置が撮 影した距離情報を有する画像を取得して、画像処理によりユーザの動作を判別する 。解析部 44は、形状認識技術によりユーザの体の一部、たとえば、頭、 目、手、指、 足などを抽出し、抽出した体の部位の動きを解析することにより、ユーザの動作を判 別してもよい。入力装置 20がタツチパネルであった場合、解析部 44は、入力信号の 形状や時間変化を解析することにより、ユーザの動作を判別してもよい。入力装置 20 が非接触型入力装置であった場合、解析部 44は、入力信号の形状や距離、それら の時間変化を解析することにより、ユーザの動作を判別してもよい。 When the input device 20 is an imaging device with a distance measuring function, the analysis unit 44 acquires an image having distance information captured by the imaging device, and determines a user operation by image processing. The analysis unit 44 uses a shape recognition technique to detect a part of the user's body, such as the head, eyes, hands, fingers, The motion of the user may be determined by extracting a foot or the like and analyzing the motion of the extracted body part. When the input device 20 is a touch panel, the analysis unit 44 may determine the user's operation by analyzing the shape or time change of the input signal. When the input device 20 is a non-contact type input device, the analysis unit 44 may determine the user's motion by analyzing the shape and distance of the input signal and their time change.
[0017] 図 7は、制御装置 40のハードウェアコンポーネントを示す。制御装置 40は、 CPU1 20、入力インタフェイス 122、表示インタフェイス 124、メモリ 130、ノヽードディスク 132 、およびドライブ装置 128を備える。これらの構成は、バス 126などの信号伝送路によ り電気的に接続されている。  FIG. 7 shows hardware components of the control device 40. The control device 40 includes a CPU 120, an input interface 122, a display interface 124, a memory 130, a node disk 132, and a drive device 128. These components are electrically connected by a signal transmission path such as a bus 126.
[0018] 入力インタフェイス 122は、入力装置 20が検知した入力信号を取得する。表示イン タフヱイス 124は、表示装置 30に表示する画像信号を出力する。ハードディスク 132 は、大容量の磁気記憶装置であり、各種ファイルなどを記憶する。記録媒体 140は、 上述した制御装置 40の機能を、 CPU120に実現させるためのプログラムを記録する 。記録媒体 140がドライブ装置 128に挿入されると、そのプログラムは、メモリ 130また はハードディスク 132に読み出され、 CPU120は、読み出されたプログラムにより本 実施の形態の制御処理を行う。この記録媒体 140は、 CD-ROM, DVD, FDなどの コンピュータ読み取り可能な媒体である。  The input interface 122 acquires an input signal detected by the input device 20. The display interface 124 outputs an image signal to be displayed on the display device 30. The hard disk 132 is a large-capacity magnetic storage device, and stores various files. The recording medium 140 records a program for causing the CPU 120 to realize the functions of the control device 40 described above. When the recording medium 140 is inserted into the drive device 128, the program is read into the memory 130 or the hard disk 132, and the CPU 120 performs the control processing of the present embodiment by the read program. The recording medium 140 is a computer-readable medium such as a CD-ROM, a DVD, and an FD.
[0019] ここでは、プログラムが記録媒体 140に記録されている例について説明した力 別 の例においては、このプログラムは、無線、有線を問わず、外部のサーバから送信さ れてもよい。図 7に示したハードウェア構成において、プログラムは、コンピュータに本 実施の形態の制御機能を実現させればょレ、のであって、外部から供給される場合だ けでなく、予めハードディスク 132に格納されてレ、てよレ、ことも当業者には理解される ところである。  Here, in the power example described above in which the program is recorded on the recording medium 140, the program may be transmitted from an external server regardless of whether it is wireless or wired. In the hardware configuration shown in FIG. 7, the program is stored in the hard disk 132 in advance, not only when the program is realized from the outside but also when the computer realizes the control function of the present embodiment. It is well understood by those skilled in the art that this is the case.
[0020] 制御部 46の機能は、コンピュータの CPUなどにより実行されるオペレーティング 'シ ステム(Operating System :〇S)や、入出力制御用のアプリケーションなどにより実現 されてもよレ、。たとえば、ウィンドウシステムなどの GUIを採用した OSの場合は、ユー ザの指などの入力部位が接近した場所の近傍の位置で最上位にあるウィンドウのァ プリケーシヨンにユーザの指示が通知され、そのアプリケーションがユーザの指示に 対応づけられた機能を実行してもよい。その位置にウィンドウがない場合は、 OSまた は入出力制御用アプリケーションがその指示に対応づけられた機能を実行してもよ レ、。 The function of the control unit 46 may be realized by an operating system (オ ペ レ ー テ ィ ン グ S) executed by a CPU or the like of a computer, an input / output control application, or the like. For example, in the case of an OS that employs a GUI such as a window system, the user's instruction is notified to the application of the topmost window near the location where the input part such as the user's finger has approached, and the Application responds to user instructions The associated function may be executed. If there is no window at that position, the OS or I / O control application may execute the function associated with the instruction.
[0021] 1.複数点による指示入力の例  [0021] 1. Example of instruction input by multiple points
従来のポインティングデバイスを利用したユーザインタフェイスでは、 1点のポインタ により指示入力を行っていた力 本実施の形態の制御システム 10が提供するジヱス チュラルインタフェイスによれば、ユーザが指、手、足などを用いて、複数点の入力部 位により指示を行うことができる。  In a conventional user interface using a pointing device, a force is used to input an instruction with a single pointer. According to the artificial interface provided by the control system 10 according to the present embodiment, the user can use fingers, hands, and feet. An instruction can be given by using a plurality of input units using, for example.
[0022] 図 8および図 9は、ユーザの複数の指の動きにより指示を入力する例を説明するた めの図である。図 8に示すように、ユーザが親指と人差し指を閉じた状態で、表示装 置 30の画面上に表示されたアイコン 200上に指を近づけ、図 9に示すように、親指と 人差し指を開く動作を行ったとする。このとき、取得部 42は、入力装置 20が検知した 入力信号を解析部 44に送り、解析部 44は、ユーザの指の動きを解析して、ユーザが 指を開く指示を行ったことを判別する。  FIG. 8 and FIG. 9 are diagrams for explaining an example of inputting an instruction by the movement of a plurality of fingers of the user. With the thumb and index finger closed as shown in FIG. 8, the user brings his / her finger close to the icon 200 displayed on the screen of the display device 30, and opens the thumb and index finger as shown in FIG. Is performed. At this time, the acquisition unit 42 sends the input signal detected by the input device 20 to the analysis unit 44, and the analysis unit 44 analyzes the movement of the user's finger and determines that the user has issued an instruction to open the finger. I do.
[0023] 入力装置 20が測距機能付きカメラであった場合、解析部 44は、形状認識などの技 術によりユーザの手を抽出して指の動きを追跡し、ユーザが指を開く動作を行ったこ とを判別する。入力装置 20がタツチパネルまたは非接触型入力装置であった場合、 解析部 44は、表示画面上のアイコン 200の近傍の座標に接近した、または触れた一 かたまりの入力部位が二つに分かれ、それぞれが離れる方向に移動したときに、ュ 一ザが指を開く動作を行ったと判定する。一かたまりの入力部位が二つに分かれる 場合だけでなぐ 2つの入力部位が互いに離れる方向に移動したときにも、ユーザが 指を開く動作を行ったと判定してもよい。  If the input device 20 is a camera with a distance measurement function, the analysis unit 44 extracts the user's hand by using a technique such as shape recognition and tracks the movement of the finger, and performs an operation of opening the finger by the user. Determine what you have done. When the input device 20 is a touch panel or a non-contact type input device, the analysis unit 44 divides a set of input parts approaching or touching coordinates near the icon 200 on the display screen into two, and It is determined that the user has performed the operation of opening the finger when the user has moved away from the camera. Only when a group of input parts is divided into two parts When two input parts move in a direction away from each other, it may be determined that the user has performed the operation of opening the finger.
[0024] ユーザが指を開く動作を行ったとき、制御部 46は、指を開く動作に対応づけられた 機能を実行する。たとえば、制御部 46は、ユーザの指の近傍に表示されているアイコ ン 200に対応づけられたアプリケーションを起動する機能を実行してもよい。そのアイ コン 200力 ファイルに対応づけられていた場合は、そのファイルを扱うことが可能な アプリケーションを起動して、そのファイルを開く機能を実行してもよい。このように、「 指を開く」動作と、「アプリケーションを開く」、「ファイルを開く」などの機能を対応づけ ることで、ユーザの動作とアプリケーションの動作が感覚的に一致するので、より直感 的に分力りやすぐ親和性の高いユーザインタフェイスを実現することができる。 When the user performs the operation of opening the finger, the control unit 46 executes a function associated with the operation of opening the finger. For example, the control unit 46 may execute a function of activating an application associated with the icon 200 displayed near the user's finger. If the icon is associated with a file, an application that can handle the file may be started and the function to open the file may be executed. In this way, the "open finger" operation is associated with functions such as "open application" and "open file". By doing so, the user's operation and the application's operation are intuitively matched, so that a more intuitive user interface with a higher level of affinity and immediate affinity can be realized.
[0025] 指を開く動作と、「開始する」、「決定する」、「確定する」などの機能を対応づけても よい。その他、各アプリケーションが、指を開く動作に対応した機能を実装してもよい 。たとえば、画像処理用のアプリケーションにおいて、ユーザが画像上で指を開く動 作を行ったとき、その部分を拡大する、またはその部分を指を開いた方向に伸ばす 機能を実行してもよい。  [0025] The operation of opening a finger may be associated with functions such as "start", "determine", and "determine". In addition, each application may implement a function corresponding to the operation of opening the finger. For example, in an image processing application, when a user performs an operation of opening a finger on an image, a function of enlarging the portion or extending the portion in the direction in which the finger is opened may be executed.
[0026] 逆に、ユーザが、図 9に示すように、親指と人差し指を開いた状態で、表示装置 30 の画面上に表示されたアイコン 200上に指を近づけ、図 8に示すように、親指と人差 し指を閉じる動作を行ったとき、解析部 44が、ユーザがアイコン 200上で指を閉じる 動作を行ったと判定し、制御部 46が、指を閉じる動作に対応づけられた機能を実行 してもよレ、。たとえば、制御部 46は、ユーザの指の近傍に表示されているアイコン 20 0又はウィンドウに対応づけられたアプリケーションを終了する機能を実行してもよい し、アイコン 200又はウィンドウに対応づけられたファイルを閉じる機能を実行してもよ レ、。また、画像処理用のアプリケーションにおいて、ユーザが画像上で指を閉じる動 作を行ったとき、その部分を縮小する、またはその部分を指を閉じた方向に縮める機 能を実行してもよい。  [0026] Conversely, as shown in FIG. 9, the user moves his / her finger close to the icon 200 displayed on the screen of the display device 30 with the thumb and the index finger open, and as shown in FIG. When the thumb and the forefinger are closed, the analyzing unit 44 determines that the user has performed the finger closing operation on the icon 200, and the control unit 46 performs the function associated with the finger closing operation. You can do it. For example, the control unit 46 may execute a function of terminating the application associated with the icon 200 or the window displayed near the user's finger, or may execute the function of terminating the icon 200 or the file associated with the window. You may perform the function to close the. Further, in an image processing application, when a user performs an operation of closing a finger on an image, a function of reducing that part or reducing the part in a direction in which the finger is closed may be executed.
[0027] 上述の例では、指を開く又は閉じる動作について説明したが、その他、制御部 46 は、ユーザの指や手などを含む複数の入力部位を用いた指示を受け付けて、その動 作に対応づけられた機能を実行してもよい。たとえば、本を閲覧するアプリケーション において、ユーザが表示された本のページの角をつまむ動作を行ったとき、そのべ ージを記憶して、「しおりを挟む」機能を実行してもよレ、。また、ロールプレイングゲー ムにおいて、ユーザが表示されたアイテムの近傍に指をおいて、そのアイテムをつか む動作を行ったとき、そのアイテムを「拾う」機能を実行してもよい。  [0027] In the above example, the operation of opening or closing a finger has been described. In addition, the control unit 46 receives an instruction using a plurality of input sites including a user's finger, hand, and the like, and performs the operation. The associated function may be executed. For example, in a book browsing application, when a user performs an operation of pinching a corner of a displayed book page, the user may memorize the page and execute a “pinch bookmark” function. . In a role-playing game, when the user performs an operation of holding an item by placing a finger near the displayed item, a function of “picking up” the item may be executed.
[0028] 2.形状を利用した指示入力の例  [0028] 2. Example of instruction input using shape
図 10は、ユーザの手の形状により指示を入力する例を説明するための図である。 図 9に示したように、ユーザが手を開いた状態で表示画面上に手の平を置いたとする 。入力装置 20が測距機能付きカメラであった場合、解析部 44は、形状認識などの技 術によりユーザの手を抽出して手の形状を判別する。入力装置 20がタツチパネルま たは非接触型入力装置であった場合、解析部 44は、表示画面上に接近した、また は触れた物体の形状から、既知の技術により特徴点を抽出して、または所定の評価 式により評価して、物体の形状を判別する。一般的な手の形状をデータベースに保 持しておき、データベースに照合して一致する形状を抽出することにより、手の形状 を判別してもよい。検出した物体の面積に基づいて形状を判定してもよい。たとえば 、ユーザが手を表示画面上に置いたとき、手を開いた状態が最も面積が大きくなり、 手を握った状態が最も面積が小さくなる。このことを利用して、手の形状を判別しても よレ、。ユーザが手を開いた状態で画面上に手の平を置いたとき、制御部 46は、その 形状に対応する機能、たとえば、手の平を置いた位置において最上位に表示中のゥ インドウ 210に対応するアプリケーションを「終了する」機能を実行する。 「手の平を置 く」という動作と、「停止する」、「終了する」などの機能を対応づけることで、直感的に 分かりやすぐ操作性の良いユーザインタフェイスを実現することができる。 FIG. 10 is a diagram for explaining an example of inputting an instruction based on the shape of the user's hand. As shown in FIG. 9, it is assumed that the user places his palm on the display screen with his / her hand open. If the input device 20 is a camera with a distance measurement function, the analysis unit 44 The shape of the hand is determined by extracting the hand of the user by the technique. When the input device 20 is a touch panel or a non-contact type input device, the analysis unit 44 extracts a feature point from a shape of an object approaching or touching the display screen by using a known technique. Alternatively, the shape of the object is determined by performing evaluation using a predetermined evaluation formula. A general hand shape may be stored in a database, and the shape of the hand may be determined by extracting a matching shape from the database. The shape may be determined based on the area of the detected object. For example, when the user places his / her hand on the display screen, the area is the largest when the hand is open and the smallest when the user holds the hand. Utilizing this fact, the shape of the hand may be determined. When the user places his / her palm on the screen with his / her hand open, the control unit 46 performs a function corresponding to the shape, for example, an application corresponding to the window 210 which is displayed at the highest position in the position where the palm is placed. Perform the function to "quit". By associating the actions of “put your palm” with functions such as “stop” and “end”, it is possible to realize a user interface that is intuitive and easy to operate.
[0029] 図 11は、手の形状に対してオブジェクトをマッピングする例を説明するための図で ある。図 11に示したように、ユーザが一方の手、ここでは左手で特定の形状を作り、 他方の手、ここでは右手で表示画面上のオブジェクト 220の位置から左手の方向へ 指をずらす動作を行ったとする。手の形状の判別方法は、上述した例と同様である。 このとき、制御部 46は、左手の形状に対応するハードディスク 132の格納場所に、右 手の指で移動させたオブジェクト 220に対応するファイルを格納する機能を実行する 。左手の形状に対応する格納場所は、ファイルシステムにおけるディレクトリまたはフ オルダなどであってもよいし、仮想的なフォルダであってもよレ、。また、手の形状にフ アイル自身を対応づけてもよレ、。一つの形状に対して一つの格納場所を対応づける だけでなぐたとえば、開いた手の各指に対してそれぞれ格納場所を対応づけてもよ レ、。 FIG. 11 is a diagram illustrating an example of mapping an object to a hand shape. As shown in Fig. 11, the user moves a finger from the position of the object 220 on the display screen to the left hand with the other hand, here the right hand, and creates a specific shape with the other hand, here the left hand. Suppose you went. The method of determining the shape of the hand is the same as in the example described above. At this time, the control unit 46 executes a function of storing a file corresponding to the object 220 moved by the finger of the right hand in a storage location of the hard disk 132 corresponding to the shape of the left hand. The storage location corresponding to the shape of the left hand may be a directory or a folder in the file system, or may be a virtual folder. You can also associate the file with the shape of your hand. It is not only necessary to associate one storage location with one shape. For example, the storage location may be associated with each finger of an open hand.
[0030] 図 12は、手の形状に対してオブジェクトをマッピングするためのテーブルの例を示 す。テーブル 230は、制御装置 40のメモリ 130またはハードディスク 132などに保持 される。テーブル 230には、形状欄 232、格納場所欄 234が設けられている。形状欄 232は、手の形状を表す画像、パラメータなどを保持する。手の形状を画像として認 識する場合は、その画像ファイルのファイル名などを保持してもよいし、手の形状を 特徴点または評価式などでパラメータ化して認識する場合は、そのパラメータ、また はパラメータが格納されたファイルのファイル名などを保持してもよレ、。格納場所欄 2 34は、オブジェクトの格納場所を保持する。手の形状と、その形状に対応する格納 場所は、予め登録されていてもよいし、ユーザが図 11に示した操作を行ったときに、 手の形状が未登録の形状であった場合、制御部 46が、その手の形状と格納場所を テーブル 230に登録してもよい。このとき、ユーザが格納場所を指示してもよいし、 自 動的に適当な格納場所を割り当ててもよレ、。手の形状にファイル自身を対応づける 場合は、格納場所欄 234は、そのファイルのファイル名を保持する。 FIG. 12 shows an example of a table for mapping an object to a hand shape. The table 230 is held in the memory 130 or the hard disk 132 of the control device 40. The table 230 has a shape column 232 and a storage location column 234. The shape column 232 holds an image representing a hand shape, parameters, and the like. Recognize hand shapes as images When recognizing the image, the file name of the image file may be stored.When recognizing the shape of the hand as a parameter using feature points or evaluation formulas, the parameter or the file in which the parameter is stored You can keep the file name and so on. The storage location column 234 holds the storage location of the object. The hand shape and the storage location corresponding to the shape may be registered in advance, or if the hand shape is an unregistered shape when the user performs the operation shown in FIG. 11, The controller 46 may register the shape and storage location of the hand in the table 230. At this time, the user may specify the storage location, or an appropriate storage location may be automatically assigned. When associating the file itself with the hand shape, the storage location field 234 holds the file name of the file.
[0031] 図 13は、図 11に示した操作により格納されたオブジェクトを取り出す例を説明する ための図である。図 13に示したように、ユーザが一方の手、ここでは左手で取り出し たいオブジェクトの格納場所に対応する手の形状を作り、他方の手、ここでは右手で 左手の近傍の位置から離れる方向へ指をずらす動作を行う。このとき、制御部 46は、 左手の形状に対応する格納場所に格納されたオブジェクトを特定し、表示画面上に 表示する。制御部 46は、このオブジェクトに対応するファイルを開く機能を実行しても よい。 FIG. 13 is a view for explaining an example of extracting an object stored by the operation shown in FIG. As shown in Fig. 13, the user creates the shape of the hand corresponding to the storage location of the object to be retrieved with one hand, here the left hand, and moves away from the position near the left hand with the other hand, here the right hand. Perform the operation of moving your finger. At this time, the control unit 46 specifies the object stored in the storage location corresponding to the shape of the left hand, and displays it on the display screen. The control unit 46 may execute a function of opening a file corresponding to the object.
[0032] 3.距離に応じた指示入力の例  [0032] 3. Example of instruction input according to distance
図 2および図 3に示した測距機能付きの撮像装置 22を用いた構成例、および図 5 に示した非接触型入力装置 26を用いた構成例では、表示装置 30からユーザの体の 部位までの距離を検知することができる。以下、距離に応じた指示入力の例につい て説明する。  In the configuration example using the imaging device 22 with the distance measuring function shown in FIGS. 2 and 3 and the configuration example using the non-contact type input device 26 shown in FIG. The distance to can be detected. Hereinafter, an example of the instruction input according to the distance will be described.
[0033] 図 14は、距離に応じた指示入力の例を説明するための図である。図 14に示した例 では、制御部 46が表示装置に魚が泳いでいる動画像を表示している。ここで、ユー ザが表示画面に手を近づけたときに、表示画面からユーザの手までの距離が所定の しきい値を下回ると、制御部 46は、ユーザの手の近傍から魚が逃げる動画像に表示 を切り替える。これにより、現実に生きた魚が表示画面中を泳いでいるような臨場感を 与えること力できる。後述の例 4に関連するが、ユーザの手が近づくときの速度を算出 し、速度が速いときは、表示を切り替えるしきい値を大きくして、手が遠い位置にある ときから魚が逃げ始めるようにし、速度が遅いときは、しきい値を小さくして、手が近い 位置に来るまで魚が逃げないようにしてもよい。制御部 46は、ユーザの手と表示画面 との間の距離が所定値以上であるときは、魚が自由に泳いでいる動画像を表示し、 距離が所定値を下回っているときは、魚がユーザの手の位置を避けて泳ぐ動画像を 表示してもよい。非接触型入力装置は、表示画面の近傍の距離情報を精度良く検知 することができるので、この例では、入力装置 20として非接触型入力装置を用いるこ とがより好ましい。 FIG. 14 is a diagram for explaining an example of an instruction input according to a distance. In the example shown in FIG. 14, the control unit 46 displays a moving image in which a fish is swimming on the display device. Here, when the user approaches the display screen and the distance from the display screen to the user's hand falls below a predetermined threshold, the control unit 46 sets the moving image in which the fish escapes from the vicinity of the user's hand. Switch the display to the image. As a result, it is possible to give a sense of realism as if a real living fish were swimming on the display screen. Although related to Example 4 below, the speed at which the user's hand approaches is calculated, and when the speed is fast, the threshold for switching the display is increased, and the hand is at a far position. The fish may start to escape from time to time, and when the speed is slow, the threshold value may be reduced so that the fish does not escape until the hand comes close. When the distance between the user's hand and the display screen is equal to or greater than a predetermined value, the control unit 46 displays a moving image in which the fish is freely swimming, and when the distance is smaller than the predetermined value, the fish is displayed. May display a moving image of swimming while avoiding the position of the user's hand. Since the non-contact input device can accurately detect distance information near the display screen, in this example, it is more preferable to use the non-contact input device as the input device 20.
[0034] 入力装置 20として撮像装置を用いる場合は、測距機能や画像処理の精度の関係 から、ユーザが撮像装置から離れすぎていると正確にユーザの指示を判別できない 恐れがある。また、入力装置 20として非接触型入力装置を用いる場合は、ユーザの 手などの部位が入力装置 20に近づかないと、静電容量の変化が起こらず検知でき ない。このように、入力装置 20の特性に応じて、ユーザが指示入力を行うことが可能 な距離の上限値が存在するので、上述の魚が泳ぐ動画像の例を、指示入力の可否 を提示する手段として利用してもよい。たとえば、ユーザの手の位置が入力装置 20 力 離れすぎていて指示入力が不可能な距離であるときには、魚が自由に泳ぎ回つ ている動画像を表示して、入力が不可能であることをユーザに提示し、ユーザが入力 装置に近づいて指示入力が可能な距離になったときに、ユーザの手の回りを魚が避 けるように泳ぐ動画像を表示して、入力が可能であることをユーザに提示してもよい。 制御部 46は、入力装置 20に対する指示入力の可否を判断するための距離のしきい 値を予め保持してぉレ、てもよレ、。  When an imaging device is used as the input device 20, if the user is too far from the imaging device, there is a possibility that the user's instruction cannot be determined accurately due to the relationship between the distance measurement function and the accuracy of the image processing. In addition, when a non-contact input device is used as the input device 20, unless a part such as a user's hand approaches the input device 20, the capacitance does not change and cannot be detected. As described above, since there is an upper limit value of the distance at which the user can input an instruction according to the characteristics of the input device 20, the above-described example of a moving image in which a fish swims indicates whether or not the instruction can be input. It may be used as a means. For example, if the position of the user's hand is too far away from the input device and cannot input an instruction, a moving image of the fish swimming freely is displayed, and input is not possible. Is presented to the user, and when the user approaches the input device and reaches a distance at which an instruction can be input, a moving image in which a fish swims around the user's hand to avoid the fish is displayed, and input can be performed. May be presented to the user. The control unit 46 previously holds a distance threshold value for determining whether or not an instruction input to the input device 20 is possible.
[0035] 4.移動量、速度、加速度に応じた指示入力の例  4. Example of Instruction Input According to Movement Amount, Speed, and Acceleration
図 15および図 16は、画面上に表示されたオブジェクトを移動する機能について説 明するための図である。図 15および図 16の例では、ユーザが表示装置 30に表示さ れたオブジェクトを指でずらして移動させようとしている。このとき、制御部 46は、ユー ザの指などの部位の移動量、速度、加速度を検知して、それに応じてオブジェクトの 移動量を決定する。たとえば、オブジェクトごとに重さ、すなわちオブジェクトを移動さ せるために必要な仮想的なエネルギー値を予め設定しておき、そのエネルギー値と 、ユーザがオブジェクトを移動させようと指を動かすときの移動量、速度、または加速 度に基づいて、オブジェクトの移動状態を制御する。制御部 46は、軽いオブジェクト については、図 15に示すように、ユーザが素早く指をずらした場合にも、それに追随 してオブジェクトを移動させる力 重いオブジェクトについては、図 16に示すように、 ユーザが素早く指をずらした場合にはオブジェクトを移動させない。重いオブジェクト は、初めはユーザがゆっくりと指をずらすことで移動し始め、徐々に指を早く移動させ ると、それに追随して移動するようにする。これにより、現実の静止摩擦および動摩擦 抵抗を模した移動状況を擬似的に再現することができ、ユーザにオブジェクトの重み を反映した操作感を与えることができる。この例で説明した技術は、従来のポインティ ングデバイスを用いたユーザインタフェイスにおいても利用可能である。 FIG. 15 and FIG. 16 are diagrams for explaining the function of moving the object displayed on the screen. In the examples of FIG. 15 and FIG. 16, the user tries to move the object displayed on the display device 30 by shifting it with a finger. At this time, the control unit 46 detects the movement amount, speed, and acceleration of a part such as a user's finger, and determines the movement amount of the object accordingly. For example, the weight of each object, that is, the virtual energy value required to move the object is set in advance, and the energy value and the amount of movement when the user moves his or her finger to move the object are set. , Speed, or acceleration The moving state of the object is controlled based on the degree. As shown in FIG. 15, the control unit 46 moves the user's finger quickly for a light object, as shown in FIG. Does not move the object if the finger is quickly moved. At first, the heavy object starts moving by slowly moving the finger of the user, and moves gradually following the movement of the finger. This makes it possible to simulate a moving state simulating actual static friction and dynamic frictional resistance, and to give the user an operational feeling reflecting the weight of the object. The technology described in this example can be used in a user interface using a conventional pointing device.
[0036] 制御部 46は、ユーザがオブジェクトに与える力の大きさを判定して、その力の大き さに応じてオブジェクトの移動状態を制御してもよい。入力装置 20として撮像装置ま たは非接触型入力装置を用レ、る場合は、入力装置 20とユーザの手の間の距離に基 づいて力の大きさを判定してもよレ、。たとえば、ユーザの手が表示画面に近いほど大 きな力をオブジェクトに与えることができるようにしてもよい。入力装置 20としてタツチ パネルを用いる場合は、押圧する力の大きさに基づいて力の大きさを判定してもよい 。従来のポインティングデバイスを用いる場合は、感圧式マウスなどの押圧の度合い により力の大きさを判定してもよい。  The control unit 46 may determine the magnitude of the force applied to the object by the user and control the moving state of the object according to the magnitude of the force. When an imaging device or a non-contact type input device is used as the input device 20, the magnitude of the force may be determined based on the distance between the input device 20 and the user's hand. For example, the closer the user's hand is to the display screen, the more force may be applied to the object. When a touch panel is used as the input device 20, the magnitude of the force may be determined based on the magnitude of the pressing force. When a conventional pointing device is used, the magnitude of the force may be determined based on the degree of pressing of a pressure-sensitive mouse or the like.
[0037] 図 17は、紙を扱う機能について説明するための図である。図 17では、制御部 46が 本を開いた画像を表示装置 30に表示している。ユーザがページの上に指をおき、ゆ つくり指をずらすと、制御部 46は、ユーザの指の動きに追随してページが徐々にめく れていき、所定の位置を超えるとページがめくれて次のページにうつるような画像を 表示する。ユーザが素早く指を動かした場合は、制御部 46は、紙が指の動きに追随 できずに、途中までめくれて元に戻るような画像を表示する。これにより、現実に本を 読んでいるような臨場感をユーザに与えることができる。制御部 46は、紙がユーザの 動きに追随できるか否力 ^判断するために用いる速度のしきい値を予め保持してもよ レ、。その他、紙を扱う例として、制御部 46は、ユーザの指の動きが早過ぎた場合に、 紙がしわを作る様子を表示したり、ユーザが紙をつまむ、破る、などの動作を行ったと きに、紙に凹凸をつけたり、紙が破れた様子を表示したりしてもよい。 [0038] 移動量、速度、加速度に応じた指示入力の他の例として、ユーザがあるオブジェク トを指などにより移動させ、他のオブジェクトの上を通過させたとき、移動速度に応じ て他のオブジェクトに所定の処理を行ってもよい。たとえば、移動中のオブジェクトに 対応するファイルを、通過したオブジェクトに対応するアプリケーションにより開く機能 を実行してもよい。 FIG. 17 is a diagram for explaining a paper handling function. In FIG. 17, the control unit 46 displays an image of a book opened on the display device 30. When the user places his finger on the page and slowly moves his finger, the control unit 46 turns the page gradually following the movement of the user's finger, and turns the page when it exceeds a predetermined position. Displays a moving image on the next page. When the user moves his finger quickly, the control unit 46 displays an image in which the paper cannot follow the movement of the finger, but turns halfway and returns. As a result, it is possible to give the user a sense of presence as if he / she is actually reading a book. The control unit 46 may hold in advance a threshold value of the speed used to determine whether the paper can follow the movement of the user. As another example of handling paper, when the user's finger moves too quickly, the control unit 46 indicates that the paper makes a wrinkle or performs an operation such as pinching or tearing the paper. At this time, the paper may be made uneven, or the paper may be displayed as torn. [0038] As another example of the instruction input according to the moving amount, the speed, and the acceleration, when the user moves an object with a finger or the like and passes over another object, the other object is changed according to the moving speed. A predetermined process may be performed on the object. For example, the function of opening the file corresponding to the moving object by the application corresponding to the passed object may be executed.
[0039] 5. 3次元オブジェクトに対する指示入力の例  [0039] 5. Example of instruction input for 3D object
従来、 3次元オブジェクトを表示画面上に表示し、ユーザから 3次元オブジェクトの 移動、回転などの操作を受け付ける場合、ポインタの動きが 2次元的であるから、 3次 元的な指示入力が困難であるという問題があった。図 2および図 3に示した測距機能 付きの撮像装置 22を用いた構成例、および図 5に示した非接触型入力装置 26を用 いた構成例では、ユーザの体の部位の動きを 3次元的に取得することができるので、 3次元空間を扱うのに適したジエスチュラルインタフェイスを提供することができる。  Conventionally, when displaying a three-dimensional object on a display screen and accepting operations such as moving and rotating the three-dimensional object from the user, it is difficult to input three-dimensional instructions because the movement of the pointer is two-dimensional. There was a problem. In the configuration example using the imaging device 22 with the distance measuring function shown in FIGS. 2 and 3, and the configuration example using the non-contact type input device 26 shown in FIG. Since it can be obtained in a three-dimensional manner, it is possible to provide a gesture interface suitable for handling a three-dimensional space.
[0040] 図 18は、 3次元オブジェクトを扱う例を説明するための図である。制御部 46が表示 装置 30に表示した 3次元オブジェクト 240の近傍で、ユーザがオブジェクト 240をつ かんで紙面奥方向へ回すジェスチャーを行うと、制御部 46は、ユーザが手を回した 方向と同じ方向に 3次元オブジェクト 240を回転させる。このように、本実施の形態の ジエスチュラルインタフェイスによれば、 3次元オブジェクトを実際に手で扱っているよ うな操作感を与えることができ、 3次元空間を扱うためのユーザインタフェイスの操作 性を飛躍的に向上させることができる。  FIG. 18 is a diagram for describing an example of handling a three-dimensional object. When the user performs a gesture of grasping the object 240 and turning it in the depth direction of the paper near the three-dimensional object 240 displayed on the display device 30 by the control unit 46, the control unit 46 performs the same operation as the direction in which the user turns his hand. Rotate the 3D object 240 in the direction. As described above, according to the gesture interface of the present embodiment, it is possible to give an operational feeling as if a 3D object is actually handled by hand, and to provide a user interface for handling a 3D space. Operability can be dramatically improved.
[0041] (第 2の実施の形態)  (Second Embodiment)
第 2の実施の形態では、ユーザの体の一部ではなぐユーザにより操作される物体 の形状又は動作、又は物体までの距離により指示を入力する技術にっレ、て説明する 。本実施の形態の制御システム 10の全体構成、入力装置 20、表示装置 30、制御装 置 40の内部構成は、それぞれ第 1の実施の形態と同様である。本実施の形態では、 主に、図 5に示した非接触型入力装置 26を用いた構成例について説明するが、他 の構成例を用いた場合についても同様である。静電容量の変化を検知する非接触 型入力装置 26の場合、導電性の物体を表示画面上に置き、その物体にユーザが触 れると、静電容量が変化して物体の形状が検知される。これを利用して、導電性物体 の形状や動きに所定の機能を対応づけてユーザインタフェイスを構築することができ る。 In the second embodiment, a technique for inputting an instruction based on the shape or motion of an object operated by the user, which is not part of the body of the user, or the distance to the object will be described. The overall configuration of the control system 10 of the present embodiment and the internal configurations of the input device 20, the display device 30, and the control device 40 are the same as those of the first embodiment. In the present embodiment, a configuration example using the non-contact input device 26 shown in FIG. 5 will be mainly described, but the same applies to a case where another configuration example is used. In the case of the non-contact input device 26 that detects a change in capacitance, a conductive object is placed on the display screen, and when the user touches the object, the capacitance changes and the shape of the object is detected. You. Utilizing this, conductive objects A user interface can be constructed by associating predetermined functions with shapes and movements of the user.
[0042] 図 19は、物体の形状や動きにより指示を入力する例を説明するための図である。  FIG. 19 is a diagram for explaining an example in which an instruction is input based on the shape and movement of an object.
図 19に示した例では、表示装置 30に、制御部 46により制御される音楽再生アプリケ ーシヨンのウィンドウ 250が表示されている。このとき、ユーザが、音量調節機能に対 応づけられた形状を有する音量調節ユニット 260を表示画面上に置いて、ッマミ 262 を左右に動かしたとする。解析部 44は、入力信号の形状を解析して、これが音量調 節ユニット 260であることを検知し、さらにッマミ 262の動きを解析して制御部 46に伝 える。制御部 46は、ッマミ 262の移動量に応じて、音楽再生アプリケーションにおけ る音量を制御する。  In the example shown in FIG. 19, a window 250 of a music reproduction application controlled by the control unit 46 is displayed on the display device 30. At this time, it is assumed that the user places the volume control unit 260 having a shape corresponding to the volume control function on the display screen and moves the slider 262 left and right. The analysis unit 44 analyzes the shape of the input signal, detects that the input signal is the volume control unit 260, analyzes the movement of the knob 262, and transmits it to the control unit 46. The control unit 46 controls the volume in the music playback application according to the amount of movement of the knob 262.
[0043] 静電容量の変化を検知する非接触型入力装置は、導電性物体のみを検知するの で、絶縁性物体の底面に導線により特定の形状を描き、その形状と所定の機能とを 対応づけてもよい。  [0043] Since the non-contact input device that detects a change in capacitance detects only a conductive object, a specific shape is drawn by a conductive wire on the bottom surface of the insulating object, and the shape and a predetermined function are determined. It may be associated.
[0044] 以上、本発明を実施の形態をもとに説明した。この実施の形態は例示であり、それ らの各構成要素や各処理プロセスの組合せにいろいろな変形例が可能なこと、また そうした変形例も本発明の範囲にあることは当業者に理解されるところである。  The present invention has been described based on the embodiments. This embodiment is an exemplification, and it is understood by those skilled in the art that various modifications can be made to the combination of each component and each processing process, and that such modifications are also within the scope of the present invention. By the way.
[0045] 実施の形態では、制御システム 10がパーソナルコンピュータなどの電子機器に相 当する例について説明した。他の例では、入力装置 20として非接触型入力装置 26 を用い、テーブルの天板に表示装置 30および入力装置 20を設け、その上でゲーム などを楽しめるように構成してもよい。また、通路の床などに表示装置 30および入力 装置 20を設け、ユーザが歩いた足跡を表示したり、ユーザの行き先を画像や光など でナビゲーシヨンするように構成してもよレ、。 In the embodiment, an example has been described in which control system 10 corresponds to an electronic device such as a personal computer. In another example, a non-contact input device 26 may be used as the input device 20, and the display device 30 and the input device 20 may be provided on the tabletop of the table so that a game or the like can be enjoyed thereon. Further, the display device 30 and the input device 20 may be provided on a floor of a passage or the like to display a footprint of a user walking or to navigate a destination of the user by using an image or light.
産業上の利用可能性  Industrial applicability
[0046] 本発明は、電子機器などを制御するためのユーザインタフェイスに適用できる。 The present invention can be applied to a user interface for controlling an electronic device or the like.
図面の簡単な説明  Brief Description of Drawings
[0047] [図 1]実施の形態に係る制御システムの構成を示す図である。  FIG. 1 is a diagram showing a configuration of a control system according to an embodiment.
[図 2]実施の形態に係る入力装置および表示装置の構成例を示す図である。  FIG. 2 is a diagram showing a configuration example of an input device and a display device according to an embodiment.
[図 3]実施の形態に係る入力装置および表示装置の他の構成例を示す図である。 [図 4]実施の形態に係る入力装置および表示装置のさらに別の構成例を示す図であ る。 FIG. 3 is a diagram showing another configuration example of the input device and the display device according to the embodiment. FIG. 4 is a diagram showing still another configuration example of the input device and the display device according to the embodiment.
[図 5]実施の形態に係る入力装置および表示装置のさらに別の構成例を示す図であ る。  FIG. 5 is a diagram showing still another configuration example of the input device and the display device according to the embodiment.
[図 6]制御装置の内部構成を示す図である。  FIG. 6 is a diagram showing an internal configuration of a control device.
[図 7]制御装置のハードウェアコンポーネントを示す図である。  FIG. 7 is a diagram showing hardware components of the control device.
[図 8]ユーザの複数の指の動きにより指示を入力する例を説明するための図である。  FIG. 8 is a diagram for explaining an example of inputting an instruction by movement of a plurality of fingers of a user.
[図 9]ユーザの複数の指の動きにより指示を入力する例を説明するための図である。  FIG. 9 is a diagram for explaining an example in which an instruction is input by movement of a plurality of fingers of a user.
[図 10]ユーザの手の形状により指示を入力する例を説明するための図である。  FIG. 10 is a diagram for explaining an example of inputting an instruction according to the shape of a user's hand.
[図 11]手の形状に対してオブジェクトをマッピングする例を説明するための図である。  FIG. 11 is a diagram illustrating an example of mapping an object to a hand shape.
[図 12]手の形状に対してオブジェクトをマッピングするためのテーブルの例を示す図 である。  FIG. 12 is a diagram showing an example of a table for mapping objects to hand shapes.
[図 13]図 11に示した操作により格納されたオブジェクトを取り出す例を説明するため の図である。  13 is a diagram for explaining an example of retrieving an object stored by the operation shown in FIG.
[図 14]距離に応じた指示入力の例を説明するための図である。  FIG. 14 is a diagram for explaining an example of an instruction input according to a distance.
[図 15]画面上に表示されたオブジェクトを移動する機能について説明するための図 である。  FIG. 15 is a diagram for explaining a function of moving an object displayed on a screen.
[図 16]画面上に表示されたオブジェクトを移動する機能について説明するための図 である。  FIG. 16 is a diagram for explaining a function of moving an object displayed on a screen.
[図 17]紙を扱う機能について説明するための図である。  FIG. 17 is a diagram for explaining a function for handling paper.
[図 18]3次元オブジェクトを扱う例を説明するための図である。  FIG. 18 is a diagram for describing an example of handling a three-dimensional object.
[図 19]物体の形状や動きにより指示を入力する例を説明するための図である。  FIG. 19 is a diagram for explaining an example of inputting an instruction based on the shape and movement of an object.
符号の説明 Explanation of symbols
10· "制御システム、 20· "入力装置、 22···撮像装置、 24···タツチパネル、 26·· •非接触型入力装置、 30· · '表示装置、 40·· '制御装置、 42···取得部、 44···解 析部、 46···制御部。 10 "Control system, 20" Input device, 22 Image sensor, 24 Touch panel, 26 Non-contact type input device, 30 Display device, 40 Control device, 42 ··· Acquisition unit, 44 ··· Analysis unit, 46 ··· Control unit.

Claims

請求の範囲 The scope of the claims
[1] ユーザの体の少なくとも一部又はユーザにより操作される物体の少なくとも一部を 含む複数の入力部位の動作を検知する検知部と、  [1] a detection unit that detects an operation of a plurality of input parts including at least a part of a user's body or at least a part of an object operated by the user;
前記検知部により検知された前記入力部位の動作を解析して、ユーザの指示を判 別する解析部と、  An analysis unit that analyzes an operation of the input part detected by the detection unit and determines a user instruction;
前記解析部により判別されたユーザの指示に対応する機能を実行する制御部と、 を備えることを特徴とする制御システム。  A control unit that executes a function corresponding to a user's instruction determined by the analysis unit.
[2] 前記制御部は、ユーザが指を開く又は閉じる動作を行ったときに、その動作に対応 づけられた機能を実行することを特徴とする請求項 1に記載の制御システム。  2. The control system according to claim 1, wherein, when the user performs an operation of opening or closing a finger, the control unit executes a function associated with the operation.
[3] ユーザに画像を提示するための表示部を更に備え、  [3] a display unit for presenting an image to a user is further provided;
前記制御部は、前記表示部に表示されたオブジェクトの近傍でユーザが指を開く 又は閉じる動作を行ったときに、そのオブジェクトに対応づけられたアプリケーション に、指を開く又は閉じる動作に対応づけられた機能を実行させることを特徴とする請 求項 2に記載の制御システム。  When the user performs an operation of opening or closing a finger in the vicinity of the object displayed on the display unit, the control unit is associated with the operation of opening or closing the finger in an application associated with the object. 3. The control system according to claim 2, wherein the control system executes the function.
[4] ユーザの体の少なくとも一部又はユーザにより操作される物体の少なくとも一部を 含む入力部位の形状又は動作、又は前記入力部位までの距離を検知する検知部と 前記検知部により検知された前記入力部位の形状又は動作、及び前記入力部位 までの距離を解析して、ユーザの指示を判別する解析部と、  [4] a detection unit that detects a shape or an operation of an input part including at least a part of a user's body or at least a part of an object operated by the user, or a distance to the input part; An analysis unit that analyzes the shape or motion of the input part and the distance to the input part to determine a user instruction,
前記解析部により判別されたユーザの指示に対応する機能を実行する制御部と、 を備えることを特徴とする制御システム。  A control unit that executes a function corresponding to a user's instruction determined by the analysis unit.
[5] 前記制御部は、前記入力部位の形状とデータの格納場所を対応づけ、ユーザから データの格納要求があつたときに、前記検知部により検知された前記入力部位の形 状に対応する格納場所にそのデータを格納することを特徴とする請求項 1に記載の 制御システム。 [5] The control unit associates the shape of the input part with the storage location of the data, and when the data storage request is received from a user, the control part corresponds to the shape of the input part detected by the detection unit. The control system according to claim 1, wherein the data is stored in a storage location.
[6] 前記制御部は、ユーザからデータの読み出し要求があつたときに、前記検知部によ り検知された前記入力部位の形状に対応する格納場所力 データを読み出すことを 特徴とする請求項 5に記載の制御システム。 [6] The control unit reads the storage location force data corresponding to the shape of the input part detected by the detection unit when a data read request is received from a user. 5. The control system according to 5.
[7] ユーザに画像を提示するための表示部を更に備え、 [7] Further comprising a display unit for presenting an image to a user,
前記制御部は、前記入力部位までの距離が所定のしきレ、値以上であったときに、 その旨を示す画像を前記表示部に表示し、前記入力部位までの距離が所定のしき い値を下回ったときに、その旨を示す画像に切り替えることを特徴とする請求項 4に 記載の制御システム。  When the distance to the input part is equal to or more than a predetermined threshold value, the control unit displays an image indicating the fact on the display unit, and the distance to the input part is a predetermined threshold value. 5. The control system according to claim 4, wherein when the value falls below the threshold, the image is switched to an image indicating the fact.
[8] 前記しきレ、値は、前記検知部が検知した入力部位の情報を前記解析部が解析す る際の精度に基づいて定められることを特徴とする請求項 7に記載の制御システム。  [8] The control system according to claim 7, wherein the threshold and the value are determined based on accuracy when the analysis unit analyzes information on the input part detected by the detection unit.
[9] 前記制御部は、前記入力部位の移動量、速度、又は加速度を算出し、それらのう ちの少なくとも一つに基づいて、前記機能を制御することを特徴とする請求項 1に記 載の制御システム。 [9] The control method according to claim 1, wherein the control unit calculates a movement amount, a speed, or an acceleration of the input part, and controls the function based on at least one of them. Control system.
[10] ユーザに画像を提示するための表示部を更に備え、 [10] A display unit for presenting an image to a user is further provided.
前記制御部は、前記表示部に表示されたオブジェクトをユーザが移動させるために The control unit controls the user to move the object displayed on the display unit.
、そのオブジェクトの近傍に前記入力部位を接近させて移動させたときに、前記入力 部位の移動量、速度、又は加速度に応じて、オブジェクトの移動量を決定することを 特徴とする請求項 9に記載の制御システム。 The moving amount of the object is determined according to the moving amount, speed, or acceleration of the input portion when the input portion is moved close to the object. The control system as described.
[11] 前記制御部は、前記表示部に表示されたオブジェクトに対して、そのオブジェクトの 重さを表すパラメータを設定し、前記パラメータを更に考慮してオブジェクトの移動量 を決定することを特徴とする請求項 10に記載の制御システム。 [11] The control unit sets a parameter representing the weight of the object displayed on the display unit, and determines the amount of movement of the object by further considering the parameter. The control system according to claim 10, wherein
[12] 前記検知部は、前記入力部位までの距離情報を取得可能な撮像装置であることを 特徴とする請求項 1から 11のレ、ずれかに記載の制御システム。 [12] The control system according to any one of [1] to [11], wherein the detection unit is an imaging device capable of acquiring distance information to the input part.
[13] 前記検知部は、複数の電極を含み、前記入力部位の接近による前記電極間の静 電容量の変化を検知する入力装置であることを特徴とする請求項 1から 11のいずれ かに記載の制御システム。 13. The input device according to claim 1, wherein the detection unit is an input device that includes a plurality of electrodes and detects a change in capacitance between the electrodes due to the approach of the input site. The control system as described.
[14] ユーザの体の少なくとも一部又はユーザにより操作される物体の少なくとも一部を 含む入力部位の形状又は動作、又は前記入力部位までの距離を検知するステップ と、 [14] detecting a shape or an action of an input part including at least a part of a user's body or at least a part of an object operated by the user, or a distance to the input part;
検知された前記入力部位の形状又は動作、又は前記入力部位までの距離を解析 して、ユーザの指示を判別するステップと、 判別されたユーザの指示に対応する機能を実行するステップと、 Analyzing the detected shape or motion of the input part or the distance to the input part to determine a user's instruction; Performing a function corresponding to the determined user's instruction;
を含むことを特徴とする制御方法。  A control method comprising:
[15] ユーザの体の少なくとも一部又はユーザにより操作される物体の少なくとも一部を 含む入力部位の形状又は動作、又は前記入力部位までの距離を検知する検知部か ら、検知された情報を取得し、前記情報を解析して、ユーザの指示を判別する機能と 判別されたユーザの指示に対応する機能を実行する機能と、 [15] Information detected from a detection unit that detects a shape or an operation of an input part including at least a part of a user's body or at least a part of an object operated by the user, or a distance to the input part, A function of acquiring and analyzing the information to determine a user instruction; and a function of executing a function corresponding to the determined user instruction;
をコンピュータに実現させることを特徴とするコンピュータプログラム。  Computer program for causing a computer to realize the following.
[16] ユーザの体の少なくとも一部又はユーザにより操作される物体の少なくとも一部を 含む入力部位の形状又は動作、又は前記入力部位までの距離を検知する検知部か ら、検知された情報を取得し、前記情報を解析して、ユーザの指示を判別する機能と 判別されたユーザの指示に対応する機能を実行する機能と、 [16] Information detected from a detection unit that detects a shape or an operation of an input part including at least a part of a user's body or at least a part of an object operated by the user, or a distance to the input part, A function of acquiring and analyzing the information to determine a user instruction; and a function of executing a function corresponding to the determined user instruction.
をコンピュータに実現させるためのプログラムを記録したコンピュータ読み取り可能 な記録媒体。  A computer-readable recording medium on which a program for causing a computer to realize the above is recorded.
PCT/JP2004/006643 2003-07-08 2004-05-18 Control system and control method WO2005003948A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2003-193738 2003-07-08
JP2003193738A JP4723799B2 (en) 2003-07-08 2003-07-08 Control system and control method

Publications (1)

Publication Number Publication Date
WO2005003948A1 true WO2005003948A1 (en) 2005-01-13

Family

ID=33562479

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2004/006643 WO2005003948A1 (en) 2003-07-08 2004-05-18 Control system and control method

Country Status (2)

Country Link
JP (1) JP4723799B2 (en)
WO (1) WO2005003948A1 (en)

Cited By (38)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2124138A2 (en) * 2008-05-20 2009-11-25 LG Electronics Inc. Mobile terminal using proximity sensing and wallpaper controlling method thereof
JP2010539590A (en) * 2007-09-14 2010-12-16 インテレクチュアル ベンチャーズ ホールディング 67 エルエルシー Gesture-based user interaction processing
US8166421B2 (en) 2008-01-14 2012-04-24 Primesense Ltd. Three-dimensional user interface
US8249334B2 (en) 2006-05-11 2012-08-21 Primesense Ltd. Modeling of humanoid forms from depth maps
US8565479B2 (en) 2009-08-13 2013-10-22 Primesense Ltd. Extraction of skeletons from 3D maps
US8582867B2 (en) 2010-09-16 2013-11-12 Primesense Ltd Learning-based pose estimation from depth maps
US8594425B2 (en) 2010-05-31 2013-11-26 Primesense Ltd. Analysis of three-dimensional scenes
EP2703949A1 (en) * 2011-04-28 2014-03-05 NEC System Technologies, Ltd. Information processing device, information processing method, and recording medium
US8744137B2 (en) 2010-09-07 2014-06-03 Sony Corporation Information processing device and information processing method
US8787663B2 (en) 2010-03-01 2014-07-22 Primesense Ltd. Tracking body parts by combined color image and depth processing
US8872762B2 (en) 2010-12-08 2014-10-28 Primesense Ltd. Three dimensional user interface cursor control
US8881051B2 (en) 2011-07-05 2014-11-04 Primesense Ltd Zoom-based gesture user interface
US8933876B2 (en) 2010-12-13 2015-01-13 Apple Inc. Three dimensional user interface session control
US8959013B2 (en) 2010-09-27 2015-02-17 Apple Inc. Virtual keyboard for a non-tactile three dimensional user interface
US9002099B2 (en) 2011-09-11 2015-04-07 Apple Inc. Learning-based estimation of hand and finger pose
US9019267B2 (en) 2012-10-30 2015-04-28 Apple Inc. Depth mapping with enhanced resolution
US9030498B2 (en) 2011-08-15 2015-05-12 Apple Inc. Combining explicit select gestures and timeclick in a non-tactile three dimensional user interface
US9035876B2 (en) 2008-01-14 2015-05-19 Apple Inc. Three-dimensional user interface session control
US9047507B2 (en) 2012-05-02 2015-06-02 Apple Inc. Upper-body skeleton extraction from depth maps
US9122311B2 (en) 2011-08-24 2015-09-01 Apple Inc. Visual feedback for tactile and non-tactile user interfaces
US9128519B1 (en) 2005-04-15 2015-09-08 Intellectual Ventures Holding 67 Llc Method and system for state-based control of objects
US9158375B2 (en) 2010-07-20 2015-10-13 Apple Inc. Interactive reality augmentation for natural interaction
US9201501B2 (en) 2010-07-20 2015-12-01 Apple Inc. Adaptive projector
US9218063B2 (en) 2011-08-24 2015-12-22 Apple Inc. Sessionless pointing user interface
US9229107B2 (en) 2007-11-12 2016-01-05 Intellectual Ventures Holding 81 Llc Lens system
US9229534B2 (en) 2012-02-28 2016-01-05 Apple Inc. Asymmetric mapping for tactile and non-tactile user interfaces
US9247236B2 (en) 2008-03-07 2016-01-26 Intellectual Ventures Holdings 81 Llc Display with built in 3D sensing capability and gesture control of TV
US9285874B2 (en) 2011-02-09 2016-03-15 Apple Inc. Gaze detection in a 3D mapping environment
US9377865B2 (en) 2011-07-05 2016-06-28 Apple Inc. Zoom-based gesture user interface
US9377863B2 (en) 2012-03-26 2016-06-28 Apple Inc. Gaze-enhanced virtual touchscreen
US9459758B2 (en) 2011-07-05 2016-10-04 Apple Inc. Gesture-based interface with enhanced features
WO2016210292A1 (en) 2015-06-25 2016-12-29 Children's Medical Center Corporation Methods and compositions relating to hematopoietic stem cell expansion, enrichment, and maintenance
WO2017161001A1 (en) 2016-03-15 2017-09-21 Children's Medical Center Corporation Methods and compositions relating to hematopoietic stem cell expansion
US10043279B1 (en) 2015-12-07 2018-08-07 Apple Inc. Robust detection and classification of body parts in a depth map
US10366278B2 (en) 2016-09-20 2019-07-30 Apple Inc. Curvature-based face detector
CN110543234A (en) * 2018-05-29 2019-12-06 富士施乐株式会社 Information processing apparatus and non-transitory computer readable medium
CN111490840A (en) * 2019-12-18 2020-08-04 蔡晓青 Distributed playing equipment management platform
US10782788B2 (en) 2010-09-21 2020-09-22 Saturn Licensing Llc Gesture controlled communication

Families Citing this family (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4627052B2 (en) * 2006-07-06 2011-02-09 株式会社ソニー・コンピュータエンタテインメント Audio output method and apparatus linked to image
WO2008007372A2 (en) 2006-07-12 2008-01-17 N-Trig Ltd. Hover and touch detection for a digitizer
US8686964B2 (en) 2006-07-13 2014-04-01 N-Trig Ltd. User specific recognition of intended user interaction with a digitizer
JP4730784B2 (en) * 2006-08-09 2011-07-20 アルパイン株式会社 In-vehicle display system
WO2008020446A1 (en) * 2006-08-15 2008-02-21 N-Trig Ltd. Gesture detection for a digitizer
JP2010515170A (en) * 2006-12-29 2010-05-06 ジェスチャー テック,インコーポレイテッド Manipulating virtual objects using an enhanced interactive system
US7855718B2 (en) * 2007-01-03 2010-12-21 Apple Inc. Multi-touch input discrimination
US8933892B2 (en) 2007-11-19 2015-01-13 Cirque Corporation Touchpad combined with a display and having proximity and touch sensing capabilities to enable different functions or interfaces to be displayed
WO2009069392A1 (en) * 2007-11-28 2009-06-04 Nec Corporation Input device, server, display management method, and recording medium
JP4318056B1 (en) * 2008-06-03 2009-08-19 島根県 Image recognition apparatus and operation determination method
US8154524B2 (en) * 2008-06-24 2012-04-10 Microsoft Corporation Physics simulation-based interaction for surface computing
JP2010140300A (en) * 2008-12-12 2010-06-24 Sharp Corp Display, control method, control program and recording medium
KR101609388B1 (en) * 2009-03-04 2016-04-05 엘지전자 주식회사 Mobile terminal for displaying three-dimensional menu and control method using the same
JP5287403B2 (en) 2009-03-19 2013-09-11 ソニー株式会社 Information processing apparatus, information processing method, and program
JP2010244132A (en) * 2009-04-01 2010-10-28 Mitsubishi Electric Corp User interface device with touch panel, method and program for controlling user interface
JP5187280B2 (en) * 2009-06-22 2013-04-24 ソニー株式会社 Operation control device and operation control method
US20110148801A1 (en) * 2009-12-18 2011-06-23 Bateman Steven S Touch panel region of interest reporting scheme
EP2544079A4 (en) * 2010-03-05 2016-07-13 Nec Corp Portable terminal device
JP5118719B2 (en) * 2010-03-31 2013-01-16 株式会社エヌ・ティ・ティ・ドコモ Information terminal and document editing method
US8878821B2 (en) 2010-04-29 2014-11-04 Hewlett-Packard Development Company, L.P. System and method for providing object location information and physical contact information
JP5675196B2 (en) * 2010-07-24 2015-02-25 キヤノン株式会社 Information processing apparatus and control method thereof
JP5625643B2 (en) 2010-09-07 2014-11-19 ソニー株式会社 Information processing apparatus and information processing method
EP2645216B1 (en) * 2010-11-22 2019-10-02 YOSHIDA, Kenji Information input system, program, medium
JP5479414B2 (en) * 2010-11-24 2014-04-23 キヤノン株式会社 Information processing apparatus and control method thereof
JP5724422B2 (en) * 2011-02-07 2015-05-27 富士通株式会社 Operation control device, operation control program, and operation control method
JP5852346B2 (en) * 2011-07-11 2016-02-03 京セラ株式会社 Display device, control system and control program
EP2575007A1 (en) * 2011-09-27 2013-04-03 Elo Touch Solutions, Inc. Scaling of gesture based input
EP2575006B1 (en) 2011-09-27 2018-06-13 Elo Touch Solutions, Inc. Touch and non touch based interaction of a user with a device
JP2013222317A (en) * 2012-04-17 2013-10-28 Toshiba Mach Co Ltd Numerical control device
JP5510529B2 (en) * 2012-11-16 2014-06-04 ソニー株式会社 Information processing apparatus, storage medium, information processing system, information processing method, and program
US9836199B2 (en) 2013-06-26 2017-12-05 Panasonic Intellectual Property Corporation Of America User interface device and display object operating method
KR20150031384A (en) * 2013-09-13 2015-03-24 현대자동차주식회사 System of customized interface and operating method thereof
WO2015045090A1 (en) * 2013-09-27 2015-04-02 株式会社 東芝 Electronic device and method
JP2015191480A (en) * 2014-03-28 2015-11-02 株式会社ソニー・コンピュータエンタテインメント Information processor, operation method of object and operation program of object
US10080963B2 (en) 2014-03-28 2018-09-25 Sony Interactive Entertainment Inc. Object manipulation method, object manipulation program, and information processing apparatus
WO2015159548A1 (en) * 2014-04-18 2015-10-22 日本電気株式会社 Projection control device, projection control method, and recording medium recording projection control program
JP2016042383A (en) * 2015-11-25 2016-03-31 カシオ計算機株式会社 User operation processing apparatus, user operation processing method, and program

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0536715A2 (en) * 1991-10-07 1993-04-14 Fujitsu Limited An apparatus for manipulating an object displayed on a display device
US5483261A (en) * 1992-02-14 1996-01-09 Itu Research, Inc. Graphical input controller and method with rear screen image detection
JPH1138949A (en) * 1997-07-15 1999-02-12 Sony Corp Plotting device, plotting method, and recording medium
JP2000163031A (en) * 1998-11-25 2000-06-16 Seiko Epson Corp Portable information equipment and information storage medium
JP2001216069A (en) * 2000-02-01 2001-08-10 Toshiba Corp Operation inputting device and direction detecting method
JP2002342033A (en) * 2001-05-21 2002-11-29 Sony Corp Non-contact type user input device

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07182101A (en) * 1993-10-26 1995-07-21 Itu Res Inc Apparatus and method for input of graphic, operating method of graphic object and supply method of graphic input signal

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0536715A2 (en) * 1991-10-07 1993-04-14 Fujitsu Limited An apparatus for manipulating an object displayed on a display device
US5483261A (en) * 1992-02-14 1996-01-09 Itu Research, Inc. Graphical input controller and method with rear screen image detection
JPH1138949A (en) * 1997-07-15 1999-02-12 Sony Corp Plotting device, plotting method, and recording medium
JP2000163031A (en) * 1998-11-25 2000-06-16 Seiko Epson Corp Portable information equipment and information storage medium
JP2001216069A (en) * 2000-02-01 2001-08-10 Toshiba Corp Operation inputting device and direction detecting method
JP2002342033A (en) * 2001-05-21 2002-11-29 Sony Corp Non-contact type user input device

Cited By (53)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9128519B1 (en) 2005-04-15 2015-09-08 Intellectual Ventures Holding 67 Llc Method and system for state-based control of objects
US8249334B2 (en) 2006-05-11 2012-08-21 Primesense Ltd. Modeling of humanoid forms from depth maps
US10564731B2 (en) 2007-09-14 2020-02-18 Facebook, Inc. Processing of gesture-based user interactions using volumetric zones
JP2010539590A (en) * 2007-09-14 2010-12-16 インテレクチュアル ベンチャーズ ホールディング 67 エルエルシー Gesture-based user interaction processing
US9058058B2 (en) 2007-09-14 2015-06-16 Intellectual Ventures Holding 67 Llc Processing of gesture-based user interactions activation levels
US9811166B2 (en) 2007-09-14 2017-11-07 Intellectual Ventures Holding 81 Llc Processing of gesture-based user interactions using volumetric zones
US10990189B2 (en) 2007-09-14 2021-04-27 Facebook, Inc. Processing of gesture-based user interaction using volumetric zones
US9229107B2 (en) 2007-11-12 2016-01-05 Intellectual Ventures Holding 81 Llc Lens system
US9035876B2 (en) 2008-01-14 2015-05-19 Apple Inc. Three-dimensional user interface session control
US8166421B2 (en) 2008-01-14 2012-04-24 Primesense Ltd. Three-dimensional user interface
US9247236B2 (en) 2008-03-07 2016-01-26 Intellectual Ventures Holdings 81 Llc Display with built in 3D sensing capability and gesture control of TV
US10831278B2 (en) 2008-03-07 2020-11-10 Facebook, Inc. Display with built in 3D sensing capability and gesture control of tv
EP2124138A2 (en) * 2008-05-20 2009-11-25 LG Electronics Inc. Mobile terminal using proximity sensing and wallpaper controlling method thereof
EP2124138A3 (en) * 2008-05-20 2014-12-24 LG Electronics Inc. Mobile terminal using proximity sensing and wallpaper controlling method thereof
US8565479B2 (en) 2009-08-13 2013-10-22 Primesense Ltd. Extraction of skeletons from 3D maps
US8787663B2 (en) 2010-03-01 2014-07-22 Primesense Ltd. Tracking body parts by combined color image and depth processing
US8824737B2 (en) 2010-05-31 2014-09-02 Primesense Ltd. Identifying components of a humanoid form in three-dimensional scenes
US8781217B2 (en) 2010-05-31 2014-07-15 Primesense Ltd. Analysis of three-dimensional scenes with a surface model
US8594425B2 (en) 2010-05-31 2013-11-26 Primesense Ltd. Analysis of three-dimensional scenes
US9201501B2 (en) 2010-07-20 2015-12-01 Apple Inc. Adaptive projector
US9158375B2 (en) 2010-07-20 2015-10-13 Apple Inc. Interactive reality augmentation for natural interaction
US8744137B2 (en) 2010-09-07 2014-06-03 Sony Corporation Information processing device and information processing method
US8582867B2 (en) 2010-09-16 2013-11-12 Primesense Ltd Learning-based pose estimation from depth maps
US10782788B2 (en) 2010-09-21 2020-09-22 Saturn Licensing Llc Gesture controlled communication
US8959013B2 (en) 2010-09-27 2015-02-17 Apple Inc. Virtual keyboard for a non-tactile three dimensional user interface
US8872762B2 (en) 2010-12-08 2014-10-28 Primesense Ltd. Three dimensional user interface cursor control
US8933876B2 (en) 2010-12-13 2015-01-13 Apple Inc. Three dimensional user interface session control
US9454225B2 (en) 2011-02-09 2016-09-27 Apple Inc. Gaze-based display control
US9342146B2 (en) 2011-02-09 2016-05-17 Apple Inc. Pointing-based display interaction
US9285874B2 (en) 2011-02-09 2016-03-15 Apple Inc. Gaze detection in a 3D mapping environment
EP2703949A4 (en) * 2011-04-28 2014-10-22 Nec Solution Innovators Ltd Information processing device, information processing method, and recording medium
US9329673B2 (en) 2011-04-28 2016-05-03 Nec Solution Innovators, Ltd. Information processing device, information processing method, and recording medium
EP2703949A1 (en) * 2011-04-28 2014-03-05 NEC System Technologies, Ltd. Information processing device, information processing method, and recording medium
US9377865B2 (en) 2011-07-05 2016-06-28 Apple Inc. Zoom-based gesture user interface
US9459758B2 (en) 2011-07-05 2016-10-04 Apple Inc. Gesture-based interface with enhanced features
US8881051B2 (en) 2011-07-05 2014-11-04 Primesense Ltd Zoom-based gesture user interface
US9030498B2 (en) 2011-08-15 2015-05-12 Apple Inc. Combining explicit select gestures and timeclick in a non-tactile three dimensional user interface
US9218063B2 (en) 2011-08-24 2015-12-22 Apple Inc. Sessionless pointing user interface
US9122311B2 (en) 2011-08-24 2015-09-01 Apple Inc. Visual feedback for tactile and non-tactile user interfaces
US9002099B2 (en) 2011-09-11 2015-04-07 Apple Inc. Learning-based estimation of hand and finger pose
US9229534B2 (en) 2012-02-28 2016-01-05 Apple Inc. Asymmetric mapping for tactile and non-tactile user interfaces
US9377863B2 (en) 2012-03-26 2016-06-28 Apple Inc. Gaze-enhanced virtual touchscreen
US11169611B2 (en) 2012-03-26 2021-11-09 Apple Inc. Enhanced virtual touchpad
US9047507B2 (en) 2012-05-02 2015-06-02 Apple Inc. Upper-body skeleton extraction from depth maps
US9019267B2 (en) 2012-10-30 2015-04-28 Apple Inc. Depth mapping with enhanced resolution
WO2016210292A1 (en) 2015-06-25 2016-12-29 Children's Medical Center Corporation Methods and compositions relating to hematopoietic stem cell expansion, enrichment, and maintenance
US10043279B1 (en) 2015-12-07 2018-08-07 Apple Inc. Robust detection and classification of body parts in a depth map
WO2017161001A1 (en) 2016-03-15 2017-09-21 Children's Medical Center Corporation Methods and compositions relating to hematopoietic stem cell expansion
EP4049665A1 (en) 2016-03-15 2022-08-31 Children's Medical Center Corporation Methods and compositions relating to hematopoietic stem cell expansion
US10366278B2 (en) 2016-09-20 2019-07-30 Apple Inc. Curvature-based face detector
CN110543234A (en) * 2018-05-29 2019-12-06 富士施乐株式会社 Information processing apparatus and non-transitory computer readable medium
CN110543234B (en) * 2018-05-29 2024-03-08 富士胶片商业创新有限公司 Information processing apparatus and non-transitory computer readable medium
CN111490840A (en) * 2019-12-18 2020-08-04 蔡晓青 Distributed playing equipment management platform

Also Published As

Publication number Publication date
JP4723799B2 (en) 2011-07-13
JP2005031799A (en) 2005-02-03

Similar Documents

Publication Publication Date Title
JP4723799B2 (en) Control system and control method
JP5184384B2 (en) Control system and control method
JP6074170B2 (en) Short range motion tracking system and method
US11048333B2 (en) System and method for close-range movement tracking
Hinckley et al. Sensor synaesthesia: touch in motion, and motion in touch
EP2717120B1 (en) Apparatus, methods and computer program products providing finger-based and hand-based gesture commands for portable electronic device applications
KR101270847B1 (en) Gestures for touch sensitive input devices
KR101544364B1 (en) Mobile terminal having dual touch screen and method for controlling contents thereof
Cao et al. ShapeTouch: Leveraging contact shape on interactive surfaces
US8941600B2 (en) Apparatus for providing touch feedback for user input to a touch sensitive surface
US9389722B2 (en) User interface device that zooms image in response to operation that presses screen, image zoom method, and program
US9348458B2 (en) Gestures for touch sensitive input devices
JP2013037675A5 (en)
US9696882B2 (en) Operation processing method, operation processing device, and control method
TW200847001A (en) Gesturing with a multipoint sensing device
GB2509599A (en) Identification and use of gestures in proximity to a sensor
CN104049734A (en) Method and devices for displaying graphical user interfaces based on user contact
KR20140010003A (en) Using movement of a computing device to enhance interpretation of input events produced when interacting with the computing device
KR102194778B1 (en) Control method of terminal by using spatial interaction
JP2017157079A (en) Information processor, display control method, and display control program
TW201437844A (en) Input device and method of input mode switching thereof
Shittu et al. A review on interaction techniques on mobile phones
Zhai The Computer Mouse and Related Input Devices
Wu Study and design of interaction techniques to facilitate object selection and manipulation in virtual environments on mobile devices
KR20200143346A (en) Control method of terminal by using spatial interaction

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
122 Ep: pct application non-entry in european phase