US20110227825A1 - 3D Pointer Mapping - Google Patents

3D Pointer Mapping Download PDF

Info

Publication number
US20110227825A1
US20110227825A1 US13/000,889 US200913000889A US2011227825A1 US 20110227825 A1 US20110227825 A1 US 20110227825A1 US 200913000889 A US200913000889 A US 200913000889A US 2011227825 A1 US2011227825 A1 US 2011227825A1
Authority
US
United States
Prior art keywords
frame
cursor
axis
pointing
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/000,889
Inventor
Matthew G. Liberty
Bryan A. Cook
Hua Sheng
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
IDHL Holdings Inc
Original Assignee
Hillcrest Laboratories Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hillcrest Laboratories Inc filed Critical Hillcrest Laboratories Inc
Priority to US13/000,889 priority Critical patent/US20110227825A1/en
Assigned to HILLCREST LABORATORIES, INC. reassignment HILLCREST LABORATORIES, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: COOK, BRYAN A., LIBERTY, MATTHEW G., SHENG, HUA
Publication of US20110227825A1 publication Critical patent/US20110227825A1/en
Assigned to MULTIPLIER CAPITAL, LP reassignment MULTIPLIER CAPITAL, LP SECURITY AGREEMENT Assignors: HILLCREST LABORATORIES, INC.
Assigned to IDHL HOLDINGS, INC. reassignment IDHL HOLDINGS, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HILLCREST LABORATORIES, INC.
Assigned to HILLCREST LABORATORIES, INC. reassignment HILLCREST LABORATORIES, INC. RELEASE BY SECURED PARTY (SEE DOCUMENT FOR DETAILS). Assignors: MULTIPLIER CAPITAL, LP
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • G06F3/0383Signal control means within the pointing device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors

Definitions

  • the present invention describes mapping techniques, systems, software and devices, which can be used in 3D pointing devices, as well as in other types of devices.
  • the television was tuned to the desired channel by adjusting a tuner knob and the viewer watched the selected program. Later, remote control devices were introduced that permitted viewers to tune the television from a distance. This addition to the user-television interface created the phenomenon known as “channel surfing” whereby a viewer could rapidly view short segments being broadcast on a number of channels to quickly learn what programs were available at any given time.
  • Printed guides are still the most prevalent mechanism for conveying programming information.
  • the multiple button remote control with up and down arrows is still the most prevalent channel/content selection mechanism.
  • the reaction of those who design and implement the TV user interface to the increase in available media content has been a straightforward extension of the existing selection procedures and interface objects.
  • the number of rows in the printed guides has been increased to accommodate more channels.
  • the number of buttons on the remote control devices has been increased to support additional functionality and content handling, e.g., as shown in FIG. 1 .
  • the user interface bottleneck problem is being exacerbated by the aggregation of technologies. Consumers are reacting positively to having the option of buying integrated systems rather than a number of separate components.
  • An example of this trend is the combination television/VCR/DVD in which three previously independent components are frequently sold today as an integrated unit. This trend is likely to continue with an end result that potentially all of the communication devices currently found in the household will be packaged together as an integrated unit, e.g., a television/VCR/DVD/internet access/radio/stereo unit. Even those who continue to buy separate components will likely desire seamless control of, and interworking between, the separate components. With this increased aggregation comes the potential for more complexity in the user interface.
  • buttons on these universal remote units was typically more than the number of buttons on either the TV remote unit or VCR remote unit individually. This added number of buttons and functionality makes it very difficult to control anything but the simplest aspects of a TV or VCR without hunting for exactly the right button on the remote. Many times, these universal remotes do not provide enough buttons to access many levels of control or features unique to certain TVs. In these cases, the original device remote unit is still needed, and the original hassle of handling multiple remotes remains due to user interface issues arising from the complexity of aggregation. Some remote units have addressed this problem by adding “soft” buttons that can be programmed with the expert commands.
  • buttons sometimes have accompanying LCD displays to indicate their action. These too have the flaw that they are difficult to use without looking away from the TV to the remote control. Yet another flaw in these remote units is the use of modes in an attempt to reduce the number of buttons.
  • moded a special button exists to select whether the remote should communicate with the TV, DVD player, cable set-top box, VCR, etc. This causes many usability issues including sending commands to the wrong device, forcing the user to look at the remote to make sure that it is in the right mode, and it does not provide any simplification to the integration of multiple devices.
  • the most advanced of these universal remote units provide some integration by allowing the user to program sequences of commands to multiple devices into the remote. This is such a difficult task that many users hire professional installers to program their universal remote units.
  • 3D pointing devices are used in this specification to refer to the ability of an input device to measure motion in three dimensional space.
  • Three dimensional space has six degrees of freedom (6DOF): three axes of linear motion and three axes of angular motion.
  • 6DOF degrees of freedom
  • the position (or pose) of a device may be represented by its linear position and angular position (orientation).
  • the 3D pointing device moves within the six degrees of freedom in the air in front of, e.g., a display, and the corresponding ability of the user interface to translate those motions directly into user interface commands, e.g., movement of a cursor on the display.
  • “3D pointing” differs from, e.g., conventional computer mouse pointing techniques which use a surface, e.g., a desk surface or mousepad, as a proxy surface from which relative movement of the mouse is translated into cursor movement on the computer display.
  • An example of a 3D pointing device can be found in U.S. Pat. No. 7,118,518 to Matthew G.
  • the motion data is transferred between the 3D pointing device and the host system.
  • the motion data transfer may be performed by any communication link including wired, radio frequency, ultrasonic and infrared.
  • absolute pointing the desired cursor location is the location where the forward vector of the device intersects the plane of the display. If the 3D pointer were a laser pointer, this cursor location would be the location of the projected laser dot.
  • absolute pointing the pointing resolution for angular motions varies with linear position. The further the user is from the display, the finer the angular motion required to target objects. The angular motion resolution also varies with the off-axis angle. When the user is to the side of the display, smaller angular motions are required than when the user is at the same distance directly in front of the display. This variation in pointing resolution yields an inconsistent user experience.
  • Absolute pointing When part of a TV remote control, this variation causes inconsistent behavior between homes and even between seats at the same home. Absolute pointing is, however, normally repeatable and time invariant. If the 3D pointer is placed in the same position, then the cursor will return to the same position. Absolute pointing may also be non-calibrated and referenced to an initial starting position.
  • Relative pointing allows for non-linear processing including pointer ballistics which can dramatically improve pointing performance.
  • Pointer ballistics are described, for example, at http://www.microsoft.com/whdc/archive/pointer-bal.mspx.
  • Relative pointing often bounds cursor motion to the display bounds and discards any motion beyond the display bounds. While this allows users to relax and find a comfortable position, some applications benefit from a fixed mapping between device position and cursor location.
  • absolute pointing refers to solutions that have characteristics most similar to true absolute pointing
  • relative pointing refers to solutions that have characteristics most similar to true relative pointing.
  • a method for mapping a device's movement into cursor position is described.
  • the device's linear position and angular position are estimated.
  • the estimated linear position and angular position are further processed using both a first mapping algorithm to generate a first cursor location, and a second mapping algorithm to generate a second cursor location.
  • the first cursor location and the second cursor location are combined to generate a final cursor output.
  • Such a technique can be used, for example, to combine the strengths of the two mapping algorithms to provide a more robust user experience associated with, e.g., a user interface in which the cursor is used to interact with various objects.
  • a 3D pointing device includes at least one sensor configured to generate an output which is associated with movement of the 3D pointing device and a processor.
  • the processor is configured to estimate the device's linear position and angular position using the output, to process the estimated linear position and angular position using both a first mapping algorithm to generate a first cursor location and a second mapping algorithm to generate a second cursor location, and to combine the first cursor location and the second cursor location to generate a final cursor output.
  • a system includes a 3D pointing device having at least one sensor configured to generate an output which is associated with movement of said 3D pointing device and a system controller in communication with the 3D pointing device and configured to receive data associated with said output therefrom.
  • At least one of the 3D pointing device and the system controller include a processor for estimating at least one of said device's linear position and angular position using the output, to process at least one of the estimated linear position and angular position using both a first mapping algorithm to generate a first cursor location and a second mapping algorithm to generate a second cursor location, and to combine the first cursor location and the second cursor location to generate a final cursor output.
  • a method for mapping a device's movement into cursor position includes estimating the device's linear position and angular position, and processing the estimated linear position and angular position using a mapping algorithm to generate a cursor location, wherein the mapping algorithm is an absolute invariant algorithm which has a first characteristic of providing a direct, repeatable mapping from device linear position and angular position into cursor location and a second characteristic that cursor responsiveness to linear motion and angular motion is consistent over a motion range.
  • a method for mapping a device's movement into cursor position includes the steps of estimating at least one of the device's linear position and angular position, and processing at least one of the estimated linear position and angular position using a mapping algorithm to generate a cursor location, wherein the mapping algorithm creates an intermediate, virtual display that moves to face the device.
  • a method for mapping a device's movement into cursor position includes the steps of estimating the device's angular position, and processing the estimated angular position using a mapping algorithm to generate a cursor location, wherein the mapping algorithm maps said estimated angular position of the device into cursor coordinates using an angular position spherical projection.
  • FIG. 1 depicts a conventional remote control unit for an entertainment system
  • FIG. 2 depicts an exemplary media system in which exemplary embodiments can be implemented
  • FIG. 3 shows a 3D pointing device according to an exemplary embodiment of the present invention
  • FIG. 4 illustrates a cutaway view of the 3D pointing device in FIG. 4 including angular velocity sensing and linear acceleration sensing;
  • FIG. 5 shows a 3D pointing device according to another exemplary embodiment
  • FIG. 6 depicts the 3D pointing device of FIG. 5 being used as part of a “10 foot” interface according to an exemplary embodiment
  • FIG. 7 shows mapping of device motion into displayed cursor motion according to various exemplary embodiments
  • FIGS. 8-12 illustrate functions associated with mapping of device motion into displayed cursor motion according to exemplary embodiments.
  • FIG. 13 is a flowchart illustrating a method for mapping a device's movement into cursor position according to an exemplary embodiment.
  • an exemplary aggregated media system 200 in which the present invention can be implemented will first be described with respect to FIG. 2 .
  • I/O input/output
  • the I/O bus 210 represents any of a number of different of mechanisms and techniques for routing signals between the media system components.
  • the I/O bus 210 may include an appropriate number of independent audio “patch” cables that route audio signals, coaxial cables that route video signals, two-wire serial lines or infrared or radio frequency transceivers that route control signals, optical fiber or any other routing mechanisms that route other types of signals.
  • the media system 200 includes a television/monitor 212 , a video cassette recorder (VCR) 214 , digital video disk (DVD) recorder/playback device 216 , audio/video tuner 218 and compact disk player 220 coupled to the I/O bus 210 .
  • the VCR 214 , DVD 216 and compact disk player 220 may be single disk or single cassette devices, or alternatively may be multiple disk or multiple cassette devices. They may be independent units or integrated together.
  • the media system 200 includes a microphone/speaker system 222 , video camera 224 and a wireless I/O control device 226 .
  • the wireless I/O control device 226 is a 3D pointing device according to one of the exemplary embodiments described below.
  • the wireless I/O control device 226 can communicate with the entertainment system 200 using, e.g., an IR or RF transmitter or transceiver.
  • the I/O control device can be connected to the entertainment system 200 via a wire.
  • the entertainment system 200 also includes a system controller 228 .
  • the system controller 228 operates to store and to display entertainment system data available from a plurality of entertainment system data sources and to control a wide variety of features associated with each of the system components.
  • system controller 228 is coupled, either directly or indirectly, to each of the system components, as necessary, through I/O bus 210 .
  • system controller 228 in addition to or in place of I/O bus 210 , system controller 228 is configured with a wireless communication transmitter (or transceiver), which is capable of communicating with the system components via IR signals or RF signals. Regardless of the control medium, the system controller 228 is configured to control the media components of the media system 200 via a graphical user interface described below.
  • media system 200 may be configured to receive media items from various media sources and service providers.
  • media system 200 receives media input from and, optionally, sends information to, any or all of the following sources: cable broadcast 230 , satellite broadcast 232 (e.g., via a satellite dish), very high frequency (VHF) or ultra high frequency (UHF) radio frequency communication of the broadcast television networks 234 (e.g., via an aerial antenna), telephone network 236 and cable modem 238 (or another source of Internet content).
  • VHF very high frequency
  • UHF ultra high frequency
  • remote devices in accordance with the present invention can be used in conjunction with other systems, for example computer systems including, e.g., a display, a processor and a memory system or with various other systems and applications.
  • remote devices which operate as 3D pointers are of particular interest for the present specification. Such devices enable the translation of movement, e.g., gestures, into commands to a user interface.
  • An exemplary 3D pointing device 400 is depicted in FIG. 3 .
  • user movement of the 3D pointing can be defined, for example, in terms of a combination of x-axis attitude (roll), y-axis elevation (pitch) and/or z-axis heading (yaw) motion of the 3D pointing device 400 .
  • some exemplary embodiments of the present invention can also measure linear movement of the 3D pointing device 400 along the x, y, and z axes to generate cursor movement or other user interface commands.
  • the 3D pointing device 400 includes two buttons 402 and 404 as well as a scroll wheel 406 , although other exemplary embodiments will include other physical configurations. According to exemplary embodiments of the present invention, it is anticipated that 3D pointing devices 400 will be held by a user in front of a display 408 and that motion of the 3D pointing device 400 will be translated by the 3D pointing device into output which is usable to interact with the information displayed on display 408 , e.g., to move the cursor 410 on the display 408 .
  • rotation of the 3D pointing device 400 about the y-axis can be sensed by the 3D pointing device 400 and translated into an output usable by the system to move cursor 410 along the y 2 axis of the display 408 .
  • rotation of the 3D pointing device 408 about the z-axis can be sensed by the 3D pointing device 400 and translated into an output usable by the system to move cursor 410 along the x 2 axis of the display 408 .
  • the output of 3D pointing device 400 can be used to interact with the display 408 in a number of ways other than (or in addition to) cursor movement, for example it can control cursor fading, volume or media transport (play, pause, fast-forward and rewind).
  • Input commands may include operations in addition to cursor movement, for example, a zoom in or zoom out on a particular region of a display. A cursor may or may not be visible.
  • rotation of the 3D pointing device 400 sensed about the x-axis of 3D pointing device 400 can be used in addition to, or as an alternative to, y-axis and/or z-axis rotation to provide input to a user interface.
  • two axes of angular velocity sensing 420 and 422 and three axes of linear acceleration sensing 424 can be employed as sensors in 3D pointing device 400 as shown in FIG. 4 .
  • this exemplary embodiment employs inertial sensors it will be appreciated that the present invention is not so limited and examples of other types of sensors which can be used in conjunction with other exemplary embodiments are provided below.
  • the rotational sensors 420 , 422 can, for example, be implemented using IDG-500 or IXZ-500 sensors made by Invensense. Alternatively, the embodiment could measure all three axes of angular velocity using elements 420 and 422 implemented as the combination of an IDG-500 and ISZ-500.
  • rotational sensors 420 and 422 can be employed as rotational sensors 420 and 422 and that the Invensense sensors are purely used as an illustrative example.
  • these rotational sensors use MEMS technology to provide a resonating mass which is attached to a frame so that it can resonate only along one direction.
  • the resonating mass is displaced when the body to which the sensor is affixed is rotated around the sensor's sensing axis. This displacement can be measured using the Coriolis acceleration effect to determine an angular velocity associated with rotation along the sensing axis.
  • the accelerometer 424 can, for example, be a 3-axis linear accelerometer such as the LIS331DLH produced by STMicroelectronics. However, a 2-axis linear accelerometer could be used by assuming that the device is measuring gravity and mathematically computing the remaining 3 rd value. Additionally, the accelerometer(s) and rotational sensor(s) could be packaged together into a single sensor package. Other variations of sensors and sensor packages may also be used in conjunction with these exemplary embodiments.
  • the 3D pointing device 500 includes a ring-shaped housing 501 , two buttons 502 and 504 as well as a scroll wheel 506 and grip 507 , although other exemplary embodiments may include other physical configurations.
  • the region 508 which includes the two buttons 502 and 504 and scroll wheel 506 is referred to herein as the “control area” 508 , which is disposed on an outer portion of the ring-shaped housing 501 . More details regarding this exemplary embodiment can be found in U.S. patent application Ser. No. 11/480,662, entitled “3D Pointing Devices”, filed on Jul. 3, 2006, the disclosure of which is incorporated here by reference.
  • Such devices have numerous applications including, for example, usage in the so-called “10 foot” interface between a sofa and a television in the typical living room as shown in FIG. 6 .
  • that movement is detected by one or more sensors within 3D pointing device 200 and transmitted to the television 620 (or associated system component, e.g., a set-top box (not shown)).
  • Movement of the 3D pointing device 500 can, for example, be translated or mapped into movement of a cursor 640 displayed on the television 620 (examples of such mappings being provided below) and which is used to interact with a user interface.
  • One challenge faced in implementing exemplary 3D pointing devices 400 in accordance with these exemplary embodiments is to employ components, e.g., rotational sensors 502 and 504 , which are not too costly, while at the same time providing a high degree of correlation between movement of the 3D pointing device 400 , a user's expectation regarding how the user interface will react to that particular movement of the 3D pointing device and actual user interface performance in response to that movement. For example, if the 3D pointing device 400 is not moving, the user will likely expect that the cursor ought not to be drifting across the display.
  • components e.g., rotational sensors 502 and 504
  • various measurements and calculations are performed, e.g., by the handheld device 400 , which are used to adjust the outputs of one or more of the sensors 420 , 422 and 424 and/or as part of the input used by a processor to determine an appropriate output for the user interface based on the outputs of the sensors 420 , 422 and 424 .
  • a 3D pointing device that a user moves in 6 degrees of freedom can be used to convert that motion into cursor motion.
  • Different applications have different demands and requirements on how the device motion should be mapped into cursor motion.
  • These exemplary embodiments describe novel methods for mapping the device motion into cursor motion which provide improved 3D pointer performance and are configurable to deliver optimal response for each application.
  • exemplary embodiments of the present invention describe new 3D pointer mapping methods and a method for combining alternate mapping methods to provide optimal response for a given application. These can, for example, reduce cursor resolution variation as a function of the user's position relative to the display, which is a primary problem with absolute pointing.
  • the exemplary embodiments can provide a constant mapping between the device position and cursor position, which can be a problem for some relative pointing applications.
  • the device 701 typically contains a collection of sensors, examples of which were described above.
  • the sensing system may consist of one or more sensors including linear accelerometers, angular position sensors (traditional gyroscopes), angular velocity sensors (MEMS gyroscopes), magnetometers, cameras, optical, ultrasonic and RF.
  • the sensing system processes the sensor data to provide an estimate of the device's linear position and angular position.
  • the device's position is then processed by one or more mapping algorithms to yield cursor locations.
  • the cursor locations are then combined to produce a final cursor output.
  • the cursor motion information then drives the cursor on the display 702 .
  • the device 701 may be connected via either wires or wirelessly to the display 702 .
  • the algorithm may run on the device 701 , the display 702 or an intermediate processing unit (not shown in FIG. 7 , e.g., a system console connected to both the device 701 and the display 702 ).
  • the device 701 is a battery-powered, handheld device which contains a 3-axis accelerometer and a 3-axis gyroscope, however it will be appreciated that fewer or other sensors may be included.
  • the device 701 processes the sensor data to estimate its linear position and angular position and further processes the linear position and angular position to produce the cursor motion data.
  • the cursor motion data is communicated to a set-top box, e.g., represented by system controller 228 in FIG. 2 , over a proprietary 2.4 GHz RF link.
  • the data is received by the RF hardware in the set-top box and communicated over a USB bus to the main set-top box processor.
  • the main set-top box processor moves the cursor as specified, in this example, by the device 701 .
  • the set-top box renders the image and sends the image to the display over, e.g., HDMI, component, S-Video, and/or composite outputs.
  • the display receives the image and displays the image to the user.
  • the set-top box or other controller which communicates with the handheld device 701 could perform some or all of the processing, e.g., the estimation of linear position and angular position and/or the mapping of the estimated linear position and angular position into one or more cursor locations.
  • Vectors are assumed to be column vectors (N ⁇ 1 matrix);
  • x ⁇ y is the cross product of vectors x and y;
  • X y is the matrix multiplication of matrix X and vector y;
  • X T is the matrix transpose
  • is the unit vector in the direction of
  • ⁇ q 0 , v> is the quaternion q with scalar component q 0 and the length 3 vector v;
  • q p is the quaternion multiplication of q and p;
  • b x is the vector x defined in body-frame coordinates
  • u x is the vector x defined in user-frame coordinates
  • Length 2 vector v is assumed to have subcomponents named (v x , v y );
  • Length 3 vector v is assumed to have subcomponents named (v x , v y , v z ).
  • a first coordinate system i.e., the user-frame coordinate system
  • the user-frame coordinate system is arbitrarily chosen in this example to be the center of the display and consists of (x, y, z).
  • the user-frame coordinate system is stationary with respect to the display.
  • the coordinate system has x into the display, y to the right of the display and z down, which corresponds to a typical aerospace coordinate system convention.
  • alternate conventions could be used instead and include, for example, PC display coordinates (x right, y down, z into display) and HID (x out of display, y right, z down).
  • the user-frame axes will be arbitrarily defined as:
  • the second coordinate system in this exemplary embodiment is the device's body-frame coordinate system.
  • the body-frame coordinate system is stationary with respect to the device.
  • the body-frame origin is typically at the center of the device, although that is not required.
  • the body-frame axes are shown as (x°, y°, z°) with x° out the front of the device, y° to the right, and z° down.
  • the body-frame axes are arbitrarily defined as:
  • body-frame coordinate system axes and origin can be chosen without materially altering the present invention.
  • the discussion above assumes a Cartesian coordinate system, but other coordinate systems, such as spherical coordinates, could also be used without affecting the invention.
  • the length 3 vector ou is the origin of the user-frame coordinate system and is defined to be fixed to the display.
  • u ou is defined as (0, 0, 0) for the present embodiment.
  • the length 3 vector ob is the origin of the body-frame coordinate system and is defined to be fixed to the device.
  • b ob is defined as (0, 0, 0) for the present embodiment.
  • the length 3 vector od is the origin of the display coordinate system and is defined to be fixed to the display.
  • u od is defined as (0, c x , c y ) for the present embodiment.
  • the length 3 vector u is defined as the device's linear position in 3-D space.
  • u ob.
  • the quaternion q is defined as the device's angular position with respect to user-frame coordinates.
  • any desired alternate angular position representation could also be used including, for example, Euler angles, direction cosine matrix (DCM), and vector/angle.
  • the length 2 vector p is the pointing device's cursor location on the display given as 2-D display coordinates.
  • the length 2 vector p 0 is the cursor location of the display coordinate system.
  • the length 3 vector ⁇ is the angular velocity.
  • u ⁇ is the angular velocity in user-frame coordinates
  • b ⁇ is the angular velocity in the device's body-frame coordinates.
  • mapping device motion into cursor location using absolute pointing will first be considered.
  • the cursor in an absolute pointing system should be located at the intersection of the display and the line containing the device's x° axis. For example, if a laser pointer were appropriately mounted to the device 701 , the cursor location and the laser pointer's dot on the display 702 should at the same position.
  • the full mathematical equation for absolute pointing given user-frame coordinates is:
  • Absolute pointing is especially suitable for shooting games where the user aims directly at targets on the display.
  • Absolute pointing is analogous to how humans point our arms, hands and fingers to physical objects in our environment.
  • the ratio of cursor motion to angular device motion changes as a function of distance. The further the device is from the display, the smaller the angular motions required to move the cursor a given distance on the display. Likewise, the further the device is from the x axis, the smaller the angular motions required to move the cursor.
  • an absolute pointing algorithm maps device motion to cursor location based upon an intersection of a forward pointing direction (body-frame x-axis) of the device and a surface of a display on which the cursor is to be displayed.
  • u u x corresponds to a distance correction factor.
  • This distance correction factor is described by U.S. Pat. No. 5,627,565 entitled “Space coordinates detecting device and input apparatus using same”, the disclosure of which is incorporated here by reference.
  • relative pointing or “body-frame” relative pointing will be considered.
  • Some applications do not required direct pointing at a display or may not have any meaningful frame of reference. For such cases, the application may choose to use the device itself as the primary reference. As long as the u x° x is positive, regardless of the position of the display or the position of the device, if the user moves the device linearly along the y° axis or rotates around the z° axis, the cursor will always move to the right.
  • body-frame relative pointing is mathematically defined by:
  • ⁇ ⁇ ⁇ p [ ⁇ z b ⁇ y - b ] ⁇ ⁇ ⁇ t - [ ⁇ ⁇ ou y b ⁇ ⁇ ou z b ]
  • Relative pointing is illustrated functionally in FIG. 8 .
  • the amount of cursor motion is not affected by the position of the device 800 .
  • the forward direction of the device and the cursor location are not deterministic and may vary over time. For some applications, this decoupling is beneficial.
  • the actual cursor location may be computed using a non-linear function of ⁇ p, referred to as vPointer in FIG. 8 , which is a non-linear function often called pointer ballistics 802 .
  • vPointer is a non-linear function often called pointer ballistics 802 .
  • Traditional 2-D computer mice use pointer ballistics 802 to improve their apparent usability and performance, and this function can also be used in 3D pointer mapping.
  • the output of the pointer ballistics 802 , vSys is then used to move the cursor on screen subject to boundary conditions, etc., in block 804 .
  • body-frame relative pointing offers some advantages, users often do not care about the position of the device, only the relative movements that they make. Thus, a third type of mapping which is considered here is referred to as “user-frame relative pointing”. In user-frame relative pointing, if the user rotates the device around the user-frame z-axis then the cursor should move to the right. If the user linearly moves the device along the user-frame z-axis, then the cursor should move down.
  • ⁇ b z is measured by an accelerometer.
  • the amount of cursor motion is not affected by the position of the device.
  • the forward direction of the device and the cursor location are not deterministic and may vary over time. For some applications, this decoupling is beneficial.
  • the actual cursor location may be computed using a non-linear function of ⁇ p, often called pointer ballistics.
  • Traditional 2-D computer mice use pointer ballistics to improve their apparent usability and performance.
  • Absolute invariant pointing blends many of the relative pointing benefits with the absolute pointing benefits while minimizing the negative factors of each method.
  • Absolute invariant pointing can be mathematically defined by:
  • u ⁇ x ⁇ Vector ( q ⁇ ⁇ 0 , u ⁇ x ⁇ ⁇ ⁇ q * )
  • p c ⁇ [ tan - 1 ⁇ u ⁇ x ⁇ ⁇ y u ⁇ x ⁇ ⁇ x tan - 1 ⁇ u ⁇ x ⁇ ⁇ z u ⁇ x ⁇ ⁇ x ] + [ u y u u z u ] + p 0 ⁇ c ⁇ [ sin - 1 ⁇ u ⁇ x ⁇ ⁇ y sin - 1 ⁇ u ⁇ x ⁇ ⁇ z ] + [ u y u u z u ] + p 0
  • Absolute invariant pointing like absolute pointing per se described above, contains a direct, repeatable mapping from device linear position and angular position into cursor location. At the same time, the cursor responsiveness to linear motion and angular motion is consistent over the motion range. If the device is located to the left or right of display center, then users typically attempt to move the device left or right relative to the vector from the device to the display, not the display normal vector, x. This results in an apparent loss of linear motion resolution as the device moves away from the x-axis. Thus, an absolute invariant mapping algorithm according to this exemplary embodiment generates cursor position as a sum of a term of linear position values and a term computed from angular position that is independent of linear position.
  • a fifth mapping technique called “virtual display” reduces the loss of apparent linear resolution found with “absolute invariant pointing” as the device moves off-axis.
  • the “virtual display” technique may create an intermediate, virtual display that moves to face the device. By moving to face the device, the virtual display maintains the same resolution as if the device was directly in front of the display. A full virtual display moves to directly face the device.
  • a new virtual display coordinate system is created by construction with axes x ⁇ , y ⁇ , and z ⁇ . In mathematical terms,
  • the new coordinate system will preferably, but not as a requirement, maintain the y-axis as “horizontal” and the z-axis as “vertical”.
  • the remaining axes will then be:
  • a scaling factor can be applied to construct u x ⁇ between u x and u û. For example,
  • c is a value between 0 and 1, inclusive.
  • the virtual display can be applied to the absolute pointing algorithm to create a planar virtual display:
  • the virtual display method can also be applied to any absolute or pseudo-absolute mapping method including the absolute pointing and absolute invariant pointing described above. If the cursor resides on a non-planar display, then this method could easily be adapted to create a virtual display of the non-planar display.
  • the intermediate, planar virtual display maps device motion to cursor location based upon an intersection of a forward pointing direction (body-frame x-axis) of the device and a surface of a display on which the cursor is to be displayed rotated to at least partially face the device.
  • angular response is more important than linear response.
  • the virtual display method does not have consistent angular response.
  • the angular responsiveness is similar to absolute pointing, not relative pointing.
  • a sixth mapping technique called “virtual spherical display” maintains constant angular response, unlike the “virtual display”. For example, if a conversion to polar coordinates is first performed, i.e.:
  • the spherical virtual display maps device motion to cursor location based upon spherical coordinates of the estimated angular position being transformed into the virtual display coordinates by a transformation matrix and converted into cursor location.
  • the spherical virtual display method can be applied to relative pointing to create a seventh mapping technique called “relative spherical virtual display”. Instead of controlling the cursor using angles, this method uses the change in the angle to drive a change in the cursor.
  • the intermediate, relative, spherical virtual display plays maps device motion to cursor location based upon body-frame angular velocity modified by a transformation matrix and converted into cursor change in location.
  • the angular position can also be mapped into cursor coordinates using a spherical projection.
  • This eighth mapping method is known as “angular position spherical projection”.
  • T is a 3 ⁇ 3 general transformation matrix of arbitrary constants.
  • the matrix T may apply any combination of a scale, rotation, translation, shearing, reflection, orthogonal projection, affine transformation or perspective projection.
  • p can be scaled so that it becomes one.
  • the outputs of all or some of the absolute methods described above can be joined by simple linear combination or a more complicated non-linear combination.
  • a combination of the above described “absolute pointing mapping” with the “invariant absolute pointing mapping” can be performed.
  • Exemplary embodiments can, for example, use a simple linear combination (or a more complicated non-linear process) to combine any two more of the described methods. For each method, p i , an application assigns a weighting factor c i . The final resulting p is then:
  • the outputs of all or some of the relative methods described above can be joined by simple linear combination or a more complicated non-linear combination.
  • An embodiment could use an equal linear combination of “user-frame relative pointing” with “relative spherical virtual display” which would reduce the flaws of each method by half
  • the exemplary embodiment uses a simple linear combination to combine the methods. For each method, p i , an application assigns a weighting factor c i . The final resulting ⁇ p is then:
  • ⁇ ⁇ ⁇ p ⁇ i ⁇ c i ⁇ ⁇ ⁇ ⁇ p i
  • Combining absolute pointing methods and relative pointing methods is also considered according to these exemplary embodiments.
  • One method in which combined absolute and relative pointing mappings can be employed is to have the cursor primarily controlled through relative pointing, but use absolute pointing to adjust the cursor movement so as to avoid long-term drift from the reference. Avoiding long-term drift would eliminate the periodical re-centering the cursor typical with relative pointing solutions.
  • the mapping between 3D pointer position and cursor position is time-varying and dependent upon the position and range of motion, and also the speed of movement.
  • FIG. 9 An example of such an embodiment is illustrated in FIG. 9 .
  • the “user-frame relative pointing” mapping is combined with the “angular position spherical projection”, both of which mappings are individually described above.
  • the pointing device 900 outputs, on the lower branch 902 , vPointer data which is the user-frame relative pointing data described above with respect to FIG. 8 .
  • the pointing device 900 outputs angular position which is used by the absolute pointing mapping algorithm as an input.
  • a map is created in block 906 which, in this exemplary embodiment, includes a 3 ⁇ 3 general transformation matrix T which can perform a variety of transformations on the output angular position, e.g., scale (stretch in any of the axes), rotation (preserve orthogonality), shearing (essentially make axes non-orthogonal), translation (offset to account for 2D application of data), reflection, and any other affine transformation or perspective projection.
  • the map also defines an origin value (quaternion). A detailed example of how the map can be created is shown in FIG. 10 .
  • the angular position output from the device is first rotated to map the measured angular position of the device to account for deviations from the nominal origin.
  • the angular position is converted to spherical coordinates at block 1004 .
  • the current sample is evaluated to determine a weight for the sample at block 1006 .
  • the weights capture how important each previous point is to defining the current map between where the cursor is currently located on the screen and the angular position of the pointing device 900 .
  • the weights assist with determining which data points are worth saving and can be used as part of the least squares solution to find the map.
  • the primary weight is applied to each sample based upon the absolute angular position of the pointing device.
  • the full range of motion of the device is divided into a fixed set of small regions.
  • the first data point in each region gets the largest weight, and every future point within that region gets a smaller weight.
  • a secondary weight based on the current angular velocity is applied so that points where the device is at rest are more important than for points where the device is in motion.
  • the best N samples of cursor position, angular position, and weighted samples are saved and used for map calculation at blocks 1008 , 1010 and 1012 , respectively.
  • the map is created at block 1014 by first calculating the rotation origin to ensure that the input data remains within a reasonable range for conversion to spherical coordinates.
  • T 3 ⁇ 3 general transformation matrix T that will transform the set of azimuth/elevation pairs onto the set of cursor positions.
  • One method of finding T is to define the error vector v n for each of the N saved samples as:
  • v n T ⁇ [ ⁇ n - ⁇ n 1 ] - [ x n y n 1 ]
  • w n is the weight for each sample.
  • w n is the weight for each sample.
  • many methods exist for finding a solution to this linear least squares problem including inverting the normal equations using the Moore-Penrose pseudoinverse and orthogonal decomposition methods such as QR decomposition or singular value decomposition. QR decomposition is used in the exemplary implementation.
  • a map is defined by block 906 it is used to map the current 3D pointer position to display coordinates to generate a reference cursor location pRef in block 908 . More details regarding this exemplary process are illustrated in FIG. 11 .
  • the angular position of the device 900 is rotated in the same way as described above with respect to block 1002 , and the output is translated into spherical coordinates by block 1104 to create the 2 ⁇ 1 vector called Az/El. This is then converted to homogenous coordinates and then multiplied with the map matrix T at block 1106 yielding pRef.
  • the pRef value thus represents the desired location for the cursor based upon the absolute pointing system component of this exemplary, combined mapping.
  • the relative pointing value vSys, the absolute pointing value pRef and the current cursor position p 0 are input to a dynamic ballistics function 910 .
  • the dynamic ballistics function 910 takes these inputs and, in this exemplary embodiment, uses the absolute pointing value and the current cursor position to adjust the relative pointing value. More specifically, as shown in FIG. 12 , the current and reference cursor positions are used to adjust the cursor movement before it is applied to the cursor.
  • One method of adjusting the cursor is to perform small adjustments to the scale and angle of the current velocity vector so that the new cursor position will be closer to the reference point.
  • the current point is subtracted from the reference point to get the reference velocity vector vRef at block 1202 .
  • This reference vector is compared to the original velocity v 0 to find the angle between the vectors at block 1204 .
  • This angle is limited to a fixed maximum, at block 1206 , and then used to rotate the vector v 0 to create vRot as shown in block 1208 of FIG. 12 .
  • the reference vector is projected onto vRot to find the scaling of vRot that would get the next point closest to the reference point as shown in block 1210 .
  • This scaling is limited between a maximum and minimum value (block 1212 ) and then applied to vRot at block 1214 .
  • the limits on maximum angle and maximum scale can be tuned to control how much correction will be applied.
  • mapping techniques Numerous variants and alternatives of the foregoing mapping techniques are contemplated.
  • the combination need not be limited to full mapping methods.
  • Each term of each mapping method could be assigned its own weighting factor.
  • a method for mapping a device's movement into cursor position can include the steps illustrated in the flowchart of FIG. 13 .
  • at step 1300 at least one of a 3D pointing device's linear position and angular position can be estimated (or sensed, measured, detected, etc.).
  • exemplary mapping algorithms described above may use only a device's linear position, only a device's angular position or both its linear position and angular position as inputs to the mapping algorithms.
  • at step 1302 at least one of the estimated linear position and estimated angular position are processed using both a first mapping algorithm, to generate a first cursor location, and a second mapping algorithm, to generate a second cursor location.
  • the results are combined at step 1304 to generate a final cursor output.
  • the mapping algorithms may operate on partial or incomplete motion data. For example, user-frame relative pointing is useful with only one of the two terms. Some applications can utilize user-frame relative pointing while collecting either angular motion or linear motion, but not both.
  • Sensor(s) may collect and estimate motion in either user-frame coordinates, body-frame coordinates or a combination of user-frame and body-frame coordinates.
  • the mapping algorithm may operate in either the body-frame coordinate system, user-frame coordinate system, or any other coordinate system. Motion may be measured in any coordinate system including Cartesian and spherical.
  • the mapping algorithm may use derivatives of linear position including velocity and acceleration.
  • the mapping algorithm may use derivatives of angular position including velocity and acceleration.
  • the mapping combination method may be trivial and only use data from one mapping method. The factors for other algorithms may be 0. Mathematical terms with 0 valued coefficients need not be computed or appear in the final implementation.
  • Systems and methods for processing data according to exemplary embodiments of the present invention can be performed by one or more processors executing sequences of instructions contained in a memory device. Such instructions may be read into the memory device from other computer-readable mediums such as secondary data storage device(s). Execution of the sequences of instructions contained in the memory device causes the processor to operate, for example, as described above. In alternative embodiments, hard-wire circuitry may be used in place of or in combination with software instructions to implement the present invention.
  • Such software may run on a processor which is housed within the device, e.g., a 3D pointing device or other device, which contains the sensors or the software may run on a processor or computer housed within another device, e.g., a system controller, a game console, a personal computer, etc., which is in communication with the device containing the sensors.
  • data may be transferred via wireline or wirelessly between the device containing the sensors and the device containing the processor which runs the software which performs the pointer mapping as described above.
  • some of the processing described above with respect to pointer mapping may be performed in the device containing the sensors, while the remainder of the processing is performed in a second device after receipt of the partially processed data from the device containing the sensors.
  • pointer mapping techniques relate to sensing packages including one or more rotational sensors and an accelerometer
  • pointer mapping techniques are not limited to only these types of sensors. Instead pointer mapping techniques as described herein can be applied to devices which include, for example, only accelerometer(s), optical and inertial sensors (e.g., a rotational sensor, a gyroscope, an angular velocity sensor or a linear accelerometer), a magnetometer and an inertial sensor (e.g., a rotational sensor, a gyroscope or a linear accelerometer), a magnetometer and an optical sensor, or other sensor combinations.
  • accelerometer(s) e.g., a rotational sensor, a gyroscope, an angular velocity sensor or a linear accelerometer
  • magnetometer and an inertial sensor e.g., a rotational sensor, a gyroscope or a linear accelerometer
  • magnetometer and an optical sensor e.g., a magnetometer and an optical sensor, or other
  • exemplary embodiments described herein relate to cursor mapping in the context of 3D pointing devices and applications, such techniques are not so limited and may be employed in methods and devices associated with other applications, e.g., medical applications, gaming, cameras, military applications, etc.

Abstract

Systems, devices, methods and software are described for mapping movement or motion of a 3D pointing device into cursor position, e.g., for use in rendering the cursor on a display. Absolute and relative type mapping algorithms are described. Mapping algorithms can be combined to obtain beneficial characteristics from different types of mapping.

Description

    RELATED APPLICATION
  • This application is related to, and claims priority from, U.S. Provisional Patent Application Ser. No. 61/077,238, entitled “3D Pointer Mapping”, filed on Jul. 1, 2008, the disclosure of which is incorporated here by reference.
  • BACKGROUND
  • The present invention describes mapping techniques, systems, software and devices, which can be used in 3D pointing devices, as well as in other types of devices.
  • Technologies associated with the communication of information have evolved rapidly over the last several decades. Television, cellular telephony, the Internet and optical communication techniques (to name just a few things) combine to inundate consumers with available information and entertainment options. Taking television as an example, the last three decades have seen the introduction of cable television service, satellite television service, pay-per-view movies and video-on-demand. Whereas television viewers of the 1960s could typically receive perhaps four or five over-the-air TV channels on their television sets, today's TV watchers have the opportunity to select from hundreds, thousands, and potentially millions of channels of shows and information. Video-on-demand technology, currently used primarily in hotels and the like, provides the potential for in-home entertainment selection from among thousands of movie titles.
  • The technological ability to provide so much information and content to end users provides both opportunities and challenges to system designers and service providers. One challenge is that while end users typically prefer having more choices rather than fewer, this preference is counterweighted by their desire that the selection process be both fast and simple. Unfortunately, the development of the systems and interfaces by which end users access media items has resulted in selection processes which are neither fast nor simple. Consider again the example of television programs. When television was in its infancy, determining which program to watch was a relatively simple process primarily due to the small number of choices. One would consult a printed guide which was formatted, for example, as series of columns and rows which showed the correspondence between (1) nearby television channels, (2) programs being transmitted on those channels and (3) date and time. The television was tuned to the desired channel by adjusting a tuner knob and the viewer watched the selected program. Later, remote control devices were introduced that permitted viewers to tune the television from a distance. This addition to the user-television interface created the phenomenon known as “channel surfing” whereby a viewer could rapidly view short segments being broadcast on a number of channels to quickly learn what programs were available at any given time.
  • Despite the fact that the number of channels and amount of viewable content has dramatically increased, the generally available user interface, control device options and frameworks for televisions has not changed much over the last 30 years. Printed guides are still the most prevalent mechanism for conveying programming information. The multiple button remote control with up and down arrows is still the most prevalent channel/content selection mechanism. The reaction of those who design and implement the TV user interface to the increase in available media content has been a straightforward extension of the existing selection procedures and interface objects. Thus, the number of rows in the printed guides has been increased to accommodate more channels. The number of buttons on the remote control devices has been increased to support additional functionality and content handling, e.g., as shown in FIG. 1. However, this approach has significantly increased both the time required for a viewer to review the available information and the complexity of actions required to implement a selection. Arguably, the cumbersome nature of the existing interface has hampered commercial implementation of some services, e.g., video-on-demand, since consumers are resistant to new services that will add complexity to an interface that they view as already too slow and complex.
  • In addition to increases in bandwidth and content, the user interface bottleneck problem is being exacerbated by the aggregation of technologies. Consumers are reacting positively to having the option of buying integrated systems rather than a number of separate components. An example of this trend is the combination television/VCR/DVD in which three previously independent components are frequently sold today as an integrated unit. This trend is likely to continue with an end result that potentially all of the communication devices currently found in the household will be packaged together as an integrated unit, e.g., a television/VCR/DVD/internet access/radio/stereo unit. Even those who continue to buy separate components will likely desire seamless control of, and interworking between, the separate components. With this increased aggregation comes the potential for more complexity in the user interface. For example, when so-called “universal” remote units were introduced, e.g., to combine the functionality of TV remote units and VCR remote units, the number of buttons on these universal remote units was typically more than the number of buttons on either the TV remote unit or VCR remote unit individually. This added number of buttons and functionality makes it very difficult to control anything but the simplest aspects of a TV or VCR without hunting for exactly the right button on the remote. Many times, these universal remotes do not provide enough buttons to access many levels of control or features unique to certain TVs. In these cases, the original device remote unit is still needed, and the original hassle of handling multiple remotes remains due to user interface issues arising from the complexity of aggregation. Some remote units have addressed this problem by adding “soft” buttons that can be programmed with the expert commands. These soft buttons sometimes have accompanying LCD displays to indicate their action. These too have the flaw that they are difficult to use without looking away from the TV to the remote control. Yet another flaw in these remote units is the use of modes in an attempt to reduce the number of buttons. In these “moded” universal remote units, a special button exists to select whether the remote should communicate with the TV, DVD player, cable set-top box, VCR, etc. This causes many usability issues including sending commands to the wrong device, forcing the user to look at the remote to make sure that it is in the right mode, and it does not provide any simplification to the integration of multiple devices. The most advanced of these universal remote units provide some integration by allowing the user to program sequences of commands to multiple devices into the remote. This is such a difficult task that many users hire professional installers to program their universal remote units.
  • Some attempts have also been made to modernize the display interface between end users and media systems. However, these attempts typically suffer from, among other drawbacks, an inability to easily scale between large collections of media items and small collections of media items. For example, interfaces which rely on lists of items may work well for small collections of media items, but are tedious to browse for large collections of media items. Interfaces which rely on hierarchical navigation (e.g., tree structures) may be speedier to traverse than list interfaces for large collections of media items, but are not readily adaptable to small collections of media items. Additionally, users tend to lose interest in selection processes wherein the user has to move through three or more layers in a tree structure. For all of these cases, current remote units make this selection processor even more tedious by forcing the user to repeatedly depress the up and down buttons to navigate the list or hierarchies. When selection skipping controls are available such as page up and page down, the user usually has to look at the remote to find these special buttons or be trained to know that they even exist. Accordingly, organizing frameworks, techniques and systems which simplify the control and display interface between users and media systems as well as accelerate the selection process, while at the same time permitting service providers to take advantage of the increases in available bandwidth to end user equipment by facilitating the supply of a large number of media items and new services to the user have been proposed in U.S. patent application Ser. No. 10/768,432, filed on Jan. 30, 2004, entitled “A Control Framework with a Zoomable Graphical User Interface for Organizing, Selecting and Launching Media Items”, the disclosure of which is incorporated here by reference.
  • Of particular interest for this specification are the remote devices usable to interact with such frameworks, as well as other applications and systems. As mentioned in the above-incorporated application, various different types of remote devices can be used with such frameworks including, for example, trackballs, “mouse”-type pointing devices, light pens, etc. However, another category of remote devices which can be used with such frameworks (and other applications) is 3D pointing devices. The phrase “3D pointing” is used in this specification to refer to the ability of an input device to measure motion in three dimensional space. Three dimensional space has six degrees of freedom (6DOF): three axes of linear motion and three axes of angular motion. Although the term 6DOF is commonly used, the seventh dimension of time is automatically included. The position (or pose) of a device may be represented by its linear position and angular position (orientation). The 3D pointing device moves within the six degrees of freedom in the air in front of, e.g., a display, and the corresponding ability of the user interface to translate those motions directly into user interface commands, e.g., movement of a cursor on the display. Thus “3D pointing” differs from, e.g., conventional computer mouse pointing techniques which use a surface, e.g., a desk surface or mousepad, as a proxy surface from which relative movement of the mouse is translated into cursor movement on the computer display. An example of a 3D pointing device can be found in U.S. Pat. No. 7,118,518 to Matthew G. Liberty (hereafter referred to as the '518 patent), the disclosure of which is incorporated here by reference. The motion data is transferred between the 3D pointing device and the host system. The motion data transfer may be performed by any communication link including wired, radio frequency, ultrasonic and infrared.
  • Two primary methods exist for mapping device motion into cursor motion: absolute pointing and relative pointing. With absolute pointing, the desired cursor location is the location where the forward vector of the device intersects the plane of the display. If the 3D pointer were a laser pointer, this cursor location would be the location of the projected laser dot. With absolute pointing, the pointing resolution for angular motions varies with linear position. The further the user is from the display, the finer the angular motion required to target objects. The angular motion resolution also varies with the off-axis angle. When the user is to the side of the display, smaller angular motions are required than when the user is at the same distance directly in front of the display. This variation in pointing resolution yields an inconsistent user experience. When part of a TV remote control, this variation causes inconsistent behavior between homes and even between seats at the same home. Absolute pointing is, however, normally repeatable and time invariant. If the 3D pointer is placed in the same position, then the cursor will return to the same position. Absolute pointing may also be non-calibrated and referenced to an initial starting position.
  • With relative pointing, the pointing resolution for all motion is independent of linear position and angular position from the display. However, the device may not be aligned with the cursor on the display. Relative pointing allows for non-linear processing including pointer ballistics which can dramatically improve pointing performance. Pointer ballistics are described, for example, at http://www.microsoft.com/whdc/archive/pointer-bal.mspx. Relative pointing often bounds cursor motion to the display bounds and discards any motion beyond the display bounds. While this allows users to relax and find a comfortable position, some applications benefit from a fixed mapping between device position and cursor location.
  • However, solutions exist beyond pure absolute pointing and pure relative pointing. Accordingly, there is still room for improvement in the area of mapping of, e.g., device movement to display, handheld device design, generally, and 3D pointer design, more specifically. For the remainder of this specification, absolute pointing refers to solutions that have characteristics most similar to true absolute pointing and relative pointing refers to solutions that have characteristics most similar to true relative pointing.
  • SUMMARY
  • According to one exemplary embodiment, a method for mapping a device's movement into cursor position is described. The device's linear position and angular position are estimated. The estimated linear position and angular position are further processed using both a first mapping algorithm to generate a first cursor location, and a second mapping algorithm to generate a second cursor location. The first cursor location and the second cursor location are combined to generate a final cursor output. Such a technique can be used, for example, to combine the strengths of the two mapping algorithms to provide a more robust user experience associated with, e.g., a user interface in which the cursor is used to interact with various objects.
  • According to another exemplary embodiment, a 3D pointing device includes at least one sensor configured to generate an output which is associated with movement of the 3D pointing device and a processor. The processor is configured to estimate the device's linear position and angular position using the output, to process the estimated linear position and angular position using both a first mapping algorithm to generate a first cursor location and a second mapping algorithm to generate a second cursor location, and to combine the first cursor location and the second cursor location to generate a final cursor output.
  • According to another exemplary embodiment, a system includes a 3D pointing device having at least one sensor configured to generate an output which is associated with movement of said 3D pointing device and a system controller in communication with the 3D pointing device and configured to receive data associated with said output therefrom. At least one of the 3D pointing device and the system controller include a processor for estimating at least one of said device's linear position and angular position using the output, to process at least one of the estimated linear position and angular position using both a first mapping algorithm to generate a first cursor location and a second mapping algorithm to generate a second cursor location, and to combine the first cursor location and the second cursor location to generate a final cursor output.
  • According to another exemplary embodiment, a method for mapping a device's movement into cursor position includes estimating the device's linear position and angular position, and processing the estimated linear position and angular position using a mapping algorithm to generate a cursor location, wherein the mapping algorithm is an absolute invariant algorithm which has a first characteristic of providing a direct, repeatable mapping from device linear position and angular position into cursor location and a second characteristic that cursor responsiveness to linear motion and angular motion is consistent over a motion range.
  • According to another exemplary embodiment, a method for mapping a device's movement into cursor position includes the steps of estimating at least one of the device's linear position and angular position, and processing at least one of the estimated linear position and angular position using a mapping algorithm to generate a cursor location, wherein the mapping algorithm creates an intermediate, virtual display that moves to face the device.
  • According to another exemplary embodiment, a method for mapping a device's movement into cursor position, includes the steps of estimating the device's angular position, and processing the estimated angular position using a mapping algorithm to generate a cursor location, wherein the mapping algorithm maps said estimated angular position of the device into cursor coordinates using an angular position spherical projection.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The accompanying drawings illustrate exemplary embodiments, wherein:
  • FIG. 1 depicts a conventional remote control unit for an entertainment system;
  • FIG. 2 depicts an exemplary media system in which exemplary embodiments can be implemented;
  • FIG. 3 shows a 3D pointing device according to an exemplary embodiment of the present invention;
  • FIG. 4 illustrates a cutaway view of the 3D pointing device in FIG. 4 including angular velocity sensing and linear acceleration sensing;
  • FIG. 5 shows a 3D pointing device according to another exemplary embodiment;
  • FIG. 6 depicts the 3D pointing device of FIG. 5 being used as part of a “10 foot” interface according to an exemplary embodiment;
  • FIG. 7 shows mapping of device motion into displayed cursor motion according to various exemplary embodiments;
  • FIGS. 8-12 illustrate functions associated with mapping of device motion into displayed cursor motion according to exemplary embodiments; and
  • FIG. 13 is a flowchart illustrating a method for mapping a device's movement into cursor position according to an exemplary embodiment.
  • DETAILED DESCRIPTION
  • The following detailed description of the invention refers to the accompanying drawings. The same reference numbers in different drawings identify the same or similar elements. Also, the following detailed description does not limit the invention. Instead, the scope of the invention is defined by the appended claims.
  • In order to provide some context for this discussion, an exemplary aggregated media system 200 in which the present invention can be implemented will first be described with respect to FIG. 2. Those skilled in the art will appreciate, however, that the present invention is not restricted to implementation in this type of media system and that more or fewer components can be included therein. Therein, an input/output (I/O) bus 210 connects the system components in the media system 200 together. The I/O bus 210 represents any of a number of different of mechanisms and techniques for routing signals between the media system components. For example, the I/O bus 210 may include an appropriate number of independent audio “patch” cables that route audio signals, coaxial cables that route video signals, two-wire serial lines or infrared or radio frequency transceivers that route control signals, optical fiber or any other routing mechanisms that route other types of signals.
  • In this exemplary embodiment, the media system 200 includes a television/monitor 212, a video cassette recorder (VCR) 214, digital video disk (DVD) recorder/playback device 216, audio/video tuner 218 and compact disk player 220 coupled to the I/O bus 210. The VCR 214, DVD 216 and compact disk player 220 may be single disk or single cassette devices, or alternatively may be multiple disk or multiple cassette devices. They may be independent units or integrated together. In addition, the media system 200 includes a microphone/speaker system 222, video camera 224 and a wireless I/O control device 226. According to exemplary embodiments of the present invention, the wireless I/O control device 226 is a 3D pointing device according to one of the exemplary embodiments described below. The wireless I/O control device 226 can communicate with the entertainment system 200 using, e.g., an IR or RF transmitter or transceiver. Alternatively, the I/O control device can be connected to the entertainment system 200 via a wire.
  • The entertainment system 200 also includes a system controller 228. According to one exemplary embodiment of the present invention, the system controller 228 operates to store and to display entertainment system data available from a plurality of entertainment system data sources and to control a wide variety of features associated with each of the system components. As shown in FIG. 2, system controller 228 is coupled, either directly or indirectly, to each of the system components, as necessary, through I/O bus 210. In one exemplary embodiment, in addition to or in place of I/O bus 210, system controller 228 is configured with a wireless communication transmitter (or transceiver), which is capable of communicating with the system components via IR signals or RF signals. Regardless of the control medium, the system controller 228 is configured to control the media components of the media system 200 via a graphical user interface described below.
  • As further illustrated in FIG. 2, media system 200 may be configured to receive media items from various media sources and service providers. In this exemplary embodiment, media system 200 receives media input from and, optionally, sends information to, any or all of the following sources: cable broadcast 230, satellite broadcast 232 (e.g., via a satellite dish), very high frequency (VHF) or ultra high frequency (UHF) radio frequency communication of the broadcast television networks 234 (e.g., via an aerial antenna), telephone network 236 and cable modem 238 (or another source of Internet content). Those skilled in the art will appreciate that the media components and media sources illustrated and described with respect to FIG. 2 are purely exemplary and that media system 200 may include more or fewer of both. For example, other types of inputs to the system include AM/FM radio and satellite radio.
  • More details regarding this exemplary entertainment system and frameworks associated therewith can be found in the above-incorporated by reference U.S. patent application “A Control Framework with a Zoomable Graphical User Interface for Organizing, Selecting and Launching Media Items”. Alternatively, remote devices in accordance with the present invention can be used in conjunction with other systems, for example computer systems including, e.g., a display, a processor and a memory system or with various other systems and applications.
  • As mentioned in the Background section, remote devices which operate as 3D pointers are of particular interest for the present specification. Such devices enable the translation of movement, e.g., gestures, into commands to a user interface. An exemplary 3D pointing device 400 is depicted in FIG. 3. Therein, user movement of the 3D pointing can be defined, for example, in terms of a combination of x-axis attitude (roll), y-axis elevation (pitch) and/or z-axis heading (yaw) motion of the 3D pointing device 400. In addition, some exemplary embodiments of the present invention can also measure linear movement of the 3D pointing device 400 along the x, y, and z axes to generate cursor movement or other user interface commands. In the exemplary embodiment of FIG. 3, the 3D pointing device 400 includes two buttons 402 and 404 as well as a scroll wheel 406, although other exemplary embodiments will include other physical configurations. According to exemplary embodiments of the present invention, it is anticipated that 3D pointing devices 400 will be held by a user in front of a display 408 and that motion of the 3D pointing device 400 will be translated by the 3D pointing device into output which is usable to interact with the information displayed on display 408, e.g., to move the cursor 410 on the display 408. For example, rotation of the 3D pointing device 400 about the y-axis can be sensed by the 3D pointing device 400 and translated into an output usable by the system to move cursor 410 along the y2 axis of the display 408. Likewise, rotation of the 3D pointing device 408 about the z-axis can be sensed by the 3D pointing device 400 and translated into an output usable by the system to move cursor 410 along the x2 axis of the display 408. It will be appreciated that the output of 3D pointing device 400 can be used to interact with the display 408 in a number of ways other than (or in addition to) cursor movement, for example it can control cursor fading, volume or media transport (play, pause, fast-forward and rewind). Input commands may include operations in addition to cursor movement, for example, a zoom in or zoom out on a particular region of a display. A cursor may or may not be visible. Similarly, rotation of the 3D pointing device 400 sensed about the x-axis of 3D pointing device 400 can be used in addition to, or as an alternative to, y-axis and/or z-axis rotation to provide input to a user interface.
  • According to one purely illustrative exemplary embodiment of the present invention, two axes of angular velocity sensing 420 and 422 and three axes of linear acceleration sensing 424 can be employed as sensors in 3D pointing device 400 as shown in FIG. 4. Although this exemplary embodiment employs inertial sensors it will be appreciated that the present invention is not so limited and examples of other types of sensors which can be used in conjunction with other exemplary embodiments are provided below. The rotational sensors 420, 422 can, for example, be implemented using IDG-500 or IXZ-500 sensors made by Invensense. Alternatively, the embodiment could measure all three axes of angular velocity using elements 420 and 422 implemented as the combination of an IDG-500 and ISZ-500. It will be appreciated by those skilled in the art that other types of rotational sensors can be employed as rotational sensors 420 and 422 and that the Invensense sensors are purely used as an illustrative example. Unlike traditional gyroscopes, these rotational sensors use MEMS technology to provide a resonating mass which is attached to a frame so that it can resonate only along one direction. The resonating mass is displaced when the body to which the sensor is affixed is rotated around the sensor's sensing axis. This displacement can be measured using the Coriolis acceleration effect to determine an angular velocity associated with rotation along the sensing axis. Other sensors/sensor packages may also be used, and the angular velocity sensors 420, 422 can be 1-D, 2-D or 3-D sensors. The accelerometer 424 can, for example, be a 3-axis linear accelerometer such as the LIS331DLH produced by STMicroelectronics. However, a 2-axis linear accelerometer could be used by assuming that the device is measuring gravity and mathematically computing the remaining 3rd value. Additionally, the accelerometer(s) and rotational sensor(s) could be packaged together into a single sensor package. Other variations of sensors and sensor packages may also be used in conjunction with these exemplary embodiments.
  • The exemplary embodiments are not limited to the industrial design illustrated in FIGS. 3 and 4, but can instead be deployed in any industrial form factor, another example of which is illustrated as FIG. 5. In the exemplary embodiment of FIG. 5, the 3D pointing device 500 includes a ring-shaped housing 501, two buttons 502 and 504 as well as a scroll wheel 506 and grip 507, although other exemplary embodiments may include other physical configurations. The region 508 which includes the two buttons 502 and 504 and scroll wheel 506 is referred to herein as the “control area” 508, which is disposed on an outer portion of the ring-shaped housing 501. More details regarding this exemplary embodiment can be found in U.S. patent application Ser. No. 11/480,662, entitled “3D Pointing Devices”, filed on Jul. 3, 2006, the disclosure of which is incorporated here by reference.
  • Such devices have numerous applications including, for example, usage in the so-called “10 foot” interface between a sofa and a television in the typical living room as shown in FIG. 6. Therein, as the 3D pointing device 500 moves between different positions, that movement is detected by one or more sensors within 3D pointing device 200 and transmitted to the television 620 (or associated system component, e.g., a set-top box (not shown)). Movement of the 3D pointing device 500 can, for example, be translated or mapped into movement of a cursor 640 displayed on the television 620 (examples of such mappings being provided below) and which is used to interact with a user interface. Details of an exemplary user interface with which the user can interact via 3D pointing device 500 can be found, for example, in the above-incorporated U.S. patent application Ser. No. 10/768,432 as well as U.S. patent application Ser. No. 11/437,215, entitled “Global Navigation Objects in User Interfaces”, filed on May 19, 2006, the disclosure of which is incorporated here by reference. Another exemplary embodiment may contain other sensors including the sensing system which can be found in U.S. patent application Ser. No. 12/424,090 entitled “Tracking Determination Based On Intensity Angular Gradient Of A Wave”, filed on Apr. 15, 2009, the disclosure of which is incorporated here by reference.
  • One challenge faced in implementing exemplary 3D pointing devices 400 in accordance with these exemplary embodiments is to employ components, e.g., rotational sensors 502 and 504, which are not too costly, while at the same time providing a high degree of correlation between movement of the 3D pointing device 400, a user's expectation regarding how the user interface will react to that particular movement of the 3D pointing device and actual user interface performance in response to that movement. For example, if the 3D pointing device 400 is not moving, the user will likely expect that the cursor ought not to be drifting across the display. Likewise, if the user rotates the 3D pointing device 400 purely around the y-axis, she or he would likely not expect to see the resulting cursor movement on display 408 contain any significant x-axis component. To achieve these, and other, aspects of exemplary embodiments of the present invention, various measurements and calculations are performed, e.g., by the handheld device 400, which are used to adjust the outputs of one or more of the sensors 420, 422 and 424 and/or as part of the input used by a processor to determine an appropriate output for the user interface based on the outputs of the sensors 420, 422 and 424. These measurements and calculations are used to compensate for factors which fall broadly into two categories: (1) factors which are intrinsic to the 3D pointing device 400, e.g., errors associated with the particular sensors 420, 422 and 424 used in the device 400 or the way in which the sensors are mounted in the device 400 and (2) factors which are not intrinsic to the 3D pointing device 400, but are instead associated with the manner in which a user is using the 3D pointing device 400, e.g., linear acceleration, tilt and tremor. Some exemplary techniques for handling these effects are described in the above-incorporated by reference '518 patent. However, additional techniques for handling the bias or offset error contributions to sensed motion are described in U.S. patent application Ser. No. 12/163,229 filed on Jun. 27, 2008, entitled “Real-Time Dynamic Tracking Of Bias”, the disclosure of which is incorporated here by reference.
  • Mapping
  • As mentioned above, a 3D pointing device that a user moves in 6 degrees of freedom can be used to convert that motion into cursor motion. Different applications have different demands and requirements on how the device motion should be mapped into cursor motion. These exemplary embodiments describe novel methods for mapping the device motion into cursor motion which provide improved 3D pointer performance and are configurable to deliver optimal response for each application. Among other things, exemplary embodiments of the present invention describe new 3D pointer mapping methods and a method for combining alternate mapping methods to provide optimal response for a given application. These can, for example, reduce cursor resolution variation as a function of the user's position relative to the display, which is a primary problem with absolute pointing. At the same time, the exemplary embodiments can provide a constant mapping between the device position and cursor position, which can be a problem for some relative pointing applications.
  • One exemplary embodiment includes:
  • 1. A device that measures motion in 6 degrees of freedom
  • 2. One or more algorithms that convert the device motion into cursor motion
  • 3. A method that combines the cursor motion outputs from the algorithms
  • 4. A visual display that displays the cursor
  • In the exemplary embodiment shown in FIG. 7, a user holds the device 701 and makes motions to control the cursor 703 which appears on display 702. The device 701 typically contains a collection of sensors, examples of which were described above. The sensing system may consist of one or more sensors including linear accelerometers, angular position sensors (traditional gyroscopes), angular velocity sensors (MEMS gyroscopes), magnetometers, cameras, optical, ultrasonic and RF. The sensing system processes the sensor data to provide an estimate of the device's linear position and angular position. The device's position is then processed by one or more mapping algorithms to yield cursor locations. The cursor locations are then combined to produce a final cursor output. The cursor motion information then drives the cursor on the display 702. The device 701 may be connected via either wires or wirelessly to the display 702. The algorithm may run on the device 701, the display 702 or an intermediate processing unit (not shown in FIG. 7, e.g., a system console connected to both the device 701 and the display 702).
  • According to one exemplary embodiment, the device 701 is a battery-powered, handheld device which contains a 3-axis accelerometer and a 3-axis gyroscope, however it will be appreciated that fewer or other sensors may be included. According to this exemplary embodiment, the device 701 processes the sensor data to estimate its linear position and angular position and further processes the linear position and angular position to produce the cursor motion data. The cursor motion data is communicated to a set-top box, e.g., represented by system controller 228 in FIG. 2, over a proprietary 2.4 GHz RF link. The data is received by the RF hardware in the set-top box and communicated over a USB bus to the main set-top box processor. The main set-top box processor moves the cursor as specified, in this example, by the device 701. At 30 Hz or 60 Hz for NTSC (25 Hz or 50 Hz for PAL), the set-top box renders the image and sends the image to the display over, e.g., HDMI, component, S-Video, and/or composite outputs. The display receives the image and displays the image to the user. As mentioned earlier, although the various processing stages are performed in the handheld device 701 according to this exemplary embodiment, the set-top box or other controller which communicates with the handheld device 701 could perform some or all of the processing, e.g., the estimation of linear position and angular position and/or the mapping of the estimated linear position and angular position into one or more cursor locations.
  • Prior to discussing the mapping techniques which can be employed according to exemplary embodiments, some mathematical notation is introduced below to guide the discussion:
  • Lower case letters represent scalar variables: x, y, z;
  • Lower case bold letters represent vectors: x, y, z;
  • Upper case bold letters represent matrices: X, Y, Z;
  • Vectors are assumed to be column vectors (N×1 matrix);
  • |v| is the magnitude of vector v;
  • x y=x·y is the dot product of vectors x and y;
  • x×y is the cross product of vectors x and y;
  • X y is the matrix multiplication of matrix X and vector y;
  • XT is the matrix transpose;
  • ŷ is the unit vector in the direction of
  • y = y y · y ;
  • <q0, v> is the quaternion q with scalar component q0 and the length 3 vector v;
  • Vector(q)=v where q is the quaternion <q0, v>;
  • q
    Figure US20110227825A1-20110922-P00001
    p is the quaternion multiplication of q and p;
  • q* is the quaternion conjugate of q: <q0, v>*=<q0, −v>;
  • bx is the vector x defined in body-frame coordinates;
  • ux is the vector x defined in user-frame coordinates;
  • Length 2 vector v is assumed to have subcomponents named (vx, vy); and
  • Length 3 vector v is assumed to have subcomponents named (vx, vy, vz).
  • Using this notation, exemplary techniques for processing device motion into cursor movement as shown in FIG. 7 will now be considered. The figure shows two coordinate systems. A first coordinate system, i.e., the user-frame coordinate system, is arbitrarily chosen in this example to be the center of the display and consists of (x, y, z). The user-frame coordinate system is stationary with respect to the display. The coordinate system has x into the display, y to the right of the display and z down, which corresponds to a typical aerospace coordinate system convention. Those skilled in the art will recognize that alternate conventions could be used instead and include, for example, PC display coordinates (x right, y down, z into display) and HID (x out of display, y right, z down). For this discussion, the user-frame axes will be arbitrarily defined as:

  • ux=[1, 0, 0]

  • uy=[0, 1, 0]

  • uz=[0, 0, 1]
  • One skilled in the art will recognize that the user-frame coordinate system axes and origin can be chosen without materially altering the present invention. The discussion above assumes a Cartesian coordinate system, but other coordinate systems, such as spherical coordinates, could also be used without affecting the invention.
  • The second coordinate system in this exemplary embodiment is the device's body-frame coordinate system. The body-frame coordinate system is stationary with respect to the device. The body-frame origin is typically at the center of the device, although that is not required. The body-frame axes are shown as (x°, y°, z°) with x° out the front of the device, y° to the right, and z° down. For this discussion, the body-frame axes are arbitrarily defined as:

  • bx°=[1, 0, 0]

  • by°=[0, 1, 0]

  • bz°=[0, 0, 1]
  • One skilled in the art will recognize that the body-frame coordinate system axes and origin can be chosen without materially altering the present invention. The discussion above assumes a Cartesian coordinate system, but other coordinate systems, such as spherical coordinates, could also be used without affecting the invention.
  • The length 3 vector ou is the origin of the user-frame coordinate system and is defined to be fixed to the display. uou is defined as (0, 0, 0) for the present embodiment. The length 3 vector ob is the origin of the body-frame coordinate system and is defined to be fixed to the device. bob is defined as (0, 0, 0) for the present embodiment. The length 3 vector od is the origin of the display coordinate system and is defined to be fixed to the display. uod is defined as (0, cx, cy) for the present embodiment. The length 3 vector u is defined as the device's linear position in 3-D space. u=ob. The length 3 vector w is the vector from the device's body-frame coordinate system origin, ob, to the user-frame coordinate system origin, ou. Mathematically, w=ou−ob=−u.
  • The quaternion q is defined as the device's angular position with respect to user-frame coordinates. For the present discussion, q is defined as a unit quaternion (q·q=1). Instead of quaternions, any desired alternate angular position representation could also be used including, for example, Euler angles, direction cosine matrix (DCM), and vector/angle. The length 2 vector p is the pointing device's cursor location on the display given as 2-D display coordinates. The length 2 vector p0 is the cursor location of the display coordinate system. The length 3 vector ω is the angular velocity. Following the convention above, uω is the angular velocity in user-frame coordinates and bω is the angular velocity in the device's body-frame coordinates. Given a vector in body-frame coordinates, the vector in user-frame coordinates can be found using the following equation:

  • u v=Vector(q
    Figure US20110227825A1-20110922-P00002
    0,b v
    Figure US20110227825A1-20110922-P00003
    q*)
  • Given a vector in user-frame coordinates, the vector in body-frame coordinates with the following equation:

  • b v=Vector(q*
    Figure US20110227825A1-20110922-P00002
    0,u v
    Figure US20110227825A1-20110922-P00003
    q)
  • Given this exemplary, mathematical context, mapping device motion into cursor location using absolute pointing will first be considered. The cursor in an absolute pointing system should be located at the intersection of the display and the line containing the device's x° axis. For example, if a laser pointer were appropriately mounted to the device 701, the cursor location and the laser pointer's dot on the display 702 should at the same position. The full mathematical equation for absolute pointing given user-frame coordinates is:
  • p = - u x u u x x [ u x y u x z ] + [ u y u u z u ] + p 0
  • Absolute pointing is especially suitable for shooting games where the user aims directly at targets on the display. Absolute pointing is analogous to how humans point our arms, hands and fingers to physical objects in our environment. With absolute pointing, the ratio of cursor motion to angular device motion changes as a function of distance. The further the device is from the display, the smaller the angular motions required to move the cursor a given distance on the display. Likewise, the further the device is from the x axis, the smaller the angular motions required to move the cursor. Thus, for example, an absolute pointing algorithm maps device motion to cursor location based upon an intersection of a forward pointing direction (body-frame x-axis) of the device and a surface of a display on which the cursor is to be displayed.
  • The value of uux corresponds to a distance correction factor. This distance correction factor is described by U.S. Pat. No. 5,627,565 entitled “Space coordinates detecting device and input apparatus using same”, the disclosure of which is incorporated here by reference.
  • Next, relative pointing or “body-frame” relative pointing will be considered. Some applications do not required direct pointing at a display or may not have any meaningful frame of reference. For such cases, the application may choose to use the device itself as the primary reference. As long as the ux is positive, regardless of the position of the display or the position of the device, if the user moves the device linearly along the y° axis or rotates around the z° axis, the cursor will always move to the right.
  • Assuming discrete time with Euler integration, body-frame relative pointing is mathematically defined by:
  • Δ p = [ ω z b ω y - b ] Δ t - [ Δ ou y b Δ ou z b ]
  • Note that devices need not implement both terms in this equation. Relative pointing is illustrated functionally in FIG. 8. With relative pointing, unlike absolute pointing, the amount of cursor motion is not affected by the position of the device 800. However, the forward direction of the device and the cursor location are not deterministic and may vary over time. For some applications, this decoupling is beneficial. The actual cursor location may be computed using a non-linear function of Δp, referred to as vPointer in FIG. 8, which is a non-linear function often called pointer ballistics 802. Traditional 2-D computer mice use pointer ballistics 802 to improve their apparent usability and performance, and this function can also be used in 3D pointer mapping. The output of the pointer ballistics 802, vSys, is then used to move the cursor on screen subject to boundary conditions, etc., in block 804.
  • Although body-frame relative pointing offers some advantages, users often do not care about the position of the device, only the relative movements that they make. Thus, a third type of mapping which is considered here is referred to as “user-frame relative pointing”. In user-frame relative pointing, if the user rotates the device around the user-frame z-axis then the cursor should move to the right. If the user linearly moves the device along the user-frame z-axis, then the cursor should move down.
  • Assuming discrete time with Euler integration, user-frame relative pointing is mathematically defined by:
  • b z = Vector ( q * 0 , u z q ) θ = tan - 1 ( z y b z z b ) Δ p = [ cos θ sin θ - sin θ cos θ ] [ ω z b ω y - b ] Δ t - [ Δ ou y b Δ ou z b ]
  • Note that the device need not implement both terms. U.S. Pat. No. 7,158,118, the disclosure of which is incorporated by reference above, contains details and disclosures regarding user-frame relative pointing. In one exemplary embodiment described therein, −bz is measured by an accelerometer. As with body frame relative pointing, with user frame relative pointing the amount of cursor motion is not affected by the position of the device. However, the forward direction of the device and the cursor location are not deterministic and may vary over time. For some applications, this decoupling is beneficial. The actual cursor location may be computed using a non-linear function of Δp, often called pointer ballistics. Traditional 2-D computer mice use pointer ballistics to improve their apparent usability and performance.
  • For many applications, the device responsiveness should be constant regardless of its position within the room but should still be pointing towards the display. According to exemplary embodiments, a fourth mapping technique, referred to herein as “absolute invariant pointing” blends many of the relative pointing benefits with the absolute pointing benefits while minimizing the negative factors of each method. Absolute invariant pointing can be mathematically defined by:
  • u x = Vector ( q 0 , u x q * ) p = c [ tan - 1 u x y u x x tan - 1 u x z u x x ] + [ u y u u z u ] + p 0 c [ sin - 1 u x y sin - 1 u x z ] + [ u y u u z u ] + p 0
  • where c is a constant.
  • Absolute invariant pointing, like absolute pointing per se described above, contains a direct, repeatable mapping from device linear position and angular position into cursor location. At the same time, the cursor responsiveness to linear motion and angular motion is consistent over the motion range. If the device is located to the left or right of display center, then users typically attempt to move the device left or right relative to the vector from the device to the display, not the display normal vector, x. This results in an apparent loss of linear motion resolution as the device moves away from the x-axis. Thus, an absolute invariant mapping algorithm according to this exemplary embodiment generates cursor position as a sum of a term of linear position values and a term computed from angular position that is independent of linear position.
  • A fifth mapping technique called “virtual display” reduces the loss of apparent linear resolution found with “absolute invariant pointing” as the device moves off-axis. The “virtual display” technique may create an intermediate, virtual display that moves to face the device. By moving to face the device, the virtual display maintains the same resolution as if the device was directly in front of the display. A full virtual display moves to directly face the device. A new virtual display coordinate system is created by construction with axes xΔ, yΔ, and zΔ. In mathematical terms,

  • u x Δ=−u û
  • Typically, the new coordinate system will preferably, but not as a requirement, maintain the y-axis as “horizontal” and the z-axis as “vertical”. By construction, the remaining axes will then be:

  • b= u u x Δ

  • uyΔ={circumflex over (b)}

  • u z Δ=u x Δ×u y Δ
  • By fully rotating the virtual display to face the device, the cursor response becomes highly non-linear as the device points away from the origin. To help minimize these effects, the virtual display coordinate system need not fully rotate towards the device. A scaling factor can be applied to construct uxΔ between ux and uû. For example,

  • u x Δ =c u x−(1−c)u û
  • where c is a value between 0 and 1, inclusive.
  • The virtual display can be applied to the absolute pointing algorithm to create a planar virtual display:
  • Q = [ | | | x Δ u y Δ u z Δ u | | | ] s = Q u x w = Q u u p = w x s x [ s y s z ] + [ w y w z ] + p 0
  • The virtual display method can also be applied to any absolute or pseudo-absolute mapping method including the absolute pointing and absolute invariant pointing described above. If the cursor resides on a non-planar display, then this method could easily be adapted to create a virtual display of the non-planar display. Thus, according to exemplary embodiments, the intermediate, planar virtual display maps device motion to cursor location based upon an intersection of a forward pointing direction (body-frame x-axis) of the device and a surface of a display on which the cursor is to be displayed rotated to at least partially face the device.
  • For many applications, angular response is more important than linear response. However, the virtual display method does not have consistent angular response. The angular responsiveness is similar to absolute pointing, not relative pointing.
  • A sixth mapping technique called “virtual spherical display” maintains constant angular response, unlike the “virtual display”. For example, if a conversion to polar coordinates is first performed, i.e.:
  • b = Q u x = [ cos φ cos θ cos φ sin θ - sin φ ]
  • Then, the algorithm can solve for θ and φ and finally p using homogenous coordinates as follows:
  • φ = sin - 1 ( - b z ) θ = tan - 1 ( b y b x ) [ p x p y 1 ] = T [ θ - φ 1 ]
  • where T is a 3×3 general transformation matrix of arbitrary constants. The matrix T may apply any combination of a scale, rotation, translation, shearing, reflection, orthogonal projection, affine transformation or perspective projection. Thus, the spherical virtual display according to this exemplary embodiment maps device motion to cursor location based upon spherical coordinates of the estimated angular position being transformed into the virtual display coordinates by a transformation matrix and converted into cursor location.
  • The spherical virtual display method can be applied to relative pointing to create a seventh mapping technique called “relative spherical virtual display”. Instead of controlling the cursor using angles, this method uses the change in the angle to drive a change in the cursor.
  • [ Δ p x Δ p y 1 ] = T [ ω z u Δ t - ω y u Δ t 1 ]
  • where T is a 3×3 general transformation matrix of arbitrary constants. The matrix T may apply any combination of a scale, rotation, translation, shearing, reflection, orthogonal projection, affine transformation or perspective projection. Thus, the intermediate, relative, spherical virtual display according to this exemplary embodiment play maps device motion to cursor location based upon body-frame angular velocity modified by a transformation matrix and converted into cursor change in location.
  • The angular position can also be mapped into cursor coordinates using a spherical projection. This eighth mapping method is known as “angular position spherical projection”.
  • b = q I * q * u x q q I = [ cos φ cos θ cos φ sin θ - sin φ ]
  • where qI is an arbitrary initial angular position rotation value. Then, the algorithm can solve for θ and φ and finally p using homogenous coordinates as follows:
  • φ = sin - 1 ( - b z ) θ = tan - 1 ( b y b x ) [ p x p y 1 ] = T [ θ - φ 1 ]
  • where T is a 3×3 general transformation matrix of arbitrary constants. The matrix T may apply any combination of a scale, rotation, translation, shearing, reflection, orthogonal projection, affine transformation or perspective projection. One skilled in the art will note that if the third row of p in homogenous coordinates is not equal to one, then p can be scaled so that it becomes one.
  • The outputs of all or some of the absolute methods described above can be joined by simple linear combination or a more complicated non-linear combination. For example, suppose that an application wanted to be mostly absolute but with improved angular resolution pointing consistency. In this case, a combination of the above described “absolute pointing mapping” with the “invariant absolute pointing mapping” can be performed. Exemplary embodiments can, for example, use a simple linear combination (or a more complicated non-linear process) to combine any two more of the described methods. For each method, pi, an application assigns a weighting factor ci. The final resulting p is then:
  • p = i c i p i
  • Ideally,
  • i c i = 1
  • to maintain a consistent cursor response, but this constraint is not required.
  • Similarly, the outputs of all or some of the relative methods described above can be joined by simple linear combination or a more complicated non-linear combination. For example, suppose that both the angular position and angular velocity of a device had unique flaws. An embodiment could use an equal linear combination of “user-frame relative pointing” with “relative spherical virtual display” which would reduce the flaws of each method by half The exemplary embodiment uses a simple linear combination to combine the methods. For each method, pi, an application assigns a weighting factor ci. The final resulting Δ p is then:
  • Δ p = i c i Δ p i
  • Ideally,
  • i c i = 1
  • to maintain a consistent cursor response, but this constraint is not required.
  • Combining absolute pointing methods and relative pointing methods is also considered according to these exemplary embodiments. One method in which combined absolute and relative pointing mappings can be employed is to have the cursor primarily controlled through relative pointing, but use absolute pointing to adjust the cursor movement so as to avoid long-term drift from the reference. Avoiding long-term drift would eliminate the periodical re-centering the cursor typical with relative pointing solutions. When using relative pointing and when using non-linear pointer ballistics according to an exemplary embodiment, the mapping between 3D pointer position and cursor position is time-varying and dependent upon the position and range of motion, and also the speed of movement. By retaining history of 3D pointer positions and their associated cursor locations, it is possible to define an adaptive map that defines a cursor position based upon a 3D pointer position. This map can be constrained to minimize the difference between the history of cursor positions and the history of 3D pointer positions mapped to cursor positions with the map.
  • An example of such an embodiment is illustrated in FIG. 9. In this particular exemplary embodiment, the “user-frame relative pointing” mapping is combined with the “angular position spherical projection”, both of which mappings are individually described above. The pointing device 900 outputs, on the lower branch 902, vPointer data which is the user-frame relative pointing data described above with respect to FIG. 8. On the upper branch 904, the pointing device 900 outputs angular position which is used by the absolute pointing mapping algorithm as an input. A map is created in block 906 which, in this exemplary embodiment, includes a 3×3 general transformation matrix T which can perform a variety of transformations on the output angular position, e.g., scale (stretch in any of the axes), rotation (preserve orthogonality), shearing (essentially make axes non-orthogonal), translation (offset to account for 2D application of data), reflection, and any other affine transformation or perspective projection. The map also defines an origin value (quaternion). A detailed example of how the map can be created is shown in FIG. 10.
  • Therein, in block 1002, the angular position output from the device is first rotated to map the measured angular position of the device to account for deviations from the nominal origin. After applying the initial rotation, the angular position is converted to spherical coordinates at block 1004. The current sample is evaluated to determine a weight for the sample at block 1006. The weights capture how important each previous point is to defining the current map between where the cursor is currently located on the screen and the angular position of the pointing device 900. The weights assist with determining which data points are worth saving and can be used as part of the least squares solution to find the map. In the exemplary implementation, the primary weight is applied to each sample based upon the absolute angular position of the pointing device. The full range of motion of the device is divided into a fixed set of small regions. The first data point in each region gets the largest weight, and every future point within that region gets a smaller weight. In addition a secondary weight based on the current angular velocity is applied so that points where the device is at rest are more important than for points where the device is in motion. Based on this weight, the best N samples of cursor position, angular position, and weighted samples are saved and used for map calculation at blocks 1008, 1010 and 1012, respectively. The map is created at block 1014 by first calculating the rotation origin to ensure that the input data remains within a reasonable range for conversion to spherical coordinates. If the cursor has moved beyond the display bounds, the origin and the saved state can be adjusted to define a new alignment. Next a 3×3 general transformation matrix T is created that will transform the set of azimuth/elevation pairs onto the set of cursor positions. One method of finding T is to define the error vector vn for each of the N saved samples as:
  • v n = T [ θ n - φ n 1 ] - [ x n y n 1 ]
  • and then minimize the least squares error
  • arg min T [ n = 1 N w n ( v n · v n ) ]
  • where wn is the weight for each sample. One skilled in the art will recognize that many methods exist for finding a solution to this linear least squares problem including inverting the normal equations using the Moore-Penrose pseudoinverse and orthogonal decomposition methods such as QR decomposition or singular value decomposition. QR decomposition is used in the exemplary implementation.
  • Returning to FIG. 9, once a map is defined by block 906 it is used to map the current 3D pointer position to display coordinates to generate a reference cursor location pRef in block 908. More details regarding this exemplary process are illustrated in FIG. 11. Therein, at block 1102, the angular position of the device 900 is rotated in the same way as described above with respect to block 1002, and the output is translated into spherical coordinates by block 1104 to create the 2×1 vector called Az/El. This is then converted to homogenous coordinates and then multiplied with the map matrix T at block 1106 yielding pRef. The pRef value thus represents the desired location for the cursor based upon the absolute pointing system component of this exemplary, combined mapping.
  • Returning again to FIG. 9, the relative pointing value vSys, the absolute pointing value pRef and the current cursor position p0, are input to a dynamic ballistics function 910. The dynamic ballistics function 910 takes these inputs and, in this exemplary embodiment, uses the absolute pointing value and the current cursor position to adjust the relative pointing value. More specifically, as shown in FIG. 12, the current and reference cursor positions are used to adjust the cursor movement before it is applied to the cursor. One method of adjusting the cursor is to perform small adjustments to the scale and angle of the current velocity vector so that the new cursor position will be closer to the reference point. First, the current point is subtracted from the reference point to get the reference velocity vector vRef at block 1202. This reference vector is compared to the original velocity v0 to find the angle between the vectors at block 1204. This angle is limited to a fixed maximum, at block 1206, and then used to rotate the vector v0 to create vRot as shown in block 1208 of FIG. 12. Next, the reference vector is projected onto vRot to find the scaling of vRot that would get the next point closest to the reference point as shown in block 1210. This scaling is limited between a maximum and minimum value (block 1212) and then applied to vRot at block 1214. The limits on maximum angle and maximum scale can be tuned to control how much correction will be applied.
  • By combining absolute pointing methods and relative pointing methods it is possible to maintain the benefits of relative pointing but still maintain a fixed region of 3D device positions that map to the display. This allows for drift-less relative pointing, which has the invariant that given a 3D pointer position q and cursor location p1 at time t0, returning the device to position q at time t1>t0 will return the cursor location to p2 such that p1≈p2.
  • Numerous variants and alternatives of the foregoing mapping techniques are contemplated. For example, the combination need not be limited to full mapping methods. Each term of each mapping method could be assigned its own weighting factor. Moreover, the virtual display need not fully track the device. Compromises include:
      • 1. Adding (or subtracting) an arbitrary x-axis value to the actual device position to appropriately scale linear motion.
      • 2. Algorithmically increasing or decreasing the distance to the virtual display from the device.
      • 3. Multiplying the device's actual position by a scale factor to reduce or increase the response to linear motion.
      • 4. A combination of any of the above factors.
        Alternative virtual display surfaces could be used. The above text describes planes and spheres, but the virtual display could be extended to cylinders, ellipses and higher order surfaces.
  • Thus according to one exemplary embodiment, a method for mapping a device's movement into cursor position can include the steps illustrated in the flowchart of FIG. 13. Therein, at step 1300, at least one of a 3D pointing device's linear position and angular position can be estimated (or sensed, measured, detected, etc.). In some cases, exemplary mapping algorithms described above may use only a device's linear position, only a device's angular position or both its linear position and angular position as inputs to the mapping algorithms. Then, at step 1302, at least one of the estimated linear position and estimated angular position are processed using both a first mapping algorithm, to generate a first cursor location, and a second mapping algorithm, to generate a second cursor location. The results are combined at step 1304 to generate a final cursor output.
  • The mapping algorithms may operate on partial or incomplete motion data. For example, user-frame relative pointing is useful with only one of the two terms. Some applications can utilize user-frame relative pointing while collecting either angular motion or linear motion, but not both. Sensor(s) may collect and estimate motion in either user-frame coordinates, body-frame coordinates or a combination of user-frame and body-frame coordinates. The mapping algorithm may operate in either the body-frame coordinate system, user-frame coordinate system, or any other coordinate system. Motion may be measured in any coordinate system including Cartesian and spherical. The mapping algorithm may use derivatives of linear position including velocity and acceleration. The mapping algorithm may use derivatives of angular position including velocity and acceleration. The mapping combination method may be trivial and only use data from one mapping method. The factors for other algorithms may be 0. Mathematical terms with 0 valued coefficients need not be computed or appear in the final implementation.
  • Systems and methods for processing data according to exemplary embodiments of the present invention can be performed by one or more processors executing sequences of instructions contained in a memory device. Such instructions may be read into the memory device from other computer-readable mediums such as secondary data storage device(s). Execution of the sequences of instructions contained in the memory device causes the processor to operate, for example, as described above. In alternative embodiments, hard-wire circuitry may be used in place of or in combination with software instructions to implement the present invention. Such software may run on a processor which is housed within the device, e.g., a 3D pointing device or other device, which contains the sensors or the software may run on a processor or computer housed within another device, e.g., a system controller, a game console, a personal computer, etc., which is in communication with the device containing the sensors. In such a case, data may be transferred via wireline or wirelessly between the device containing the sensors and the device containing the processor which runs the software which performs the pointer mapping as described above. According to other exemplary embodiments, some of the processing described above with respect to pointer mapping may be performed in the device containing the sensors, while the remainder of the processing is performed in a second device after receipt of the partially processed data from the device containing the sensors.
  • Although the foregoing exemplary embodiments relate to sensing packages including one or more rotational sensors and an accelerometer, pointer mapping techniques according to these exemplary embodiments are not limited to only these types of sensors. Instead pointer mapping techniques as described herein can be applied to devices which include, for example, only accelerometer(s), optical and inertial sensors (e.g., a rotational sensor, a gyroscope, an angular velocity sensor or a linear accelerometer), a magnetometer and an inertial sensor (e.g., a rotational sensor, a gyroscope or a linear accelerometer), a magnetometer and an optical sensor, or other sensor combinations. Additionally, although exemplary embodiments described herein relate to cursor mapping in the context of 3D pointing devices and applications, such techniques are not so limited and may be employed in methods and devices associated with other applications, e.g., medical applications, gaming, cameras, military applications, etc.
  • The above-described exemplary embodiments are intended to be illustrative in all respects, rather than restrictive, of the present invention. Thus, the present invention is capable of many variations in detailed implementation that can be derived from the description contained herein by a person skilled in the art. For example, although the foregoing exemplary embodiments describe, among other things, the use of inertial sensors to detect movement of a device, other types of sensors (e.g., ultrasound, magnetic or optical) can be used instead of, or in addition to, inertial sensors in conjunction with the afore-described signal processing. All such variations and modifications are considered to be within the scope and spirit of the present invention as defined by the following claims No element, act, or instruction used in the description of the present application should be construed as critical or essential to the invention unless explicitly described as such. Also, as used herein, the article “a” is intended to include one or more items.

Claims (25)

1. A method for mapping a device's movement into cursor position, comprising:
estimating at least one of said device's linear position and angular position;
processing at least one of said estimated linear position and estimated angular position using both a first mapping algorithm to generate a first cursor location and a second mapping algorithm to generate a second cursor location; and
combining said first cursor location and said second cursor location to generate a final cursor output.
2. The method of claim 1, wherein said first mapping algorithm is an absolute pointing algorithm and wherein said second mapping algorithm is a relative pointing algorithm.
3. The method of claim 1, wherein at least one of the mapping algorithms is an absolute pointing algorithm which has a characteristic that a ratio of cursor motion to angular device motion changes as a function of distance between said device and a display on which a cursor is displayed.
4. The method of claim 3, wherein said absolute pointing algorithm maps device motion to cursor location based upon an intersection of a forward pointing direction (body-frame x-axis) of the device and a surface of a display on which said cursor is to be displayed.
5. The method of claim 4, wherein said absolute pointing algorithm is defined as:
p = - u x u u x x [ u x y u x z ] + [ u y u u z u ] + p 0 ,
where
uux is a detected value of the linear position x-axis in a user's frame of reference;
uuy is a detected value of the linear position y-axis in a user's frame of reference;
uuz is a detected value of the linear position z-axis in a user's frame of reference;
ux is a detected value of the body-frame x-axis in a user's frame of reference;
uy is a detected value of the body-frame y-axis along an y-axis in a user's frame of reference;
uz is a detected value of the body-frame z-axis along a z-axis in a user's frame of reference; and
p0 is the 2D vector value of the display coordinate system origin in cursor coordinates.
6. The method of claim 1, wherein at least one of the mapping algorithms is an absolute invariant algorithm which has a characteristic of providing a direct, repeatable mapping from device linear position and angular position into cursor location and a second characteristic that cursor responsiveness to linear motion and angular motion is consistent over a motion range.
7. The method of claim 6, wherein said absolute invariant algorithm generates cursor position as a sum of a term of linear position values and a term computed from angular position that is independent of linear position.
8. The method of claim 7, wherein said absolute invariant algorithm is defined as:
u x = Vector ( q 0 , b x q * ) p = c [ tan - 1 u x y u x x tan - 1 u x z u x x ] + [ u y u u z u ] + p 0
or approximated by
p = c [ sin - 1 u x y sin - 1 u x z ] + [ u y u u z u ] + p 0
where:
q is said angular position;
uuy is a detected value of the linear position y-axis in a user's frame of reference;
uuz is a detected value of the linear position x-axis in a user's frame of reference;
ux is a detected value of the body-frame x-axis in a user's frame of reference;
uy is a detected value of the body-frame y-axis in a user's frame of reference;
uz is a detected value of the body-frame z-axis in a user's frame of reference; and
p0 is the 2D vector value of the display coordinate system origin in cursor coordinates.
9. The method of claim 1, wherein at least one of the mapping algorithms creates an intermediate, virtual display that moves to face the device.
10. The method of claim 9, wherein said wherein said intermediate, planar virtual display maps device motion to cursor location based upon an intersection of a forward pointing direction (body-frame x-axis) of the device and a surface of a display on which said cursor is to be displayed rotated to at least partially face the device.
11. The method of claim 10, wherein said intermediate, planar virtual display is defined as:
Q = [ | | | x Δ u y Δ u z Δ u | | | ] T s = Q u x w = Q u u p = w x s x [ s y s z ] + [ w y w z ] + p 0
where:
uxΔ is a detected value of the virtual display x-axis in a user-frame of reference;
uyΔ is a detected value of the virtual display y-axis in a user-frame of reference;
uzΔ is a detected value of the virtual display z-axis in a user-frame of reference;
wy is a detected value of the linear position y-axis in the virtual display frame of reference;
wz is a detected value of the linear position z-axis in the virtual display frame of reference;
sx is a detected value of the body-frame x-axis in the virtual display frame of reference;
sy is a detected value of the body-frame y-axis in the virtual display frame of reference;
sz is a detected value of the body-frame z-axis in the virtual display frame of reference; and
p0 is the 2D vector value of the display coordinate system origin in cursor coordinates.
12. The method of claim 1, wherein at least one of the mapping algorithms creates an intermediate, spherical virtual display that moves to face the device.
13. The method of claim 12, wherein said spherical virtual display maps device motion to cursor location based upon spherical coordinates of the estimated angular position being transformed into the virtual display coordinates by a transformation matrix and converted into cursor location.
14. The method of claim 13, wherein said mapping is performed without using said estimated linear position of said device.
15. The method of claim 14, wherein said spherical virtual display is defined as:
φ = sin - 1 ( - b z ) θ = tan - 1 ( b y b x ) [ p x p y 1 ] = T [ θ - φ 1 ]
where:
bx is the detected value of the body-frame x-axis in the virtual display frame of reference;
by is the detected value of the body-frame y-axis in the virtual display frame of reference;
bz is the detected value of the body-frame z-axis in the virtual display frame of reference; and
T is a 3×3 transformation matrix of arbitrary constants which may apply any combination of a scale, rotation, translation, shearing, reflection, orthogonal projection, affine transformation and a perspective projection.
16. The method of claim 1, wherein at least one of the mapping algorithms creates an intermediate, relative, spherical virtual display which uses a change in angle between the device and the display to determine a change in cursor location.
17. The method of claim 16, said intermediate, relative, spherical virtual display maps device motion to cursor location based upon body-frame angular velocity modified by a transformation matrix and converted into cursor change in location.
18-28. (canceled)
29. A 3D pointing device comprising:
at least one sensor configured to generate an output which is associated with movement of said 3D pointing device; and
a processor configured to estimate at least one of said device's linear position and angular position using said output, to process at least one of said estimated linear position and angular position using both a first mapping algorithm to generate a first cursor location and a second mapping algorithm to generate a second cursor location, and to combine said first cursor location and said second cursor location to generate a final cursor output.
30. The 3D pointing device of claim 29, wherein said first mapping algorithm is an absolute pointing algorithm and wherein said second mapping algorithm is a relative pointing algorithm.
31. The 3D pointing device of claim 29, wherein at least one of the mapping algorithms is an absolute pointing algorithm which has a characteristic that a ratio of cursor motion to angular device motion changes as a function of distance between said device and a display on which a cursor is displayed.
32. The 3D pointing device of claim 31, wherein said absolute pointing algorithm maps device motion to cursor location based upon an intersection of a forward pointing direction (body-frame x-axis) of the device and a surface of a display on which said cursor is to be displayed.
33. The 3D pointing device of claim 32, wherein said absolute pointing algorithm is defined as:
p = - u x u u x x [ u x y u x z ] + [ u y u u z u ] + p 0 ,
where
uux is a detected value of the linear position x-axis in a user's frame of reference;
uuy is a detected value of the linear position y-axis in a user's frame of reference;
uuz is a detected value of the linear position z-axis in a user's frame of reference;
ux is a detected value of the body-frame x-axis in a user's frame of reference;
uy is a detected value of the body-frame y-axis along an y-axis in a user's frame of reference;
uz is a detected value of the body-frame z-axis along a z-axis in a user's frame of reference; and
p0 is the 2D vector value of the display coordinate system origin in cursor coordinates.
34. The 3D pointing device of claim 29, wherein at least one of the mapping algorithms is an absolute invariant algorithm which has a characteristic of providing a direct, repeatable mapping from device linear position and angular position into cursor location and a second characteristic that cursor responsiveness to linear motion and angular motion is consistent over a motion range.
35-106. (canceled)
US13/000,889 2008-07-01 2009-07-01 3D Pointer Mapping Abandoned US20110227825A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US13/000,889 US20110227825A1 (en) 2008-07-01 2009-07-01 3D Pointer Mapping

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US7723808P 2008-07-01 2008-07-01
PCT/US2009/049411 WO2010002997A1 (en) 2008-07-01 2009-07-01 3d pointer mapping
US13/000,889 US20110227825A1 (en) 2008-07-01 2009-07-01 3D Pointer Mapping

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2009/049411 A-371-Of-International WO2010002997A1 (en) 2008-07-01 2009-07-01 3d pointer mapping

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/010,145 Continuation US10620726B2 (en) 2008-07-01 2018-06-15 3D pointer mapping

Publications (1)

Publication Number Publication Date
US20110227825A1 true US20110227825A1 (en) 2011-09-22

Family

ID=41466324

Family Applications (2)

Application Number Title Priority Date Filing Date
US13/000,889 Abandoned US20110227825A1 (en) 2008-07-01 2009-07-01 3D Pointer Mapping
US16/010,145 Active US10620726B2 (en) 2008-07-01 2018-06-15 3D pointer mapping

Family Applications After (1)

Application Number Title Priority Date Filing Date
US16/010,145 Active US10620726B2 (en) 2008-07-01 2018-06-15 3D pointer mapping

Country Status (6)

Country Link
US (2) US20110227825A1 (en)
EP (1) EP2297675A4 (en)
JP (1) JP5866199B2 (en)
KR (2) KR20110039318A (en)
CN (2) CN108664156B (en)
WO (1) WO2010002997A1 (en)

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110102321A1 (en) * 2009-11-05 2011-05-05 Lg Electronics Inc. Image display apparatus and method for controlling the image display apparatus
US20120143526A1 (en) * 2010-07-15 2012-06-07 The Cleveland Clinic Foundation Detection and characterization of head impacts
US8384665B1 (en) * 2006-07-14 2013-02-26 Ailive, Inc. Method and system for making a selection in 3D virtual environment
US20130151195A1 (en) * 2011-12-13 2013-06-13 Stmicroelectronics S.R.L. System and method for compensating orientation of a portable device
CN103294177A (en) * 2012-02-29 2013-09-11 株式会社理光 Cursor moving control method and system
CN103488294A (en) * 2013-09-12 2014-01-01 华南理工大学 Non-contact gesture control mapping adjustment method based on user interactive habits
US20150042563A1 (en) * 2012-03-30 2015-02-12 Sony Corporation Control method, control apparatus, and program
EP2853992A1 (en) 2013-09-27 2015-04-01 Movea Air pointer with improved user experience
US9024876B2 (en) 2012-09-06 2015-05-05 Interphase Corporation Absolute and relative positioning sensor fusion in an interactive display system
CN104750442A (en) * 2013-12-31 2015-07-01 冠捷投资有限公司 Indicator display and control method
US20160091992A1 (en) * 2011-10-28 2016-03-31 Esat Yilmaz Executing Gestures with Active Stylus
WO2016174170A1 (en) * 2015-04-28 2016-11-03 Centre National D'etudes Spatiales Method of controlling a calculation device via a mobile element and control system implementing this method
US20170156035A1 (en) * 2013-10-20 2017-06-01 Oahu Group, Llc Method and system for determining object motion by capturing motion data via radio frequency phase and direction of arrival detection
CN107508889A (en) * 2017-08-25 2017-12-22 湖南城市学院 A kind of grain security retroactive method and system
WO2018070750A1 (en) * 2016-10-10 2018-04-19 홍유정 Object controller
US9952684B2 (en) 2013-05-09 2018-04-24 Samsung Electronics Co., Ltd. Input apparatus, pointing apparatus, method for displaying pointer, and recordable medium
CN110088712A (en) * 2016-10-10 2019-08-02 洪侑廷 Object controller
US20190302903A1 (en) * 2018-03-30 2019-10-03 Microsoft Technology Licensing, Llc Six dof input device

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9201516B2 (en) * 2010-06-03 2015-12-01 Hillcrest Laboratories, Inc. Determining forward pointing direction of a handheld device
TWI481871B (en) * 2010-09-28 2015-04-21 J Mex Inc Apparatus and system and method for interacting with target in operation area
KR20120046973A (en) * 2010-11-03 2012-05-11 삼성전자주식회사 Method and apparatus for generating motion information
US8907892B2 (en) 2010-11-22 2014-12-09 Hillcrest Laboratories, Inc. 3D pointing device with up-down-left-right mode switching and integrated swipe detector
CN107943403B (en) * 2010-11-22 2022-11-01 Idhl控股公司 3D pointing device with up-down-left-right mode switching and integrated swipe detector
KR101956173B1 (en) 2012-03-26 2019-03-08 삼성전자주식회사 Apparatus and Method for Calibrating 3D Position/Orientation Tracking System
CN103488312B (en) * 2012-08-22 2016-10-26 上海飞智电子科技有限公司 The method of positioning pointer position, system and equipment
KR20140060818A (en) * 2012-11-12 2014-05-21 삼성전자주식회사 Remote controller and display apparatus, control method thereof
US9740294B2 (en) * 2013-06-28 2017-08-22 Samsung Electronics Co., Ltd. Display apparatus and method for controlling display apparatus thereof
US10063802B2 (en) 2013-08-28 2018-08-28 Lg Electronics Inc. Multimedia device and method for controlling external devices of the same
DE102016005237B4 (en) * 2016-04-29 2020-09-24 Gebrüder Frei GmbH & Co. KG Remote control for motorized industrial trucks and automated guided vehicles
CN106293065A (en) * 2016-07-26 2017-01-04 上海与德通讯技术有限公司 The control method of application program and control system
CN106528517A (en) * 2016-11-01 2017-03-22 深圳市方直科技股份有限公司 Examination question locating and adjusting method and system
CN109395382A (en) * 2018-09-12 2019-03-01 苏州蜗牛数字科技股份有限公司 A kind of linear optimization method for rocking bar
CN110102050B (en) * 2019-04-30 2022-02-18 腾讯科技(深圳)有限公司 Virtual object display method and device, electronic equipment and storage medium

Citations (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4160973A (en) * 1977-10-11 1979-07-10 Massachusetts Institute Of Technology Three-dimensional display
US5554980A (en) * 1993-03-12 1996-09-10 Mitsubishi Denki Kabushiki Kaisha Remote control system
US5627565A (en) * 1994-05-26 1997-05-06 Alps Electric Co., Ltd. Space coordinates detecting device and input apparatus using same
US6110039A (en) * 1995-02-21 2000-08-29 Konami Co., Ltd. Shooting game machine
US6252579B1 (en) * 1997-08-23 2001-06-26 Immersion Corporation Interface device and method for providing enhanced cursor control with force feedback
US6292174B1 (en) * 1997-08-23 2001-09-18 Immersion Corporation Enhanced cursor control using limited-workspace force feedback devices
US20020085097A1 (en) * 2000-12-22 2002-07-04 Colmenarez Antonio J. Computer vision-based wireless pointing system
US6501515B1 (en) * 1998-10-13 2002-12-31 Sony Corporation Remote control system
US20030107551A1 (en) * 2001-12-10 2003-06-12 Dunker Garrett Storm Tilt input device
US20040070564A1 (en) * 2002-10-15 2004-04-15 Dawson Thomas P. Method and system for controlling a display device
US20040095317A1 (en) * 2002-11-20 2004-05-20 Jingxi Zhang Method and apparatus of universal remote pointing control for home entertainment system and computer
US20040123001A1 (en) * 1998-12-28 2004-06-24 Alps Electric Co., Ltd. Dual pointing device used to control a cursor having absolute and relative pointing devices
US20040268393A1 (en) * 2003-05-08 2004-12-30 Hunleth Frank A. Control framework with a zoomable graphical user interface for organizing, selecting and launching media items
US20050174324A1 (en) * 2003-10-23 2005-08-11 Hillcrest Communications, Inc. User interface devices and methods employing accelerometers
US20050212767A1 (en) * 2004-03-23 2005-09-29 Marvit David L Context dependent gesture response
US20050243062A1 (en) * 2004-04-30 2005-11-03 Hillcrest Communications, Inc. Free space pointing devices with tilt compensation and improved usability
US20060033713A1 (en) * 1997-08-22 2006-02-16 Pryor Timothy R Interactive video based games using objects sensed by TV cameras
US20060092133A1 (en) * 2004-11-02 2006-05-04 Pierre A. Touma 3D mouse and game controller based on spherical coordinates system and system for use
US20060152489A1 (en) * 2005-01-12 2006-07-13 John Sweetser Handheld vision based absolute pointing system
US20060178212A1 (en) * 2004-11-23 2006-08-10 Hillcrest Laboratories, Inc. Semantic gaming and application transformation
US20060262116A1 (en) * 2005-05-19 2006-11-23 Hillcrest Laboratories, Inc. Global navigation objects in user interfaces
US20060277571A1 (en) * 2002-07-27 2006-12-07 Sony Computer Entertainment Inc. Computer image and audio processing of intensity and input devices for interfacing with a computer program
US20070035518A1 (en) * 2005-07-01 2007-02-15 Hillcrest Laboratories, Inc. 3D pointing devices
US7236156B2 (en) * 2004-04-30 2007-06-26 Hillcrest Laboratories, Inc. Methods and devices for identifying users based on tremor
US7239301B2 (en) * 2004-04-30 2007-07-03 Hillcrest Laboratories, Inc. 3D pointing devices and methods
US20070176899A1 (en) * 2006-02-01 2007-08-02 Samsung Electronics Co., Ltd. Pointing device and method and pointer display apparatus and method
US20080070684A1 (en) * 2006-09-14 2008-03-20 Mark Haigh-Hutchinson Method and apparatus for using a common pointing input to control 3D viewpoint and object targeting
US20080106517A1 (en) * 2006-11-07 2008-05-08 Apple Computer, Inc. 3D remote control system employing absolute and relative position detection
US20080129688A1 (en) * 2005-12-06 2008-06-05 Naturalpoint, Inc. System and Methods for Using a Movable Object to Control a Computer
US20080278445A1 (en) * 2007-05-08 2008-11-13 Thinkoptics, Inc. Free-space multi-dimensional absolute pointer with improved performance
US20090259432A1 (en) * 2008-04-15 2009-10-15 Liberty Matthew G Tracking determination based on intensity angular gradient of a wave
US7656395B2 (en) * 2004-07-15 2010-02-02 Microsoft Corporation Methods and apparatuses for compound tracking systems
US7774155B2 (en) * 2006-03-10 2010-08-10 Nintendo Co., Ltd. Accelerometer-based controller
US7860676B2 (en) * 2007-06-28 2010-12-28 Hillcrest Laboratories, Inc. Real-time dynamic tracking of bias
US20110102321A1 (en) * 2009-11-05 2011-05-05 Lg Electronics Inc. Image display apparatus and method for controlling the image display apparatus
US20120194428A1 (en) * 2011-01-30 2012-08-02 Lg Electronics Inc. Image display apparatus and method for operating the same
US20130151195A1 (en) * 2011-12-13 2013-06-13 Stmicroelectronics S.R.L. System and method for compensating orientation of a portable device

Family Cites Families (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB591019A (en) 1945-04-13 1947-08-05 Charles Stanley Hudson Improvements in or relating to remote indicating compasses
US3931747A (en) 1974-02-06 1976-01-13 Sperry Rand Corporation Gyroscopic stable reference device
US4787051A (en) 1986-05-16 1988-11-22 Tektronix, Inc. Inertial mouse system
IL78889A (en) 1986-05-23 1989-09-28 Elbit Computers Ltd Electronic magnetic compass system
US5440326A (en) 1990-03-21 1995-08-08 Gyration, Inc. Gyroscopic pointer
US5396265A (en) 1990-09-17 1995-03-07 Massachusetts Institute Of Technology Three-dimensional tactile computer input device
US5181181A (en) 1990-09-27 1993-01-19 Triton Technologies, Inc. Computer apparatus input device for three-dimensional information
US6069594A (en) 1991-07-29 2000-05-30 Logitech, Inc. Computer input device with multiple switches using single line
US5453758A (en) 1992-07-31 1995-09-26 Sony Corporation Input apparatus
US5598187A (en) 1993-05-13 1997-01-28 Kabushiki Kaisha Toshiba Spatial motion pattern input system and input method
JP3217926B2 (en) * 1994-10-13 2001-10-15 アルプス電気株式会社 Spatial coordinate detector
US5902968A (en) 1996-02-20 1999-05-11 Ricoh Company, Ltd. Pen-shaped handwriting input apparatus using accelerometers and gyroscopes and an associated operational device for determining pen movement
US5825350A (en) 1996-03-13 1998-10-20 Gyration, Inc. Electronic pointing apparatus and method
US6078312A (en) * 1997-07-09 2000-06-20 Gateway 2000, Inc. Pointing device with absolute and relative positioning capability
US6369794B1 (en) 1998-09-09 2002-04-09 Matsushita Electric Industrial Co., Ltd. Operation indication outputting device for giving operation indication according to type of user's action
JP4236783B2 (en) * 1998-12-28 2009-03-11 アルプス電気株式会社 controller
US6737591B1 (en) 1999-05-25 2004-05-18 Silverbrook Research Pty Ltd Orientation sensing device
US7031875B2 (en) 2001-01-24 2006-04-18 Geo Vector Corporation Pointing systems for addressing objects
JP2003179700A (en) * 2002-08-23 2003-06-27 Hitachi Ltd Portable terminal
US7874917B2 (en) * 2003-09-15 2011-01-25 Sony Computer Entertainment Inc. Methods and systems for enabling depth and direction detection when interfacing with a computer program
US9229540B2 (en) * 2004-01-30 2016-01-05 Electronic Scripting Products, Inc. Deriving input from six degrees of freedom interfaces
US7853193B2 (en) * 2004-03-17 2010-12-14 Leapfrog Enterprises, Inc. Method and device for audibly instructing a user to interact with a function
JP2006137077A (en) 2004-11-12 2006-06-01 Seiko Epson Corp Printer, data conversion system, printing method, and data conversion method
JP4644800B2 (en) * 2005-01-07 2011-03-02 国立大学法人電気通信大学 3D position input device
US8935630B2 (en) * 2005-05-04 2015-01-13 Hillcrest Laboratories, Inc. Methods and systems for scrolling and pointing in user interfaces
WO2006137077A1 (en) * 2005-06-20 2006-12-28 Hewlett-Packard Development Company, L.P. A pointing device with absolute and relative positioning capability
JP5075330B2 (en) * 2005-09-12 2012-11-21 任天堂株式会社 Information processing program
JP4587940B2 (en) * 2005-11-11 2010-11-24 シャープ株式会社 Remote control system and display device
TWI291117B (en) * 2005-12-29 2007-12-11 High Tech Comp Corp A tapping operation method and a mobile electrical apparatus with tapping operation function
KR100855471B1 (en) * 2006-09-19 2008-09-01 삼성전자주식회사 Input device and method for providing movement information of the input device
CN101169831B (en) * 2006-10-25 2011-07-20 原相科技股份有限公司 Pointer positioning device and method
JP5127242B2 (en) * 2007-01-19 2013-01-23 任天堂株式会社 Acceleration data processing program and game program
EP2189879A4 (en) * 2007-09-12 2013-06-05 Sony Corp Input device, control device, control system, and control method
US8341544B2 (en) 2007-12-14 2012-12-25 Apple Inc. Scroll bar with video region in a media system
US8907892B2 (en) * 2010-11-22 2014-12-09 Hillcrest Laboratories, Inc. 3D pointing device with up-down-left-right mode switching and integrated swipe detector
US9542092B2 (en) * 2011-02-12 2017-01-10 Microsoft Technology Licensing, Llc Prediction-based touch contact tracking

Patent Citations (41)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4160973A (en) * 1977-10-11 1979-07-10 Massachusetts Institute Of Technology Three-dimensional display
US5554980A (en) * 1993-03-12 1996-09-10 Mitsubishi Denki Kabushiki Kaisha Remote control system
US5627565A (en) * 1994-05-26 1997-05-06 Alps Electric Co., Ltd. Space coordinates detecting device and input apparatus using same
US6110039A (en) * 1995-02-21 2000-08-29 Konami Co., Ltd. Shooting game machine
US20060033713A1 (en) * 1997-08-22 2006-02-16 Pryor Timothy R Interactive video based games using objects sensed by TV cameras
US6252579B1 (en) * 1997-08-23 2001-06-26 Immersion Corporation Interface device and method for providing enhanced cursor control with force feedback
US6292174B1 (en) * 1997-08-23 2001-09-18 Immersion Corporation Enhanced cursor control using limited-workspace force feedback devices
US6501515B1 (en) * 1998-10-13 2002-12-31 Sony Corporation Remote control system
US20040123001A1 (en) * 1998-12-28 2004-06-24 Alps Electric Co., Ltd. Dual pointing device used to control a cursor having absolute and relative pointing devices
US6983336B2 (en) * 1998-12-28 2006-01-03 Alps Electric Co., Ltd. Dual pointing device used to control a cursor having absolute and relative pointing devices
US20020085097A1 (en) * 2000-12-22 2002-07-04 Colmenarez Antonio J. Computer vision-based wireless pointing system
US20030107551A1 (en) * 2001-12-10 2003-06-12 Dunker Garrett Storm Tilt input device
US20060277571A1 (en) * 2002-07-27 2006-12-07 Sony Computer Entertainment Inc. Computer image and audio processing of intensity and input devices for interfacing with a computer program
US20040070564A1 (en) * 2002-10-15 2004-04-15 Dawson Thomas P. Method and system for controlling a display device
US20040095317A1 (en) * 2002-11-20 2004-05-20 Jingxi Zhang Method and apparatus of universal remote pointing control for home entertainment system and computer
US20040268393A1 (en) * 2003-05-08 2004-12-30 Hunleth Frank A. Control framework with a zoomable graphical user interface for organizing, selecting and launching media items
US20050174324A1 (en) * 2003-10-23 2005-08-11 Hillcrest Communications, Inc. User interface devices and methods employing accelerometers
US7489299B2 (en) * 2003-10-23 2009-02-10 Hillcrest Laboratories, Inc. User interface devices and methods employing accelerometers
US20050212767A1 (en) * 2004-03-23 2005-09-29 Marvit David L Context dependent gesture response
US20050243062A1 (en) * 2004-04-30 2005-11-03 Hillcrest Communications, Inc. Free space pointing devices with tilt compensation and improved usability
US7239301B2 (en) * 2004-04-30 2007-07-03 Hillcrest Laboratories, Inc. 3D pointing devices and methods
US7158118B2 (en) * 2004-04-30 2007-01-02 Hillcrest Laboratories, Inc. 3D pointing devices with orientation compensation and improved usability
US20070257885A1 (en) * 2004-04-30 2007-11-08 Hillcrest Laboratories, Inc. 3D pointing devices with orientation compensation and improved usability
US7236156B2 (en) * 2004-04-30 2007-06-26 Hillcrest Laboratories, Inc. Methods and devices for identifying users based on tremor
US7656395B2 (en) * 2004-07-15 2010-02-02 Microsoft Corporation Methods and apparatuses for compound tracking systems
US20060092133A1 (en) * 2004-11-02 2006-05-04 Pierre A. Touma 3D mouse and game controller based on spherical coordinates system and system for use
US20060178212A1 (en) * 2004-11-23 2006-08-10 Hillcrest Laboratories, Inc. Semantic gaming and application transformation
US20060152489A1 (en) * 2005-01-12 2006-07-13 John Sweetser Handheld vision based absolute pointing system
US20060262116A1 (en) * 2005-05-19 2006-11-23 Hillcrest Laboratories, Inc. Global navigation objects in user interfaces
US20070035518A1 (en) * 2005-07-01 2007-02-15 Hillcrest Laboratories, Inc. 3D pointing devices
US20080129688A1 (en) * 2005-12-06 2008-06-05 Naturalpoint, Inc. System and Methods for Using a Movable Object to Control a Computer
US20070176899A1 (en) * 2006-02-01 2007-08-02 Samsung Electronics Co., Ltd. Pointing device and method and pointer display apparatus and method
US7774155B2 (en) * 2006-03-10 2010-08-10 Nintendo Co., Ltd. Accelerometer-based controller
US20080070684A1 (en) * 2006-09-14 2008-03-20 Mark Haigh-Hutchinson Method and apparatus for using a common pointing input to control 3D viewpoint and object targeting
US20080106517A1 (en) * 2006-11-07 2008-05-08 Apple Computer, Inc. 3D remote control system employing absolute and relative position detection
US20080278445A1 (en) * 2007-05-08 2008-11-13 Thinkoptics, Inc. Free-space multi-dimensional absolute pointer with improved performance
US7860676B2 (en) * 2007-06-28 2010-12-28 Hillcrest Laboratories, Inc. Real-time dynamic tracking of bias
US20090259432A1 (en) * 2008-04-15 2009-10-15 Liberty Matthew G Tracking determination based on intensity angular gradient of a wave
US20110102321A1 (en) * 2009-11-05 2011-05-05 Lg Electronics Inc. Image display apparatus and method for controlling the image display apparatus
US20120194428A1 (en) * 2011-01-30 2012-08-02 Lg Electronics Inc. Image display apparatus and method for operating the same
US20130151195A1 (en) * 2011-12-13 2013-06-13 Stmicroelectronics S.R.L. System and method for compensating orientation of a portable device

Cited By (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8384665B1 (en) * 2006-07-14 2013-02-26 Ailive, Inc. Method and system for making a selection in 3D virtual environment
US9152248B1 (en) * 2006-07-14 2015-10-06 Ailive Inc Method and system for making a selection in 3D virtual environment
US9681112B2 (en) * 2009-11-05 2017-06-13 Lg Electronics Inc. Image display apparatus and method for controlling the image display apparatus
US20110102321A1 (en) * 2009-11-05 2011-05-05 Lg Electronics Inc. Image display apparatus and method for controlling the image display apparatus
US9149227B2 (en) * 2010-07-15 2015-10-06 The Cleveland Clinic Foundation Detection and characterization of head impacts
US20120143526A1 (en) * 2010-07-15 2012-06-07 The Cleveland Clinic Foundation Detection and characterization of head impacts
US10582883B2 (en) * 2010-07-15 2020-03-10 The Cleveland Clinic Foundation Detection and characterization of head impacts
US11399741B2 (en) 2010-07-15 2022-08-02 The Cleveland Clinic Foundation Detection and characterization of head impacts
US20160106346A1 (en) * 2010-07-15 2016-04-21 The Cleveland Clinic Foundation Detection and characterization of head impacts
US11520419B2 (en) 2011-10-28 2022-12-06 Wacom Co., Ltd. Executing gestures with active stylus
US11868548B2 (en) 2011-10-28 2024-01-09 Wacom Co., Ltd. Executing gestures with active stylus
US20160091992A1 (en) * 2011-10-28 2016-03-31 Esat Yilmaz Executing Gestures with Active Stylus
US11269429B2 (en) 2011-10-28 2022-03-08 Wacom Co., Ltd. Executing gestures with active stylus
US10599234B2 (en) * 2011-10-28 2020-03-24 Wacom Co., Ltd. Executing gestures with active stylus
US20130151195A1 (en) * 2011-12-13 2013-06-13 Stmicroelectronics S.R.L. System and method for compensating orientation of a portable device
CN103294177A (en) * 2012-02-29 2013-09-11 株式会社理光 Cursor moving control method and system
US20150042563A1 (en) * 2012-03-30 2015-02-12 Sony Corporation Control method, control apparatus, and program
US10114478B2 (en) * 2012-03-30 2018-10-30 Sony Corporation Control method, control apparatus, and program
US9024876B2 (en) 2012-09-06 2015-05-05 Interphase Corporation Absolute and relative positioning sensor fusion in an interactive display system
US9952684B2 (en) 2013-05-09 2018-04-24 Samsung Electronics Co., Ltd. Input apparatus, pointing apparatus, method for displaying pointer, and recordable medium
CN103488294A (en) * 2013-09-12 2014-01-01 华南理工大学 Non-contact gesture control mapping adjustment method based on user interactive habits
WO2015044345A1 (en) 2013-09-27 2015-04-02 Movea Air pointer with improved user experience
EP2853992A1 (en) 2013-09-27 2015-04-01 Movea Air pointer with improved user experience
US10331240B2 (en) * 2013-09-27 2019-06-25 Movea, Inc. Air pointer with improved user experience
US20160216775A1 (en) * 2013-09-27 2016-07-28 Movea Air pointer with improved user experience
US20170156035A1 (en) * 2013-10-20 2017-06-01 Oahu Group, Llc Method and system for determining object motion by capturing motion data via radio frequency phase and direction of arrival detection
US9867013B2 (en) * 2013-10-20 2018-01-09 Oahu Group, Llc Method and system for determining object motion by capturing motion data via radio frequency phase and direction of arrival detection
CN104750442A (en) * 2013-12-31 2015-07-01 冠捷投资有限公司 Indicator display and control method
WO2016174170A1 (en) * 2015-04-28 2016-11-03 Centre National D'etudes Spatiales Method of controlling a calculation device via a mobile element and control system implementing this method
FR3035718A1 (en) * 2015-04-28 2016-11-04 Centre Nat D'etudes Spatiales (Cnes) METHOD FOR CONTROLLING A CALCULATION DEVICE VIA A MOBILE ELEMENT AND A CONTROL SYSTEM USING THE SAME
US10388148B2 (en) * 2015-04-28 2019-08-20 Centre National D'edutes Spatiales Method of controlling a calculation device via a mobile element and control system implementing this method
WO2018070750A1 (en) * 2016-10-10 2018-04-19 홍유정 Object controller
CN110088712A (en) * 2016-10-10 2019-08-02 洪侑廷 Object controller
CN107508889A (en) * 2017-08-25 2017-12-22 湖南城市学院 A kind of grain security retroactive method and system
US20190302903A1 (en) * 2018-03-30 2019-10-03 Microsoft Technology Licensing, Llc Six dof input device

Also Published As

Publication number Publication date
CN108664156B (en) 2022-02-25
US20180307334A1 (en) 2018-10-25
JP2011527053A (en) 2011-10-20
WO2010002997A1 (en) 2010-01-07
US10620726B2 (en) 2020-04-14
CN102099814A (en) 2011-06-15
EP2297675A4 (en) 2014-04-09
KR20140095574A (en) 2014-08-01
CN102099814B (en) 2018-07-24
JP5866199B2 (en) 2016-02-17
CN108664156A (en) 2018-10-16
EP2297675A1 (en) 2011-03-23
KR101617562B1 (en) 2016-05-02
KR20110039318A (en) 2011-04-15

Similar Documents

Publication Publication Date Title
US10620726B2 (en) 3D pointer mapping
US10782792B2 (en) 3D pointing devices with orientation compensation and improved usability
US10120463B2 (en) Determining forward pointing direction of a handheld device
US8072424B2 (en) 3D pointing devices with orientation compensation and improved usability
US9250716B2 (en) Real-time dynamic tracking of bias
EP2337016B1 (en) Free space pointing devices with tilt compensation and improved usability

Legal Events

Date Code Title Description
AS Assignment

Owner name: HILLCREST LABORATORIES, INC., MARYLAND

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LIBERTY, MATTHEW G.;COOK, BRYAN A.;SHENG, HUA;REEL/FRAME:025927/0029

Effective date: 20110131

AS Assignment

Owner name: MULTIPLIER CAPITAL, LP, MARYLAND

Free format text: SECURITY AGREEMENT;ASSIGNOR:HILLCREST LABORATORIES, INC.;REEL/FRAME:037963/0405

Effective date: 20141002

AS Assignment

Owner name: IDHL HOLDINGS, INC., DELAWARE

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HILLCREST LABORATORIES, INC.;REEL/FRAME:042747/0445

Effective date: 20161222

AS Assignment

Owner name: HILLCREST LABORATORIES, INC., DELAWARE

Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:MULTIPLIER CAPITAL, LP;REEL/FRAME:043339/0214

Effective date: 20170606

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO PAY ISSUE FEE