WO2003077107A2 - Multi-dimensional object manipulation - Google Patents

Multi-dimensional object manipulation Download PDF

Info

Publication number
WO2003077107A2
WO2003077107A2 PCT/EP2002/011803 EP0211803W WO03077107A2 WO 2003077107 A2 WO2003077107 A2 WO 2003077107A2 EP 0211803 W EP0211803 W EP 0211803W WO 03077107 A2 WO03077107 A2 WO 03077107A2
Authority
WO
WIPO (PCT)
Prior art keywords
manipulation
screen
function
module
freedom
Prior art date
Application number
PCT/EP2002/011803
Other languages
French (fr)
Other versions
WO2003077107A3 (en
Inventor
Bernd Gombert
Original Assignee
3Dconnexion Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 3Dconnexion Gmbh filed Critical 3Dconnexion Gmbh
Priority to AU2002350596A priority Critical patent/AU2002350596A1/en
Priority to EP02785274A priority patent/EP1483658A2/en
Publication of WO2003077107A2 publication Critical patent/WO2003077107A2/en
Publication of WO2003077107A3 publication Critical patent/WO2003077107A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation

Definitions

  • the present invention relates to control devices, and more particularly, to control devices configured for use to manipulate multi-dimensional objects.
  • a user of a three-dimensional (“3D”)/two-dimensional (“2D”) computer aided design (“CAD”) system often needs to view objects on a computer screen (or display) in a multitude of ways. For example, the user may need to zoom in on an object or zoom out from an object. To view the object, the user must work through a number of individual, specific steps before actually displaying the view the user desires. This process is both time consuming and tedious, which leads to inefficiency, movement fatigue, and viewing fatigue for the user.
  • 3D three-dimensional
  • 2D two-dimensional
  • CAD computer aided design
  • a common viewing operation requires the user to zoom in on a specific detail while viewing or working on a 3D/2D object on the screen.
  • the user When working with a 3D input device the user would first have to turn off the specific, single degrees of freedom (e.g., an x dimension, an y-dimension, a roll, a pitch, and a yaw). The user must then adjust the zoom sensitivity settings, e.g., increasing the sensitivity to its extreme. Further, to get a particular desired result, various additional steps may be necessary. In all, the steps necessary for the zoom operation do not support a spontaneous viewing of the object for the user.
  • the user's appendages and vision tire quickly because the user is forced to work through positioning and aligning the cursor with the object and various menus, sliders, and/or button (or clicking around the screen) on the screen before finally obtaining the view that the user desires.
  • Figure 1 illustrates a view processing system in accordance with one embodiment of the present invention.
  • Figures 2a and 2b illustrate various embodiments for processes to view and manipulate an object on a screen in accordance with the present invention.
  • Figure 3 illustrates an example of one embodiment of a manipulation system for executing a manipulation mode, e.g., a zoom function or a pan function, in accordance with the present invention.
  • a manipulation mode e.g., a zoom function or a pan function
  • Figure 4 illustrates an example of one embodiment for a process of performing a zoom function in accordance with the present invention.
  • Figures 5a through 5f illustrate screen shots of objects viewed before executing a desired viewing function and after execution of the desired viewing function in accordance with embodiments of the present invention.
  • Figures 6a and 6b illustrate examples of a computer system environment and a software environment for use and application with embodiments of the present invention as described herein.
  • the present invention provides for spontaneous viewing of an object through an input device configured to allow for multi-dimensional object manipulation.
  • the present invention allows for an input device, e.g., a three-dimensional ("3D") input device, a two-dimensional ("2D") input device, a keyboard, a joystick or any combination thereof, to activate a specified viewing function with regard to an object on a computer screen (or display) without having to step through a series of menus, commands, buttons, and sliders.
  • the present invention allows for activation of the specified viewing function through other input devices, for example, speech control, eye movement, or the like.
  • the specified viewing function may be a "fast zoom" function that allows for rapidly selecting and executing zooming function, e.g., zoom in or zoom out, of a multidimensional object displayed on the computer screen.
  • zooming function e.g., zoom in or zoom out
  • the viewing functions are not limited to zooming and include other viewing functions, for example, panning and/or rotational movement along a particular axis.
  • the present invention may also extend to application of viewing functions in any of a multitude of degrees of freedom (“DOF”), for example, six degrees of freedom (“DOF").
  • DOF degrees of freedom
  • the present invention may be extended to more than six degrees of freedom, for example, ten degrees of freedom, and that extending to these other dimensions allows the present invention to be applied to additional functionality, for example, force feedback control and additional zooming within a zoomed view.
  • a two-handed approach may be used for a process to manipulate a view of an object on a screen.
  • a 3D input device may initiate the two-handed approach such that a user can execute a function, e.g., a specific zoom, pan or rotational movements around selected axis, spontaneously and with a minimum of effort while viewing or manipulating objects on the screen.
  • the object may be, for example, a graphic object, a graphic model, or may be the viewing area of the screen.
  • the user triggers a process to jump the 3D input device into a fast zoom mode.
  • the trigger may be, for example, switching or pressing a specified switch(s) or button(s) on the 3D input device, pressing a particular button or wheel on a conventional 2D pointing device (e.g., a center button or wheel on a computer mouse), or pressing a particular button or key on a keyboard.
  • the triggering event activates a zoom axis and appropriately adjusts the speed for object movement. For example, in the fast zoom example, the speed of the object's movement is automatically increased to double speed.
  • the present invention is also configured to define a point of origin for the viewing operation using a cursor position of an input device relative to an object.
  • the point of origin provides the location around with the viewing operation is executed (or performed).
  • the object may be, for example, a graphic object, a graphic model, or may be the viewing area of the screen. This allows the user to execute a viewing or manipulation function (or operation) in a clear, pre-defined manor.
  • the cursor position of the 2D pointing device is used to define a point of origin.
  • point of origin may be defined by a cursor position identified through the 3D input device, a keyboard, a joystick, a speech operation, or an eye movement operation.
  • a data processing system in accordance with the present invention includes a computer system and a multi-dimensional computer input device that are coupled with each other.
  • the computer system is a conventional computer system that includes a screen (or display).
  • the multi-dimensional input device may be a three-dimensional ("3D") control device, for example, a 3D pointing device or a computer joystick that allows for speed and/or velocity control.
  • the data processing system may also include a 2D control device, for example, a conventional computer-pointing device, or a keyboard that allows for positioning control.
  • the computer system of the present invention includes a conventional computer operating system and includes a device driver for the multi-dimensional input device.
  • the computer system may also include device drivers for the other input devices, including the conventional computer-pointing device and the keyboard.
  • the device drivers couple with the operating system and the devices and facilitate device control operations through message, command, and/or instruction passing between the operating system of the computer system and the respective input device.
  • the multi-dimensional input device may be a bidirectional input device (i.e., can transmit to, as well as receive from, information with regard to the computer system).
  • the other input devices may also be bi-directional input devices.
  • FIG. 1 illustrates a view processing system 105 in accordance with one embodiment of the present invention.
  • the view processing system 105 includes a selection module 110, a point of origin module 120, a manipulation module 130, a multi-dimensional input device control module 140, and an interface module 150.
  • Each module 110, 120, 130, 140, 150 may be configured in software, hardware, firmware or a combination of two or more of these.
  • each module 110, 120, 130, 140, 150 may be coupled with each other.
  • the selection module 110 is configured to identify and/or select where a viewing operation is to occur on an object, which may include the screen itself as is previously described.
  • the point of origin module 120 is configured to identify a point of origin for the execution (or operation) of the viewing function.
  • the point of origin module 120 may also be configured to include a center of gravity sub-module.
  • the center of gravity sub-module may be configured to identify and or calculate a center of gravity around which the viewing function may occur.
  • the manipulation operation module 130 and the multi-dimensional input device control module 140 may functionally be one module or could be separate modules as is illustrated.
  • the manipulation operation module 130 is configured to execute the viewing function selected by the user.
  • the manipulation operation module 130 is also configured to appropriately adjust the speed for the viewing function and a manipulation operation in response to the selected viewing function. For example, a zoom in process may increase speed, while a zoom out process may decrease speed.
  • the multi-dimensional input device control module 140 is configured to carry out manipulation of the object once the selected viewing function is executed.
  • the interface ⁇ module 150 is optional and provides a screen interface for a user to customize particular parameters as default or event specific parameters for object manipulation.
  • the screen interface may be, for example, a window dialog box with buttons and/or sliders that allow for adjusting, changing, adding, or deleting particular manipulation modes or parameters associated with particular manipulation modes.
  • a manipulation mode is manipulation of an object (which may include the screen relative to an object) about a particular degree of freedom of movement.
  • the interface module 150 may allow for configuring a default particular manipulation mode such as a zoom or pan function.
  • the interface module 150 provides a tool to adjust parameters such as object movement speed along or about an axis of rotation or movement resistance (e.g., forced feedback) for the example zoom or pan function.
  • parameters such as object movement speed along or about an axis of rotation or movement resistance (e.g., forced feedback) for the example zoom or pan function.
  • the particular mode and associated parameters may be adjusted through the interface module 150 they may be saved as default settings in a configuration file.
  • this configuration file may be saved and recalled through the interface module 150 so that numerous configuration files can be saved and organized in a multitude of ways, including for example, per function, per user, or per device.
  • present invention may be configured for use with a positioning device that may be configured as a multi-dimensional control device providing speed and velocity control.
  • a positioning device that may be configured as a multi-dimensional control device providing speed and velocity control.
  • the present invention allows for selecting an object, using the selection point as a point of origin, and then allowing speed and/or velocity control based about the point of origin through rapidly clicking a second button on the 2D pointing device.
  • a 2D pointing device that includes a scroll wheel (e.g., a mechanical scroll wheel, an optical scroll wheel, a touch scroll wheel, pressure-scroll wheel, or other solid-state scroll mechanism)
  • the selection of an object and its point of origin may be through clicking or tapping a button or the scroll wheel of the 2D pointing device, and then scrolling the scroll wheel for speed and/or velocity control about the point of origin.
  • scrolling the scroll wheel forward can zoom in with regard to object at the point of origin and scrolling the scroll wheel backward can zoom out with regard to the object at the point of origin.
  • the modes and parameters of the 2D pointing device can be customized, configured, saved, and recalled through the interface module 150.
  • Figures 2a and 2b illustrate various embodiments for processes to view and manipulate an object on a screen in accordance with the present invention.
  • a first embodiment of a process starts 210 and determines 215 a location of a cursor on a screen.
  • the process then optionally determines 220 a point of origin for a manipulation mode function, e.g., a zoom function, relative to the position of the cursor.
  • the process then applies 225 a command associated with the manipulation mode function, e.g., execution of the zoom function.
  • the process then allows manipulation 230 of the object with regard to the particular degree of freedom of movement, e.g., zoom along a z-axis of six or more degrees of freedom of movement as well as adjusting particular speed and velocity relative to the zoom function being applied.
  • the process then ends 235 when operation of the manipulation function completes.
  • a second embodiment of a process starts 250 and an object is selected 255 for manipulation.
  • the process determines 260 a point of origin for the selected object.
  • the process may also determine the manipulation mode, e.g., a zoom function, at this point.
  • the process then provides 265 control to the multi-dimensional input device, if the multi-dimensional device does not already have control.
  • the process applies 270 a command associated with the manipulation mode function, e.g., execution of the zoom function.
  • the process allows manipulation 275 of the object with regard to the particular degree of freedom of movement, e.g., zoom along a z-axis of six or more degrees of freedom of movement as well as adjusting a particular speed and velocity relative to the zoom function being applied.
  • the process ends 280 when operation of the manipulation function completes.
  • Figure 3 illustrates an example of one embodiment of a manipulation system 305 for executing a manipulation mode, e.g., a zoom function or a pan function, in accordance with the present invention.
  • the manipulation system 305 includes an interface module 310, a trigger module 320, a parameter preparation module 330, a mode selection module 340, a mode lock module 350, and a mode-processing module 360.
  • Each module 310, 320, 330, 340, 350, 360 is coupled to each other and each may be implemented in software, hardware, firmware, or two or more of these.
  • the interface module 310 is configured to provide a screen and device interface for a user to select particular manipulation modes, e.g., a zoom function or a pan function, customize parameter settings for the particular manipulation mode, and saving and retrieving device and or user configuration information for the particular manipulation mode.
  • the trigger module 320 is configured to receive an event or trigger signal from an input device, e.g., a multi-dimensional input device.
  • the mode selection module 340 may be configured to function with the interface module 310 to set or change a manipulation mode and appropriate parameters for that manipulation mode. In addition, the mode selection module 340 also couples with the trigger module 320 to select or identify the execution of the particular manipulation mode.
  • the parameter preparation module 330 may be configured to function with the interface module for retrieving the particular configuration parameters for the particular manipulation mode when the trigger module 320 receives a trigger or event.
  • the mode lock module 350 locks and unlocks a particular manipulation mode for a user once a trigger or event is received, the mode is identified, parameters are set, and the mode is executed.
  • the mode-processing module 360 executes the manipulation mode when the mode lock module 350 locks the manipulation mode and stops executing when the mode lock module 350 unlocks (or de-selects) the manipulation mode.
  • FIG. 4 illustrates an example for a process of performing a manipulation function, specifically, a zoom function, in accordance with the present invention.
  • the process will be referred to as a "fast zoom" mode.
  • the process starts 410, and the fast zoom mode is triggered 420.
  • Triggering 420 can be through, for example, pressing a button on a 2D pointing device, pressing a key or button on a keyboard, a speech input or command, an eye- tracking input or process, or other triggering device, process or event.
  • the process then prepares 430 parameters for the fast zoom mode.
  • the object speed is adjusted, for example, to twice the object movement speed; axes (e.g., x-axis, y-axis, and z-axis) are disabled; a zoom axis is enabled; and a fast zoom mode is enabled (e.g., a true condition).
  • axes e.g., x-axis, y-axis, and z-axis
  • zoom axis is enabled
  • a fast zoom mode is enabled (e.g., a true condition).
  • the process allows for the device to function (or process) 450 in the fast zoom mode while 440 the fast zoom mode is enabled.
  • the process toggles the fast zoom mode (e.g., a false condition) and ends 480 the fast zoom mode.
  • the process could time out the fast zoom mode and end 480 the fast zoom mode 480. The process then ends 490.
  • Figures 5a through 5f are screen illustrations of objects viewed before executing a desired viewing function (e.g., (1)) and after execution of the desired viewing function (e.g., (2)) in accordance with embodiments of the present invention.
  • the screen illustrations demonstrate an example of a zoom viewing function using differing points of origin.
  • points of origin include a center of a screen, a center of an object, a particular location within an object, a particular location on the screen, and a further zoom of an already zoomed object.
  • Figures 6a and 6b illustrate examples of a computing system environment and a software environment for use and application with embodiments of the present invention as described herein.
  • Figure 6a illustrates a computing system environment 605 that includes a computing system 610; a multi-dimensional (e.g., 3D) control device that is used for speed and/or velocity control 620; and one or more optional position type input device (e.g., a keyboard, a pointing device such as a mouse, a trackball, an eraser pointer, or a touch pad) 630a- n.
  • the object for manipulation 640 is preferably on a screen of the computing system 610, although it is not limited to a screen and may be coupled with, but apart from the computer system 610.
  • Figure 6b illustrates a software environment 608 that includes a device driver 650, an operating system 660, a user interface 670, and an application 680 that allows for manipulation of multi-dimensional, e.g., three or more dimensions, objects.
  • the device driver 650 provides the interface between the computing system through its operating system 660 and a multi-dimensional input device 690.
  • the user interface 670 provides an interface for a user to select manipulation modes and parameters as previously described with respect to the interface module, e.g., 310.
  • the application 680 allows for viewing and manipulating the object 640 through the computer screen or directly.
  • the present invention advantageously is functional within input devices that allow for spontaneous viewing and manipulation of multiple degrees of freedom and do not require execution of a multitude of steps and layers to execute a particular viewing function.
  • the present invention helps increase user efficiency and helps decrease user appendage and visual fatigue.
  • the principles of the present invention may be applied to environments and situations where a user uses a multi-dimensional control device in accordance with the present invention to directly manipulate a three-dimensional object without the need for a computer screen.

Abstract

A control device for manipulating multi-dimensional objects on a screen comprises - a selection module (110) for activating in a single step a defined manipulation function with preset settings and disabling all other degrees of freedom for manipulating the object on the screen, - a manipulation operation module (130) for executing the activated viewing function, and - a multi-dimensional input device control module (140) for commanding a manipulation of an object once the activated viewing function is executed.

Description

MULTI-DIMENSIONAL OBJECT MANIPULATION
Inventor: Bemd Gombert
Background of the Invention
1. Field of Invention
The present invention relates to control devices, and more particularly, to control devices configured for use to manipulate multi-dimensional objects.
2. Description of the Related Art
A user of a three-dimensional ("3D")/two-dimensional ("2D") computer aided design ("CAD") system often needs to view objects on a computer screen (or display) in a multitude of ways. For example, the user may need to zoom in on an object or zoom out from an object. To view the object, the user must work through a number of individual, specific steps before actually displaying the view the user desires. This process is both time consuming and tedious, which leads to inefficiency, movement fatigue, and viewing fatigue for the user.
For example, a common viewing operation requires the user to zoom in on a specific detail while viewing or working on a 3D/2D object on the screen. When working with a 3D input device the user would first have to turn off the specific, single degrees of freedom (e.g., an x dimension, an y-dimension, a roll, a pitch, and a yaw). The user must then adjust the zoom sensitivity settings, e.g., increasing the sensitivity to its extreme. Further, to get a particular desired result, various additional steps may be necessary. In all, the steps necessary for the zoom operation do not support a spontaneous viewing of the object for the user. Moreover, the user's appendages and vision tire quickly because the user is forced to work through positioning and aligning the cursor with the object and various menus, sliders, and/or button (or clicking around the screen) on the screen before finally obtaining the view that the user desires.
Therefore, there is need for a system and process for spontaneously selecting a viewing function to view an object in any one of a plurality of viewing dimensions and also manipulating the object within the selected view.
Brief Description of the Drawings
Figure 1 illustrates a view processing system in accordance with one embodiment of the present invention.
Figures 2a and 2b illustrate various embodiments for processes to view and manipulate an object on a screen in accordance with the present invention.
Figure 3 illustrates an example of one embodiment of a manipulation system for executing a manipulation mode, e.g., a zoom function or a pan function, in accordance with the present invention.
Figure 4 illustrates an example of one embodiment for a process of performing a zoom function in accordance with the present invention.
Figures 5a through 5f illustrate screen shots of objects viewed before executing a desired viewing function and after execution of the desired viewing function in accordance with embodiments of the present invention.
Figures 6a and 6b illustrate examples of a computer system environment and a software environment for use and application with embodiments of the present invention as described herein.
Description of Embodiments of the Present Invention
The present invention provides for spontaneous viewing of an object through an input device configured to allow for multi-dimensional object manipulation. The present invention allows for an input device, e.g., a three-dimensional ("3D") input device, a two-dimensional ("2D") input device, a keyboard, a joystick or any combination thereof, to activate a specified viewing function with regard to an object on a computer screen (or display) without having to step through a series of menus, commands, buttons, and sliders. Moreover, the present invention allows for activation of the specified viewing function through other input devices, for example, speech control, eye movement, or the like.
In one embodiment, the specified viewing function may be a "fast zoom" function that allows for rapidly selecting and executing zooming function, e.g., zoom in or zoom out, of a multidimensional object displayed on the computer screen. It is noted that the viewing functions are not limited to zooming and include other viewing functions, for example, panning and/or rotational movement along a particular axis. The present invention may also extend to application of viewing functions in any of a multitude of degrees of freedom ("DOF"), for example, six degrees of freedom ("DOF"). It is noted that the present invention may be extended to more than six degrees of freedom, for example, ten degrees of freedom, and that extending to these other dimensions allows the present invention to be applied to additional functionality, for example, force feedback control and additional zooming within a zoomed view.
In one embodiment, a two-handed approach may be used for a process to manipulate a view of an object on a screen. A 3D input device may initiate the two-handed approach such that a user can execute a function, e.g., a specific zoom, pan or rotational movements around selected axis, spontaneously and with a minimum of effort while viewing or manipulating objects on the screen.
As an example, consideration is given to a user seeking to view and or manipulate an object on the screen with a 3D input device. It is noted that the object may be, for example, a graphic object, a graphic model, or may be the viewing area of the screen. If the user seeks to zoom in on a specific detail, the user triggers a process to jump the 3D input device into a fast zoom mode. The trigger may be, for example, switching or pressing a specified switch(s) or button(s) on the 3D input device, pressing a particular button or wheel on a conventional 2D pointing device (e.g., a center button or wheel on a computer mouse), or pressing a particular button or key on a keyboard. The triggering event activates a zoom axis and appropriately adjusts the speed for object movement. For example, in the fast zoom example, the speed of the object's movement is automatically increased to double speed.
In addition the present invention is also configured to define a point of origin for the viewing operation using a cursor position of an input device relative to an object. The point of origin provides the location around with the viewing operation is executed (or performed). As previously mentioned, the object may be, for example, a graphic object, a graphic model, or may be the viewing area of the screen. This allows the user to execute a viewing or manipulation function (or operation) in a clear, pre-defined manor. In one embodiment, the cursor position of the 2D pointing device is used to define a point of origin. In alternative embodiments, point of origin may be defined by a cursor position identified through the 3D input device, a keyboard, a joystick, a speech operation, or an eye movement operation.
In one embodiment, the present invention is functional in a data processing environment. For example, a data processing system in accordance with the present invention includes a computer system and a multi-dimensional computer input device that are coupled with each other. Preferably, the computer system is a conventional computer system that includes a screen (or display). The multi-dimensional input device may be a three-dimensional ("3D") control device, for example, a 3D pointing device or a computer joystick that allows for speed and/or velocity control. The data processing system may also include a 2D control device, for example, a conventional computer-pointing device, or a keyboard that allows for positioning control.
The computer system of the present invention includes a conventional computer operating system and includes a device driver for the multi-dimensional input device. The computer system may also include device drivers for the other input devices, including the conventional computer-pointing device and the keyboard. The device drivers couple with the operating system and the devices and facilitate device control operations through message, command, and/or instruction passing between the operating system of the computer system and the respective input device. It is noted that the multi-dimensional input device may be a bidirectional input device (i.e., can transmit to, as well as receive from, information with regard to the computer system). In addition, the other input devices may also be bi-directional input devices.
To further describe the features discussed previously, various Figures illustrate various embodiments for the present invention. Figure 1 illustrates a view processing system 105 in accordance with one embodiment of the present invention. The view processing system 105 includes a selection module 110, a point of origin module 120, a manipulation module 130, a multi-dimensional input device control module 140, and an interface module 150. Each module 110, 120, 130, 140, 150 may be configured in software, hardware, firmware or a combination of two or more of these. Moreover, each module 110, 120, 130, 140, 150 may be coupled with each other.
The selection module 110 is configured to identify and/or select where a viewing operation is to occur on an object, which may include the screen itself as is previously described. The point of origin module 120 is configured to identify a point of origin for the execution (or operation) of the viewing function. The point of origin module 120 may also be configured to include a center of gravity sub-module. The center of gravity sub-module may be configured to identify and or calculate a center of gravity around which the viewing function may occur.
The manipulation operation module 130 and the multi-dimensional input device control module 140 may functionally be one module or could be separate modules as is illustrated. The manipulation operation module 130 is configured to execute the viewing function selected by the user. The manipulation operation module 130 is also configured to appropriately adjust the speed for the viewing function and a manipulation operation in response to the selected viewing function. For example, a zoom in process may increase speed, while a zoom out process may decrease speed. The multi-dimensional input device control module 140 is configured to carry out manipulation of the object once the selected viewing function is executed.
Iris noted that the interface~module 150 is optional and provides a screen interface for a user to customize particular parameters as default or event specific parameters for object manipulation. The screen interface may be, for example, a window dialog box with buttons and/or sliders that allow for adjusting, changing, adding, or deleting particular manipulation modes or parameters associated with particular manipulation modes. It is noted that a manipulation mode is manipulation of an object (which may include the screen relative to an object) about a particular degree of freedom of movement. As an example of customization, the interface module 150 may allow for configuring a default particular manipulation mode such as a zoom or pan function. Further, as another example, the interface module 150 provides a tool to adjust parameters such as object movement speed along or about an axis of rotation or movement resistance (e.g., forced feedback) for the example zoom or pan function. Once the particular mode and associated parameters are adjusted through the interface module 150 they may be saved as default settings in a configuration file. Moreover, this configuration file may be saved and recalled through the interface module 150 so that numerous configuration files can be saved and organized in a multitude of ways, including for example, per function, per user, or per device.
In addition, it is noted that present invention may be configured for use with a positioning device that may be configured as a multi-dimensional control device providing speed and velocity control. For example, with a 2D pointing device having only buttons, the present invention allows for selecting an object, using the selection point as a point of origin, and then allowing speed and/or velocity control based about the point of origin through rapidly clicking a second button on the 2D pointing device.
In an alternative example, with a 2D pointing device that includes a scroll wheel (e.g., a mechanical scroll wheel, an optical scroll wheel, a touch scroll wheel, pressure-scroll wheel, or other solid-state scroll mechanism), the selection of an object and its point of origin may be through clicking or tapping a button or the scroll wheel of the 2D pointing device, and then scrolling the scroll wheel for speed and/or velocity control about the point of origin. For example, for a zoom operation, scrolling the scroll wheel forward can zoom in with regard to object at the point of origin and scrolling the scroll wheel backward can zoom out with regard to the object at the point of origin. As with other multi-dimensional control devices, e.g., the 3D control device, the modes and parameters of the 2D pointing device can be customized, configured, saved, and recalled through the interface module 150.
Figures 2a and 2b illustrate various embodiments for processes to view and manipulate an object on a screen in accordance with the present invention. In Figure 2a, a first embodiment of a process starts 210 and determines 215 a location of a cursor on a screen. The process then optionally determines 220 a point of origin for a manipulation mode function, e.g., a zoom function, relative to the position of the cursor. The process then applies 225 a command associated with the manipulation mode function, e.g., execution of the zoom function. The process then allows manipulation 230 of the object with regard to the particular degree of freedom of movement, e.g., zoom along a z-axis of six or more degrees of freedom of movement as well as adjusting particular speed and velocity relative to the zoom function being applied. The process then ends 235 when operation of the manipulation function completes.
In Figure 2b, a second embodiment of a process starts 250 and an object is selected 255 for manipulation. The process determines 260 a point of origin for the selected object. The process may also determine the manipulation mode, e.g., a zoom function, at this point. The process then provides 265 control to the multi-dimensional input device, if the multi-dimensional device does not already have control. The process then applies 270 a command associated with the manipulation mode function, e.g., execution of the zoom function. The process then allows manipulation 275 of the object with regard to the particular degree of freedom of movement, e.g., zoom along a z-axis of six or more degrees of freedom of movement as well as adjusting a particular speed and velocity relative to the zoom function being applied. The process then ends 280 when operation of the manipulation function completes.
Figure 3 illustrates an example of one embodiment of a manipulation system 305 for executing a manipulation mode, e.g., a zoom function or a pan function, in accordance with the present invention. The manipulation system 305 includes an interface module 310, a trigger module 320, a parameter preparation module 330, a mode selection module 340, a mode lock module 350, and a mode-processing module 360. Each module 310, 320, 330, 340, 350, 360 is coupled to each other and each may be implemented in software, hardware, firmware, or two or more of these.
The interface module 310 is configured to provide a screen and device interface for a user to select particular manipulation modes, e.g., a zoom function or a pan function, customize parameter settings for the particular manipulation mode, and saving and retrieving device and or user configuration information for the particular manipulation mode. The trigger module 320 is configured to receive an event or trigger signal from an input device, e.g., a multi-dimensional input device. The mode selection module 340 may be configured to function with the interface module 310 to set or change a manipulation mode and appropriate parameters for that manipulation mode. In addition, the mode selection module 340 also couples with the trigger module 320 to select or identify the execution of the particular manipulation mode.
The parameter preparation module 330 may be configured to function with the interface module for retrieving the particular configuration parameters for the particular manipulation mode when the trigger module 320 receives a trigger or event. The mode lock module 350 locks and unlocks a particular manipulation mode for a user once a trigger or event is received, the mode is identified, parameters are set, and the mode is executed. The mode-processing module 360 executes the manipulation mode when the mode lock module 350 locks the manipulation mode and stops executing when the mode lock module 350 unlocks (or de-selects) the manipulation mode.
Figure 4 illustrates an example for a process of performing a manipulation function, specifically, a zoom function, in accordance with the present invention. As an example, the process will be referred to as a "fast zoom" mode. Generally, the process starts 410, and the fast zoom mode is triggered 420. Triggering 420 can be through, for example, pressing a button on a 2D pointing device, pressing a key or button on a keyboard, a speech input or command, an eye- tracking input or process, or other triggering device, process or event. The process then prepares 430 parameters for the fast zoom mode. Specifically, in one embodiment, the object speed is adjusted, for example, to twice the object movement speed; axes (e.g., x-axis, y-axis, and z-axis) are disabled; a zoom axis is enabled; and a fast zoom mode is enabled (e.g., a true condition). The process allows for the device to function (or process) 450 in the fast zoom mode while 440 the fast zoom mode is enabled.
If the process receives a signal to turn off the fast zoom 460, the process toggles the fast zoom mode (e.g., a false condition) and ends 480 the fast zoom mode. Alternatively, if the process does not receive a signal to turn off the fast zoom mode 460, the process could time out the fast zoom mode and end 480 the fast zoom mode 480. The process then ends 490.
Figures 5a through 5f are screen illustrations of objects viewed before executing a desired viewing function (e.g., (1)) and after execution of the desired viewing function (e.g., (2)) in accordance with embodiments of the present invention. Generally, the screen illustrations demonstrate an example of a zoom viewing function using differing points of origin. For example, points of origin include a center of a screen, a center of an object, a particular location within an object, a particular location on the screen, and a further zoom of an already zoomed object.
Figures 6a and 6b illustrate examples of a computing system environment and a software environment for use and application with embodiments of the present invention as described herein. For example, Figure 6a illustrates a computing system environment 605 that includes a computing system 610; a multi-dimensional (e.g., 3D) control device that is used for speed and/or velocity control 620; and one or more optional position type input device (e.g., a keyboard, a pointing device such as a mouse, a trackball, an eraser pointer, or a touch pad) 630a- n. In addition, the object for manipulation 640 is preferably on a screen of the computing system 610, although it is not limited to a screen and may be coupled with, but apart from the computer system 610.
Also for example, Figure 6b illustrates a software environment 608 that includes a device driver 650, an operating system 660, a user interface 670, and an application 680 that allows for manipulation of multi-dimensional, e.g., three or more dimensions, objects. The device driver 650 provides the interface between the computing system through its operating system 660 and a multi-dimensional input device 690. The user interface 670 provides an interface for a user to select manipulation modes and parameters as previously described with respect to the interface module, e.g., 310. The application 680 allows for viewing and manipulating the object 640 through the computer screen or directly.
The present invention advantageously is functional within input devices that allow for spontaneous viewing and manipulation of multiple degrees of freedom and do not require execution of a multitude of steps and layers to execute a particular viewing function. In turn, the present invention helps increase user efficiency and helps decrease user appendage and visual fatigue. Moreover, in alternative embodiments the principles of the present invention may be applied to environments and situations where a user uses a multi-dimensional control device in accordance with the present invention to directly manipulate a three-dimensional object without the need for a computer screen.

Claims

Claims :
A control device for manipulating multi-dimensional objects on a screen, the device comprising:
- A selection module (110) for activating in a single step a defined manipulation function with preset settings and disabling all other degrees of freedom for manipulating the object on the screen,
- A manipulation operation module (130) for executing the activated viewing function, and
- A multi-dimensional input device control module (140) for commanding a manipulation of an object once the activated viewing function is executed.
A control device according to claim 1, furthermore comprising:
- A point of origin module (120) for identifying a point of origin for executing said manipulation of the object.
A control device according to claim 1 or 2, furthermore comprising:
- An screen interface module (150) for customizing parameters as default or event specific parameters for the object manipulation.
A control device according to anyone of the preceding claims, wherein upon activation of the defined manipulation function all other degrees of freedom are disabled. A control device according to anyone of the preceding claims, characterized in that the manipulation function is a fast zoom view of an object.
A control device according to claim 5, characterized in that the selection module is designed to activate a defined zoom axis and speed of the object's movement.
A control device according to anyone of the preceding claims, characterized in that it is a input device having at least six degrees of freedom.
A data processing system comprising: a computing device, a multi-dimensional input device according to anyone of the preceding claims, and a screen or display.
A method for controlling multi-dimensional objects on a screen, the method comprising the following steps:
- Activating a manipulation mode function,
- Determining (215) the current location of a cursor on the screen,
- Applying (225) a command associated with the activated manipulation mode function with regard to at least one degree of freedom, and
- Manipulating (230) the object with regard to the at least one degree of freedom and depending on the location of the cursor . . A Method according to claim 9, comprising the further step of
- Determining a point of origin for the manipulation mode function relative to the position of the cursor. . A method for controlling multi-dimensional objects on a screen, the method comprising the following steps:
- Selecting (255) an object to manipulate on the screen,
- Determining (260) a point of origin,
- Providing control (265) to a multi-domensional control device,
- Applying (270) a command with regard to at least one degree of freedom, and
- Manipulating (275) the selected object with regard to the at least one degree of freedom.
. Device driver software product for a multi-dimesnional input decice, characterized in that it supports a method according to claims when running on a computing device. . Recording medium, having recorded thereon a software product according to claim 12.
PCT/EP2002/011803 2002-03-08 2002-10-22 Multi-dimensional object manipulation WO2003077107A2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
AU2002350596A AU2002350596A1 (en) 2002-03-08 2002-10-22 Multi-dimensional object manipulation
EP02785274A EP1483658A2 (en) 2002-03-08 2002-10-22 Multi-dimensional object manipulation

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US36349002P 2002-03-08 2002-03-08
US60/363,490 2002-03-08
US36503902P 2002-03-12 2002-03-12
US60/365,039 2002-03-12

Publications (2)

Publication Number Publication Date
WO2003077107A2 true WO2003077107A2 (en) 2003-09-18
WO2003077107A3 WO2003077107A3 (en) 2003-11-20

Family

ID=27807994

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2002/011803 WO2003077107A2 (en) 2002-03-08 2002-10-22 Multi-dimensional object manipulation

Country Status (3)

Country Link
EP (1) EP1483658A2 (en)
AU (1) AU2002350596A1 (en)
WO (1) WO2003077107A2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006058775A1 (en) * 2004-12-02 2006-06-08 3Dconnexion Gmbh Input device with different modes
US7603917B2 (en) 2004-08-09 2009-10-20 Peratech Limited Full-axis sensor for detecting input force and torque

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0168981A2 (en) * 1984-07-20 1986-01-22 Tektronix, Inc. Method and apparatus for spherical panning
EP1074934A2 (en) * 1999-08-02 2001-02-07 Lucent Technologies Inc. Computer input device having six degrees of freedom for controlling movement of a three-dimensional object
US6198472B1 (en) * 1998-09-16 2001-03-06 International Business Machines Corporation System integrated 2-dimensional and 3-dimensional input device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0168981A2 (en) * 1984-07-20 1986-01-22 Tektronix, Inc. Method and apparatus for spherical panning
US6198472B1 (en) * 1998-09-16 2001-03-06 International Business Machines Corporation System integrated 2-dimensional and 3-dimensional input device
EP1074934A2 (en) * 1999-08-02 2001-02-07 Lucent Technologies Inc. Computer input device having six degrees of freedom for controlling movement of a three-dimensional object

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7603917B2 (en) 2004-08-09 2009-10-20 Peratech Limited Full-axis sensor for detecting input force and torque
WO2006058775A1 (en) * 2004-12-02 2006-06-08 3Dconnexion Gmbh Input device with different modes
EP1669842A1 (en) * 2004-12-02 2006-06-14 3Dconnexion GmbH Input device with different modes

Also Published As

Publication number Publication date
EP1483658A2 (en) 2004-12-08
WO2003077107A3 (en) 2003-11-20
AU2002350596A1 (en) 2003-09-22

Similar Documents

Publication Publication Date Title
US10353462B2 (en) Eye tracker based contextual action
US5568603A (en) Method and system for transparent mode switching between two different interfaces
JP3782836B2 (en) Method and computer system for providing multiple display pointers
US6407749B1 (en) Combined scroll and zoom method and apparatus
CA2294085C (en) Graphical click surfaces for force feedback applications
TWI514234B (en) Method and apparatus for gesture recognition
US9280257B2 (en) Navigation system for a 3D virtual scene
Ramos et al. Zliding: fluid zooming and sliding for high precision parameter manipulation
US7770135B2 (en) Tracking menus, system and method
EP2395413B1 (en) Gesture-based human machine interface
US5986656A (en) Graphical user interface control element
US7111239B2 (en) Method for active feedback
US7379048B2 (en) Human-computer interface including efficient three-dimensional controls
JP2000511673A (en) User interface with compound cursor
WO2004029789A2 (en) Graphical user interface navigation method and apparatus
US20090109173A1 (en) Multi-function computer pointing device
US10839613B2 (en) Fast manipulation of objects in a three-dimensional scene
KR101154137B1 (en) User interface for controlling media using one finger gesture on touch pad
WO2003077107A2 (en) Multi-dimensional object manipulation
WO1998043194A2 (en) Apparatus and methods for moving a cursor on a computer display and specifying parameters
JPH09198223A (en) Computer system
RU2718613C1 (en) Method of controlling devices with a large number of controlled elements using a "mouse"
KR102181499B1 (en) Method and system for authoring virtual reality contents with two hands motion input
US20240103643A1 (en) Input device
Faisstnauer et al. Computer-Assisted Selection of 3D Interaction and Navigation Metaphors

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ OM PH PL PT RO RU SD SE SG SI SK SL TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR IE IT LU MC NL PT SE SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
WWE Wipo information: entry into national phase

Ref document number: 2002785274

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 2002785274

Country of ref document: EP

NENP Non-entry into the national phase in:

Ref country code: JP

WWW Wipo information: withdrawn in national office

Country of ref document: JP