Recherche Images Maps Play YouTube Actualités Gmail Drive Plus »
Connexion
Les utilisateurs de lecteurs d'écran peuvent cliquer sur ce lien pour activer le mode d'accessibilité. Celui-ci propose les mêmes fonctionnalités principales, mais il est optimisé pour votre lecteur d'écran.

Brevets

  1. Recherche avancée dans les brevets
Numéro de publicationUS20100066705 A1
Type de publicationDemande
Numéro de demandeUS 12/627,275
Date de publication18 mars 2010
Date de dépôt30 nov. 2009
Date de priorité10 nov. 2000
Autre référence de publicationCN1262910C, CN1360249A, EP1205836A2, EP1205836A3, US6897853, US7081889, US7277089, US7626580, US20020056575, US20050088422, US20050088423, US20060033751, US20130293500
Numéro de publication12627275, 627275, US 2010/0066705 A1, US 2010/066705 A1, US 20100066705 A1, US 20100066705A1, US 2010066705 A1, US 2010066705A1, US-A1-20100066705, US-A1-2010066705, US2010/0066705A1, US2010/066705A1, US20100066705 A1, US20100066705A1, US2010066705 A1, US2010066705A1
InventeursLeroy B. Keely, Charlton E. Lui, F. David Jones, Ryan Edward Cukierman, Susanne Alysia Clark Cazzanti, Marieke Iwema, Robert Jarrett
Cessionnaire d'origineMicrosoft Corporation
Exporter la citationBiBTeX, EndNote, RefMan
Liens externes: USPTO, Cession USPTO, Espacenet
Highlevel active pen matrix
US 20100066705 A1
Résumé
The present invention relates to a system, method and medium for receiving and acting upon user input. In one embodiment, the user may only have access to a limited input device, like a stylus. Using the present invention, a user is provided with intuitive responses from the system based on inputs from the limited input device.
Images(8)
Previous page
Next page
Revendications(20)
1. One or more computer-readable media storing computer-executable instructions that when executed perform operations comprising:
receiving user input at a digitizer;
determining whether the user input moves beyond a first threshold;
determining whether the user input ends before an amount of time; and
responsive to the user input failing to exceed the first threshold and ending before the amount of time, classifying the user input as a tap.
2. The one or more computer-readable media of claim 1 wherein the user input is caused by the digitizer detecting a user's finger in contact with the digitizer, and the user input ends when the digitizer no longer detects the user's finger in contact with the digitizer.
3. The one or more computer-readable media of claim 1 wherein the first threshold includes at least one of: a distance of movement, a rate of movement, an acceleration of movement, or any combination thereof.
4. The one or more computer-readable media of claim 1 wherein the first threshold is not changed based on an object associated with the user input.
5. The one or more computer-readable media of claim 1 wherein the first threshold depends on an object associated with the user input.
6. The one or more computer-readable media of claim 1 wherein the computer-executable instructions perform operations further comprising:
selecting an object within proximity of the tap when the object has not already been selected.
7. The one or more computer-readable media of claim 1 wherein the computer-executable instructions perform operations further comprising:
de-selecting an object within proximity of the tap when the object has already been selected.
8. The one or more computer-readable media of claim 1 wherein the computer-executable instructions perform operations further comprising:
placing an insertion point at a location within proximity of the tap when the location of the tap was within text.
9. One or more computer-readable media storing computer-executable instructions that when executed perform operations comprising:
receiving user input at a digitizer;
determining whether the user input moves beyond a first threshold; and
responsive to the user input moving beyond the first threshold, classifying the user input as a stroke.
10. The one or more computer-readable media of 9 wherein the user input is caused by the digitizer detecting a user's finger in contact with the digitizer.
11. The one or more computer-readable media of claim 9 wherein the first threshold includes at least one of: a distance of movement, a rate of movement, an acceleration of movement, or any combination thereof.
12. The one or more computer-readable media of claim 9 wherein the computer-executable instructions perform operations further comprising:
when the stroke started within proximity to a draggable object and a drag threshold had been exceeded, performing a function with the draggable object based on the stroke; and
when the drag threshold has not been exceeded, maintaining the draggable object at a current state.
13. The one or more computer-readable media of claim 9 wherein the computer-executable instructions perform operations further comprising:
when the stroke did not start within proximity to a draggable object, determining whether an area under the stroke is inkable; and
when the area under the stroke inkable, performing inking based on the stroke.
14. A computing device, comprising:
one or more processors; and
one or more computer-readable media storing computer-executable instructions that when executed by the one or more processors perform operations comprising:
receiving user input at a digitizer;
determining whether the user input moves beyond a first threshold;
responsive to the user input moving beyond the first threshold, classifying the user input as a stroke;
determining whether the user input ends before an amount of time; and
responsive to the user input failing to exceed the first threshold within the amount of time and ending before the amount of time, classifying the user input as a tap.
15. The computing device of claim 14 wherein the user input is caused by the digitizer detecting a user's finger in contact with the digitizer, and the user input ends when the digitizer no longer detects the user's finger in contact with the digitizer.
16. The computing device of claim 14 wherein the computer-executable instructions perform operations further comprising:
selecting a first object within proximity of the tap when the first object has not already been selected; and
de-selecting a second object within proximity of the tap when the second object has already been selected.
17. The computing device of claim 14 wherein the computer-executable instructions perform operations further comprising:
placing an insertion point at a location within proximity of the tap when the location of the tap was within text.
18. The computing device of claim 14 wherein the computer-executable instructions perform operations further comprising:
when the stroke started within proximity to a draggable object and a drag threshold had been exceeded, performing a function with the draggable object based on the stroke; and
when the drag threshold has not been exceeded, maintaining the draggable object at a current state.
19. The computing device of claim 14 wherein the computer-executable instructions perform operations further comprising:
when the stroke did not start within proximity to a draggable object, determining whether an area under the stroke is inkable; and
when the area under the stroke inkable, performing inking based on the stroke.
20. The computing device of claim 14 wherein the first threshold includes at least one of: a distance of movement, a rate of movement, an acceleration of movement, or any combination thereof.
Description
    CROSS-REFERENCE TO RELATED APPLICATIONS
  • [0001]
    The present application is a continuation of U.S. patent application Ser. No. 11/202,034, filed Aug. 12, 2005, which is a continuation of U.S. patent application Ser. No. 10/993,357, filed Nov. 22, 2004, now issued as U.S. Pat. No. 7,081,889, which is a continuation of U.S. patent application Ser. No. 09/736,170, filed Dec. 15, 2000, now issued as U.S. Pat. No. 6,897,853, which claims priority to U.S. Provisional Patent Application Ser. No. 60/247,400, filed Nov. 10, 2000, each of which is incorporated by reference herein as to its entirety.
  • FIELD OF THE INVENTION
  • [0002]
    Aspects of the present invention are directed generally to apparatus and methods for controlling a graphical user interface (GUI). More particularly, the present invention relates to receiving user input, determining based on the user input what the user wants to do, and performing a function related to the desired input.
  • BACKGROUND
  • [0003]
    Typical computer systems, especially computer systems using graphical user interface (GUI) systems such as Microsoft WINDOWS, are optimized for accepting user input from one or more discrete input devices such as a keyboard and for entering text, and a pointing device such as a mouse with one or more buttons for driving the user interface. Virtually all software applications designed to run on Microsoft WINDOWS are optimized to accept user input in the same manner. For instance, many applications make extensive use of the right mouse button (a “right click”) to display context-sensitive command menus. The user may generate other gestures using the mouse such as by clicking the left button of the mouse (a “left click”), or by clicking the left or right button of the mouse and moving the mouse while the button is depressed (either a “left click drag” or a “right click drag”).
  • [0004]
    In some environments, a mouse is not usable or desirable. For example, in a digitizer tablet environment, the primary input device may be a stylus. While a stylus attempts to provide pad and paper-like feel to a computing environment, current systems are limited. For example, the use of a stylus in a graphical user interface is limited to tapping on various items for selection. See, for example, the Palm-series of products using the Palm OS 3.0 operating system. Further, in stylus-based input environments, a user is continually forced to select tools or operations from a remote tool bar, generally on a top or bottom of a screen. While a user can type in letters or have the digitizer recognize handwriting, these operations require selecting a keyboard input mode and writing in a predefined portion of the digitizer, respectively. In short, requiring a user to tell the computer, for every new input, what a user wants to do makes stylus-based computing difficult for the average user. Accordingly, stylus based inputs have been relegated to personal data assistants (PDAs) where significant user input is not possible. Mainstream computing still requires the use of at least a keyboard and mouse (or mouse-based input device, for example, trackballs, touch-pads, and other mouse substitutes).
  • [0005]
    Accordingly, a need exists for permitting a user to perform all operations of a mouse-type device using a stylus.
  • SUMMARY
  • [0006]
    As discussed in the various copending patent applications incorporated herein by reference, aspects of the present invention are directed to a tablet-like computer that allows users to directly write on a display surface using a stylus. The display surface may physically, optically, and or electro magnetically detect the stylus. The computer may allow the user to write and to edit, manipulate, and create objects through the use of the stylus. Many of the features discussed in these copending applications are more easily performed by use of the various aspects of the present invention discussed herein.
  • [0007]
    An aspect of the present invention is directed to methods and apparatus for simulating gestures of a mouse by use of a stylus on a display surface. The present invention determines the operation a user wants to perform based on the user's input. This determination may include reference to other information including the location of the user's input on a digitizer (e.g., location on a screen) and the status of other objects or elements as displayed. By using this information, the system determines what the user wants to do and implements the action.
  • [0008]
    A number of inputs with a stylus are possible. For example, a user may tap a stylus, stroke the stylus, hold the stylus at a given point, or hold then drag the stylus. Other inputs and combinations are possible as noted by the above-identified applications, which are expressly incorporated herein by reference.
  • [0009]
    As to a stroke operation, the system may drag an object, may maintain a current state or operation, or being inking. Inking may include writing, drawing, or adding annotations as described in greater detail in U.S. Ser. No. 60/212,825, filed Jun. 21, 2000, entitled “Methods for Classifying, Anchoring, and Transforming Ink Annotations” and incorporated by reference.
  • [0010]
    As to a tap operation, the system may add to existing writing, may select a new object, insert a cursor or insertion point, or may perform an action on a selected object.
  • [0011]
    As to a hold operation, the system may simulate a right mouse button click or other definable event.
  • [0012]
    As to a hold and drag operation, the system may drag a selected object or perform other functions.
  • [0013]
    These and other features of the invention will be apparent upon consideration of the following detailed description of preferred embodiments. Although the invention has been defined using the appended claims, these claims are exemplary in that the invention is intended to include the elements and steps described herein in any combination or subcombination. Accordingly, there are any number of alternative combinations for defining the invention, which incorporate one or more elements from the specification, including the description, claims, and drawings, in various combinations or subcombinations. It will be apparent to those skilled in the relevant technology, in light of the present specification, that alternate combinations of aspects of the invention, either alone or in combination with one or more elements or steps defined herein, may be utilized as modifications or alterations of the invention or as part of the invention. It is intended that the written description of the invention contained herein covers all such modifications and alterations.
  • DESCRIPTION OF THE DRAWINGS
  • [0014]
    The foregoing summary of the invention, as well as the following detailed description of preferred embodiments, is better understood when read in conjunction with the accompanying drawings, which are included by way of example, and not by way of limitation with regard to the claimed invention. In the accompanying drawings, elements are labeled with three-digit reference numbers, wherein the first digit of a reference number indicates the drawing number in which the element is first illustrated. The same reference number in different drawings refers to the same element.
  • [0015]
    FIG. 1 is a schematic diagram of a general-purpose digital computing environment that can be used to implement various aspects of the invention.
  • [0016]
    FIG. 2 is a plan view of a tablet computer and stylus that can be used in accordance with various aspects of the present invention.
  • [0017]
    FIGS. 3-7 are flowcharts showing a variety of steps for interpreting a user's input in accordance with embodiments of the present invention.
  • DETAILED DESCRIPTION
  • [0018]
    The present invention may be more readily described with reference to FIGS. 1-7. FIG. 1 illustrates a schematic diagram of a conventional general-purpose digital computing environment that can be used to implement various aspects of the present invention. In FIG. 1, a computer 100 includes a processing unit 110, a system memory 120, and a system bus 130 that couples various system components including the system memory to the processing unit 110. The system bus 130 may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures. The system memory 120 includes read only memory (ROM) 140 and random access memory (RAM) 150.
  • [0019]
    A basic input/output system 160 (BIOS), containing the basic routines that help to transfer information between elements within the computer 100, such as during start-up, is stored in the ROM 140. The computer 100 also includes a hard disk drive 170 for reading from and writing to a hard disk (not shown), a magnetic disk drive 180 for reading from or writing to a removable magnetic disk 190, and an optical disk drive 191 for reading from or writing to a removable optical disk 192 such as a CD ROM or other optical media. The hard disk drive 170, magnetic disk drive 180, and optical disk drive 191 are connected to the system bus 130 by a hard disk drive interface 192, a magnetic disk drive interface 193, and an optical disk drive interface 194, respectively. The drives and their associated computer-readable media provide nonvolatile storage of computer readable instructions, data structures, program modules and other data for the personal computer 100. It will be appreciated by those skilled in the art that other types of computer readable media that can store data that is accessible by a computer, such as magnetic cassettes, flash memory cards, digital video disks, Bernoulli cartridges, random access memories (RAMs), read only memories (ROMs), and the like, may also be used in the example operating environment.
  • [0020]
    A number of program modules can be stored on the hard disk drive 170, magnetic disk 190, optical disk 192, ROM 140 or RAM 150, including an operating system 195, one or more application programs 196, other program modules 197, and program data 198. A user can enter commands and information into the computer 100 through input devices such as a keyboard 101 and pointing device 102. Other input devices (not shown) may include a microphone, joystick, game pad, satellite dish, scanner or the like. These and other input devices are often connected to the processing unit 110 through a serial port interface 106 that is coupled to the system bus, but may be connected by other interfaces, such as a parallel port, game port or a universal serial bus (USB). Further still, these devices may be coupled directly to the system bus 130 via an appropriate interface (not shown). A monitor 107 or other type of display device is also connected to the system bus 130 via an interface, such as a video adapter 108. In addition to the monitor, personal computers typically include other peripheral output devices (not shown), such as speakers and printers. In a preferred embodiment, a pen digitizer 165 and accompanying pen or stylus 166 are provided in order to digitally capture freehand input. Although a direct connection between the pen digitizer 165 and the processing unit 110 is shown, in practice, the pen digitizer 165 may be coupled to the processing unit 110 via a serial port, parallel port or other interface and the system bus 130 as known in the art. Furthermore, although the digitizer 165 is shown apart from the monitor 107, it is preferred that the usable input area of the digitizer 165 be co-extensive with the display area of the monitor 107. Further still, the digitizer 165 may be integrated in the monitor 107, or may exist as a separate device overlaying or otherwise appended to the monitor 107.
  • [0021]
    The computer 100 can operate in a networked environment using logical connections to one or more remote computers, such as a remote computer 109. The remote computer 109 can be a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to the computer 100, although only a memory storage device 111 has been illustrated in FIG. 1. The logical connections depicted in FIG. 1 include a local area network (LAN) 112 and a wide area network (WAN) 113. Such networking environments are commonplace in offices, enterprise-wide computer networks, intranets and the Internet.
  • [0022]
    When used in a LAN networking environment, the computer 100 is connected to the local network 112 through a network interface or adapter 114. When used in a WAN networking environment, the personal computer 100 typically includes a modem 115 or other means for establishing a communications over the wide area network 113, such as the Internet. The modem 115, which may be internal or external, is connected to the system bus 130 via the serial port interface 106. In a networked environment, program modules depicted relative to the personal computer 100, or portions thereof, may be stored in the remote memory storage device.
  • [0023]
    It will be appreciated that the network connections shown are exemplary and other techniques for establishing a communications link between the computers can be used. The existence of any of various well-known protocols such as TCP/IP, Ethernet, FTP, HTTP and the like is presumed, and the system can be operated in a client-server configuration to permit a user to retrieve web pages from a web-based server. Any of various conventional web browsers can be used to display and manipulate data on web pages.
  • [0024]
    FIG. 2 illustrates a tablet PC 201 that can be used in accordance with various aspects of the present invention. Any or all of the features, subsystems, and functions in the system of FIG. 1 can be included in the computer of FIG. 2. Tablet PC 201 includes a large display surface 202, e.g., a digitizing flat panel display, preferably, a liquid crystal display (LCD) screen, on which a plurality of windows 203 is displayed. Using stylus 204, a user can select, highlight, and write on the digitizing display area. Examples of suitable digitizing display panels include electromagnetic pen digitizers, such as the Mutoh or Wacom pen digitizers. Other types of pen digitizers, e.g., optical digitizers, may also be used. Tablet PC 201 interprets marks made using stylus 204 in order to manipulate data, enter text, and execute conventional computer application tasks such as spreadsheets, word processing programs, and the like.
  • [0025]
    A stylus could be equipped with buttons or other features to augment its selection capabilities. In one embodiment, a stylus could be implemented as a “pencil” or “pen”, in which one end constitutes a writing portion and the other end constitutes an “eraser” end, and which, when moved across the display, indicates portions of the display are to be erased. Other types of input devices, such as a mouse, trackball, or the like could be used. Additionally, a user's own finger could be used for selecting or indicating portions of the displayed image on a touch-sensitive or proximity-sensitive display. Consequently, the term “user input device”, as used herein, is intended to have a broad definition and encompasses many variations on well-known input devices.
  • [0026]
    Region 205 shows a feed back region or contact region permitting the user to determine where the stylus as contacted the digitizer. In another embodiment, the region 205 provides visual feedback when the hold status of the present invention has been reached.
  • [0027]
    FIGS. 3-7 show various flowcharts for determining what a user wants to do based on a user's interaction with the digitizer. As will be discussed below, the user contacts the digitizer where the user wants to begin writing, tapping, annotating, dragging, etc. In the case where the digitizer is superimposed over a display, the user's contact with the digitizer is directed at operating at (or near) the contact point between the user's stylus and the currently displayed information at or near the contact point.
  • [0028]
    In step 301, the system senses a contact or other indication of an action. In one embodiment the contact may be the stylus contacting the surface of the digitizer. In another embodiment, the action may be bringing the tip of the stylus near the digitizer's surface. Further, if the stylus includes another signaling method (for example, a radio transmitter transmitting a signal to the digitizer signaling a user's input), the digitizer (or related input mechanism or mechanisms) interpret the received signal as a user's input. Other methods of starting an operation or writing or contact with a digitizer are known in the art. For purposes of illustration and description, the system and method reference physical contact with the digitizer. All other ways of providing signals to a processor are considered within the scope of the invention and are not mentioned here for simplicity.
  • [0029]
    In step 302, the system determines the contact position and what lies beneath the contact position (for example, an object, a drawing, blank space, ink, and the like). In step 303, the system determines if the stylus has moved beyond a first threshold (time, distance, rate, or acceleration, and the like). In one embodiment, the threshold is set to the minimum resolvable movement. In another embodiment, the threshold is set higher to account for shaky hands, vibrations of the digitizer or tablet pc (for example, if trying to use the system while driving in a car over a bumpy road). It is noted that objects may have all the same threshold. Alternatively, objects may have different thresholds. This may be dependent on the object, the size of the object, the state of the system, the state of the object, and the like.
  • [0030]
    If the first threshold has been exceeded, then the system proceeds to step 304 where the user's input is classified as a stroke and the system steps to point A 305. If the first threshold has not been exceeded, the system determines if the stylus was still in contact with the digitizer when a time threshold had expired in step 306. If no (meaning that the stylus was still in contact with the digitizer surface), the system classifies the input as a tap in step 307 and proceeds to point B 308.
  • [0031]
    If the stylus was still in contact with the surface after the time threshold in step 306, the system determines if a second move threshold was exceeded in step 309. The first and second move thresholds may be identical or different. For example, both may be 0.25 mm. Or, the first may be 0.5 mm or one mm and the second be 0.3 mm. Further, the first may be 1.2 mm or more and the second may be 0.5 mm or more. In short, any values may be used as long as they are not obtrusive to the user. The second threshold may be determined only after the time threshold of step 306 has expired. In this example, the second threshold may be higher than the first threshold (or it may be the same or smaller).
  • [0032]
    If the second move threshold was not exceeded, then the system classifies the input as a hold in step 310 and proceeds to point C 311. If the second move threshold was exceeded, then the system classifies the input as a ‘hold and drag’ in step 312 and moves to point D 313.
  • [0033]
    FIG. 4 shows point A as starting point 401. Here, the system classified the input as a stroke and begins stroke processing in step 402. In step 403, the system determines if the stroke started on a draggable object. If yes, the system determines in step 404 whether drag threshold was exceeded (for example, 0.25 inches, 0.25 inches per second and the like). If so, the system classifies the stroke as a drag in step 405 and performs a function that is dependent on the object. For example, the drag may extend a selection as described in greater detail in “Selection Handles in Editing Electronic Documents,” filed concurrently with the present application (attorney docket 03797.00069), and expressly incorporated by reference. Also, the drag may operate a bungee tool as described in Serial No. (Atty docket 3797.00070), entitled “Insertion Point Bungee Space Tool”, and filed concurrently with the present application, and expressly incorporated herein.
  • [0034]
    If, in step 404, the drag threshold has not been exceeded, the system maintains the current state (with the object being selected or not) in step 407. If the stroke was not over a draggable object in step 403, the system determines if the area under the contact point is inkable in step 408. For example, inkable may mean an area capable of receiving ink (including drawings, annotations, or writing) as detailed in Ser. No. 60/212,825, filed Jun. 21, 2000, and expressly incorporated herein by reference for essential subject matter. By contrast, a control button (for copy, save, open, etc.) may not be inkable. If inkable in step 408, the system permits inking (drawing, writing, annotating and other related functions) in step 409. If not inkable, the system maintains the current state (objects selected or not) in step 407.
  • [0035]
    In FIG. 5A, the system starts at point B 501 and operates on the input as a tap 502. The system determines whether the tap was on an area or object that is inkable in step 503. If yes, the system determines whether any ink was recently added or “wet” (for example, less than 0.5 or 1 second old) in step 504. If so, the system considers the tap as a dot to be added to the ink in step 505 (and adds the dot). If no wet ink exists, then the system determines if the tap was over a selectable object in step 506. It is noted that steps 503 and 504 may be combined. If the tap was over a selectable object, then the system determines if the object was already selected in step 507. If it was not, then the system selects the tapped object in step 508. If a previous object had been selected, the system cancels the previous or old selection in step 509. If the object was previously selected as determined by step 507, the system performs an action relevant to the object in step 510. This action may include editing the object, performing a predefined operation (for example, enlarge, shrink and the like). From step 506, if the tap was not on a selectable object, then the system proceeds to point BB 512.
  • [0036]
    FIG. 5B shows additional processing to FIG. 5A. As point BB 512, the system determines if the tap was in a space between text (referred to herein as an inline space) in step 513. If yes, the system places an insertion point at the tap point in step 514. As shown in a broken lined box, the system may also cancel any old or previous selections in step 515. If no, then the system determines if the tap point has ink nearby in step 518. If the system determines that the tap was nearby ink, then the system adds a dot to the ink in step 516. If there was an old selection, then the system cancels the old selection in step 517 (as shown by a broken line box).
  • [0037]
    If not nearby ink in step 518, the system determines if the tap is on an active object in step 519. If the tap was not on an active object, the system places an insertion point at the tap point or performs some other definable action in step 520. Again, if there was an old selection, then the system cancels the old selection in step 521 (as shown by a broken line box). If the tap was on an active object as determined by step 519, the system performs an action in step 522. The action may be definable by the user or relate to any function desirable. In one embodiment, the action may be to perform a function to operate a selection handle or bungee space tool as described in Ser. No. 60/247,973 (Attorney docket 3797.00069), “Selection Handles in Editing Electronic Documents,” filed concurrently with the present application and expressly incorporated by reference. Also, the drag may operate a bungee tool as described in Ser. No. 60/247,842 (Atty. docket 3797.00070), entitled “Insertion Point Bungee Space Tool”, and filed concurrently with the present application, and expressly incorporated herein. Other operations are known in the art and incorporated herein.
  • [0038]
    FIG. 6 relates to holding a stylus beyond a time threshold. Starting from point C 601, the system classifies the user input as a hold operation in step 602. Next, the system simulates a right mouse button click or other definable event in step 603. The functions associated with step 603 are described in greater detail in U.S. application Ser. No. 60/247,844 (Atty. docket 3797.00072), entitled “Simulating Gestures of a Mouse Using a Stylus and Providing Feedback Thereto”, filed Nov. 10, 2000, whose contents are expressly incorporated herein by reference.
  • [0039]
    FIG. 7 relates to holding a stylus beyond a time threshold and moving the stylus. Starting from point D 701, the system classifies the user input as a hold and drag operation in step 702. Next, in step 703 the system drags the selected object as directed by the user.
  • [0040]
    There are a number of alternatives associated with dragging. If the hold and drag relates to an inline space, the system may use this hold and drag function to select text. Similarly, one may use this function to select a drawing encountered by the dragged stylus. Further, one may select both text and drawings in this manner. Also, the cursor's point may become a selection tool that leaves a trail behind it. In this regard, the user may loop a number of objects, drawing or text in this regard. The looping of the objects may result in the selecting of the objects.
  • [0041]
    An alternate embodiment of the present invention relates to modifying ink drawings or annotations. For example, if one added an annotation (from step 409) to text, one may manipulate the text (for example, by inserting new text) and have the annotation track the manipulation of the text. So, if one circled text then added text to the circled text, the annotation would expand to include the added text as well. This is described in relation to in U.S. Ser. No. 60/212,825, filed Jun. 21, 2000, entitled “Methods for Classifying, Anchoring, and Transforming Ink Annotations” and incorporated by reference.
  • [0042]
    While exemplary systems and methods embodying the present invention are shown by way of example, it will be understood, of course, that the invention is not limited to these embodiments. Modifications may be made by those skilled in the art, particularly in light of the foregoing teachings. For example, each of the elements of the aforementioned embodiments may be utilized alone or in combination with elements of the other embodiment.
Citations de brevets
Brevet cité Date de dépôt Date de publication Déposant Titre
US2143875 *11 déc. 193417 janv. 1939Rca CorpMultiplex facsimile printer system
US4534060 *9 août 19836 août 1985Pencept, Inc.Method and apparatus for removing noise at the ends of a stroke
US4608658 *13 avr. 198426 août 1986Pencept, Inc.Method and apparatus for removing noise at the ends of a stroke caused by retracing
US4686332 *26 juin 198611 août 1987International Business Machines CorporationCombined finger touch and stylus detection system for use on the viewing surface of a visual display device
US4899138 *29 févr. 19886 févr. 1990Pioneer Electronic CorporationTouch panel control device with touch time and finger direction discrimination
US4933670 *21 juil. 198812 juin 1990Picker International, Inc.Multi-axis trackball
US4954817 *2 mai 19884 sept. 1990Levine Neil AFinger worn graphic interface device
US4982618 *20 déc. 19898 janv. 1991Culver Craig FMultifunction tactile manipulatable control
US4988981 *28 févr. 198929 janv. 1991Vpl Research, Inc.Computer data entry and manipulation apparatus and method
US5060135 *1 nov. 198822 oct. 1991Wang Laboratories, Inc.Apparatus for manipulating documents in a data processing system utilizing reduced images of sheets of information which are movable
US5147155 *14 nov. 198915 sept. 1992Molnlycke AbDevice for achieving uniform distribution of airborne fibres, e.g. cellulose-fibres
US5231578 *13 nov. 199027 juil. 1993Wang Laboratories, Inc.Apparatus for document annotation and manipulation using images from a window source
US5280276 *10 juil. 199218 janv. 1994Quickshot (Bvi) Ltd.Combination mouse/trackball input device
US5294792 *31 déc. 199115 mars 1994Texas Instruments IncorporatedWriting tip position sensing and processing apparatus
US5327161 *21 oct. 19915 juil. 1994Microtouch Systems, Inc.System and method for emulating a mouse input device with a touchpad input device
US5347295 *31 oct. 199013 sept. 1994Go CorporationControl of a computer through a position-sensed stylus
US5404439 *15 avr. 19924 avr. 1995Xerox CorporationTime-space object containment for graphical user interface
US5404458 *24 févr. 19944 avr. 1995International Business Machines CorporationRecognizing the cessation of motion of a pointing device on a display by comparing a group of signals to an anchor point
US5442795 *19 nov. 199015 août 1995Wang Laboratories, Inc.System and method for viewing icon contents on a video display
US5463696 *5 juil. 199431 oct. 1995Apple Computer, Inc.Recognition system and method for user inputs to a computer system
US5483261 *26 oct. 19939 janv. 1996Itu Research, Inc.Graphical input controller and method with rear screen image detection
US5485171 *5 mai 199416 janv. 1996Micromed Systems, Inc.Hand held computer input apparatus and method
US5488204 *17 oct. 199430 janv. 1996Synaptics, IncorporatedPaintbrush stylus for capacitive touch sensor pad
US5488392 *28 avr. 199430 janv. 1996Harris; Thomas S.Precision, absolute mapping computer pointing device and versatile accessories
US5491495 *13 nov. 199013 févr. 1996Wang Laboratories, Inc.User interface having simulated devices
US5513309 *8 mai 199530 avr. 1996Apple Computer, Inc.Graphic editor user interface for a pointer-based computer system
US5523775 *8 juin 19944 juin 1996Apple Computer, Inc.Method for selecting objects on a computer display
US5534893 *15 déc. 19939 juil. 1996Apple Computer, Inc.Method and apparatus for using stylus-tablet input in a computer system
US5539427 *27 janv. 199423 juil. 1996Compaq Computer CorporationGraphic indexing system
US5543590 *2 sept. 19946 août 1996Synaptics, IncorporatedObject position detector with edge motion feature
US5543591 *7 oct. 19946 août 1996Synaptics, IncorporatedObject position detector with edge motion feature and gesture recognition
US5544295 *27 mai 19926 août 1996Apple Computer, Inc.Method and apparatus for indicating a change in status of an object and its disposition using animation
US5546527 *23 mai 199413 août 1996International Business Machines CorporationOverriding action defaults in direct manipulation of objects on a user interface by hovering a source object
US5548705 *27 févr. 199520 août 1996Xerox CorporationWiping metaphor as a user interface for operating on graphical objects on an interactive graphical display
US5555363 *30 sept. 199310 sept. 1996Apple Computer, Inc.Resetting the case of text on a computer display
US5559943 *27 juin 199424 sept. 1996Microsoft CorporationMethod and apparatus customizing a dual actuation setting of a computer input device switch
US5590567 *14 mars 19957 janv. 1997Delco Electronics CorporationSnap retainer and retainer system
US5592566 *1 juin 19957 janv. 1997Apple Computer, IncorporatedMethod and apparatus for computerized recognition
US5594810 *5 juin 199514 janv. 1997Apple Computer, Inc.Method and apparatus for recognizing gestures on a computer system
US5596694 *8 avr. 199621 janv. 1997Apple Computer, Inc.Method and apparatus for indicating a change in status of an object and its disposition using animation
US5596698 *30 janv. 199521 janv. 1997Morgan; Michael W.Method and apparatus for recognizing handwritten inputs in a computerized teaching system
US5602570 *31 mai 199511 févr. 1997Capps; Stephen P.Method for deleting objects on a computer display
US5612719 *15 avr. 199418 mars 1997Apple Computer, Inc.Gesture sensitive buttons for graphical user interfaces
US5613019 *3 juin 199418 mars 1997Microsoft CorporationSystem and methods for spacing, storing and recognizing electronic representations of handwriting, printing and drawings
US5621817 *13 avr. 199515 avr. 1997Apple Computer, Inc.Pointer-based computer system capable of aligning geometric figures
US5625377 *26 mai 199529 avr. 1997Apple Computer, Inc.Method for controlling a computerized organizer
US5640178 *21 août 199517 juin 1997Fujitsu LimitedPointing device
US5666113 *5 sept. 19959 sept. 1997Microtouch Systems, Inc.System for using a touchpad input device for cursor control and keyboard emulation
US5666438 *29 juil. 19949 sept. 1997Apple Computer, Inc.Method and apparatus for recognizing handwriting of different users of a pen-based computer system
US5666499 *4 août 19959 sept. 1997Silicon Graphics, Inc.Clickaround tool-based graphical interface with two cursors
US5670955 *31 janv. 199523 sept. 1997Microsoft CorporationMethod and apparatus for generating directional and force vector in an input device
US5748926 *18 avr. 19965 mai 1998Canon Kabushiki KaishaData processing method and apparatus
US5751260 *3 avr. 199512 mai 1998The United States Of America As Represented By The Secretary Of The NavySensory integrated data interface
US5757361 *20 mars 199626 mai 1998International Business Machines CorporationMethod and apparatus in computer systems to selectively map tablet input devices using a virtual boundary
US5757368 *27 mars 199526 mai 1998Cirque CorporationSystem and method for extending the drag function of a computer pointing device
US5760773 *6 janv. 19952 juin 1998Microsoft CorporationMethods and apparatus for interacting with data objects using action handles
US5764218 *31 janv. 19959 juin 1998Apple Computer, Inc.Method and apparatus for contacting a touch-sensitive cursor-controlling input device to generate button values
US5781181 *16 juil. 199614 juil. 1998Alps Electric Co., Ltd.Apparatus and method for changing an operation mode of a coordinate input apparatus
US5805144 *18 juil. 19968 sept. 1998Dell Usa, L.P.Mouse pointing device having integrated touchpad
US5812118 *25 juin 199622 sept. 1998International Business Machines CorporationMethod, apparatus, and memory for creating at least two virtual pointing devices
US5856822 *27 oct. 19955 janv. 199902 Micro, Inc.Touch-pad digital computer pointing-device
US5861583 *15 juil. 199619 janv. 1999Synaptics, IncorporatedObject position detector
US5861886 *26 juin 199619 janv. 1999Xerox CorporationMethod and apparatus for grouping graphic objects on a computer based system having a graphical user interface
US5864635 *14 juin 199626 janv. 1999International Business Machines CorporationDistinguishing gestures from handwriting in a pen based computer by stroke analysis
US5880411 *28 mars 19969 mars 1999Synaptics, IncorporatedObject position detector with edge motion feature and gesture recognition
US5880717 *14 mars 19979 mars 1999Tritech Microelectronics International, Ltd.Automatic cursor motion control for a touchpad mouse
US5883622 *17 janv. 199716 mars 1999Tritech Microelectronics International Ltd.Touchpad pen-input controller
US5898424 *30 sept. 199627 avr. 1999Gateway 2000, Inc.Pointing device with differing actuation forces for primary and secondary buttons
US5907327 *15 août 199725 mai 1999Alps Electric Co., Ltd.Apparatus and method regarding drag locking with notification
US5910800 *11 juin 19978 juin 1999Microsoft CorporationUsage tips for on-screen touch-sensitive controls
US5912659 *3 sept. 199715 juin 1999International Business Machines CorporationGraphics display pointer with integrated selection
US5920694 *7 avr. 19976 juil. 1999Ncr CorporationAnnotation of computer video displays
US5926179 *29 sept. 199720 juil. 1999Sony CorporationThree-dimensional virtual reality space display processing apparatus, a three-dimensional virtual reality space display processing method, and an information providing medium
US5926567 *26 juin 199720 juil. 1999Compaq Computer CorporationMethod and apparatus for storing and rapidly displaying graphic data
US5943043 *5 déc. 199624 août 1999International Business Machines CorporationTouch panel "double-touch" input method and detection apparatus
US5943044 *15 mai 199724 août 1999Interlink ElectronicsForce sensing semiconductive touchpad
US5945979 *20 mai 199731 août 1999International Business Machines CorporationCombined digital and analog cursor control
US6049329 *4 juin 199611 avr. 2000International Business Machines CorporartionMethod of and system for facilitating user input into a small GUI window using a stylus
US6057830 *17 janv. 19972 mai 2000Tritech Microelectronics International Ltd.Touchpad mouse controller
US6061051 *17 janv. 19979 mai 2000Tritech MicroelectronicsCommand set for touchpad pen-input mouse
US6094197 *17 mai 199525 juil. 2000Xerox CorporationGraphical keyboard
US6115043 *7 juin 19955 sept. 2000Kodak LimitedData processing system with folder means for associating a plurality of reduced size images in a stacked arrangement
US6118427 *18 avr. 199612 sept. 2000Silicon Graphics, Inc.Graphical user interface with optimal transparency thresholds for maximizing user performance and system efficiency
US6173287 *11 mars 19989 janv. 2001Digital Equipment CorporationTechnique for ranking multimedia annotations of interest
US6204837 *13 juil. 199820 mars 2001Hewlett-Packard CompanyComputing apparatus having multiple pointing devices
US6208329 *13 août 199627 mars 2001Lsi Logic CorporationSupplemental mouse button emulation system, method and apparatus for a coordinate based data input device
US6262719 *23 oct. 199717 juil. 2001Packard Bell Nec, Inc.Mouse emulation with a passive pen
US6266050 *10 août 199824 juil. 2001Samsung Electronics Co., Ltd.Portable computer having touch pad input control function
US6339431 *14 sept. 199915 janv. 2002Kabushiki Kaisha ToshibaInformation presentation apparatus and method
US6342906 *2 févr. 199929 janv. 2002International Business Machines CorporationAnnotation layer for synchronous collaboration
US6414700 *21 juil. 19982 juil. 2002Silicon Graphics, Inc.System for accessing a large number of menu items using a zoned menu bar
US6557042 *19 mars 199929 avr. 2003Microsoft CorporationMultimedia summary generation employing user feedback
US6610936 *12 août 199726 août 2003Synaptics, Inc.Object position detector with edge motion feature and gesture recognition
US6677930 *22 mars 199913 janv. 2004Fujitsu Takamisawa Component LtdMouse
US6847350 *10 déc. 200225 janv. 2005Hewlett-Packard Development Company, L.P.Optical pointing device
US6897853 *15 déc. 200024 mai 2005Microsoft Corp.Highlevel active pen matrix
US6930672 *13 avr. 199916 août 2005Fujitsu LimitedInput processing method and input control apparatus
US7081889 *22 nov. 200425 juil. 2006Microsoft CorporationHighlevel active pen matrix
US7626580 *12 août 20051 déc. 2009Microsoft CorporationHighlevel active pen matrix
US20020015064 *29 nov. 20007 févr. 2002Robotham John S.Gesture-based user interface to multi-level and multi-modal sets of bit-maps
Référencé par
Brevet citant Date de dépôt Date de publication Déposant Titre
US901560625 nov. 201321 avr. 2015Microsoft Technology Licensing, LlcPresenting an application change through a tile
US913484924 oct. 201215 sept. 2015Nook Digital, LlcPen interface for a touch screen device
US9223472 *22 déc. 201129 déc. 2015Microsoft Technology Licensing, LlcClosing applications
US922991816 mars 20155 janv. 2016Microsoft Technology Licensing, LlcPresenting an application change through a tile
US9588607 *21 juin 20137 mars 2017Samsung Electronics Co., Ltd.Method for improving touch recognition and electronic device thereof
US969688830 déc. 20144 juil. 2017Microsoft Technology Licensing, LlcApplication-launching interface for multiple modes
US20080313568 *20 nov. 200718 déc. 2008Samsung Electronics Co., Ltd.Digital multimedia playback apparatus and control method thereof
US20120140102 *9 févr. 20127 juin 2012Samsung Electronics Co., Ltd.Digital multimedia playback apparatus and control method thereof
US20130167058 *22 déc. 201127 juin 2013Microsoft CorporationClosing applications
US20130222301 *5 févr. 201329 août 2013Samsung Electronics Co., Ltd.Method and apparatus for moving contents in terminal
US20130342485 *21 juin 201326 déc. 2013Samsung Electronics Co., Ltd.Method for improving touch recognition and electronic device thereof
CN103513822A *21 juin 201315 janv. 2014三星电子株式会社Method for improving touch recognition and electronic device thereof
WO2013063241A1 *25 oct. 20122 mai 2013Barnesandnoble.Com LlcPen interface for a touch screen device
Classifications
Classification aux États-Unis345/179, 382/314, 715/863, 715/769, 382/188
Classification internationaleG06F3/023, G06K9/62, G06K9/68, G06F1/16, G06F3/041, G10L15/26, G06F3/038
Classification coopérativeG06F3/0238, G06F3/04883, G10L15/26, G06F3/0412, G06F3/038, G06F3/04886, G06F1/1626, G06K9/6293, G06F3/041
Classification européenneG06F3/023P, G06F3/038, G06F3/0488G, G06K9/62F3M, G10L15/26A, G06F3/0488T, G06F3/041D, G06F1/16P3
Événements juridiques
DateCodeÉvénementDescription
9 déc. 2014ASAssignment
Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:034564/0001
Effective date: 20141014