US20060062461A1 - Chinese character handwriting recognition system - Google Patents

Chinese character handwriting recognition system Download PDF

Info

Publication number
US20060062461A1
US20060062461A1 US11/262,214 US26221405A US2006062461A1 US 20060062461 A1 US20060062461 A1 US 20060062461A1 US 26221405 A US26221405 A US 26221405A US 2006062461 A1 US2006062461 A1 US 2006062461A1
Authority
US
United States
Prior art keywords
stroke
character
characters
list
strokes
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/262,214
Inventor
Michael Longe
Jianchao Wu
Lu Zhang
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tegic Communications Inc
Original Assignee
Tegic Communications Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tegic Communications Inc filed Critical Tegic Communications Inc
Priority to US11/262,214 priority Critical patent/US20060062461A1/en
Assigned to TEGIC COMMUNICATIONS, INC. reassignment TEGIC COMMUNICATIONS, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: LONGE, MICHAEL, WU, JIANCHAO, ZHANG, LU
Publication of US20060062461A1 publication Critical patent/US20060062461A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/018Input/output arrangements for oriental characters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/14Image acquisition
    • G06V30/142Image acquisition using hand-held instruments; Constructional details of the instruments
    • G06V30/1423Image acquisition using hand-held instruments; Constructional details of the instruments the instrument generating sequences of position coordinates corresponding to handwriting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Definitions

  • This invention relates generally to text input technology. More particularly, the invention relates to a method and system that allows users to input handwritten Chinese characters to a data processor by entering the first few strokes required to write a character, so that users can perform characters input tasks in a fast, predictive way.
  • Synaptics' QuickStroke system which incorporates a prediction function based on a highly sophisticated neural network engine. This is not a graphics capture application where the users have to write out the entire character before the software can recognize which character is intended. Instead, it can recognize a character after only three to six strokes of the character have been written. It can be used with a standard mouse, Synaptics TouchPadTM, or a Synaptics pen input TouchPad.
  • Zi Corporation's text input solutions based on an intelligent indexing engine which intuitively predicts and displays desired candidates.
  • the solutions also include powerful personalization and learning capabilities—providing prediction of user-created terms and frequently used vocabulary.
  • a handwritten Chinese character input method and system is provided to allow users to enter Chinese characters to a data processor by drawing just the first few strokes and one selection movement such as mouse clicking or stylus or finger tapping.
  • the system is interactive, predictive, and intuitive to use. By adding one or two strokes which are used to start writing a Chinese character, users can find a desired character from a list of characters. The list is context sensitive, so in some cases no strokes are needed. It varies depending on the prior character entered.
  • the system puts the handwritten-stroke-to-category mapping on top of the stroke category matching technology, including an optional “Match any stroke category” key or gesture. Compared to other existing systems, this system can save users considerable time and efforts to entering handwritten characters.
  • the handwritten Chinese character input system includes: (1) recognition means for recognizing a category of handwriting stroke from a list of stroke categories; (2) collection means for organizing a list of characters that commonly start with one or more recognized categories of handwriting strokes, the list of characters being displayed in a predetermined sequence; and (3) selection means for selecting a desired character from the list of characters.
  • the strokes are classified into five basic categories, each having one or more sub-categories.
  • the collection means contains predefined stroke order information. It also contain a display means to display a list of most frequently used characters when no strokes are entered, while strokes are being entered, and/or after a character is selected.
  • the list of most frequently used characters is context sensitive. It varies depending upon the last Chinese character entered.
  • the predetermined sequence may be based on any of: (1) number of strokes necessary to write out a character; (2) use frequency of a character; and (3) contextual relation to the last character entered.
  • the selection means is associated with any of: (1) mouse clicking; (2) stylus tapping; (3) finger tapping; and (4) button/key pressing.
  • the system also contains “stroke entry means,” such as an LCD touchscreen, stylus or finger pad, trackball, data glove, or other touch-sensitive (possibly flexible) surface.
  • stroke entry means such as an LCD touchscreen, stylus or finger pad, trackball, data glove, or other touch-sensitive (possibly flexible) surface.
  • the system may further includes means for displaying a numeric or iconic representation of each stroke that is entered and a full numeric or iconic representation of strokes for a Chinese character that is selected.
  • a method for inputting handwritten Chinese characters includes the following steps:
  • the method may further comprise the steps of:
  • the method may comprises the steps of:
  • FIG. 1 is a schematic diagram illustrating an apparatus for inputting handwritten Chinese characters according to one preferred embodiment of the invention
  • FIG. 2 is a flow diagram illustrating a method for inputting handwritten Chinese characters in a predictive manner according to another preferred embodiment of the invention
  • FIG. 3 is a diagram illustrating five basic strokes and their numeric representation
  • FIG. 4A is a pictorial diagram illustrating an overview of the Stroke Recognition Interface prior to any input
  • FIG. 4B is a pictorial diagram illustrating the Stroke Recognition Interface when a first single horizontal stroke is added
  • FIG. 4C is a pictorial diagram illustrating the Stroke Recognition Interface when a second horizontal stroke is added
  • FIG. 4D is a pictorial diagram illustrating the Stroke Recognition Interface when a third horizontal stroke is added
  • FIG. 4E is a pictorial diagram illustrating the Stroke Recognition Interface when a desired character appears to be the first character in the Selection List;
  • FIG. 4F is a pictorial diagram illustrating the Stroke Recognition Interface when the first character in the selection list is selected
  • FIG. 4G is a pictorial diagram illustrating the Stroke Recognition Interface when a desired character is not the first character in the selection list
  • FIG. 4H is a pictorial diagram illustrating the Stroke Recognition Interface when the desired character rather than the first character in the selection list is selected;
  • FIG. 4I is a pictorial diagram illustrating the Stroke Recognition Interface when the first desired character is selected and a stroke is added for another character;
  • FIG. 4J is a pictorial diagram illustrating the Stroke Recognition Interface when two strokes are added
  • FIG. 4K is a pictorial diagram illustrating the Stroke Recognition Interface when third stroke is added
  • FIG. 4L is a pictorial diagram illustrating the Stroke Recognition Interface where the desired character is indicated;
  • FIG. 4M is a pictorial diagram illustrating the Stroke Recognition Interface when the second desired character is selected
  • FIG. 4N is a pictorial diagram illustrating the Stroke Recognition Interface where a third desired character appears in the most frequently used characters
  • FIG. 4O is a pictorial diagram illustrating the Stroke Recognition Interface when a third desired character is selected without adding any stroke.
  • FIG. 5 is a schematic diagram illustrating the input interface for touchscreen PDA according to the most preferred embodiment of the invention.
  • FIG. 1 is a schematic diagram illustrating an apparatus for inputting handwritten Chinese characters according to one preferred embodiment of this invention.
  • the apparatus includes three basic components: a Stroke Recognition Interface 20 for recognizing entered stroke patterns, an Input Device 24 for entering strokes, and a Processor 30 for performing data process tasks.
  • the Stroke Recognition Interface 20 has three basic areas: a Message Display Area 28 , a Selection List Area 26 , and a Stroke Input Area 22 .
  • Message Display Area 28 is the place where the selected characters are displayed. It represents an email or SMS message, or whatever application intends to use the generated text.
  • Selection List Area 26 is the place to display the most common character choices for the strokes currently entered on the stroke input window. This area may also list common characters that follow the last character in the Message Display Area 28 , that also begin with the strokes entered in the Stroke Input Area 22 .
  • Stroke Input Area 22 is the heart of the Stroke Recognition Interface 20 .
  • the user begins drawing a character onscreen in this area, using an Input Device 24 such as a stylus, a finger, or a mouse, depending on input device and display device used.
  • the display device echos and retains each stroke (an “ink trail”) until the character is selected.
  • Stroke Recognition Interface 20 may further includes a Stroke Number Display Area to display the interfaces interpretation, either numeric or iconic, of the strokes entered by the user. When a character is selected, the full stroke representation, either by numbers or by icons, is displayed here. This area is optional, but could be useful for helping users learn stroke orders and stroke categories.
  • the system may further include: the capabilities to match Latin letters and punctuation symbols and emoticons, with user-defined stroke sequences; user-defined gestures for predefined stroke categories, and unique gestures representing entire components/sequence/symbols; learning/adapting to user's handwriting style, skew, or cursive; optional training session with known characters; optional prompting user to clarify between ambiguous stroke interpretations, and/or a means to enter explicit strokes, e.g. via stroke category keys), and/or remedy a stroke misinterpretation; optional indication of level of confidence of stroke interpretations, e.g. color-coding each “ink trail” or a smiley-face that frowns when it is uncertain; means to display all strokes that make up a character, e.g. drag & drop from text editor to Stroke [Number] Display Area); as well as ability to delete the last stroke(s) in reverse order (and ink trail(s)) by some means.
  • level of confidence of stroke interpretations e.g. color
  • FIG. 2 is a flow diagram illustrating a method for inputting handwritten Chinese characters in a predictive manner according to the preferred embodiment of the invention. The method includes the following steps:
  • the apparatus may have a function to actively display the interfaces interpretation, either numeric or iconic, of the strokes entered by the user. Therefore, the method described above may further comprise the steps of:
  • Step 54 may be replaced by:
  • FIG. 3 is a diagram showing five basic strokes and their numeric representation. There is a government standard of five stroke categories for simplified Chinese characters. There are other classification of the stroke categories. The method and system according to this invention apply to any kind of classification.
  • One of the major advantages of the recognition system according to this invention is the great reduction of ambiguities arising in the subtle distinction between certain subtypes of the stroke categories.
  • a horizontal line with a slight hook upwards is stroke 1 ;
  • a horizontal line with a slight hook down is stroke 5 ;
  • a horizontal line angled upwards is stroke 1 ;
  • a curved line that starts right diagonally then evens out to horizontal or curved up is stroke 4 , and etc.
  • One technique for resolving, or at least limiting, ambiguities is the use of limited wildcards. These are stroke keys that match with any stroke that fits one type of ambiguity. For example, if the stroke may fit into either stroke category 4 or stroke category 5 , the limited wildcard would match both 4 and 5 .
  • the system may learn the specific idiosyncrasies of its one user, and adapt to fit that person's handwriting style.
  • FIG. 4A illustrates an overview of the Stroke Recognition Interface before any stroke is added.
  • the Character Selection List shows the first ten most frequently used characters. If a user's first desired character is in the list, he just selects the character by clicking the mouse or by tapping a stylus or his finger, without need to add a stroke. If the desired character is not in the list, the user adds a stroke using mouse, stylus, or finger.
  • FIG. 4B illustrates the Stroke Recognition Interface when a first single horizontal stroke is added.
  • the stroke category is determined to be “1”, and is listed in the Stroke Number Area.
  • the Selection List is re-ordered to predict the most likely character to be chosen based on the first stroke.
  • FIG. 4C illustrates the Stroke Recognition Interface when a second horizontal stroke is added. After a second horizontal line is entered, the selection list is re-ordered again, showing only the most likely characters that start with two horizontal lines (stroke category 1 ). Note that the position and relative lengths of the strokes do not affect the selection list, only the stroke categories.
  • FIG. 4D illustrates the Stroke Recognition Interface when a third horizontal stroke is added. After a third horizontal line is entered, the selection list is re-ordered again, showing only the most likely characters that start with three horizontal lines (stroke category 1 ).
  • FIG. 4E illustrates the Stroke Recognition Interface when a desired character appears to be the first character in the Selection List. Note that the character drawn so far is identical to the first character listed in the selection list. If this were the character desired, simply click that character from the list.
  • FIG. 4F illustrates the Stroke Recognition Interface when the first character in the selection list is selected. If the user chooses the first character, it is added to the message; at the same time, the stroke numbers are displayed at the bottom, and the input area is cleared, ready for the next character. Note that to select a character, the user has to take one additional mouse click (or stylus or finger press/tapping) than there are strokes. Novice users may find this annoying until they get used to the system, and lean to take advantage of its predictive features.
  • FIG. 4G illustrates the Stroke Recognition Interface when a desired character is not the first character in the selection list.
  • the strength of this system is its predictive abilities. If the user desired the very complex, but somewhat common, character pointed to in the above illustration, he needs not complete the stroke for that character. As soon as it is displayed in the selection list, it can be selected by clicking a mouse (or stylus or finger tapping) on the character.
  • FIG. 4H illustrates the Stroke Recognition Interface when the desired character rather than the first character in the selection list is selected.
  • FIG. 4I illustrates the Stroke Recognition Interface when the first desired character is selected and a stroke is added for another character. Once the character is entered, the program is ready to accept the strokes for another character. Here the initial stroke is a different category, to enter in a very different character. Notice that the selection list is very different than it was with the first stroke of the previous character.
  • FIG. 4J illustrates the Stroke Recognition Interface when two strokes are added. Note that the strokes entered already form a character that matches the most likely choice in the selection list. The character that we are aiming for in this example is already displayed (see the fifth character from the left) after the second stroke is added. But we want to continue to demonstrate the disambiguation feature of the system.
  • FIG. 4K illustrates the Stroke Recognition Interface when the third stroke is added.
  • the selection list contains two characters that are only slightly different from each other. In fact, these two characters have exactly the same stroke order, and choosing from the selection list is the only way to disambiguate the two characters. Note that the second character being pointed to one is less commonly used than not only the first, but also of a slightly more complex character.
  • FIG. 4L illustrates the Stroke Recognition Interface where the desired character is indicated. Note that the desired character was first visible after the second stroke was entered, and is still a likely choice in the selection list (see the fourth character from the left). If a desired character is removed from the selection list for some reason, it is indication that the stroke order entered by the user does not match the Government Standard stroke order used in the system.
  • FIG. 4M illustrates the Stroke Recognition Interface when the second desired character is selected.
  • the character is selected, and added to the message. It is a 9-stroke character. We selected it at three strokes, but could have selected it at two strokes.
  • FIG. 4N illustrates the Stroke Recognition Interface where a third desired character appears in the most frequently used characters. For very common characters, there is no need to enter any strokes. The ten most frequently used characters are displayed even when no strokes are entered. If the user wants to enter one of these common characters, simply selecting it will add it to the message. Note that the selection list of the most frequently used characters is context sensitive. The system displays the ten most frequent characters to follow the last character entered.
  • FIG. 4O illustrates the Stroke Recognition Interface when a third desired character is selected without adding any stroke. This is a saving of seven to one for the third character.
  • FIG. 5 illustrates a recommended layout of the input interface according to the most preferred embodiment, where the message area is omitted and the text goes directly into the active application, so there is no need for a message area.
  • the stroke entry means is a handwriting input area displayed on a touchscreen on a PDA.
  • Each entered stroke is recognized as one of a set of stroke categories.
  • the graphical keys, each assigned to a stroke category, are optionally available to display and enter strokes, as an alternative input means.
  • One of the graphical keys represents “match any stroke category”.
  • the method described above may be carried out by a computer usable medium containing instructions in computer readable form.
  • the method may be incorporated in a computer program, a logic device, mobile device, or firmware and/or may be downloaded from a network, e.g. a Web site over the Internet. It may be applied in all sorts of text entry.

Abstract

A handwritten Chinese character input method and system is provided to allow users to enter Chinese characters to a data processor by adding less than three strokes and one selection movement such as mouse clicking or stylus or finger tapping. The system is interactive, predictive, and intuitive to use. By adding one or two strokes which are used to start writing a Chinese character, or in some case even no strokes are needed, users can find a desired character from a list of characters. The list is context sensitive. It varies depending on the prior character entered. Compared to other existing systems, this system can save users considerable time and efforts to entering handwritten characters.

Description

    CROSS REFERENCE TO RELATED APPLICATION
  • This application is a continuation of U.S. patent application Ser. No. 10/205,950, filed Jul. 25, 2002.
  • BACKGROUND OF THE INVENTION
  • 1. Technical Field
  • This invention relates generally to text input technology. More particularly, the invention relates to a method and system that allows users to input handwritten Chinese characters to a data processor by entering the first few strokes required to write a character, so that users can perform characters input tasks in a fast, predictive way.
  • 2. Description of the Prior Art
  • Around the globe, over 1.2 billion people speak Chinese. This includes the People's Republic of China, Taiwan, Singapore, and a large community of overseas Chinese in Asia and North America. Chinese character strokes and symbols are so different and so complicated that they can be sorted and grouped in a wide variety of ways. One can analytically sort out as many as 35-40 strokes of 4-10 symbols or more per Chinese character, depending on how they are grouped. Because of this unique structure of Chinese language, computer users cannot input Chinese characters using alphabetic keyboards as easily as inputting Western language.
  • A number of methods and systems for inputting Chinese characters to screen, such as the Three Corners method, Goo Coding System, 5-Stroke method, Changjie's Input scheme, etc., have been developed. However, none of these input methods provides an easy to use, standardized input/output scheme to speed up the retrieval, typewriting process, by taking full advantage of computer technology.
  • Several other methods and system for inputting handwritten Chinese characters are also deknown. For example, Apple Computer and the Institute of System Science in Singapore (Apple-ISS) have developed a system which features an application for dictation and a handwriting input method for Chinese. This system incorporates a dictionary assistance service wherein when a first character is recognized, the device displays a list of phrases based on the first character and the user may select the proper phrase without inputting any stroke. This technique effectively increases the input speed.
  • Another example is Synaptics' QuickStroke system which incorporates a prediction function based on a highly sophisticated neural network engine. This is not a graphics capture application where the users have to write out the entire character before the software can recognize which character is intended. Instead, it can recognize a character after only three to six strokes of the character have been written. It can be used with a standard mouse, Synaptics TouchPad™, or a Synaptics pen input TouchPad.
  • Another example is Zi Corporation's text input solutions based on an intelligent indexing engine which intuitively predicts and displays desired candidates. The solutions also include powerful personalization and learning capabilities—providing prediction of user-created terms and frequently used vocabulary.
  • It would be advantageous to provide a handwritten Chinese character input method and system to allow users to enter Chinese characters to a data processor by drawing just the first few strokes and one selection movement such as mouse clicking or stylus or finger tapping.
  • SUMMARY OF INVENTION
  • A handwritten Chinese character input method and system is provided to allow users to enter Chinese characters to a data processor by drawing just the first few strokes and one selection movement such as mouse clicking or stylus or finger tapping. The system is interactive, predictive, and intuitive to use. By adding one or two strokes which are used to start writing a Chinese character, users can find a desired character from a list of characters. The list is context sensitive, so in some cases no strokes are needed. It varies depending on the prior character entered. The system puts the handwritten-stroke-to-category mapping on top of the stroke category matching technology, including an optional “Match any stroke category” key or gesture. Compared to other existing systems, this system can save users considerable time and efforts to entering handwritten characters.
  • In one preferred embodiment, the handwritten Chinese character input system includes: (1) recognition means for recognizing a category of handwriting stroke from a list of stroke categories; (2) collection means for organizing a list of characters that commonly start with one or more recognized categories of handwriting strokes, the list of characters being displayed in a predetermined sequence; and (3) selection means for selecting a desired character from the list of characters.
  • In a typical embodiment, the strokes are classified into five basic categories, each having one or more sub-categories. The collection means contains predefined stroke order information. It also contain a display means to display a list of most frequently used characters when no strokes are entered, while strokes are being entered, and/or after a character is selected. The list of most frequently used characters is context sensitive. It varies depending upon the last Chinese character entered. The predetermined sequence may be based on any of: (1) number of strokes necessary to write out a character; (2) use frequency of a character; and (3) contextual relation to the last character entered.
  • The selection means is associated with any of: (1) mouse clicking; (2) stylus tapping; (3) finger tapping; and (4) button/key pressing.
  • The system also contains “stroke entry means,” such as an LCD touchscreen, stylus or finger pad, trackball, data glove, or other touch-sensitive (possibly flexible) surface.
  • The system may further includes means for displaying a numeric or iconic representation of each stroke that is entered and a full numeric or iconic representation of strokes for a Chinese character that is selected.
  • According to the preferred embodiment, a method for inputting handwritten Chinese characters includes the following steps:
      • adding a stroke into the stroke recognition apparatus;
      • categorizing the added stroke into one of a predetermined number of categories;
      • finding characters based on frequency of character use;
      • displaying a list of found characters;
      • if a desired character is in the list, selecting the desired character from the list;
      • if a desired character is not visible in the list, adding another stroke;
      • finding most common characters that appear after a previously selected character based on a present stroke sequence; and
      • displaying another list of found characters.
  • The method may further comprise the steps of:
      • displaying a numeric representation for a stroke that is added; and
      • displaying full stroke numeric representation for a character that is selected.
  • As an alternative, the method may comprises the steps of:
      • displaying an iconic representation for a stroke that is added; and
      • displaying full stroke iconic representation for a character that is selected.
    BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a schematic diagram illustrating an apparatus for inputting handwritten Chinese characters according to one preferred embodiment of the invention;
  • FIG. 2 is a flow diagram illustrating a method for inputting handwritten Chinese characters in a predictive manner according to another preferred embodiment of the invention;
  • FIG. 3 is a diagram illustrating five basic strokes and their numeric representation;
  • FIG. 4A is a pictorial diagram illustrating an overview of the Stroke Recognition Interface prior to any input;
  • FIG. 4B is a pictorial diagram illustrating the Stroke Recognition Interface when a first single horizontal stroke is added;
  • FIG. 4C is a pictorial diagram illustrating the Stroke Recognition Interface when a second horizontal stroke is added;
  • FIG. 4D is a pictorial diagram illustrating the Stroke Recognition Interface when a third horizontal stroke is added;
  • FIG. 4E is a pictorial diagram illustrating the Stroke Recognition Interface when a desired character appears to be the first character in the Selection List;
  • FIG. 4F is a pictorial diagram illustrating the Stroke Recognition Interface when the first character in the selection list is selected;
  • FIG. 4G is a pictorial diagram illustrating the Stroke Recognition Interface when a desired character is not the first character in the selection list;
  • FIG. 4H is a pictorial diagram illustrating the Stroke Recognition Interface when the desired character rather than the first character in the selection list is selected;
  • FIG. 4I is a pictorial diagram illustrating the Stroke Recognition Interface when the first desired character is selected and a stroke is added for another character;
  • FIG. 4J is a pictorial diagram illustrating the Stroke Recognition Interface when two strokes are added;
  • FIG. 4K is a pictorial diagram illustrating the Stroke Recognition Interface when third stroke is added;
  • FIG. 4L is a pictorial diagram illustrating the Stroke Recognition Interface where the desired character is indicated;
  • FIG. 4M is a pictorial diagram illustrating the Stroke Recognition Interface when the second desired character is selected;
  • FIG. 4N is a pictorial diagram illustrating the Stroke Recognition Interface where a third desired character appears in the most frequently used characters;
  • FIG. 4O is a pictorial diagram illustrating the Stroke Recognition Interface when a third desired character is selected without adding any stroke; and
  • FIG. 5 is a schematic diagram illustrating the input interface for touchscreen PDA according to the most preferred embodiment of the invention.
  • DETAILED DESCRIPTION OF THE INVENTION
  • FIG. 1 is a schematic diagram illustrating an apparatus for inputting handwritten Chinese characters according to one preferred embodiment of this invention. The apparatus includes three basic components: a Stroke Recognition Interface 20 for recognizing entered stroke patterns, an Input Device 24 for entering strokes, and a Processor 30 for performing data process tasks.
  • The Stroke Recognition Interface 20 has three basic areas: a Message Display Area 28, a Selection List Area 26, and a Stroke Input Area 22.
  • Message Display Area 28 is the place where the selected characters are displayed. It represents an email or SMS message, or whatever application intends to use the generated text.
  • Selection List Area 26 is the place to display the most common character choices for the strokes currently entered on the stroke input window. This area may also list common characters that follow the last character in the Message Display Area 28, that also begin with the strokes entered in the Stroke Input Area 22.
  • Stroke Input Area 22 is the heart of the Stroke Recognition Interface 20. The user begins drawing a character onscreen in this area, using an Input Device 24 such as a stylus, a finger, or a mouse, depending on input device and display device used. The display device echos and retains each stroke (an “ink trail”) until the character is selected.
  • Stroke Recognition Interface 20 may further includes a Stroke Number Display Area to display the interfaces interpretation, either numeric or iconic, of the strokes entered by the user. When a character is selected, the full stroke representation, either by numbers or by icons, is displayed here. This area is optional, but could be useful for helping users learn stroke orders and stroke categories.
  • The system may further include: the capabilities to match Latin letters and punctuation symbols and emoticons, with user-defined stroke sequences; user-defined gestures for predefined stroke categories, and unique gestures representing entire components/sequence/symbols; learning/adapting to user's handwriting style, skew, or cursive; optional training session with known characters; optional prompting user to clarify between ambiguous stroke interpretations, and/or a means to enter explicit strokes, e.g. via stroke category keys), and/or remedy a stroke misinterpretation; optional indication of level of confidence of stroke interpretations, e.g. color-coding each “ink trail” or a smiley-face that frowns when it is uncertain; means to display all strokes that make up a character, e.g. drag & drop from text editor to Stroke [Number] Display Area); as well as ability to delete the last stroke(s) in reverse order (and ink trail(s)) by some means.
  • FIG. 2 is a flow diagram illustrating a method for inputting handwritten Chinese characters in a predictive manner according to the preferred embodiment of the invention. The method includes the following steps:
    • Step 50: Adding a stroke into the Stroke Input Area 22;
    • Step 52: Categorizing the added stroke into a stroke category.
    • Step 54: Finding characters based on frequency of character use;
    • Step 56: Displaying a list of found characters. The list of characters is displayed in a predetermined sequence. The predetermined sequence may be based on (1) number of strokes necessary to write out a Chinese character; (2) use frequency of a Chinese character entered; or (3) contextual relation to the prior character entered;
    • Step 58: Checking whether the desired character in the list;
    • Step 60: If the desired character is not in the list, adding next stroke in the Message Display Area 28;
    • Step 70: If a desired character is in the list, selecting it by clicking a mouse or tapping a stylus or finger, depending on the input and display devices used;
    • Step 72: Putting the selected character in the Message Display Area 28;
    • Step 74: Checking whether the message is complete;
    • Step 76: Adding next stroke if the message is not complete;
    • Step 62 (continued from Step 60 or Step 76): Finding most common characters that appear after a previously selected character based on a present stroke sequence. This also happens before the first stroke, i.e. before Step 50] and
    • Step 80: Displaying a list of found characters and the process continues on Step 58.
  • The apparatus may have a function to actively display the interfaces interpretation, either numeric or iconic, of the strokes entered by the user. Therefore, the method described above may further comprise the steps of:
      • Displaying a numeric representation for a stroke that is added;
      • Displaying full stroke numeric representation for a character that is selected;
      • Displaying an iconic representation for a stroke that is added; and
      • Displaying full stroke iconic representation for a character that is selected.
  • As an alternative, Step 54 may be replaced by:
      • Finding characters that commonly start with one or more recognized stroke patterns.
  • FIG. 3 is a diagram showing five basic strokes and their numeric representation. There is a government standard of five stroke categories for simplified Chinese characters. There are other classification of the stroke categories. The method and system according to this invention apply to any kind of classification.
  • One of the major advantages of the recognition system according to this invention is the great reduction of ambiguities arising in the subtle distinction between certain subtypes of the stroke categories. To reduce ambiguities, there are further definitions on the subtypes. For example, a horizontal line with a slight hook upwards is stroke 1; a horizontal line with a slight hook down is stroke 5; a horizontal line angled upwards is stroke 1; and a curved line that starts right diagonally then evens out to horizontal or curved up is stroke 4, and etc.
  • One technique for resolving, or at least limiting, ambiguities, is the use of limited wildcards. These are stroke keys that match with any stroke that fits one type of ambiguity. For example, if the stroke may fit into either stroke category 4 or stroke category 5, the limited wildcard would match both 4 and 5.
  • Often the difference between a stroke of one type and a similar stroke of another type are too subtle for a computer to differentiate. This gets even more confusing when the user is sloppy and curves his straight strokes, or straightens his curved strokes, or gets the angle slightly off.
  • To account for all of the variation of an individual user, the system may learn the specific idiosyncrasies of its one user, and adapt to fit that person's handwriting style.
  • The specifics of the exaggeration needed may be determined as appropriate. Key to this aspect of the invention is that the user has to make diagonal strokes very diagonal, straight strokes very straight, curved strokes very curved, and angled strokes very angled.
  • The result on paper is a character that would look somewhat artificial and a caricature of its intended character. However, this greatly simplifies the disambiguation process for finding the strokes, which then helps the disambiguation of characters.
  • In the following paragraphs in conjunction with a series of pictorial diagrams, the operation process is described.
  • FIG. 4A illustrates an overview of the Stroke Recognition Interface before any stroke is added. Note that the Character Selection List shows the first ten most frequently used characters. If a user's first desired character is in the list, he just selects the character by clicking the mouse or by tapping a stylus or his finger, without need to add a stroke. If the desired character is not in the list, the user adds a stroke using mouse, stylus, or finger.
  • FIG. 4B illustrates the Stroke Recognition Interface when a first single horizontal stroke is added. The stroke category is determined to be “1”, and is listed in the Stroke Number Area. The Selection List is re-ordered to predict the most likely character to be chosen based on the first stroke.
  • FIG. 4C illustrates the Stroke Recognition Interface when a second horizontal stroke is added. After a second horizontal line is entered, the selection list is re-ordered again, showing only the most likely characters that start with two horizontal lines (stroke category 1). Note that the position and relative lengths of the strokes do not affect the selection list, only the stroke categories.
  • FIG. 4D illustrates the Stroke Recognition Interface when a third horizontal stroke is added. After a third horizontal line is entered, the selection list is re-ordered again, showing only the most likely characters that start with three horizontal lines (stroke category 1).
  • FIG. 4E illustrates the Stroke Recognition Interface when a desired character appears to be the first character in the Selection List. Note that the character drawn so far is identical to the first character listed in the selection list. If this were the character desired, simply click that character from the list.
  • FIG. 4F illustrates the Stroke Recognition Interface when the first character in the selection list is selected. If the user chooses the first character, it is added to the message; at the same time, the stroke numbers are displayed at the bottom, and the input area is cleared, ready for the next character. Note that to select a character, the user has to take one additional mouse click (or stylus or finger press/tapping) than there are strokes. Novice users may find this annoying until they get used to the system, and lean to take advantage of its predictive features.
  • FIG. 4G illustrates the Stroke Recognition Interface when a desired character is not the first character in the selection list. The strength of this system is its predictive abilities. If the user desired the very complex, but somewhat common, character pointed to in the above illustration, he needs not complete the stroke for that character. As soon as it is displayed in the selection list, it can be selected by clicking a mouse (or stylus or finger tapping) on the character.
  • FIG. 4H illustrates the Stroke Recognition Interface when the desired character rather than the first character in the selection list is selected. Once the complex character is selected, we see that it is a 15-stroke character, added to the message with only three strokes and one additional click. The user gets a 15-stroke character using four movements. The saving of movement and hence time is about four to one. Additionally, the entire stroke order is displayed now, so if the user was used to an alternate stroke order for the character, he can learn the Government Standard stroke order used by this system.
  • FIG. 4I illustrates the Stroke Recognition Interface when the first desired character is selected and a stroke is added for another character. Once the character is entered, the program is ready to accept the strokes for another character. Here the initial stroke is a different category, to enter in a very different character. Notice that the selection list is very different than it was with the first stroke of the previous character.
  • FIG. 4J illustrates the Stroke Recognition Interface when two strokes are added. Note that the strokes entered already form a character that matches the most likely choice in the selection list. The character that we are aiming for in this example is already displayed (see the fifth character from the left) after the second stroke is added. But we want to continue to demonstrate the disambiguation feature of the system.
  • FIG. 4K illustrates the Stroke Recognition Interface when the third stroke is added. After a third stroke is entered, the selection list contains two characters that are only slightly different from each other. In fact, these two characters have exactly the same stroke order, and choosing from the selection list is the only way to disambiguate the two characters. Note that the second character being pointed to one is less commonly used than not only the first, but also of a slightly more complex character.
  • FIG. 4L illustrates the Stroke Recognition Interface where the desired character is indicated. Note that the desired character was first visible after the second stroke was entered, and is still a likely choice in the selection list (see the fourth character from the left). If a desired character is removed from the selection list for some reason, it is indication that the stroke order entered by the user does not match the Government Standard stroke order used in the system.
  • FIG. 4M illustrates the Stroke Recognition Interface when the second desired character is selected. The character is selected, and added to the message. It is a 9-stroke character. We selected it at three strokes, but could have selected it at two strokes.
  • FIG. 4N illustrates the Stroke Recognition Interface where a third desired character appears in the most frequently used characters. For very common characters, there is no need to enter any strokes. The ten most frequently used characters are displayed even when no strokes are entered. If the user wants to enter one of these common characters, simply selecting it will add it to the message. Note that the selection list of the most frequently used characters is context sensitive. The system displays the ten most frequent characters to follow the last character entered.
  • FIG. 4O illustrates the Stroke Recognition Interface when a third desired character is selected without adding any stroke. This is a saving of seven to one for the third character.
  • FIG. 5 illustrates a recommended layout of the input interface according to the most preferred embodiment, where the message area is omitted and the text goes directly into the active application, so there is no need for a message area.
  • In a typical embodiment, the stroke entry means is a handwriting input area displayed on a touchscreen on a PDA. Each entered stroke is recognized as one of a set of stroke categories. The graphical keys, each assigned to a stroke category, are optionally available to display and enter strokes, as an alternative input means. One of the graphical keys represents “match any stroke category”.
  • The method described above may be carried out by a computer usable medium containing instructions in computer readable form. In other words, the method may be incorporated in a computer program, a logic device, mobile device, or firmware and/or may be downloaded from a network, e.g. a Web site over the Internet. It may be applied in all sorts of text entry.
  • Although the invention is described herein with reference to some preferred embodiments, one skilled in the art will readily appreciate that other applications may be substituted for those set forth herein without departing from the spirit and scope of the present invention.
  • Accordingly, the invention should only be limited by the claims included below.

Claims (18)

1. A Chinese character handwriting input system, comprising:
recognition means for recognizing a category of handwriting stroke from a predefined number of stroke categories;
recognition means for recognizing one or more categories of handwriting stroke from a predefined number of stroke categories;
collection means for organizing a list of characters that commonly start with said more than one recognized category of handwriting stroke, said list of characters being displayed in a predefined sequence, wherein said predefined sequence is based on any of:
number of strokes necessary to write out a character;
use frequency of a character; and
contextual relation to the last character entered; and
selection means for selecting a desired character from said list of characters.
2. The system of claim 1, further comprising:
wildcard entry means for matching any stroke category.
3. A Chinese character handwriting input system, comprising:
recognition means for recognizing a category of handwriting stroke from a predefined number of stroke categories; and
collection means for organizing a list of characters that commonly start with one or more recognized categories of handwriting stroke, said list of characters being displayed in a predefined sequence, wherein said predefined sequence is based on any of:
number of strokes necessary to write out a character;
use frequency of a character; and
contextual relation to the last character entered;
selection means for selecting a desired character from said list of characters;
wherein said predetermined number of stroke categories comprise more than five basic categories.
4. A method for inputting handwritten Chinese characters, comprising the steps of:
adding a stroke into a pattern recognition system;
categorizing said added stroke into one of a predefined number of stroke categories;
finding characters based on frequency of character use; and
displaying a list of found characters.
5. The method of claim 4, further comprising the steps of:
if a desired character is in said list, selecting said desired character from said list;
if a desired character is not visible in said list, adding another stroke; and
displaying another list of found characters.
6. The method claim 4, further comprising the steps of:
displaying a numeric or iconic representation for a stroke that is added; and
displaying full stroke numeric or iconic representation for a character that is selected.
7. The method of claim 4, further comprising the steps of:
if a desired character is in said list, either of selecting said desired character from said list or adding another stroke and displaying another list of found characters.
8. The method of claim 4, further comprising the step of:
retaining an ink trail of each stroke that is added until a character is selected.
9. The method of claim 8, further comprising the step of:
color coding each ink trail either to indicate a level of confidence or differentiation in said categorization step.
10. The method of claim 4, further comprising the step of:
prompting a user to clarify between ambiguous stroke interpretations and/or to remedy a stroke's misinterpretation.
11. The method of claim 4, further comprising the step of:
providing means for removing one or more strokes of an input stroke sequence in reverse order.
12. The method of claim 4, further comprising the step of:
providing means for matching any of Latin letters, punctuation symbols, and emoticons with predefined or user-defined stroke sequences.
13. The method of claim 4, further comprising the step of:
selecting a character from said list with a user gesture;
wherein said user gesture allows said user to begin entry of strokes for a next character.
14. The method of claim 4, further comprising the step of:
providing user-defined gestures for any of stroke categories, sequences of strokes, and character components.
15. The method of claim 4, further comprising the step of:
providing means for explicit selection of stroke categories.
16. The method of claim 4, further comprising the step of:
displaying character components that start with one or more recognized stroke categories;
wherein selecting a character component results in the display of only the characters containing or starting with said selected component.
17. The method of claim 4, further comprising the step of:
allowing alternative stroke sequences for character or character component entry.
18. The method of claim 4, said step of finding characters based on frequency of use further comprising the step of:
finding said characters based on context.
US11/262,214 2002-07-25 2005-10-27 Chinese character handwriting recognition system Abandoned US20060062461A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US11/262,214 US20060062461A1 (en) 2002-07-25 2005-10-27 Chinese character handwriting recognition system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US10/205,950 US6970599B2 (en) 2002-07-25 2002-07-25 Chinese character handwriting recognition system
US11/262,214 US20060062461A1 (en) 2002-07-25 2005-10-27 Chinese character handwriting recognition system

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US10/205,950 Continuation US6970599B2 (en) 1999-05-27 2002-07-25 Chinese character handwriting recognition system

Publications (1)

Publication Number Publication Date
US20060062461A1 true US20060062461A1 (en) 2006-03-23

Family

ID=30770184

Family Applications (2)

Application Number Title Priority Date Filing Date
US10/205,950 Expired - Fee Related US6970599B2 (en) 1999-05-27 2002-07-25 Chinese character handwriting recognition system
US11/262,214 Abandoned US20060062461A1 (en) 2002-07-25 2005-10-27 Chinese character handwriting recognition system

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US10/205,950 Expired - Fee Related US6970599B2 (en) 1999-05-27 2002-07-25 Chinese character handwriting recognition system

Country Status (5)

Country Link
US (2) US6970599B2 (en)
CN (1) CN100550036C (en)
AU (1) AU2003252091A1 (en)
HK (1) HK1082310A1 (en)
WO (1) WO2004012135A1 (en)

Cited By (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040083198A1 (en) * 2002-07-18 2004-04-29 Bradford Ethan R. Dynamic database reordering system
US20050052406A1 (en) * 2003-04-09 2005-03-10 James Stephanick Selective input system based on tracking of motion parameters of an input device
US20060072824A1 (en) * 2003-09-16 2006-04-06 Van Meurs Pim System and method for Chinese input using a joystick
US20070218430A1 (en) * 2005-11-03 2007-09-20 Tamkang University Calligraphy practicing system
US20080015841A1 (en) * 2000-05-26 2008-01-17 Longe Michael R Directional Input System with Automatic Correction
US20080183472A1 (en) * 2002-03-15 2008-07-31 International Business Machine Corporation Speech recognition system and program thereof
US20090003703A1 (en) * 2007-06-26 2009-01-01 Microsoft Corporation Unifield digital ink recognition
US20090002392A1 (en) * 2007-06-26 2009-01-01 Microsoft Corporation Integrated platform for user input of digital ink
US20090060338A1 (en) * 2007-09-04 2009-03-05 Por-Sen Jaw Method of indexing Chinese characters
US20090213134A1 (en) * 2003-04-09 2009-08-27 James Stephanick Touch screen and graphical user interface
US20090284471A1 (en) * 1999-05-27 2009-11-19 Tegic Communications, Inc. Virtual Keyboard System with Automatic Correction
US20090295737A1 (en) * 2008-05-30 2009-12-03 Deborah Eileen Goldsmith Identification of candidate characters for text input
US20100141597A1 (en) * 2008-12-05 2010-06-10 Nhn Corporation Method, device and computer readable recording medium for preventing input error when information is inputted through touch screen
US7880730B2 (en) 1999-05-27 2011-02-01 Tegic Communications, Inc. Keyboard system with automatic correction
US20110169726A1 (en) * 2010-01-08 2011-07-14 Microsoft Corporation Evolving universal gesture sets
US20110193797A1 (en) * 2007-02-01 2011-08-11 Erland Unruh Spell-check for a keyboard system with automatic correction
US20110310118A1 (en) * 2010-06-22 2011-12-22 Microsoft Corporation Ink Lag Compensation Techniques
US8094939B2 (en) 2007-06-26 2012-01-10 Microsoft Corporation Digital ink-based search
US20120079373A1 (en) * 2007-01-05 2012-03-29 Kenneth Kocienda Method, System, and Graphical User Interface for Providing Word Recommendations
US8201087B2 (en) 2007-02-01 2012-06-12 Tegic Communications, Inc. Spell-check for a keyboard system with automatic correction
US20120242516A1 (en) * 2009-12-02 2012-09-27 Tencent Technology (Shenzhen) Company Limited Wubi input system and method
CN102880400A (en) * 2011-07-13 2013-01-16 阿尔派株式会社 Hand-writing character input device and hand-writing character input method
US20130212511A1 (en) * 2012-02-09 2013-08-15 Samsung Electronics Co., Ltd. Apparatus and method for guiding handwriting input for handwriting recognition
US8560974B1 (en) * 2011-10-06 2013-10-15 Google Inc. Input method application for a touch-sensitive user interface
JP6054547B2 (en) * 2013-12-09 2016-12-27 株式会社東芝 Electronic device and method for processing handwritten document information
US20170139898A1 (en) * 2015-11-16 2017-05-18 Lenovo (Singapore) Pte, Ltd. Updating hint list based on number of strokes
CN108089727A (en) * 2016-06-12 2018-05-29 苹果公司 For the touch keypad of screen
CN108256448A (en) * 2017-12-29 2018-07-06 上海义启信息科技有限公司 A kind of Chinese-character writing recognition methods
CN108319896A (en) * 2017-12-29 2018-07-24 上海义启信息科技有限公司 A kind of recognition methods of Chinese-character writing
US10346035B2 (en) 2013-06-09 2019-07-09 Apple Inc. Managing real-time handwriting recognition
US11079933B2 (en) 2008-01-09 2021-08-03 Apple Inc. Method, device, and graphical user interface providing word recommendations for text input
US11194467B2 (en) 2019-06-01 2021-12-07 Apple Inc. Keyboard management user interfaces

Families Citing this family (60)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6970599B2 (en) * 2002-07-25 2005-11-29 America Online, Inc. Chinese character handwriting recognition system
JP4177335B2 (en) * 2003-05-02 2008-11-05 富士通株式会社 Handwritten character input device and handwritten character input processing method
US7634720B2 (en) * 2003-10-24 2009-12-15 Microsoft Corporation System and method for providing context to an input method
US7848573B2 (en) * 2003-12-03 2010-12-07 Microsoft Corporation Scaled text replacement of ink
WO2005096217A1 (en) * 2004-04-02 2005-10-13 Nokia Corporation Apparatus and method for handwriting recognition
US8411958B2 (en) 2004-05-04 2013-04-02 Nokia Corporation Apparatus and method for handwriting recognition
US20050264584A1 (en) * 2004-05-27 2005-12-01 Zhu-Min Di [method for fast input of chinese character]
US20050276480A1 (en) * 2004-06-10 2005-12-15 Microsoft Corporation Handwritten input for Asian languages
CN100373401C (en) * 2005-04-28 2008-03-05 西门子(中国)有限公司 Chinese character handwriting inputting method based on stroke sequence
EP1717672A1 (en) * 2005-04-29 2006-11-02 Ford Global Technologies, LLC Method for providing feedback to a user of an appliance system in a vehicle
EP1717671A1 (en) * 2005-04-29 2006-11-02 Ford Global Technologies, LLC Method for an appliance system of a vehicle
US8374846B2 (en) * 2005-05-18 2013-02-12 Neuer Wall Treuhand Gmbh Text input device and method
US20090193334A1 (en) * 2005-05-18 2009-07-30 Exb Asset Management Gmbh Predictive text input system and method involving two concurrent ranking means
US8117540B2 (en) * 2005-05-18 2012-02-14 Neuer Wall Treuhand Gmbh Method and device incorporating improved text input mechanism
US9606634B2 (en) * 2005-05-18 2017-03-28 Nokia Technologies Oy Device incorporating improved text input mechanism
US8036878B2 (en) 2005-05-18 2011-10-11 Never Wall Treuhand GmbH Device incorporating improved text input mechanism
US8413069B2 (en) * 2005-06-28 2013-04-02 Avaya Inc. Method and apparatus for the automatic completion of composite characters
KR101418128B1 (en) * 2005-10-15 2014-07-09 노키아 코포레이션 Improved text entry into electronic devices
KR20070052118A (en) * 2005-11-16 2007-05-21 한국전자통신연구원 A letter inputting system and method using analog joystick controller
EP1895466A1 (en) * 2006-08-30 2008-03-05 BRITISH TELECOMMUNICATIONS public limited company Providing an image for display
US20080210474A1 (en) * 2006-10-31 2008-09-04 Volkswagen Of America, Inc. Motor vehicle having a touch screen
CN1996219B (en) * 2006-12-15 2010-05-19 许双俊 Quick handwriting input method for small-sized electronic device
US7809719B2 (en) * 2007-02-08 2010-10-05 Microsoft Corporation Predicting textual candidates
US7912700B2 (en) * 2007-02-08 2011-03-22 Microsoft Corporation Context based word prediction
US8341556B2 (en) * 2007-04-30 2012-12-25 Hewlett-Packard Development Company, L.P. Method and system for attention-free user input on a computing device
KR100930802B1 (en) * 2007-06-29 2009-12-09 엔에이치엔(주) Browser control method and system using images
DE102007052622A1 (en) * 2007-11-05 2009-05-07 T-Mobile International Ag Method for image analysis, in particular for a mobile radio device
US8010465B2 (en) * 2008-02-26 2011-08-30 Microsoft Corporation Predicting candidates using input scopes
EP2133772B1 (en) * 2008-06-11 2011-03-09 ExB Asset Management GmbH Device and method incorporating an improved text input mechanism
EP2194443A1 (en) * 2008-12-04 2010-06-09 Research In Motion Limited Stroke based input system for character input
US8648796B2 (en) * 2008-12-04 2014-02-11 Blackberry Limited Stroke based input system for character input
US8977779B2 (en) * 2009-03-31 2015-03-10 Mytalk Llc Augmentative and alternative communication system with personalized user interface and content
TWI411937B (en) * 2009-05-07 2013-10-11 Inventec Appliances Corp Man - machine input system and text editor input method
US8896470B2 (en) * 2009-07-10 2014-11-25 Blackberry Limited System and method for disambiguation of stroke input
TWI412955B (en) * 2009-08-19 2013-10-21 Inventec Appliances Corp Method of prompting stroke order for chinese character, electronic device, and computer program product thereof
CN102043568A (en) * 2009-10-16 2011-05-04 孙振峰 Control device and method for auxiliary input in handwriting input device
CN101930474A (en) * 2010-09-14 2010-12-29 闫卫 Chinese character simple stroke search method
EP2450773A1 (en) * 2010-10-20 2012-05-09 Research In Motion Limited Character input method
US8810581B2 (en) 2010-10-20 2014-08-19 Blackberry Limited Character input method
CN102156608B (en) * 2010-12-10 2013-07-24 上海合合信息科技发展有限公司 Handwriting input method for writing characters continuously
US8094941B1 (en) * 2011-06-13 2012-01-10 Google Inc. Character recognition for overlapping textual user input
CN103034426B (en) * 2011-09-28 2016-07-06 腾讯科技(深圳)有限公司 A kind of terminal and contact person's searching method thereof
CN102880412A (en) * 2012-08-23 2013-01-16 东莞宇龙通信科技有限公司 Handwriting input method, system and device
EP2711805A1 (en) * 2012-09-25 2014-03-26 Advanced Digital Broadcast S.A. Method for handling a gesture-based user interface
JP5832980B2 (en) * 2012-09-25 2015-12-16 株式会社東芝 Handwriting input support device, method and program
GB2507777A (en) * 2012-11-09 2014-05-14 David Rawcliffe Conversion of combinations of gestures into character input, using restricted gesture set
WO2014200736A1 (en) * 2013-06-09 2014-12-18 Apple Inc. Managing real - time handwriting recognition
US9495620B2 (en) 2013-06-09 2016-11-15 Apple Inc. Multi-script handwriting recognition using a universal recognizer
US20140361983A1 (en) * 2013-06-09 2014-12-11 Apple Inc. Real-time stroke-order and stroke-direction independent handwriting recognition
US10725650B2 (en) * 2014-03-17 2020-07-28 Kabushiki Kaisha Kawai Gakki Seisakusho Handwritten music sign recognition device and program
CN107077206A (en) * 2014-09-30 2017-08-18 皇家飞利浦有限公司 User interface system based on sensing equipment
CN106372563A (en) * 2015-07-22 2017-02-01 深圳市新方码电脑科技有限公司 Radical handwritten recognition Chinese character input method and apparatus
US10289664B2 (en) * 2015-11-12 2019-05-14 Lenovo (Singapore) Pte. Ltd. Text input method for completing a phrase by inputting a first stroke of each logogram in a plurality of logograms
US10146759B2 (en) * 2016-03-24 2018-12-04 Microsoft Technology Licensing, Llc Controlling digital input
US10579893B2 (en) * 2017-02-28 2020-03-03 Konica Minolta Laboratory U.S.A., Inc. Inferring stroke information from an image
CN107368205B (en) * 2017-07-26 2020-04-07 维沃移动通信有限公司 Handwriting input method and mobile terminal
WO2019022567A2 (en) * 2017-07-27 2019-01-31 Samsung Electronics Co., Ltd. Method for automatically providing gesture-based auto-complete suggestions and electronic device thereof
CN110111648A (en) * 2019-04-17 2019-08-09 吉林大学珠海学院 A kind of programming training system and method
CN111091036B (en) * 2019-07-17 2023-09-26 广东小天才科技有限公司 Dictation content identification method and electronic equipment
US11514696B2 (en) * 2019-12-17 2022-11-29 Ricoh Company, Ltd. Display device, display method, and computer-readable recording medium

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4573196A (en) * 1983-01-19 1986-02-25 Communications Intelligence Corporation Confusion grouping of strokes in pattern recognition method and system
US5586198A (en) * 1993-08-24 1996-12-17 Lakritz; David Method and apparatus for identifying characters in ideographic alphabet
US5754686A (en) * 1994-02-10 1998-05-19 Canon Kabushiki Kaisha Method of registering a character pattern into a user dictionary and a character recognition apparatus having the user dictionary
US5812697A (en) * 1994-06-10 1998-09-22 Nippon Steel Corporation Method and apparatus for recognizing hand-written characters using a weighting dictionary
US5812696A (en) * 1992-06-25 1998-09-22 Canon Kabushiki Kaisha Character recognizing method and apparatus
US5870492A (en) * 1992-06-04 1999-02-09 Wacom Co., Ltd. Hand-written character entry apparatus
US5926566A (en) * 1996-11-15 1999-07-20 Synaptics, Inc. Incremental ideographic character input method
US6453079B1 (en) * 1997-07-25 2002-09-17 Claritech Corporation Method and apparatus for displaying regions in a document image having a low recognition confidence
US20020168107A1 (en) * 1998-04-16 2002-11-14 International Business Machines Corporation Method and apparatus for recognizing handwritten chinese characters
US6686907B2 (en) * 2000-12-21 2004-02-03 International Business Machines Corporation Method and apparatus for inputting Chinese characters
US6801659B1 (en) * 1999-01-04 2004-10-05 Zi Technology Corporation Ltd. Text input system for ideographic and nonideographic languages
US6970599B2 (en) * 2002-07-25 2005-11-29 America Online, Inc. Chinese character handwriting recognition system
US7088861B2 (en) * 2003-09-16 2006-08-08 America Online, Inc. System and method for chinese input using a joystick

Family Cites Families (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4286329A (en) * 1979-12-17 1981-08-25 International Business Machines Corporation Complex character generator
US6002799A (en) * 1986-07-25 1999-12-14 Ast Research, Inc. Handwritten keyboardless entry computer system
US5187480A (en) * 1988-09-05 1993-02-16 Allan Garnham Symbol definition apparatus
US5224179A (en) * 1988-12-20 1993-06-29 At&T Bell Laboratories Image skeletonization method
JP3143461B2 (en) * 1990-05-29 2001-03-07 キヤノン株式会社 Character recognition method and character recognition device
JP3155577B2 (en) * 1991-10-16 2001-04-09 キヤノン株式会社 Character recognition method and device
US5973676A (en) * 1993-06-30 1999-10-26 Kabushiki Kaisha Toshiba Input apparatus suitable for portable electronic device
EP0769175B9 (en) * 1994-07-01 2005-01-12 Palm Computing, Inc. Multiple pen stroke character set and handwriting recognition system
AU690781B2 (en) 1994-11-14 1998-04-30 Motorola, Inc. Method of splitting handwritten input
JP2845149B2 (en) * 1994-12-28 1999-01-13 日本電気株式会社 Handwritten character input device and handwritten character input method
US6041137A (en) * 1995-08-25 2000-03-21 Microsoft Corporation Radical definition and dictionary creation for a handwriting recognition system
US6278445B1 (en) * 1995-08-31 2001-08-21 Canon Kabushiki Kaisha Coordinate input device and method having first and second sampling devices which sample input data at staggered intervals
US5796867A (en) * 1996-06-12 1998-08-18 Industrial Technology Research Institute Stroke-number-free and stroke-order-free on-line Chinese character recognition method
CN1100300C (en) * 1996-10-16 2003-01-29 夏普公司 Character input apparatus and storage medium in which character input program is stored
US6275611B1 (en) * 1996-10-17 2001-08-14 Motorola, Inc. Handwriting recognition device, method and alphabet, with strokes grouped into stroke sub-structures
JP4098880B2 (en) * 1997-06-06 2008-06-11 松下電器産業株式会社 Information retrieval device
US6144764A (en) * 1997-07-02 2000-11-07 Mitsui High-Tec, Inc. Method and apparatus for on-line handwritten input character recognition and recording medium for executing the method
JP3481136B2 (en) 1998-05-29 2003-12-22 シャープ株式会社 Character font generation method and apparatus therefor, and computer-readable recording medium recording character font generation program
US6075469A (en) * 1998-08-11 2000-06-13 Pong; Gim Yee Three stroke Chinese character word processing techniques and apparatus
US6172625B1 (en) * 1999-07-06 2001-01-09 Motorola, Inc. Disambiguation method and apparatus, and dictionary data compression techniques
FI112978B (en) 1999-09-17 2004-02-13 Nokia Corp Entering Symbols
US7949513B2 (en) * 2002-01-22 2011-05-24 Zi Corporation Of Canada, Inc. Language module and method for use with text processing devices

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4573196A (en) * 1983-01-19 1986-02-25 Communications Intelligence Corporation Confusion grouping of strokes in pattern recognition method and system
US5870492A (en) * 1992-06-04 1999-02-09 Wacom Co., Ltd. Hand-written character entry apparatus
US5812696A (en) * 1992-06-25 1998-09-22 Canon Kabushiki Kaisha Character recognizing method and apparatus
US5586198A (en) * 1993-08-24 1996-12-17 Lakritz; David Method and apparatus for identifying characters in ideographic alphabet
US5754686A (en) * 1994-02-10 1998-05-19 Canon Kabushiki Kaisha Method of registering a character pattern into a user dictionary and a character recognition apparatus having the user dictionary
US5812697A (en) * 1994-06-10 1998-09-22 Nippon Steel Corporation Method and apparatus for recognizing hand-written characters using a weighting dictionary
US5926566A (en) * 1996-11-15 1999-07-20 Synaptics, Inc. Incremental ideographic character input method
US6453079B1 (en) * 1997-07-25 2002-09-17 Claritech Corporation Method and apparatus for displaying regions in a document image having a low recognition confidence
US20020168107A1 (en) * 1998-04-16 2002-11-14 International Business Machines Corporation Method and apparatus for recognizing handwritten chinese characters
US6801659B1 (en) * 1999-01-04 2004-10-05 Zi Technology Corporation Ltd. Text input system for ideographic and nonideographic languages
US6956968B1 (en) * 1999-01-04 2005-10-18 Zi Technology Corporation, Ltd. Database engines for processing ideographic characters and methods therefor
US6686907B2 (en) * 2000-12-21 2004-02-03 International Business Machines Corporation Method and apparatus for inputting Chinese characters
US6970599B2 (en) * 2002-07-25 2005-11-29 America Online, Inc. Chinese character handwriting recognition system
US7088861B2 (en) * 2003-09-16 2006-08-08 America Online, Inc. System and method for chinese input using a joystick

Cited By (76)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7880730B2 (en) 1999-05-27 2011-02-01 Tegic Communications, Inc. Keyboard system with automatic correction
US8576167B2 (en) 1999-05-27 2013-11-05 Tegic Communications, Inc. Directional input system with automatic correction
US9557916B2 (en) 1999-05-27 2017-01-31 Nuance Communications, Inc. Keyboard system with automatic correction
US8466896B2 (en) 1999-05-27 2013-06-18 Tegic Communications, Inc. System and apparatus for selectable input with a touch screen
US9400782B2 (en) 1999-05-27 2016-07-26 Nuance Communications, Inc. Virtual keyboard system with automatic correction
US20100277416A1 (en) * 1999-05-27 2010-11-04 Tegic Communications, Inc. Directional input system with automatic correction
US20090284471A1 (en) * 1999-05-27 2009-11-19 Tegic Communications, Inc. Virtual Keyboard System with Automatic Correction
US8441454B2 (en) 1999-05-27 2013-05-14 Tegic Communications, Inc. Virtual keyboard system with automatic correction
US8294667B2 (en) 1999-05-27 2012-10-23 Tegic Communications, Inc. Directional input system with automatic correction
US8976115B2 (en) 2000-05-26 2015-03-10 Nuance Communications, Inc. Directional input system with automatic correction
US20080015841A1 (en) * 2000-05-26 2008-01-17 Longe Michael R Directional Input System with Automatic Correction
US7778818B2 (en) 2000-05-26 2010-08-17 Tegic Communications, Inc. Directional input system with automatic correction
US20080183472A1 (en) * 2002-03-15 2008-07-31 International Business Machine Corporation Speech recognition system and program thereof
US20040083198A1 (en) * 2002-07-18 2004-04-29 Bradford Ethan R. Dynamic database reordering system
US8237682B2 (en) 2003-04-09 2012-08-07 Tegic Communications, Inc. System and process for selectable input with a touch screen
US7821503B2 (en) 2003-04-09 2010-10-26 Tegic Communications, Inc. Touch screen and graphical user interface
US20050052406A1 (en) * 2003-04-09 2005-03-10 James Stephanick Selective input system based on tracking of motion parameters of an input device
US20090213134A1 (en) * 2003-04-09 2009-08-27 James Stephanick Touch screen and graphical user interface
US8237681B2 (en) 2003-04-09 2012-08-07 Tegic Communications, Inc. Selective input system and process based on tracking of motion parameters of an input object
US8456441B2 (en) 2003-04-09 2013-06-04 Tegic Communications, Inc. Selective input system and process based on tracking of motion parameters of an input object
US7750891B2 (en) 2003-04-09 2010-07-06 Tegic Communications, Inc. Selective input system based on tracking of motion parameters of an input device
US20060072824A1 (en) * 2003-09-16 2006-04-06 Van Meurs Pim System and method for Chinese input using a joystick
US7218781B2 (en) * 2003-09-16 2007-05-15 Tegic Communications, Inc. System and method for chinese input using a joystick
US8570292B2 (en) 2003-12-22 2013-10-29 Tegic Communications, Inc. Virtual keyboard system with automatic correction
US20070218430A1 (en) * 2005-11-03 2007-09-20 Tamkang University Calligraphy practicing system
US11416141B2 (en) 2007-01-05 2022-08-16 Apple Inc. Method, system, and graphical user interface for providing word recommendations
US20120079373A1 (en) * 2007-01-05 2012-03-29 Kenneth Kocienda Method, System, and Graphical User Interface for Providing Word Recommendations
US9189079B2 (en) * 2007-01-05 2015-11-17 Apple Inc. Method, system, and graphical user interface for providing word recommendations
US9244536B2 (en) 2007-01-05 2016-01-26 Apple Inc. Method, system, and graphical user interface for providing word recommendations
US10592100B2 (en) 2007-01-05 2020-03-17 Apple Inc. Method, system, and graphical user interface for providing word recommendations
US11112968B2 (en) 2007-01-05 2021-09-07 Apple Inc. Method, system, and graphical user interface for providing word recommendations
US20110193797A1 (en) * 2007-02-01 2011-08-11 Erland Unruh Spell-check for a keyboard system with automatic correction
US9092419B2 (en) 2007-02-01 2015-07-28 Nuance Communications, Inc. Spell-check for a keyboard system with automatic correction
US8225203B2 (en) 2007-02-01 2012-07-17 Nuance Communications, Inc. Spell-check for a keyboard system with automatic correction
US8201087B2 (en) 2007-02-01 2012-06-12 Tegic Communications, Inc. Spell-check for a keyboard system with automatic correction
US8892996B2 (en) 2007-02-01 2014-11-18 Nuance Communications, Inc. Spell-check for a keyboard system with automatic correction
US8315482B2 (en) 2007-06-26 2012-11-20 Microsoft Corporation Integrated platform for user input of digital ink
US20090003703A1 (en) * 2007-06-26 2009-01-01 Microsoft Corporation Unifield digital ink recognition
US20090002392A1 (en) * 2007-06-26 2009-01-01 Microsoft Corporation Integrated platform for user input of digital ink
US8094939B2 (en) 2007-06-26 2012-01-10 Microsoft Corporation Digital ink-based search
US8041120B2 (en) 2007-06-26 2011-10-18 Microsoft Corporation Unified digital ink recognition
US20090060338A1 (en) * 2007-09-04 2009-03-05 Por-Sen Jaw Method of indexing Chinese characters
US11079933B2 (en) 2008-01-09 2021-08-03 Apple Inc. Method, device, and graphical user interface providing word recommendations for text input
US11474695B2 (en) 2008-01-09 2022-10-18 Apple Inc. Method, device, and graphical user interface providing word recommendations for text input
US10152225B2 (en) 2008-05-30 2018-12-11 Apple Inc. Identification of candidate characters for text input
US10871897B2 (en) 2008-05-30 2020-12-22 Apple Inc. Identification of candidate characters for text input
US20090295737A1 (en) * 2008-05-30 2009-12-03 Deborah Eileen Goldsmith Identification of candidate characters for text input
US9355090B2 (en) * 2008-05-30 2016-05-31 Apple Inc. Identification of candidate characters for text input
US9372847B2 (en) * 2008-12-05 2016-06-21 Nhn Corporation Method, device and computer readable recording medium for preventing input error when information is inputted through touch screen
US20100141597A1 (en) * 2008-12-05 2010-06-10 Nhn Corporation Method, device and computer readable recording medium for preventing input error when information is inputted through touch screen
US20120242516A1 (en) * 2009-12-02 2012-09-27 Tencent Technology (Shenzhen) Company Limited Wubi input system and method
US20110169726A1 (en) * 2010-01-08 2011-07-14 Microsoft Corporation Evolving universal gesture sets
US9019201B2 (en) * 2010-01-08 2015-04-28 Microsoft Technology Licensing, Llc Evolving universal gesture sets
US20110310118A1 (en) * 2010-06-22 2011-12-22 Microsoft Corporation Ink Lag Compensation Techniques
US9189147B2 (en) * 2010-06-22 2015-11-17 Microsoft Technology Licensing, Llc Ink lag compensation techniques
CN102880400A (en) * 2011-07-13 2013-01-16 阿尔派株式会社 Hand-writing character input device and hand-writing character input method
US8560974B1 (en) * 2011-10-06 2013-10-15 Google Inc. Input method application for a touch-sensitive user interface
US20130212511A1 (en) * 2012-02-09 2013-08-15 Samsung Electronics Co., Ltd. Apparatus and method for guiding handwriting input for handwriting recognition
US10346035B2 (en) 2013-06-09 2019-07-09 Apple Inc. Managing real-time handwriting recognition
US11816326B2 (en) 2013-06-09 2023-11-14 Apple Inc. Managing real-time handwriting recognition
US11182069B2 (en) 2013-06-09 2021-11-23 Apple Inc. Managing real-time handwriting recognition
US11016658B2 (en) 2013-06-09 2021-05-25 Apple Inc. Managing real-time handwriting recognition
JP6054547B2 (en) * 2013-12-09 2016-12-27 株式会社東芝 Electronic device and method for processing handwritten document information
US9916300B2 (en) * 2015-11-16 2018-03-13 Lenovo (Singapore) Pte. Ltd. Updating hint list based on number of strokes
US20170139898A1 (en) * 2015-11-16 2017-05-18 Lenovo (Singapore) Pte, Ltd. Updating hint list based on number of strokes
US10884617B2 (en) 2016-06-12 2021-01-05 Apple Inc. Handwriting keyboard for screens
CN108089727A (en) * 2016-06-12 2018-05-29 苹果公司 For the touch keypad of screen
US10228846B2 (en) 2016-06-12 2019-03-12 Apple Inc. Handwriting keyboard for screens
US11640237B2 (en) 2016-06-12 2023-05-02 Apple Inc. Handwriting keyboard for screens
US10466895B2 (en) 2016-06-12 2019-11-05 Apple Inc. Handwriting keyboard for screens
US11941243B2 (en) 2016-06-12 2024-03-26 Apple Inc. Handwriting keyboard for screens
CN108256448A (en) * 2017-12-29 2018-07-06 上海义启信息科技有限公司 A kind of Chinese-character writing recognition methods
CN108319896A (en) * 2017-12-29 2018-07-24 上海义启信息科技有限公司 A kind of recognition methods of Chinese-character writing
US11194467B2 (en) 2019-06-01 2021-12-07 Apple Inc. Keyboard management user interfaces
US11620046B2 (en) 2019-06-01 2023-04-04 Apple Inc. Keyboard management user interfaces
US11842044B2 (en) 2019-06-01 2023-12-12 Apple Inc. Keyboard management user interfaces

Also Published As

Publication number Publication date
AU2003252091A1 (en) 2004-02-16
CN1606753A (en) 2005-04-13
US6970599B2 (en) 2005-11-29
HK1082310A1 (en) 2006-06-02
US20040017946A1 (en) 2004-01-29
CN100550036C (en) 2009-10-14
WO2004012135A1 (en) 2004-02-05

Similar Documents

Publication Publication Date Title
US6970599B2 (en) Chinese character handwriting recognition system
US6795579B2 (en) Method and apparatus for recognizing handwritten chinese characters
US7088861B2 (en) System and method for chinese input using a joystick
JP4527731B2 (en) Virtual keyboard system with automatic correction function
CN1324436C (en) System and method for improved user input on personal computing devices
US6493464B1 (en) Multiple pen stroke character set and handwriting recognition system with immediate response
US6567549B1 (en) Method and apparatus for immediate response handwriting recognition system that handles multiple character sets
RU2206118C2 (en) Ambiguity elimination system with downsized keyboard
US7158678B2 (en) Text input method for personal digital assistants and the like
US20030007018A1 (en) Handwriting user interface for personal digital assistants and the like
US20060119588A1 (en) Apparatus and method of processing information input using a touchpad
US20030231167A1 (en) System and method for providing gesture suggestions to enhance interpretation of user input
WO2007121673A1 (en) Method and device for improving inputting speed of characters
EP1513053A2 (en) Apparatus and method for character recognition
JPH0991424A (en) Retrieval device and method thereof
US7562314B2 (en) Data processing apparatus and method
KR20180115699A (en) System and method for multi-input management
KR100651396B1 (en) Alphabet recognition apparatus and method
CN101601050B (en) The system and method for preview and selection is carried out to word
CN104834392B (en) A kind of Chinese character input method of stroke dynamic group word
JP2003005902A (en) Character inputting device, information processor, method for controlling character inputting device, and storage medium
CN101551701A (en) Multidimensional control method and device, optimal or relatively favorable display input method and device
CN107608533A (en) A kind of Embedded Input Method of light-type
JP3153704B2 (en) Character recognition device
CN115917469A (en) Apparatus and method for inputting logograms into electronic device

Legal Events

Date Code Title Description
AS Assignment

Owner name: TEGIC COMMUNICATIONS, INC., WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LONGE, MICHAEL;WU, JIANCHAO;ZHANG, LU;REEL/FRAME:017078/0654

Effective date: 20050926

STCB Information on status: application discontinuation

Free format text: ABANDONED -- AFTER EXAMINER'S ANSWER OR BOARD OF APPEALS DECISION