US20090073137A1 - Mobile phone and method - Google Patents

Mobile phone and method Download PDF

Info

Publication number
US20090073137A1
US20090073137A1 US12/211,914 US21191408A US2009073137A1 US 20090073137 A1 US20090073137 A1 US 20090073137A1 US 21191408 A US21191408 A US 21191408A US 2009073137 A1 US2009073137 A1 US 2009073137A1
Authority
US
United States
Prior art keywords
input
character
characters
display
candidate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/211,914
Inventor
Yipu Gao
Ying Y. Liu
Yanming Zou
Kongqiao Wang
Jari A. Kangas
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nokia Oyj
Original Assignee
Nokia Oyj
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nokia Oyj filed Critical Nokia Oyj
Priority to US12/211,914 priority Critical patent/US20090073137A1/en
Publication of US20090073137A1 publication Critical patent/US20090073137A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B1/00Details of transmission systems, not covered by a single one of groups H04B3/00 - H04B13/00; Details of transmission systems not characterised by the medium used for transmission
    • H04B1/38Transceivers, i.e. devices in which transmitter and receiver form a structural unit and in which at least one part is used for functions of transmitting and receiving
    • H04B1/40Circuits
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0233Character input methods
    • G06F3/0237Character input methods using prediction or retrieval techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/23Construction or mounting of dials or of equivalent devices; Means for facilitating the use thereof

Definitions

  • the disclosed embodiments relate to a method for inputting characters to a mobile communication apparatus, and such a mobile communication apparatus.
  • a mobile communication apparatus comprising a touch sensitive display
  • a processor of the apparatus interprets the written character, and the interpreted character is input.
  • the input of information to a mobile communication apparatus by virtually writing on the screen of the mobile communication apparatus is a very feasible way to input information, since most users are familiar with normal writing.
  • the interpretation of an input character is not 100% accurate, e.g. due to personal handwriting.
  • some characters may be relatively complex to write, and also complex to interpret. An example of this is some Chinese characters. Therefore, the user desires a facilitated way of inputting information.
  • the disclosed embodiments provide an improved input method and an improved mobile communication apparatus.
  • a method for inputting characters to a mobile communication apparatus comprises enabling input of strokes representing a first character in a first area of a touch sensitive display; determining the first character from the strokes; and detecting if further characters are input within a predetermined time period, and if no further characters are detected within the time period performing the steps of: determining one or more candidates of characters for word association with the first character; presenting the one or more candidates on the display; and enabling selection among the one or more candidates.
  • An advantage of this is that a user that writes characters fast will be able to do so, until the user do not manage to write fast. Then, the user will get help by the presentation of likely characters to follow. Thus, the user will experience an input method that adapts to the input skills of the user.
  • a further advantage of this is that computing power is saved when a user do not need help with likely characters to follow.
  • the input area may comprise a first and a second input area, wherein the input method may comprise:
  • An advantage of this is that the user is enabled to make up her mind, or correct an incorrect input or incorrect interpretation of an input character.
  • the input of a character may be performed in one of the first and second areas and the presentation of candidates is performed in the other of the first and second areas.
  • An input may be determined to be a stroke if the input is within the input area, and determined to be a selection if the input is outside the input area.
  • An advantage of this is that it is easily determined by the mobile communication apparatus which kind of input it is, and it is more clear for a user how to choose between writing and selection.
  • the step of determining the first character from the strokes may comprise: determining one or more candidates of characters being likely to be the input character; presenting the one or more candidates on the display; and enabling selection among the one or more candidates.
  • An advantage of this is that the writing is facilitated for the user.
  • the presenting of candidates of likely characters may be performed in a presentation area of the display, and the selection may be performed by pointing at a character to be selected in the presentation area.
  • a second aspect of the disclosed embodiments is directed to a mobile communication apparatus comprising a touch sensitive display; a receiver structured and arranged to input strokes representing a first character in an input area of the touch sensitive display; a processor structured and adapted to determine the first character from the strokes; a first detector structured and arranged to produce a first signal in dependence on presence of input in the input area; a timer adapted to start every time the first signal switches from indicating presence of input to indicating no input, and time out after a predetermined time; and a second detector structured and adapted to produce a second signal in dependence on if further characters are input before the timer times out, wherein the processor is structured and arranged to, when the second signal indicates no further input character, determine one or more candidates of characters for word association with the first character; present the one or more candidates on the display; and receive a selection among the one or more candidates.
  • a third aspect of the disclosed embodiments is directed to a computer program arranged to perform the method according to the first aspect of the disclosed embodiments when downloaded into and run on a mobile communication apparatus.
  • FIG. 1 shows a mobile communication apparatus according to an embodiment
  • FIG. 2 shows a general display image of a mobile communication apparatus with a touch sensitive display for character input
  • FIG. 3 shows a display image of a display screen according to an embodiment
  • FIGS. 4 a and 4 b illustrates a part of a screen view according to an embodiment
  • FIG. 5 shows a display image of a display screen according to a further embodiment
  • FIGS. 6 a - 6 d are screen views of an example of input of characters to a mobile communication apparatus
  • FIGS. 7 a - 7 d are screen views of further an example of input of characters to a mobile communication apparatus
  • FIG. 8 is a flow chart showing a method according to an embodiment.
  • FIG. 9 is a flow chart showing an embodiment of determining an input character.
  • FIG. 1 shows a mobile communication apparatus 100 , according to an embodiment of the disclosed embodiments, comprising a main body 102 on which an antenna 104 , a speaker 106 , a microphone 108 , a plurality of keys 110 , and a display screen 112 is present, and a stylus 114 used for input on the screen 112 , which is touch sensitive.
  • the stylus 114 can normally be put in a groove (not shown) of the main body 102 when not used.
  • the mobile communication apparatus comprises any features known in the art, such as messaging, browsing, calendar, etc. Further, the mobile communication apparatus comprises functions for input by virtual writing on the touch sensitive screen 112 by the stylus 114 . The features of input of the mobile communication apparatus will be described in more detail below.
  • FIG. 2 shows a display image of a display screen 200 , with a designated field 202 for input of characters with a stylus.
  • a character is input by a user using the stylus in the field 202 as would have been done with a normal pen on paper.
  • the mobile communication apparatus interprets the character by strokes detected on the touch sensitive display, and preferably displays the character on the screen 200 .
  • the user can then input further characters in turn in the field 202 , and the interpretation of the characters are displayed in turn as characters on the screen 200 .
  • FIG. 3 shows a display image of a display screen 300 according to an embodiment of the disclosed embodiments, with a designated first field 302 for input of characters.
  • a second field 304 associated with the first field 302 and preferably placed within the first field 302 , is designated for selection of alternatives presented in connection with the input.
  • the second field can be placed in the lower right corner, as depicted in FIG. 3 , as a character in e.g. Chinese normally begin with a stroke up to down or left to right.
  • the second field 304 can be properly placed within the first field 302 .
  • the placing of the second field 304 within the first field 302 is due to the normally limited available space on a screen of a mobile communication apparatus. However, if this is not a constraint, the fields 302 , 304 can be placed arbitrarily.
  • a character input by strokes should be interpreted.
  • the character is displayed on the screen and characters associated with the recognized character are displayed in the second field 304 .
  • Associated characters can be characters forming a word together with the first input character. If the next pen-down is within the second field 304 , a character of the displayed associated characters at the pen-down point is selected accordingly, and displayed together with the character inputted by virtual writing. If the next pen-down is in the first field 302 , but out of the second field 304 , a second character input by strokes should be interpreted, and displayed on the screen, e.g.
  • a predetermined, and preferably user selectable, delay can be applied before associated characters are determined and displayed.
  • the user selectable delay can be anything from no delay to infinity, i.e. the function is disabled. It is feasible that after an associated character is selected, further associated characters can be presented for further selection.
  • the further associated characters can be characters forming a word with the preceding characters input by virtual writing and selection.
  • FIGS. 4 a and 4 b illustrates a part of a screen view 400 comprising first and second fields 402 , 404 .
  • the view further comprises a third field 406 , which presents a plurality of likely characters inputted by virtual writing.
  • a default character 408 is marked in FIG. 4 a as being the most likely.
  • FIG. 4 a presents a set 410 of associated characters to the marked default character 408 .
  • the user may find that the character she intended to input is another among the presented likely characters.
  • the user can then select, e.g. by tapping with the stylus on the correct character 412 in the third field 408 .
  • the correct character then becomes marked, as depicted in FIG. 4 b .
  • a new set 414 of associated characters, associated with the correct character 412 is presented in the second field 404 , as depicted in FIG. 4 b .
  • the user is then able to select among the characters associated to the character she really intended to input.
  • FIG. 5 shows a display image of a display screen 500 according to a further embodiment of the disclosed embodiments, with designated first and second fields 502 , 504 for input of characters.
  • a third field 506 associated with the first and second fields 502 , 504 is designated for selection of alternatives presented in connection with the input.
  • the use of the fields 502 , 504 , 506 will be illustratively described by examples in FIGS. 6 a - 6 d and 7 a - 7 d.
  • FIGS. 6 a - 6 d are screen views of an example of input of characters to a mobile communication apparatus having a touch sensitive display by virtual writing by a stylus, using input fields similar to what presented in FIG. 5 .
  • a user writes a character 601 in the first field.
  • the writing is interpreted and in FIG. 6 b , a plurality of likely characters 603 are presented in the third field, with the most likely character 605 marked as a default character.
  • the user recognizes the character 605 as the correct character and in FIG. 6 c , the correct character 605 is presented on the screen, e.g. for a text editing application or a messaging application.
  • the default character is either presented directly, or after a delay, or upon selection by the user.
  • FIG. 6 c a plurality of associated characters are presented in the second field. Associated characters are characters likely to follow a preceding character, and can e.g. form a word with the preceding character.
  • the user selects one of the characters presented in the second field, and the selected character 607 is inserted in the presented text on the display.
  • FIGS. 7 a - 7 d are screen views of a further example of input of characters to a mobile communication apparatus having a touch sensitive display by virtual writing by a stylus, using input fields similar to what presented in FIG. 5 .
  • a user writes a first character 701 in the first field.
  • the user is a skilled writer, and immediately writes a second character 703 in the second field, as depicted in FIG. 7 b .
  • the first character 701 is then immediately presented in the text on the display, as being an accepted default character.
  • the user then do not immediately write a third character in the first field, as would be possible for the fast writing user. Instead, the likely characters are presented in the third field, with the most likely character marked as a default character.
  • the user recognizes the marked character as the correct character and in FIG. 7 c , the correct character is presented on the screen, e.g. for a text editing application or a messaging application. Further, in FIG.
  • a plurality of associated characters are presented in the first field.
  • the user selects one of the characters presented in the first field, and the selected character is inserted in the presented text on the display. In this way, the user is able to write fast by alternately writing in the first field and the second field.
  • suggestions on likely associated characters turn up in the one of the first and second fields that was not used for input.
  • a predetermined, and preferably user selectable, delay can be applied before associated characters are determined and displayed.
  • the user selectable delay can be anything from no delay to infinity, i.e. the function is disabled.
  • a normal delay set by a normal user is between a fraction of a second and a second. It is feasible that after an associated character is selected, further associated characters can be presented, preferably alternately between the first and second fields, for further selection.
  • the further associated characters can be characters forming a word with the preceding characters input by virtual writing and selection.
  • the delay can be implemented by a timer starting every time no touch is detected on the display, stops when touch is detected, and re-starts next time no touch is detected.
  • the disabling is present as long as the timer has not timed out the predetermined time.
  • FIG. 8 is a flow chart showing a method according to an embodiment of the disclosed embodiments.
  • a stroke input step 800 input strokes performed by virtual writing by a stylus on a touch sensitive display screen are received and gathered by a processor of the mobile communication apparatus.
  • a character likely to correspond with the input strokes is determined in a character determination step 802 .
  • this step can also comprise presenting a plurality of likely characters, among one is selected.
  • a timer setting step 804 a timer is set and started, preferably starting after the last stroke is input, as described above.
  • a determination step 806 of new input strokes it is determined if new input of strokes is present before the timer times out. If no new input is determined within the predetermined time-out period, candidates of characters associated with the previously input character are determined in a candidate determination step 808 . For example, the associated characters can form a word association with the previously input character. The candidates can also be determined before or in parallel with the input determination step 806 , as the candidate determination step 808 will require some processing. In case the candidate determination step is performed before or in parallel with the input determination step 806 , the result of candidate determination is achieved earlier, but on the other hand, processing power may have been wasted in vain if the user do not need help from the determined candidates. The candidates are presented in a candidate presentation step 810 .
  • a character among the candidates is selected in a character selection step 812 .
  • characters are input and selected, they are used in an application in the mobile communication apparatus, e.g. a messaging application. Based on previously input and selected characters further characters can be associated to e.g. form a word.
  • a determination step 814 for further associated characters with the previous characters it is determined whether more candidates are present. If more candidates are present the method continues with candidate determination step 808 , otherwise, the method continues with stroke input step 800 .
  • the method branches off to a determination step 816 of overwriting previous character.
  • Overwriting can be determined by determining in which field the new input strokes are present. For example, if a first character is input in the first field, according to FIG. 5 , and then a second character is input in the second field, overwriting should not be performed. If a first character is input in the first field and then a second character is input in the first field, the first character should be replaced by the second character, i.e. overwriting is present.
  • Other ways of determining overwriting are possible to provide a feasible input method.
  • overwriting it is set in a replacement step 818 that the new input character should replace the previous character, and then the method continues to the stroke input step 800 . Otherwise, if no overwriting is determined in determination step 816 , the method continues to a step 820 for addition of characters to e.g. an editor, and then returns to the stroke input step 800 .
  • FIG. 9 is a flow chart showing an embodiment of determining an input character, as of the character determination step 802 in FIG. 8 .
  • Candidates likely to be a character input by strokes are determined in a candidate determination step 900 .
  • the determined candidates are then presented in a candidate presentation step 902 .
  • the candidates can be presented on the screen, as illustrated in FIGS. 4 a , 4 b , 6 b - 6 d , 7 c or 7 d , or any other way to achieve a feasible user interface.
  • a character among the likely candidates is then enabled to be selected in a character selection step 904 , which character thereby is determined to be the input character.

Abstract

A method for inputting characters to a mobile communication apparatus is disclosed. The method includes enabling input of strokes representing a first character in a first area of a touch sensitive display; determining the first character from the strokes; determining one or more candidates of characters for word association with the first character; presenting the one or more candidates on the display; and enabling selection among the one or more candidates. A mobile communication apparatus includes means for inputting characters is also disclosed.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • This application is a continuation of, and claims the benefit of and priority to U.S. patent application Ser. No. 10/978,954, filed Nov. 1, 2004, now allowed, which the disclosure of which is incorporated by reference herein in its entirety.
  • BACKGROUND
  • 1. Field
  • The disclosed embodiments relate to a method for inputting characters to a mobile communication apparatus, and such a mobile communication apparatus.
  • 2. Brief Description of Related Developments
  • There are several ways to input characters in a mobile communication apparatus. In a mobile communication apparatus comprising a touch sensitive display, it is possible to virtually write characters on the screen with a stylus. A processor of the apparatus then interprets the written character, and the interpreted character is input.
  • The input of information to a mobile communication apparatus by virtually writing on the screen of the mobile communication apparatus is a very feasible way to input information, since most users are familiar with normal writing. However, the interpretation of an input character is not 100% accurate, e.g. due to personal handwriting. Further, some characters may be relatively complex to write, and also complex to interpret. An example of this is some Chinese characters. Therefore, the user desires a facilitated way of inputting information.
  • SUMMARY
  • The disclosed embodiments provide an improved input method and an improved mobile communication apparatus.
  • According to a first aspect of the disclosed embodiments a method for inputting characters to a mobile communication apparatus, comprises enabling input of strokes representing a first character in a first area of a touch sensitive display; determining the first character from the strokes; and detecting if further characters are input within a predetermined time period, and if no further characters are detected within the time period performing the steps of: determining one or more candidates of characters for word association with the first character; presenting the one or more candidates on the display; and enabling selection among the one or more candidates.
  • An advantage of this is that a user that writes characters fast will be able to do so, until the user do not manage to write fast. Then, the user will get help by the presentation of likely characters to follow. Thus, the user will experience an input method that adapts to the input skills of the user. A further advantage of this is that computing power is saved when a user do not need help with likely characters to follow.
  • The input area may comprise a first and a second input area, wherein the input method may comprise:
  • enabling input of further characters in the first and second areas alternately; and
  • enabling correction of a character by inputting a character in the same of the first and second areas as the preceding character. An advantage of this is that the user is enabled to make up her mind, or correct an incorrect input or incorrect interpretation of an input character.
  • The input of a character may be performed in one of the first and second areas and the presentation of candidates is performed in the other of the first and second areas.
  • An advantage of this is that space is saved on the display.
  • An input may be determined to be a stroke if the input is within the input area, and determined to be a selection if the input is outside the input area.
  • An advantage of this is that it is easily determined by the mobile communication apparatus which kind of input it is, and it is more clear for a user how to choose between writing and selection.
  • The step of determining the first character from the strokes may comprise: determining one or more candidates of characters being likely to be the input character; presenting the one or more candidates on the display; and enabling selection among the one or more candidates.
  • An advantage of this is that the writing is facilitated for the user.
  • The presenting of candidates of likely characters may be performed in a presentation area of the display, and the selection may be performed by pointing at a character to be selected in the presentation area.
  • A second aspect of the disclosed embodiments is directed to a mobile communication apparatus comprising a touch sensitive display; a receiver structured and arranged to input strokes representing a first character in an input area of the touch sensitive display; a processor structured and adapted to determine the first character from the strokes; a first detector structured and arranged to produce a first signal in dependence on presence of input in the input area; a timer adapted to start every time the first signal switches from indicating presence of input to indicating no input, and time out after a predetermined time; and a second detector structured and adapted to produce a second signal in dependence on if further characters are input before the timer times out, wherein the processor is structured and arranged to, when the second signal indicates no further input character, determine one or more candidates of characters for word association with the first character; present the one or more candidates on the display; and receive a selection among the one or more candidates.
  • A third aspect of the disclosed embodiments is directed to a computer program arranged to perform the method according to the first aspect of the disclosed embodiments when downloaded into and run on a mobile communication apparatus.
  • The advantages of the second and third aspects of the disclosed embodiments are essentially similar to those of the first aspect of the invention.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The novel features of the disclosed embodiments are set forth in the appended claims. The invention itself, however, as well as preferred mode of use, further aspects and advantages thereof, will best be understood by reference to the following detailed description of an illustrative embodiment when read in conjunction with the accompanying drawings, wherein:
  • FIG. 1 shows a mobile communication apparatus according to an embodiment;
  • FIG. 2 shows a general display image of a mobile communication apparatus with a touch sensitive display for character input;
  • FIG. 3 shows a display image of a display screen according to an embodiment;
  • FIGS. 4 a and 4 b illustrates a part of a screen view according to an embodiment;
  • FIG. 5 shows a display image of a display screen according to a further embodiment;
  • FIGS. 6 a-6 d are screen views of an example of input of characters to a mobile communication apparatus;
  • FIGS. 7 a-7 d are screen views of further an example of input of characters to a mobile communication apparatus;
  • FIG. 8 is a flow chart showing a method according to an embodiment; and
  • FIG. 9 is a flow chart showing an embodiment of determining an input character.
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT(S)
  • FIG. 1 shows a mobile communication apparatus 100, according to an embodiment of the disclosed embodiments, comprising a main body 102 on which an antenna 104, a speaker 106, a microphone 108, a plurality of keys 110, and a display screen 112 is present, and a stylus 114 used for input on the screen 112, which is touch sensitive. The stylus 114 can normally be put in a groove (not shown) of the main body 102 when not used. The mobile communication apparatus comprises any features known in the art, such as messaging, browsing, calendar, etc. Further, the mobile communication apparatus comprises functions for input by virtual writing on the touch sensitive screen 112 by the stylus 114. The features of input of the mobile communication apparatus will be described in more detail below.
  • To illustrate the basic features of virtual writing on a touch sensitive display, FIG. 2 shows a display image of a display screen 200, with a designated field 202 for input of characters with a stylus. A character is input by a user using the stylus in the field 202 as would have been done with a normal pen on paper. Then, the mobile communication apparatus interprets the character by strokes detected on the touch sensitive display, and preferably displays the character on the screen 200. The user can then input further characters in turn in the field 202, and the interpretation of the characters are displayed in turn as characters on the screen 200.
  • FIG. 3 shows a display image of a display screen 300 according to an embodiment of the disclosed embodiments, with a designated first field 302 for input of characters. A second field 304 associated with the first field 302, and preferably placed within the first field 302, is designated for selection of alternatives presented in connection with the input. The second field can be placed in the lower right corner, as depicted in FIG. 3, as a character in e.g. Chinese normally begin with a stroke up to down or left to right. For other types of characters, and their way of being written, the second field 304 can be properly placed within the first field 302. The placing of the second field 304 within the first field 302 is due to the normally limited available space on a screen of a mobile communication apparatus. However, if this is not a constraint, the fields 302, 304 can be placed arbitrarily.
  • When the first pen-down is detected in the first field 302, but out of the second field 304, a character input by strokes should be interpreted. When an input character is recognized, the character is displayed on the screen and characters associated with the recognized character are displayed in the second field 304. Associated characters can be characters forming a word together with the first input character. If the next pen-down is within the second field 304, a character of the displayed associated characters at the pen-down point is selected accordingly, and displayed together with the character inputted by virtual writing. If the next pen-down is in the first field 302, but out of the second field 304, a second character input by strokes should be interpreted, and displayed on the screen, e.g. in a text editor, next to the preceding input character. Thereby, the user is able to continue virtual writing until the user wants to use the feature of selecting among the associated characters. To save computing power, or simply not disturbing the user, a predetermined, and preferably user selectable, delay can be applied before associated characters are determined and displayed. The user selectable delay can be anything from no delay to infinity, i.e. the function is disabled. It is feasible that after an associated character is selected, further associated characters can be presented for further selection. The further associated characters can be characters forming a word with the preceding characters input by virtual writing and selection.
  • The interpretation of the character input by virtual writing is not 100% accurate. Therefore it is preferable that two or more likely recognized characters are presented to the user, with the most likely as a default. Associated characters displayed in the second field 304 are thus associated with the default character. If another character than the default character is selected among the most likely characters, a new set of associated characters are presented, being associated with the selected character. FIGS. 4 a and 4 b illustrates a part of a screen view 400 comprising first and second fields 402, 404. The view further comprises a third field 406, which presents a plurality of likely characters inputted by virtual writing. A default character 408 is marked in FIG. 4 a as being the most likely. Field 404 in FIG. 4 a presents a set 410 of associated characters to the marked default character 408. The user may find that the character she intended to input is another among the presented likely characters. The user can then select, e.g. by tapping with the stylus on the correct character 412 in the third field 408. The correct character then becomes marked, as depicted in FIG. 4 b. A new set 414 of associated characters, associated with the correct character 412, is presented in the second field 404, as depicted in FIG. 4 b. The user is then able to select among the characters associated to the character she really intended to input.
  • FIG. 5 shows a display image of a display screen 500 according to a further embodiment of the disclosed embodiments, with designated first and second fields 502, 504 for input of characters. A third field 506 associated with the first and second fields 502, 504 is designated for selection of alternatives presented in connection with the input. The use of the fields 502, 504, 506 will be illustratively described by examples in FIGS. 6 a-6 d and 7 a-7 d.
  • FIGS. 6 a-6 d are screen views of an example of input of characters to a mobile communication apparatus having a touch sensitive display by virtual writing by a stylus, using input fields similar to what presented in FIG. 5.
  • In FIG. 6 a, a user writes a character 601 in the first field. The writing is interpreted and in FIG. 6 b, a plurality of likely characters 603 are presented in the third field, with the most likely character 605 marked as a default character. The user recognizes the character 605 as the correct character and in FIG. 6 c, the correct character 605 is presented on the screen, e.g. for a text editing application or a messaging application. The default character is either presented directly, or after a delay, or upon selection by the user. In FIG. 6 c, a plurality of associated characters are presented in the second field. Associated characters are characters likely to follow a preceding character, and can e.g. form a word with the preceding character. In FIG. 6 d, the user selects one of the characters presented in the second field, and the selected character 607 is inserted in the presented text on the display.
  • FIGS. 7 a-7 d are screen views of a further example of input of characters to a mobile communication apparatus having a touch sensitive display by virtual writing by a stylus, using input fields similar to what presented in FIG. 5.
  • In FIG. 7 a, a user writes a first character 701 in the first field. The user is a skilled writer, and immediately writes a second character 703 in the second field, as depicted in FIG. 7 b. The first character 701 is then immediately presented in the text on the display, as being an accepted default character. The user then do not immediately write a third character in the first field, as would be possible for the fast writing user. Instead, the likely characters are presented in the third field, with the most likely character marked as a default character. The user recognizes the marked character as the correct character and in FIG. 7 c, the correct character is presented on the screen, e.g. for a text editing application or a messaging application. Further, in FIG. 7 c, a plurality of associated characters are presented in the first field. In FIG. 7 d, the user selects one of the characters presented in the first field, and the selected character is inserted in the presented text on the display. In this way, the user is able to write fast by alternately writing in the first field and the second field. When the user need or want help, suggestions on likely associated characters turn up in the one of the first and second fields that was not used for input.
  • To save computing power, or simply not disturbing the user, a predetermined, and preferably user selectable, delay can be applied before associated characters are determined and displayed. Thus, the fast writer can go on writing in the alternate fields, and the mobile communication apparatus do not attempt to help the user until it seems to be needed. The user selectable delay can be anything from no delay to infinity, i.e. the function is disabled. A normal delay set by a normal user is between a fraction of a second and a second. It is feasible that after an associated character is selected, further associated characters can be presented, preferably alternately between the first and second fields, for further selection. The further associated characters can be characters forming a word with the preceding characters input by virtual writing and selection.
  • The delay can be implemented by a timer starting every time no touch is detected on the display, stops when touch is detected, and re-starts next time no touch is detected. The disabling is present as long as the timer has not timed out the predetermined time.
  • FIG. 8 is a flow chart showing a method according to an embodiment of the disclosed embodiments. In a stroke input step 800, input strokes performed by virtual writing by a stylus on a touch sensitive display screen are received and gathered by a processor of the mobile communication apparatus. A character likely to correspond with the input strokes is determined in a character determination step 802. As described above with reference to FIGS. 4, 6 a-6 d and 7 a-7 d, this step can also comprise presenting a plurality of likely characters, among one is selected. In a timer setting step 804, a timer is set and started, preferably starting after the last stroke is input, as described above. In a determination step 806 of new input strokes, it is determined if new input of strokes is present before the timer times out. If no new input is determined within the predetermined time-out period, candidates of characters associated with the previously input character are determined in a candidate determination step 808. For example, the associated characters can form a word association with the previously input character. The candidates can also be determined before or in parallel with the input determination step 806, as the candidate determination step 808 will require some processing. In case the candidate determination step is performed before or in parallel with the input determination step 806, the result of candidate determination is achieved earlier, but on the other hand, processing power may have been wasted in vain if the user do not need help from the determined candidates. The candidates are presented in a candidate presentation step 810. A character among the candidates is selected in a character selection step 812. As characters are input and selected, they are used in an application in the mobile communication apparatus, e.g. a messaging application. Based on previously input and selected characters further characters can be associated to e.g. form a word. In a determination step 814 for further associated characters with the previous characters, it is determined whether more candidates are present. If more candidates are present the method continues with candidate determination step 808, otherwise, the method continues with stroke input step 800.
  • Returning to determination step 806 of new input strokes, if it is determined that new input of strokes is present before the timer times out, the method branches off to a determination step 816 of overwriting previous character. Overwriting can be determined by determining in which field the new input strokes are present. For example, if a first character is input in the first field, according to FIG. 5, and then a second character is input in the second field, overwriting should not be performed. If a first character is input in the first field and then a second character is input in the first field, the first character should be replaced by the second character, i.e. overwriting is present. Other ways of determining overwriting are possible to provide a feasible input method. If overwriting is determined, it is set in a replacement step 818 that the new input character should replace the previous character, and then the method continues to the stroke input step 800. Otherwise, if no overwriting is determined in determination step 816, the method continues to a step 820 for addition of characters to e.g. an editor, and then returns to the stroke input step 800.
  • It should be noted that the nature of the technology, and thus also the method, is that real-time constraints are rather strict to provide a feasible user interface. Thus is the sequential description of the method more or less only for descriptive purposes. In practice, the steps are performed in any order, in different orders from time to time, and sometimes performed in parallel, with the only demand that there is required data available for the step to work with. Further, the method is running as long as the operation of input of characters is running.
  • FIG. 9 is a flow chart showing an embodiment of determining an input character, as of the character determination step 802 in FIG. 8. Candidates likely to be a character input by strokes are determined in a candidate determination step 900. The determined candidates are then presented in a candidate presentation step 902. The candidates can be presented on the screen, as illustrated in FIGS. 4 a, 4 b, 6 b-6 d, 7 c or 7 d, or any other way to achieve a feasible user interface. A character among the likely candidates is then enabled to be selected in a character selection step 904, which character thereby is determined to be the input character.

Claims (20)

1. A method for inputting characters to a mobile communication apparatus, comprising:
detecting an input of strokes representing a first character in an input area of a touch sensitive display;
determining the first character from the strokes; and
detecting if further characters are input within a predetermined time period, and if no further characters are detected within the time period:
determining one or more candidates of associated characters for word association with the first character;
presenting the one or more associated candidates on the display; and
enabling selection among the one or more associated candidates.
2. The method according to claim 1, wherein the input area comprises a first and a second input area, wherein the input method further comprises:
enabling input of further characters in the first and second areas alternately; and
enabling correction of a character by inputting a character in the same of the first and second areas as the preceding character.
3. The method according to claim 2, further comprising enabling each of the first and second input areas to receive a new character after detecting a selection from the one or more associated candidates corresponding to a respective one of the first and second input areas.
4. The method according to claim 1, wherein an input is determined to be a stroke if the input is within the input area, and determined to be a selection if the input is outside the input area.
5. The method according to claim 1, wherein determining the first character from the strokes comprises:
determining one or more candidates of characters likely to be the input character;
presenting the one or more candidates on the display; and
enabling selection among the one or more candidates.
6. The method according to claim 5, wherein the presenting of candidates of likely characters is performed in a presentation area of the display, and wherein the selection is performed by detecting a pointing at a character to be selected in the presentation area.
7. The method of claim 1 further comprising:
detecting a selection of an associated candidate; and
displaying the first character and the selected associated candidate.
8. The method of claim 7 further comprising that the selected associated candidate is a word.
9. The method of claim 7 further comprising that the one or more associated candidates are presented in a second input area of the display.
10. The method of claim 9 further comprising that the second input area is a separate display field within the input area.
11. The method of claim 10 further comprising providing a third input field associated with the input area and second input area, the third input field enabling a selection of associated candidates.
12. The method of claim 7 further comprising, after detecting a selection of an associated candidate, presenting one or more additional associated candidates on the display, the additional associated candidates corresponding to a combination of the first character and the selected associated candidate.
13. A method comprising:
detecting an input of a character in a first input field of a display;
detecting an input of another character in a second input field of the display;
presenting a default character in a display area of the display corresponding to the character inputted in the first input field;
determining if an further character is inputted in the first input field prior to expiration of a pre-determined time period from detection of the input of the another character; and
presenting one or candidate characters in a third input field of the display, the one or more candidate characters corresponding to characters associated with the another character.
14. The method of claim 13 further comprising highlighting a candidate character from the one or more candidate characters that is most closely associated with the another character as a default candidate selection.
15. The method of claim 13 further comprising detecting a selection of the one or more candidate characters in the third input field and presenting the selected candidate in the display area with the presented default character.
16. The method of claim 13 further comprising, after detecting the input of the character, detecting an expiration of a pre-determined time period prior to detecting an input of the another character, and presenting one or candidate characters in a third input field of the display, the one or more candidate characters corresponding to characters associated with the character.
17. The method of claim 13 further comprising enabling each of the first input field and the second input field to receive a new character input after a selection of a candidate character corresponding to a character inputted in each field, respectively.
18. A computer program configured to execute the method of claim 13 when downloaded and run on a mobile communication apparatus.
19. An apparatus comprising:
a display including at least one input area and display area; and
at least one processor coupled to the display, the at least one processor being configured to:
detect an input of a character in a first input field of the display;
detect an input of another character in a second input field of the display;
present a default character in a display area of the display corresponding to the character inputted in the first input field;
determine if an further character is inputted in the first input field prior to expiration of a pre-determined time period from detection of the input of the another character; and
present one or candidate characters in a third input field of the display, the one or more candidate characters corresponding to characters associated with the another character.
20. The apparatus of claim 20 wherein the processor is further configured to, after detecting the input of the character, detect an expiration of a pre-determined time period prior to detecting an input of the another character, and presenting one or candidate characters in a third input field of the display, the one or more candidate characters corresponding to characters associated with the character.
US12/211,914 2004-11-01 2008-09-17 Mobile phone and method Abandoned US20090073137A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/211,914 US20090073137A1 (en) 2004-11-01 2008-09-17 Mobile phone and method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US10/978,954 US7443386B2 (en) 2004-11-01 2004-11-01 Mobile phone and method
US12/211,914 US20090073137A1 (en) 2004-11-01 2008-09-17 Mobile phone and method

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US10/978,954 Continuation US7443386B2 (en) 2004-11-01 2004-11-01 Mobile phone and method

Publications (1)

Publication Number Publication Date
US20090073137A1 true US20090073137A1 (en) 2009-03-19

Family

ID=36261228

Family Applications (2)

Application Number Title Priority Date Filing Date
US10/978,954 Expired - Fee Related US7443386B2 (en) 2004-11-01 2004-11-01 Mobile phone and method
US12/211,914 Abandoned US20090073137A1 (en) 2004-11-01 2008-09-17 Mobile phone and method

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US10/978,954 Expired - Fee Related US7443386B2 (en) 2004-11-01 2004-11-01 Mobile phone and method

Country Status (4)

Country Link
US (2) US7443386B2 (en)
KR (1) KR101203446B1 (en)
CN (1) CN1782974B (en)
HK (1) HK1090444A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080025613A1 (en) * 2006-07-28 2008-01-31 Manish Kumar Compact Stylus-Based Input Technique For Indic Scripts
US20080180403A1 (en) * 2007-01-30 2008-07-31 Samsung Electronics Co., Ltd. Apparatus and method for inputting characters on touch screen of a terminal
US20120262488A1 (en) * 2009-12-23 2012-10-18 Nokia Corporation Method and Apparatus for Facilitating Text Editing and Related Computer Program Product and Computer Readable Medium
US20140317569A1 (en) * 2011-01-12 2014-10-23 Motorola Mobility Llc Methods and Devices for Chinese Language Input to a Touch Screen
USRE45694E1 (en) 2007-06-11 2015-09-29 Samsung Electronics Co., Ltd. Character input apparatus and method for automatically switching input mode in terminal having touch screen
US10474245B2 (en) 2014-09-30 2019-11-12 Lenovo (Beijing) Co., Ltd. Input method and electronic device for improving character recognition rate

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7443386B2 (en) * 2004-11-01 2008-10-28 Nokia Corporation Mobile phone and method
US20080122806A1 (en) * 2005-01-05 2008-05-29 Jaewoo Ahn Method and Apparatus for Inputting Character Through Pointing Device
KR100686159B1 (en) * 2005-06-13 2007-02-26 엘지전자 주식회사 A mobile communication device and the data processing method therefor
US8352323B2 (en) * 2007-11-30 2013-01-08 Blaze Mobile, Inc. Conducting an online payment transaction using an NFC enabled mobile communication device
US8026904B2 (en) * 2007-01-03 2011-09-27 Apple Inc. Periodic sensor panel baseline adjustment
US8054296B2 (en) 2007-01-03 2011-11-08 Apple Inc. Storing baseline information in EEPROM
CN101178633A (en) * 2007-12-13 2008-05-14 深圳华为通信技术有限公司 Method, system and device for correcting hand-written screen error
CN102214011B (en) * 2010-04-09 2015-09-09 北京搜狗科技发展有限公司 A kind of method of initiating input method remote calculation request and device
BR112012029421A2 (en) * 2010-05-24 2017-02-21 John Temple Will multidirectional button, key and keyboard
US9607505B2 (en) 2010-09-22 2017-03-28 Apple Inc. Closed loop universal remote control
TW201216124A (en) * 2010-10-12 2012-04-16 Inventec Corp Multi-block handwriting system and method thereof
US9501161B2 (en) 2010-10-22 2016-11-22 Hewlett-Packard Development Company, L.P. User interface for facilitating character input
US8286104B1 (en) * 2011-10-06 2012-10-09 Google Inc. Input method application for a touch-sensitive user interface
CN103064531B (en) * 2013-01-18 2016-04-06 东莞宇龙通信科技有限公司 Terminal and input method
US9898187B2 (en) 2013-06-09 2018-02-20 Apple Inc. Managing real-time handwriting recognition
US9495620B2 (en) 2013-06-09 2016-11-15 Apple Inc. Multi-script handwriting recognition using a universal recognizer
US20140361983A1 (en) * 2013-06-09 2014-12-11 Apple Inc. Real-time stroke-order and stroke-direction independent handwriting recognition
US20150106764A1 (en) * 2013-10-15 2015-04-16 Apple Inc. Enhanced Input Selection
DK179374B1 (en) 2016-06-12 2018-05-28 Apple Inc Handwriting keyboard for monitors
US10997362B2 (en) * 2016-09-01 2021-05-04 Wacom Co., Ltd. Method and system for input areas in documents for handwriting devices
US11194467B2 (en) 2019-06-01 2021-12-07 Apple Inc. Keyboard management user interfaces

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6052482A (en) * 1996-01-12 2000-04-18 Canon Kabushiki Kaisha Character recognition apparatus and method
US6212298B1 (en) * 1995-09-08 2001-04-03 Canon Kabushiki Kaisha Character recognition apparatus, method and computer readable memory
US20020168107A1 (en) * 1998-04-16 2002-11-14 International Business Machines Corporation Method and apparatus for recognizing handwritten chinese characters
US6694056B1 (en) * 1999-10-15 2004-02-17 Matsushita Electric Industrial Co., Ltd. Character input apparatus/method and computer-readable storage medium
US6847734B2 (en) * 2000-01-28 2005-01-25 Kabushiki Kaisha Toshiba Word recognition method and storage medium that stores word recognition program
US6931153B2 (en) * 2000-04-20 2005-08-16 Matsushita Electric Industrial Co., Ltd. Handwritten character recognition apparatus
US20060055669A1 (en) * 2004-09-13 2006-03-16 Mita Das Fluent user interface for text entry on touch-sensitive display
US7030863B2 (en) * 2000-05-26 2006-04-18 America Online, Incorporated Virtual keyboard system with automatic correction
US20060119582A1 (en) * 2003-03-03 2006-06-08 Edwin Ng Unambiguous text input method for touch screens and reduced keyboard systems
US7443386B2 (en) * 2004-11-01 2008-10-28 Nokia Corporation Mobile phone and method

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0769175B9 (en) * 1994-07-01 2005-01-12 Palm Computing, Inc. Multiple pen stroke character set and handwriting recognition system
GB9701793D0 (en) * 1997-01-29 1997-03-19 Gay Geoffrey N W Means for inputting characters or commands into a computer
KR100356037B1 (en) * 1999-11-15 2002-10-18 김영식 Apparatus And Method For Recognition Of Multiple Character In Handwriting Recognition
CN1271883A (en) * 2000-02-16 2000-11-01 康艳 Chinese-character intelligent writing input method and writing input board
JP2002304250A (en) * 2001-04-05 2002-10-18 Isao Nagaoka Telephone having character input function, character inputting method, and character input program
US7168046B2 (en) * 2001-04-26 2007-01-23 Lg Electronics Inc. Method and apparatus for assisting data input to a portable information terminal
CN1512803A (en) * 2002-12-30 2004-07-14 北京汉王科技有限公司 Radio hand writing input device for mobile phone

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6212298B1 (en) * 1995-09-08 2001-04-03 Canon Kabushiki Kaisha Character recognition apparatus, method and computer readable memory
US6052482A (en) * 1996-01-12 2000-04-18 Canon Kabushiki Kaisha Character recognition apparatus and method
US20020168107A1 (en) * 1998-04-16 2002-11-14 International Business Machines Corporation Method and apparatus for recognizing handwritten chinese characters
US6694056B1 (en) * 1999-10-15 2004-02-17 Matsushita Electric Industrial Co., Ltd. Character input apparatus/method and computer-readable storage medium
US6847734B2 (en) * 2000-01-28 2005-01-25 Kabushiki Kaisha Toshiba Word recognition method and storage medium that stores word recognition program
US6931153B2 (en) * 2000-04-20 2005-08-16 Matsushita Electric Industrial Co., Ltd. Handwritten character recognition apparatus
US7030863B2 (en) * 2000-05-26 2006-04-18 America Online, Incorporated Virtual keyboard system with automatic correction
US20060119582A1 (en) * 2003-03-03 2006-06-08 Edwin Ng Unambiguous text input method for touch screens and reduced keyboard systems
US20060055669A1 (en) * 2004-09-13 2006-03-16 Mita Das Fluent user interface for text entry on touch-sensitive display
US7443386B2 (en) * 2004-11-01 2008-10-28 Nokia Corporation Mobile phone and method

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080025613A1 (en) * 2006-07-28 2008-01-31 Manish Kumar Compact Stylus-Based Input Technique For Indic Scripts
US8077974B2 (en) * 2006-07-28 2011-12-13 Hewlett-Packard Development Company, L.P. Compact stylus-based input technique for indic scripts
US20080180403A1 (en) * 2007-01-30 2008-07-31 Samsung Electronics Co., Ltd. Apparatus and method for inputting characters on touch screen of a terminal
US9141283B2 (en) 2007-01-30 2015-09-22 Samsung Electronics Co., Ltd Apparatus and method for inputting characters on touch screen of a terminal
US9389700B2 (en) 2007-01-30 2016-07-12 Samsung Electronics Co., Ltd Apparatus and method for inputting characters on touch screen of a terminal
USRE45694E1 (en) 2007-06-11 2015-09-29 Samsung Electronics Co., Ltd. Character input apparatus and method for automatically switching input mode in terminal having touch screen
US20120262488A1 (en) * 2009-12-23 2012-10-18 Nokia Corporation Method and Apparatus for Facilitating Text Editing and Related Computer Program Product and Computer Readable Medium
US20140317569A1 (en) * 2011-01-12 2014-10-23 Motorola Mobility Llc Methods and Devices for Chinese Language Input to a Touch Screen
US10048771B2 (en) * 2011-01-12 2018-08-14 Google Technology Holdings LLC Methods and devices for chinese language input to a touch screen
US10474245B2 (en) 2014-09-30 2019-11-12 Lenovo (Beijing) Co., Ltd. Input method and electronic device for improving character recognition rate

Also Published As

Publication number Publication date
CN1782974B (en) 2010-06-16
KR20060052386A (en) 2006-05-19
CN1782974A (en) 2006-06-07
KR101203446B1 (en) 2012-11-21
US7443386B2 (en) 2008-10-28
HK1090444A1 (en) 2006-12-22
US20060092128A1 (en) 2006-05-04

Similar Documents

Publication Publication Date Title
US20090073137A1 (en) Mobile phone and method
JP7153810B2 (en) handwriting input on electronic devices
US8487879B2 (en) Systems and methods for interacting with a computer through handwriting to a screen
US7023428B2 (en) Using touchscreen by pointing means
CN100465867C (en) Handwritten information input apparatus
US6160555A (en) Method for providing a cue in a computer system
US9448722B2 (en) Text entry into electronic devices
US20020057260A1 (en) In-air gestures for electromagnetic coordinate digitizers
US9612697B2 (en) Touch control method of capacitive and electromagnetic dual-mode touch screen and handheld electronic device
KR20140081793A (en) Explicit touch selection and cursor placement
US9507516B2 (en) Method for presenting different keypad configurations for data input and a portable device utilizing same
US10453425B2 (en) Information displaying apparatus and information displaying method
US20160147436A1 (en) Electronic apparatus and method
US8378980B2 (en) Input method using a touchscreen of an electronic device
US20150378443A1 (en) Input for portable computing device based on predicted input
KR101447886B1 (en) Method and apparatus for selecting contents through a touch-screen display
JP3075882B2 (en) Document creation and editing device
JP2018049319A (en) Document browsing device and program
US9323431B2 (en) User interface for drawing with electronic devices
KR100506231B1 (en) Apparatus and method for inputting character in terminal having touch screen
US20020085772A1 (en) Intelligent correction key
KR101444202B1 (en) Method and apparatus for applying a document format through touch-screen
KR20100033658A (en) Text input method and apparatus
KR101021099B1 (en) Method, processing device and computer-readable recording medium for preventing incorrect input for touch screen
CN101727290A (en) Handwriting input method and handwriting input device

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION