WO2004021166A1 - Guidance method and device - Google Patents

Guidance method and device Download PDF

Info

Publication number
WO2004021166A1
WO2004021166A1 PCT/FI2003/000598 FI0300598W WO2004021166A1 WO 2004021166 A1 WO2004021166 A1 WO 2004021166A1 FI 0300598 W FI0300598 W FI 0300598W WO 2004021166 A1 WO2004021166 A1 WO 2004021166A1
Authority
WO
WIPO (PCT)
Prior art keywords
scrolling
text
display
motion
text lines
Prior art date
Application number
PCT/FI2003/000598
Other languages
French (fr)
Inventor
Jukka-Pekka METSÄVAINIO
Manne Hannula
Original Assignee
Myorigo Oy
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Myorigo Oy filed Critical Myorigo Oy
Priority to EP03790973A priority Critical patent/EP1535142A1/en
Priority to AU2003249135A priority patent/AU2003249135A1/en
Publication of WO2004021166A1 publication Critical patent/WO2004021166A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1694Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2200/00Indexing scheme relating to G06F1/04 - G06F1/32
    • G06F2200/16Indexing scheme relating to G06F1/16 - G06F1/18
    • G06F2200/163Indexing scheme relating to constructional details of the computer
    • G06F2200/1637Sensing arrangement for detection of housing movement or orientation, e.g. for controlling scrolling or cursor movement on the display of an handheld computer

Definitions

  • the present invention generally relates to the display of electronic equipment such as mobile phones and more particularly to the browsing and reading of a large document on a display screen.
  • scroll bars permitting movements of the text either horizontally or vertically do not have jitter. That is, scroll bars can be used to permit the user to change the portion of the view on the display screen hori- zontally by sliding the horizontal scroll bar and vertically by sliding the vertical scroll bar.
  • a drawback with such methods is that scrolling is always either horizontal or vertical, i.e. no other direction is possible.
  • An objective of the present invention is to provide a guidance method for scrolling the display on a screen of a hand-held device having motion- controlled scrolling and a hand-held device performing the method.
  • the view shown on the display is a portion of larger content comprising text lines that are too long to be displayed entirely on the small display.
  • the guidance method should be adapted to follow accurately the reading direction of the text lines from the beginning of the text lines to the end of the text lines.
  • the reading direction should remain as uniform as possible during the entire reading event, i.e. unwanted effects such as jitter or bouncing of the text lines should not arise during the reading event.
  • a typical situation may be when a user is browsing a large full-size image or a document by scrolling the view shown on a mobile phone's display using motion-control scrolling.
  • the guidance method discovers when the user stops browsing and starts to read text lines. Then the need for guidance is examined. If the result of examination indicates that the user is reading the text lines, the scrolling direction is guided to follow the reading direction of the text lines. The guidance is invisible to the user.
  • the guidance method comprises three main steps: 1) the direction of the text lines shown on the display screen is periodically examined; 2) the movements of the mobile phone are periodically examined and, based on the movements, it is concluded whether the user is reading the text lines; 3) if the result of examination in step 2 indicates that the user is reading the text lines, the scrolling direction is reinforced in the direction of the text line so long as the difference between the current scrolling direction and the text line direction is less than a predetermined threshold value.
  • FIG. 1a-b illustrates a page of the document to be read on a display of a small portable device
  • FIG. 2 is a flowchart illustrating definitions of the direction of text lines on the display
  • FIG. 3 is a flowchart illustrating the method according to the invention
  • FIG. 4 illustrates the inventive idea with the help of a full-size image to be read on a display of a small portable device
  • FIG. 5a-b show an example of the determination of coordinates on the full- size image
  • FIG. 6a-b demonstrate how filtering is carried out.
  • the invention can be applied to a terminal having a small display, at least one sensor such as a magnetic pulse sensor, a acceleration transducer or gyroscope which is mounted in the back of the display, and a character recognition method or, alternatively, some other method for recognizing the direction of the text lines.
  • a sensor such as a magnetic pulse sensor, a acceleration transducer or gyroscope which is mounted in the back of the display
  • a character recognition method or, alternatively, some other method for recognizing the direction of the text lines.
  • FIG. 1a shows a typical situation where a user is browsing a large full-size image 10 on the display of a mobile phone.
  • the full- size image refers to a whole document or page, of which only a part can be displayed on the actual screen of the terminal.
  • the actual display of the mobile phone is illustrated by a dotted square 16 on the document page for visualizing the proportions of the display to the document.
  • the document page has been divided into several columns consisting of text sections 11 - 13 and non-text sections 14 - 15, such as pictures. Lines in the text column 13 are in a different direction than in the other two text columns 11 and 12.
  • FIG. 1b is an enlargement of the circled area in FIG. 1a. As seen, only a portion of the data is seen on the display at a time, i.e. a line of the document does not fit from beginning to end on the display. Scrolling directions are shown by double-ended arrows. Other scrolling directions are also possible, e.g. in the diagonal direction.
  • the data is displayed proportionally to the orientation or location of the mobile phone. That is, data scrolling in the vertical direction when the mobile phone is tilted vertically, and correspondingly, the data scrolls horizontally when the mobile phone is tilted horizontally.
  • the portion of the view displayed is moved to the beginning of the document or appropriate chapter by tilting the mobile phone in a suitable way.
  • a movable cursor is used to indicate a position of interest on the display surface by tilting the mobile phone.
  • the cursor moves horizontally when the mobile phone is tilted towards the vertical axis of the display, and correspondingly, the cursor moves vertically when the mobile phone is tilted towards the horizontal axis of the display.
  • the display shown on the screen depends on the motion of the hand.
  • hand movements are inaccurate.
  • Vv th- out accurate guidance unwanted jitter arises, making the reading uncomfortable and slow. This drawback is eliminated by the guidance method described in more detail in the following.
  • the guidance method comprises three main steps: 1) the direction of the text lines shown on the display screen is periodically examined; 2) the movements of the mobile phone are periodically examined and, based on the movements, it is concluded whether the user is reading the text lines; 3) if the result of examination in step 2 indicates that the user is reading the text lines, the scrolling direction is reinforced in the direction of the text line so long as the difference between the current scrolling direction and the text line direction is less than a predetermined threshold value.
  • FIG. 2 shows as a flowchart an example of the implementation of examination of the direction of text lines.
  • the first task is to analyze roughly whether there is text in the current portion on display. If no text is detected, free scrolling of the data is allowed at stage 21 , and the control is returned to stage 20.
  • One alternative is that the portion of the data that is seen on the display screen is first analyzed by defining the major components, i.e. whether the type of component represents the text or non-text.
  • Some character recognition method such as an optical character recognition method OCR, can be used.
  • OCR optical character recognition method
  • information about the type and size of the document, and the type and size of the font may already be included in the document, in which case no character recognition is needed.
  • FIG. 1a most of the page is covered with text.
  • the direction of text lines is defined to be horizontal. However, if the column 13 had been shown on the display, the text lines would have been defined to be vertical.
  • direction definition of the text lines can be created on the basis of the middle point of the current portion of view displayed. This method will be considered later in greater detail.
  • Another alternative utilizes image analysis and character recognition. The direction is defined on the basis of features of the analyzed data.
  • the definition results concerning the direction of the text lines are saved at stage 23.
  • the information also includes current time information.
  • the above-described analysis and definition are came out periodically at certain intervals.
  • a flowchart in FIG. 3 is used to demonstrate how the need for guidance is examined and carried out while one is browsing a large document on the display screen of a terminal by scrolling the display.
  • Coordinates of the current partial view of the full-size image are read at the fixed reading point (xi, yi) of the display.
  • the fixed reading point is the middle point of the display.
  • the coordinates of the current view at the middle point of the display are then saved at certain time instances e.g. every second, at stage 30.
  • a set of motion vectors is obtained.
  • the motion distribution as a time function is calculated at stage 31 by using the motion vector set.
  • the need for guidance is examined. If the distribution is narrow, it means that movements have been found, which do not deflect much from the main movement direction or scrolling direction. In other words, if the scrolling direction is in the direction of the x-axis, the coordinates of the page, which have been read at the middle point of the display, seem to follow the general direction of the x-axis (the text line), i.e. the scrolling direction is not exactly rectilinear. If such movement as described above is not detected or the distribution is relatively broad, free scrolling is still allowed, at stage 35. Otherwise, the next stage 33 is to examine more accurately whether there is a need to guide the reading direction. In this case the distribution is compared with the saved information (see FIG.
  • stage 34 it is analyzed whether the result of the comparison indicates, in a given time, successive movements which are roughly in the same direction. If the result does not indicate such movements, free scrolling of the data is still allowed at stage 35. Otherwise, the data is filtered in such a way that the direction of the text lines are taken into account at stage 36. Regardless of the result of the comparison, the steps above are repeated from stage 30.
  • FIG. 4 demonstrates how the defining and filtering are carried out mathematically.
  • the assumption in this example is that the direction of the text lines on the display is defined to be horizontal by the character recognition method.
  • a vector 42 represents that direction.
  • the additional assumption is that coordinates of the partial view 40 of the full-size image are read at the middle point 43 of the display. Based on the successive coordinates, a set of motion vectors are obtained, wherein if the directions of said motion vectors are roughly in the direction of the text line, this indicates that the user is reading the text lines.
  • the curve 41 represents the set of motion vectors.
  • Coordinates of the partial view of the full-size image are controlled by a cer- tain time window.
  • the position of the middle point 43 on the full-size image 10 varies as time function.
  • FS sample rate
  • Each sample consists of the coordinates (xi, yi) of a point Pi on the curve at a given time.
  • a current coordinate pair (xi,yi) is saved every second within a certain time interval, such as 10 s.
  • These samples formulate a temporal matrix T, where the first row of the matrix represents the x-coordinates and the second row the y-coordinates, so that each column presents one coordinate pair (xi,yi).
  • the parameter D represents the direction (angle) of the text line under examination.
  • the definition value for D is 0 radians when the text line is in the vertical direction and ⁇ /2 radians when the text line is in the horizontal direction.
  • parameter D can have any angle value between 0 - n2 ⁇ radians.
  • the current direction (angle) between each consecutive point in temporal matrix T is calculated. For example, if at time t1 the coordinates at point P1 are (x1 , y1) and at time t2 at point P2 (x2, y1), the vector between the two points P1 and P2 defines the direction (angle) at time t1 , where ⁇ t is (t2-t1 , t2 > t1). In general the result of calculation is N-1 directions or vectors when the number of samples is N.
  • the calculated N-1 directions formulate a vector U.
  • the result is nine vectors, each of which has a direction (angle) between 0-n2 ⁇ radians. That is the result is the distribution of directions within 10 seconds.
  • the shape of distribution of the directions is narrow, a lot of movements have been found in the same direction.
  • the shape of distribution is broad, only some movements are in the same direction.
  • curve 41 and vector 42 are in the same direction. Otherwise curve 41 and vector 42 are in different directions. In the first case mentioned above, i.e. when curve 41 and vector
  • the scroll direction is guided to follow accurately the reading direction of the text lines by filtering from the scrolling motion those components which differ from the direction of the text lines.
  • FIG. 5a illustrates how the view shown on the display looks at time t1. It is seen that the middle point of the current view 500 is in point P1 at time t1. Within time (t1+ ⁇ t) - 11 the display on the screen has been moved or the middle point of the previous display has been moved on the full-size image from point P1 to point P2, when the current view 500 is now the view 501 as shown in FIG. 5b.
  • FIG. 6a and 6b illustrate a situation similar to what was explained above with reference to FIG. 5a-5b, but here the filtering is in action.
  • the vector a from the initial point P1 to the terminal point P2 represents the displacement of the full-size image within a given time.
  • the vector b is a unit vector (i.e.
  • 1) in the direction of the x-axis.
  • the initial point of vector b is at point P1.
  • FIG. 6b illustrates said determination.
  • p is the product of the magnitudes of a and b and the cosine angle ⁇ between them.
  • is ⁇ /4.
  • the initial point of the vector a x is at point P1 and the terminal point at point PS.
  • the vector a x represents the displacement of the full-size image in the x-direction within the time ⁇ t.
  • the calculation described above is repeated correspondingly for displacements at intervals as long as the filtering is activated.
  • the time period is a sample period 1/FS.
  • the above-described method can be used in a hand-held terminal having a motion controlled scrolling operation by adapting the terminal to: read coordinates of the scrolling display at a fixed reading point of the screen at predetermined time intervals, wherein a set of motion vectors is obtained; determine the scrolling direction based on the set of motion vectors; examine whether the scrolling direction is roughly the same as the direction of a text line; and if so, reinforce the displayed scrolling in the direction of the text line as long as the difference between the current scrolling direction and the text line direction is less than a predetermined threshold value.

Abstract

This invention relates to the scrolling and reading of a larger document on the display of small hand-held electronic equipment, such as mobile phones or personal digital assistance devices, where the view on the display screen is based on a tilt angle of the equipment. An objective is to provide a guidance method that enables reading text lines in a document whose actual size is larger than the screen of the display, without un-wanted effects such as jitter or bouncing. Thus the guidance method is adapted to follow accurately the reading direction of the text lines from the beginning to the end of the text lines.

Description

Guidance method and device
Field of the invention
The present invention generally relates to the display of electronic equipment such as mobile phones and more particularly to the browsing and reading of a large document on a display screen.
Background of the invention
Today portable electronic equipment, such as mobile phones and personal digital assistance devices PDA, are getting ever-smaller. However, when the size of the portable equipment is reduced also the size of the equipment's display is reduced. On the other hand, there is an increasing need to provide services with text and pictures to subscribers. Thus the screen of the display is usually smaller in size than the actual document, i.e. only a portion of the document is visible at a time. It is clear that reading a large document from a small display screen is rather difficult.
Different kinds of mechanisms have been developed to ease reading from a small display screen. Some of them are considered briefly in the following. In some small-sized, hand-held devices which can be used while being carried, scrolling the view on the display screen is based on the location (at least one of x, y, or z coordinates) and/or orientation (at least one of azimuth, elevation, roll) of the device. In such a motion-control environment, the portion displayed on the display screen of the hand-held device depends constantly on the movements of the hand. Due to the fact that the hand is inaccurate, some unwanted effects arise, such as jitter or bouncing, which makes reading even more difficult. The same kind of unwanted effects may arise also when the hand-held device is used in a moving vehicle.
In order to avoid the above drawbacks, a method has been de- vised whereby the direction of the scrolling the data can be locked. However, one drawback with this method is that it allows browsing of the data in one direction only, i.e. it does not take into account the direction of the text lines in the data.
Methods with scroll bars permitting movements of the text either horizontally or vertically do not have jitter. That is, scroll bars can be used to permit the user to change the portion of the view on the display screen hori- zontally by sliding the horizontal scroll bar and vertically by sliding the vertical scroll bar. However, a drawback with such methods is that scrolling is always either horizontal or vertical, i.e. no other direction is possible.
At the moment there is no method available that reduces the er- rors arising from the bouncing movements around the text lines when reading a document on the display screen of a portable device assisted by a motion-control method.
Summary of the invention An objective of the present invention is to provide a guidance method for scrolling the display on a screen of a hand-held device having motion- controlled scrolling and a hand-held device performing the method. The view shown on the display is a portion of larger content comprising text lines that are too long to be displayed entirely on the small display. The guidance method should be adapted to follow accurately the reading direction of the text lines from the beginning of the text lines to the end of the text lines. The reading direction should remain as uniform as possible during the entire reading event, i.e. unwanted effects such as jitter or bouncing of the text lines should not arise during the reading event. A typical situation may be when a user is browsing a large full-size image or a document by scrolling the view shown on a mobile phone's display using motion-control scrolling.
The guidance method discovers when the user stops browsing and starts to read text lines. Then the need for guidance is examined. If the result of examination indicates that the user is reading the text lines, the scrolling direction is guided to follow the reading direction of the text lines. The guidance is invisible to the user.
The objective is achieved in the manner described in the independent claims. The guidance method comprises three main steps: 1) the direction of the text lines shown on the display screen is periodically examined; 2) the movements of the mobile phone are periodically examined and, based on the movements, it is concluded whether the user is reading the text lines; 3) if the result of examination in step 2 indicates that the user is reading the text lines, the scrolling direction is reinforced in the direction of the text line so long as the difference between the current scrolling direction and the text line direction is less than a predetermined threshold value.
Brief description of the drawings The invention is described more closely with reference to the accompanying drawings, in which
FIG. 1a-b illustrates a page of the document to be read on a display of a small portable device, FIG. 2 is a flowchart illustrating definitions of the direction of text lines on the display, FIG. 3 is a flowchart illustrating the method according to the invention, FIG. 4 illustrates the inventive idea with the help of a full-size image to be read on a display of a small portable device, FIG. 5a-b show an example of the determination of coordinates on the full- size image, and FIG. 6a-b demonstrate how filtering is carried out.
Detailed description of the invention The invention can be applied to a terminal having a small display, at least one sensor such as a magnetic pulse sensor, a acceleration transducer or gyroscope which is mounted in the back of the display, and a character recognition method or, alternatively, some other method for recognizing the direction of the text lines. The principle of the invention and the embodiments are described using a mobile phone as an example, but the same principle can, of course, be applied to any kind of terminal.
FIG. 1a shows a typical situation where a user is browsing a large full-size image 10 on the display of a mobile phone. In this context the full- size image refers to a whole document or page, of which only a part can be displayed on the actual screen of the terminal. The actual display of the mobile phone is illustrated by a dotted square 16 on the document page for visualizing the proportions of the display to the document. Thus the user is reading a document whose actual size is larger than the physical size of the display. The document page has been divided into several columns consisting of text sections 11 - 13 and non-text sections 14 - 15, such as pictures. Lines in the text column 13 are in a different direction than in the other two text columns 11 and 12.
FIG. 1b is an enlargement of the circled area in FIG. 1a. As seen, only a portion of the data is seen on the display at a time, i.e. a line of the document does not fit from beginning to end on the display. Scrolling directions are shown by double-ended arrows. Other scrolling directions are also possible, e.g. in the diagonal direction.
It is assumed that motion-controlled scrolling is used. In order to browse the document, the data is displayed proportionally to the orientation or location of the mobile phone. That is, data scrolling in the vertical direction when the mobile phone is tilted vertically, and correspondingly, the data scrolls horizontally when the mobile phone is tilted horizontally.
When a piece of interesting text 17 appears on the display and the user wants to go into the matter in greater detail, i.e. to read through the whole document or a chapter concerning the matter, the portion of the view displayed is moved to the beginning of the document or appropriate chapter by tilting the mobile phone in a suitable way. For example, a movable cursor is used to indicate a position of interest on the display surface by tilting the mobile phone. Thus the cursor moves horizontally when the mobile phone is tilted towards the vertical axis of the display, and correspondingly, the cursor moves vertically when the mobile phone is tilted towards the horizontal axis of the display.
Thus, the display shown on the screen depends on the motion of the hand. However, as stated above, hand movements are inaccurate. Vv th- out accurate guidance, unwanted jitter arises, making the reading uncomfortable and slow. This drawback is eliminated by the guidance method described in more detail in the following.
The guidance method comprises three main steps: 1) the direction of the text lines shown on the display screen is periodically examined; 2) the movements of the mobile phone are periodically examined and, based on the movements, it is concluded whether the user is reading the text lines; 3) if the result of examination in step 2 indicates that the user is reading the text lines, the scrolling direction is reinforced in the direction of the text line so long as the difference between the current scrolling direction and the text line direction is less than a predetermined threshold value. FIG. 2 shows as a flowchart an example of the implementation of examination of the direction of text lines.
It is assumed here that a user scrolls documents on the mobile phone display screen in a way that is described in association with FIG. 1a and 1b.
At stage 20, the first task is to analyze roughly whether there is text in the current portion on display. If no text is detected, free scrolling of the data is allowed at stage 21 , and the control is returned to stage 20.
There are numerous implementation alternatives for analysis. One alternative is that the portion of the data that is seen on the display screen is first analyzed by defining the major components, i.e. whether the type of component represents the text or non-text. Some character recognition method such as an optical character recognition method OCR, can be used. In some cases information about the type and size of the document, and the type and size of the font may already be included in the document, in which case no character recognition is needed.
At the next stage 22, the direction of the text lines is defined. In
FIG. 1a most of the page is covered with text. When this page is analyzed by the character recognition method the direction of text lines is defined to be horizontal. However, if the column 13 had been shown on the display, the text lines would have been defined to be vertical.
One alternative for the direction definition of the text lines can be created on the basis of the middle point of the current portion of view displayed. This method will be considered later in greater detail. Another alternative utilizes image analysis and character recognition. The direction is defined on the basis of features of the analyzed data.
The definition results concerning the direction of the text lines are saved at stage 23. The information also includes current time information. The above-described analysis and definition are came out periodically at certain intervals.
A flowchart in FIG. 3 is used to demonstrate how the need for guidance is examined and carried out while one is browsing a large document on the display screen of a terminal by scrolling the display.
Coordinates of the current partial view of the full-size image are read at the fixed reading point (xi, yi) of the display. In this example, the fixed reading point is the middle point of the display. The coordinates of the current view at the middle point of the display are then saved at certain time instances e.g. every second, at stage 30. With the aid of the set of coordinates, a set of motion vectors is obtained. The motion distribution as a time function is calculated at stage 31 by using the motion vector set.
At stage 32 the need for guidance is examined. If the distribution is narrow, it means that movements have been found, which do not deflect much from the main movement direction or scrolling direction. In other words, if the scrolling direction is in the direction of the x-axis, the coordinates of the page, which have been read at the middle point of the display, seem to follow the general direction of the x-axis (the text line), i.e. the scrolling direction is not exactly rectilinear. If such movement as described above is not detected or the distribution is relatively broad, free scrolling is still allowed, at stage 35. Otherwise, the next stage 33 is to examine more accurately whether there is a need to guide the reading direction. In this case the distribution is compared with the saved information (see FIG. 2 stage 23) about the direction of the text lines. At stage 34 it is analyzed whether the result of the comparison indicates, in a given time, successive movements which are roughly in the same direction. If the result does not indicate such movements, free scrolling of the data is still allowed at stage 35. Otherwise, the data is filtered in such a way that the direction of the text lines are taken into account at stage 36. Regardless of the result of the comparison, the steps above are repeated from stage 30.
FIG. 4 demonstrates how the defining and filtering are carried out mathematically. The assumption in this example is that the direction of the text lines on the display is defined to be horizontal by the character recognition method. A vector 42 represents that direction. The additional assumption is that coordinates of the partial view 40 of the full-size image are read at the middle point 43 of the display. Based on the successive coordinates, a set of motion vectors are obtained, wherein if the directions of said motion vectors are roughly in the direction of the text line, this indicates that the user is reading the text lines. The curve 41 represents the set of motion vectors.
The similarity of directions between the vector 42 and curve 41 is examined. One way of examining the similarity is described in the following.
Coordinates of the partial view of the full-size image are controlled by a cer- tain time window. In other words while the document is scrolled the position of the middle point 43 on the full-size image 10 varies as time function.
The curve 41 is sampled by a calculating unit using a sample rate FS (e.g. FS=1 s). Each sample consists of the coordinates (xi, yi) of a point Pi on the curve at a given time. For example, a current coordinate pair (xi,yi) is saved every second within a certain time interval, such as 10 s. These samples formulate a temporal matrix T, where the first row of the matrix represents the x-coordinates and the second row the y-coordinates, so that each column presents one coordinate pair (xi,yi). The parameter D represents the direction (angle) of the text line under examination. The definition value for D is 0 radians when the text line is in the vertical direction and π/2 radians when the text line is in the horizontal direction. Of course, parameter D can have any angle value between 0 - n2π radians. The current direction (angle) between each consecutive point in temporal matrix T is calculated. For example, if at time t1 the coordinates at point P1 are (x1 , y1) and at time t2 at point P2 (x2, y1), the vector between the two points P1 and P2 defines the direction (angle) at time t1 , where Δt is (t2-t1 , t2 > t1). In general the result of calculation is N-1 directions or vectors when the number of samples is N. The calculated N-1 directions formulate a vector U. Here within 10 seconds the result is nine vectors, each of which has a direction (angle) between 0-n2π radians. That is the result is the distribution of directions within 10 seconds. When the shape of distribution of the directions is narrow, a lot of movements have been found in the same direction. Correspondingly, when the shape of distribution is broad, only some movements are in the same direction.
The similarity of curve 41 and vector 42 is examined by comparing the above vectors in the following way. First a comparison is made as to whether the absolute value of the difference between the mean of vector U and parameter D is smaller than the predetermined threshold parameter V and whether the standard deviation of vector U is smaller than the predetermined parameter X using the following equations Abs(Mean(U) - D) < V (1) Std(U) < X (2)
If the results are smaller in both comparisons, curve 41 and vector 42 are in the same direction. Otherwise curve 41 and vector 42 are in different directions. In the first case mentioned above, i.e. when curve 41 and vector
42 are in the same direction, the scroll direction is guided to follow accurately the reading direction of the text lines by filtering from the scrolling motion those components which differ from the direction of the text lines.
The effect of filtering is seen in FIG. 4 with the help of curve 44 and vector 45. Curve 44 demonstrates movement of the page before filtering and vector 45 represents movement of the page after filtering. Thus, filtering provides for movement of the portion of view in the full-size image accurately in the direction of the text lines. The result is that the user is able to read the text lines with ease because unwanted effects are prevented that might make reading uncomfortable.
One way to carry out filtering is described with reference to FIG. 5a - 5b and 6a - 6b.
First, it is assumed that filtering is not in action in FIG. 5a and 5b. FIG. 5a illustrates how the view shown on the display looks at time t1. It is seen that the middle point of the current view 500 is in point P1 at time t1. Within time (t1+ Δt) - 11 the display on the screen has been moved or the middle point of the previous display has been moved on the full-size image from point P1 to point P2, when the current view 500 is now the view 501 as shown in FIG. 5b. FIG. 6a and 6b illustrate a situation similar to what was explained above with reference to FIG. 5a-5b, but here the filtering is in action.
In FIG. 6a the vector a from the initial point P1 to the terminal point P2 represents the displacement of the full-size image within a given time. The vector b is a unit vector (i.e. | b | =1) in the direction of the x-axis. The initial point of vector b is at point P1.
In this case it is enough to determine only the component of a which is parallel with x-axis, because the text lines are in the direction of the x-axis.
FIG. 6b illustrates said determination. First the projection of a on b is determined by a scalar product: p = a • b.= |a||b|cosθ (3) where p is the product of the magnitudes of a and b and the cosine angle θ between them. In this example θ is π/4. Then the vector ax represents the component vector of a in the x direction, as denoted by ax = pb (4)
The initial point of the vector ax is at point P1 and the terminal point at point PS. The vector ax represents the displacement of the full-size image in the x-direction within the time Δt.
The calculation described above is repeated correspondingly for displacements at intervals as long as the filtering is activated. The time period is a sample period 1/FS.
The above-described method can be used in a hand-held terminal having a motion controlled scrolling operation by adapting the terminal to: read coordinates of the scrolling display at a fixed reading point of the screen at predetermined time intervals, wherein a set of motion vectors is obtained; determine the scrolling direction based on the set of motion vectors; examine whether the scrolling direction is roughly the same as the direction of a text line; and if so, reinforce the displayed scrolling in the direction of the text line as long as the difference between the current scrolling direction and the text line direction is less than a predetermined threshold value.
The implementation and embodiment of the present invention has been explained above with some examples. However, it is to be understood that the invention is not restricted by the details of the above embodiments and that numerous changes and modifications can be made by those skilled in the art without departing from the characteristic features of the invention. The described embodiment is to be considered illustrative but not restrictive. Therefore, the invention should be limited only by the attached claims. Thus alternative implementations defined by the claims, as well as equivalent implementations, are included in the scope of the invention. For example, guidance can be implemented in any scrolling direction, also diagonally. Parameters such as the sample rate and the number of samples, as well as the time intervals, depend on the application used. In some cases it is also possible that a user adjusts some of the above parameters, wherein the guidance method reacts differently to a slow user and a quick user.

Claims

Claims
1. A guidance method for scrolling data shown on the display of a terminal, where the scrolling direction depends on the tilt angle of the terminal, characterized by the steps of: reading the coordinates of the scrolling display at a fixed reading point of the screen at predetermined time intervals, wherein a set of motion vectors are obtained, determining the scrolling direction based on the set of motion vec- tors, examining whether the scrolling direction is roughly the same as the direction of a text line, and if so, reinforcing the displayed scrolling in the direction of the text line as long as the difference between the current scrolling direction and the text line direction is less than a predetermined threshold value.
2. The method according to 1, characterized further by comprising the steps of: detecting periodically whether text is shown on the display, defining the direction of the text lines in response to the detection and saving the result together with the detection time, calculating the motion distribution as a time function, by using the motion vector set, and comparing the motion distribution with the saved result.
3. The method according to 1, characterized in that the di- rection of text lines is defined by character recognition.
4. The method according to 1, characterized in that the direction of text lines is defined on the basis of the coordinates.
5. A hand-held terminal having a motion-controlled scrolling operation, characterized in that the terminal is adapted to: read coordinates of the scrolling display at a fixed reading point of the screen at predetermined time intervals, wherein a set of motion vectors is obtained, determine the scrolling direction based on the set of motion vec- tors, examine whether the scrolling direction is roughly the same as the direction of a text line, and if so, reinforce the displayed scrolling in the direction of the text line as long as the difference between the current scrolling direction and the text line direction is less than a predetermined threshold value.
6. The hand-held terminal as in claim 5, c h a ra cte r i z e d in that the user terminal is further adapted to detect periodically whether text is shown on the display, define the direction of the text lines in response to the detection and save the result together with the detection time, calculate the motion distribution as a time function, by using the motion vector set, and compare the motion distribution with the saved result.
PCT/FI2003/000598 2002-09-02 2003-08-12 Guidance method and device WO2004021166A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP03790973A EP1535142A1 (en) 2002-09-02 2003-08-12 Guidance method and device
AU2003249135A AU2003249135A1 (en) 2002-09-02 2003-08-12 Guidance method and device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FI20021560 2002-09-02
FI20021560A FI115255B (en) 2002-09-02 2002-09-02 Monitor control method for a mobile terminal and a mobile terminal

Publications (1)

Publication Number Publication Date
WO2004021166A1 true WO2004021166A1 (en) 2004-03-11

Family

ID=8564507

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/FI2003/000598 WO2004021166A1 (en) 2002-09-02 2003-08-12 Guidance method and device

Country Status (4)

Country Link
EP (1) EP1535142A1 (en)
AU (1) AU2003249135A1 (en)
FI (1) FI115255B (en)
WO (1) WO2004021166A1 (en)

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2412048A (en) * 2004-03-09 2005-09-14 Jitendra Jayantilal Ranpura Viewing an image larger than the display device
US7864163B2 (en) 2006-09-06 2011-01-04 Apple Inc. Portable electronic device, method, and graphical user interface for displaying structured electronic documents
US8022970B2 (en) * 2004-10-14 2011-09-20 Canon Kabushiki Kaisha Image processing result display apparatus, image processing result display method, and program for implementing the method
US8214768B2 (en) 2007-01-05 2012-07-03 Apple Inc. Method, system, and graphical user interface for viewing multiple application windows
US8438504B2 (en) 2010-01-06 2013-05-07 Apple Inc. Device, method, and graphical user interface for navigating through multiple viewing areas
US8531423B2 (en) 2006-09-06 2013-09-10 Apple Inc. Video manager for portable multifunction device
US8736561B2 (en) 2010-01-06 2014-05-27 Apple Inc. Device, method, and graphical user interface with content display modes and display rotation heuristics
US8842074B2 (en) 2006-09-06 2014-09-23 Apple Inc. Portable electronic device performing similar operations for different gestures
US9367232B2 (en) 2007-01-07 2016-06-14 Apple Inc. Portable multifunction device, method, and graphical user interface supporting user navigations of graphical objects on a touch screen display
US9383818B2 (en) 2013-12-27 2016-07-05 Google Technology Holdings LLC Method and system for tilt-based actuation
US9619143B2 (en) 2008-01-06 2017-04-11 Apple Inc. Device, method, and graphical user interface for viewing application launch icons
US9772751B2 (en) 2007-06-29 2017-09-26 Apple Inc. Using gestures to slide between user interfaces
US9933937B2 (en) 2007-06-20 2018-04-03 Apple Inc. Portable multifunction device, method, and graphical user interface for playing online videos
US9933913B2 (en) 2005-12-30 2018-04-03 Apple Inc. Portable electronic device with interface reconfiguration mode
US10313505B2 (en) 2006-09-06 2019-06-04 Apple Inc. Portable multifunction device, method, and graphical user interface for configuring and displaying widgets
US10620780B2 (en) 2007-09-04 2020-04-14 Apple Inc. Editing interface
US11126321B2 (en) 2007-09-04 2021-09-21 Apple Inc. Application menu user interface

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5602566A (en) * 1993-08-24 1997-02-11 Hitachi, Ltd. Small-sized information processor capable of scrolling screen in accordance with tilt, and scrolling method therefor
EP0805389A2 (en) * 1996-04-30 1997-11-05 Sun Microsystems, Inc. Tilt-scrolling on the sunpad
WO1998014863A2 (en) * 1996-10-01 1998-04-09 Philips Electronics N.V. Hand-held image display device
GB2336747A (en) * 1998-04-22 1999-10-27 Nec Corp Hand held communication terminal and method of scrolling display screen of the same.

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5602566A (en) * 1993-08-24 1997-02-11 Hitachi, Ltd. Small-sized information processor capable of scrolling screen in accordance with tilt, and scrolling method therefor
EP0805389A2 (en) * 1996-04-30 1997-11-05 Sun Microsystems, Inc. Tilt-scrolling on the sunpad
WO1998014863A2 (en) * 1996-10-01 1998-04-09 Philips Electronics N.V. Hand-held image display device
GB2336747A (en) * 1998-04-22 1999-10-27 Nec Corp Hand held communication terminal and method of scrolling display screen of the same.

Cited By (51)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2412048A (en) * 2004-03-09 2005-09-14 Jitendra Jayantilal Ranpura Viewing an image larger than the display device
US8022970B2 (en) * 2004-10-14 2011-09-20 Canon Kabushiki Kaisha Image processing result display apparatus, image processing result display method, and program for implementing the method
US9933913B2 (en) 2005-12-30 2018-04-03 Apple Inc. Portable electronic device with interface reconfiguration mode
US11650713B2 (en) 2005-12-30 2023-05-16 Apple Inc. Portable electronic device with interface reconfiguration mode
US11449194B2 (en) 2005-12-30 2022-09-20 Apple Inc. Portable electronic device with interface reconfiguration mode
US10915224B2 (en) 2005-12-30 2021-02-09 Apple Inc. Portable electronic device with interface reconfiguration mode
US10884579B2 (en) 2005-12-30 2021-01-05 Apple Inc. Portable electronic device with interface reconfiguration mode
US10359907B2 (en) 2005-12-30 2019-07-23 Apple Inc. Portable electronic device with interface reconfiguration mode
US11106326B2 (en) 2006-09-06 2021-08-31 Apple Inc. Portable electronic device, method, and graphical user interface for displaying structured electronic documents
US10228815B2 (en) 2006-09-06 2019-03-12 Apple Inc. Portable electronic device, method, and graphical user interface for displaying structured electronic documents
US11023122B2 (en) 2006-09-06 2021-06-01 Apple Inc. Video manager for portable multifunction device
US11921969B2 (en) 2006-09-06 2024-03-05 Apple Inc. Portable electronic device, method, and graphical user interface for displaying structured electronic documents
US11736602B2 (en) 2006-09-06 2023-08-22 Apple Inc. Portable multifunction device, method, and graphical user interface for configuring and displaying widgets
US9690446B2 (en) 2006-09-06 2017-06-27 Apple Inc. Portable electronic device, method, and graphical user interface for displaying structured electronic documents
US7864163B2 (en) 2006-09-06 2011-01-04 Apple Inc. Portable electronic device, method, and graphical user interface for displaying structured electronic documents
US11592952B2 (en) 2006-09-06 2023-02-28 Apple Inc. Portable electronic device, method, and graphical user interface for displaying structured electronic documents
US11240362B2 (en) 2006-09-06 2022-02-01 Apple Inc. Portable multifunction device, method, and graphical user interface for configuring and displaying widgets
US9927970B2 (en) 2006-09-06 2018-03-27 Apple Inc. Portable electronic device performing similar operations for different gestures
US8531423B2 (en) 2006-09-06 2013-09-10 Apple Inc. Video manager for portable multifunction device
US11481106B2 (en) 2006-09-06 2022-10-25 Apple Inc. Video manager for portable multifunction device
US10222977B2 (en) 2006-09-06 2019-03-05 Apple Inc. Portable electronic device performing similar operations for different gestures
US8547355B2 (en) 2006-09-06 2013-10-01 Apple Inc. Video manager for portable multifunction device
US10838617B2 (en) 2006-09-06 2020-11-17 Apple Inc. Portable electronic device performing similar operations for different gestures
US10313505B2 (en) 2006-09-06 2019-06-04 Apple Inc. Portable multifunction device, method, and graphical user interface for configuring and displaying widgets
US8669950B2 (en) 2006-09-06 2014-03-11 Apple Inc. Portable electronic device, method, and graphical user interface for displaying structured electronic documents
US11481112B2 (en) 2006-09-06 2022-10-25 Apple Inc. Portable electronic device performing similar operations for different gestures
US10778828B2 (en) 2006-09-06 2020-09-15 Apple Inc. Portable multifunction device, method, and graphical user interface for configuring and displaying widgets
US10656778B2 (en) 2006-09-06 2020-05-19 Apple Inc. Portable electronic device, method, and graphical user interface for displaying structured electronic documents
US8842074B2 (en) 2006-09-06 2014-09-23 Apple Inc. Portable electronic device performing similar operations for different gestures
US8214768B2 (en) 2007-01-05 2012-07-03 Apple Inc. Method, system, and graphical user interface for viewing multiple application windows
US10732821B2 (en) 2007-01-07 2020-08-04 Apple Inc. Portable multifunction device, method, and graphical user interface supporting user navigations of graphical objects on a touch screen display
US10254949B2 (en) 2007-01-07 2019-04-09 Apple Inc. Portable multifunction device, method, and graphical user interface supporting user navigations of graphical objects on a touch screen display
US11169691B2 (en) 2007-01-07 2021-11-09 Apple Inc. Portable multifunction device, method, and graphical user interface supporting user navigations of graphical objects on a touch screen display
US11586348B2 (en) 2007-01-07 2023-02-21 Apple Inc. Portable multifunction device, method, and graphical user interface supporting user navigations of graphical objects on a touch screen display
US9367232B2 (en) 2007-01-07 2016-06-14 Apple Inc. Portable multifunction device, method, and graphical user interface supporting user navigations of graphical objects on a touch screen display
US9933937B2 (en) 2007-06-20 2018-04-03 Apple Inc. Portable multifunction device, method, and graphical user interface for playing online videos
US10761691B2 (en) 2007-06-29 2020-09-01 Apple Inc. Portable multifunction device with animated user interface transitions
US11507255B2 (en) 2007-06-29 2022-11-22 Apple Inc. Portable multifunction device with animated sliding user interface transitions
US9772751B2 (en) 2007-06-29 2017-09-26 Apple Inc. Using gestures to slide between user interfaces
US11604559B2 (en) 2007-09-04 2023-03-14 Apple Inc. Editing interface
US10620780B2 (en) 2007-09-04 2020-04-14 Apple Inc. Editing interface
US11010017B2 (en) 2007-09-04 2021-05-18 Apple Inc. Editing interface
US11126321B2 (en) 2007-09-04 2021-09-21 Apple Inc. Application menu user interface
US11861138B2 (en) 2007-09-04 2024-01-02 Apple Inc. Application menu user interface
US10628028B2 (en) 2008-01-06 2020-04-21 Apple Inc. Replacing display of icons in response to a gesture
US9619143B2 (en) 2008-01-06 2017-04-11 Apple Inc. Device, method, and graphical user interface for viewing application launch icons
US8736561B2 (en) 2010-01-06 2014-05-27 Apple Inc. Device, method, and graphical user interface with content display modes and display rotation heuristics
US8438504B2 (en) 2010-01-06 2013-05-07 Apple Inc. Device, method, and graphical user interface for navigating through multiple viewing areas
US9733812B2 (en) 2010-01-06 2017-08-15 Apple Inc. Device, method, and graphical user interface with content display modes and display rotation heuristics
US9720514B2 (en) 2013-12-27 2017-08-01 Google Technology Holdings LLC Method and system for tilt-based actuation
US9383818B2 (en) 2013-12-27 2016-07-05 Google Technology Holdings LLC Method and system for tilt-based actuation

Also Published As

Publication number Publication date
FI20021560A0 (en) 2002-09-02
FI115255B (en) 2005-03-31
FI20021560A (en) 2004-03-03
EP1535142A1 (en) 2005-06-01
AU2003249135A1 (en) 2004-03-19

Similar Documents

Publication Publication Date Title
WO2004021166A1 (en) Guidance method and device
US7129926B2 (en) Navigation tool
EP1507192B1 (en) Detection of a dwell gesture by examining parameters associated with pen motion
EP1328919B2 (en) Pointer tool
JP4043447B2 (en) Written motion image classification recognition system and recognition method thereof
US20070070037A1 (en) Graphic signal display apparatus and method for hand-held terminal
US8531410B2 (en) Finger occlusion avoidance on touch display devices
US20070018966A1 (en) Predicted object location
US20090002324A1 (en) Method, Apparatus and Computer Program Product for Providing a Scrolling Mechanism for Touch Screen Devices
EP1887776A1 (en) Portable terminal and user interface control method thereof based on pattern recognition and analysis of image captured by camera
EP2306291A2 (en) Information display device
US20080030477A1 (en) Display Motion Multiplier
US20040208394A1 (en) Image display device and method for preventing image Blurring
JP5604279B2 (en) Gesture recognition apparatus, method, program, and computer-readable medium storing the program
JPH0772970A (en) Apparatus and method for selection of information
KR100777107B1 (en) apparatus and method for handwriting recognition using acceleration sensor
JPWO2011064895A1 (en) MAP DISPLAY DEVICE, MAP DISPLAY METHOD, MAP DISPLAY PROGRAM, AND RECORDING MEDIUM
WO2008054185A1 (en) Method of moving/enlarging/reducing a virtual screen by movement of display device and hand helded information equipment using the same
EP1993025A2 (en) Delay judgment systems and methods
US20110043453A1 (en) Finger occlusion avoidance on touch display devices
KR20070051312A (en) Video device
US8355599B2 (en) Methods and devices for detecting changes in background of images using multiple binary images thereof and hough transformation
US20060195800A1 (en) Apparatus for displaying screen and recording medium recording a program thereof
EP3125089B1 (en) Terminal device, display control method, and program
EP3379451A1 (en) Information processing device

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 2003790973

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 2003790973

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: JP

WWW Wipo information: withdrawn in national office

Country of ref document: JP