WO2001009861A2 - Method and system for interactive motion training - Google Patents

Method and system for interactive motion training Download PDF

Info

Publication number
WO2001009861A2
WO2001009861A2 PCT/US2000/020752 US0020752W WO0109861A2 WO 2001009861 A2 WO2001009861 A2 WO 2001009861A2 US 0020752 W US0020752 W US 0020752W WO 0109861 A2 WO0109861 A2 WO 0109861A2
Authority
WO
WIPO (PCT)
Prior art keywords
animation
user
sequence
animated
motion
Prior art date
Application number
PCT/US2000/020752
Other languages
French (fr)
Other versions
WO2001009861A3 (en
Inventor
John T. Ragland, Jr.
Douglas K. Fulcher
Original Assignee
Ragland John T Jr
Fulcher Douglas K
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ragland John T Jr, Fulcher Douglas K filed Critical Ragland John T Jr
Priority to AU65033/00A priority Critical patent/AU6503300A/en
Publication of WO2001009861A2 publication Critical patent/WO2001009861A2/en
Publication of WO2001009861A3 publication Critical patent/WO2001009861A3/en

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances
    • G09B5/02Electrically-operated educational appliances with visual presentation of the material to be studied, e.g. using film strip

Definitions

  • the invention accordingly comprises the several steps and the relation of one or more of such steps with respect to each of the others, the system embodying features of construction, combinations of elements and arrangement of parts which are adapted to effect such steps, and the interface embodying features of construction, combinations of elements and arrangement of parts which are adapted to allow user interaction therewith, all as exemplified in the following detailed disclosure.
  • Fig. 4B is the main control user-interface illustrating the product marketing introduction sequence as it can be displayed on a monitor according to one embodiment of the present invention
  • Fig. 5C is the main control user-interface illustrating a third frame from the start segment of the animation motion sequence from a side view as it can be displayed on a monitor according to one embodiment of the present invention
  • Fig. IOC illustrates another additional instructional topic as it can be displayed on a monitor according to one embodiment of the present invention
  • active display panel 406 may show a live- action video, as shown in Fig. 4B
  • static display panel 410 may provide additional information links 418, such as hyperlinks and the like, to additional explanatory materials. Selecting any of information links 418 may provide additional information to the user, either in static display panel 410, active display panel 406 or by opening an additional browser window, and the like. After reviewing this additional material, the user will typically press next button 416 to enter the lesson itself.
  • Fig. 7 A shows the segment identified as step two from a side camera perspective
  • Fig. 7B shows the same frame from a front camera perspective
  • Fig. 8 A shows the segment or subsequence identified as step 3, and in particular, the initial frame of that segment or subsequence.
  • Slider 800 shows a segment frame region 802 associated with the frames of the segment or subsequence, and a frame marker 804 associated with the particular frame being shown.
  • Fig. 8B and Fig. 8C show later frames of the segment, as shown by frame marker 804.
  • Fig. 8D shows the final frame of the segment from a side camera perspective
  • Fig. 8E shows the same frame from a front camera perspective.
  • an animation software unit is made up of one or more instructional motion sequences or subsequences embedded into a software unit.
  • a software element known as a drop-down or pick list allows the user to have control over which instructional lesson he or she wants to view and practice. Most students learn the basic fundamentals and then go on to learn the intermediate and advanced motion techniques at their own pace and time schedule.
  • Fig. 12 is a block diagram illustrating the hierarchy and navigational flow for the instructional content according to one embodiment of the present invention. Referring to Fig. 12, in step 702, the user selects the "Lessons" option.
  • a software display program using the animation unit as an input provides a user interface allowing the user to view and control the animation contained within the animation unit.
  • the software display program may interface with an Internet browser, television, or any hand-held computer device to display and control the animation.
  • the software display program accepts and processes user input to control the digitized animation motion sequence, individual motion segments, the speed of animation playback, and one or more different camera views of the same motion sequence.
  • the software display program may also allow a user to access the additional information provided with the animation by, for example, using a pointing device to manipulate a cursor to indicate an object for which additional information is desired.

Abstract

The invention employs digitized animation motion sequences to provide training in the performance of a physical activity utilizing a computer-based interactive user-interface. The digitized animation motion sequences are moving images of an instructor dynamically performing particular techniques from a pre-written script (410) using motion capture hardware and software. A digitized animation unit contains one or more digitized animation motion sequences condensed into a single software computer data file. In addition to the pre-written script text, the animation unit contains audio (426), static images (406), and one or more different views (420, 422) of the same animation motion sequences. An interactive computer-based user-interface allows a student to display and control different aspects of the animation unit, such as a frame rate of speed and camera views, on a computer connected to a communications network, the Internet, or television, or any hand-held computer communications device.

Description

METHOD AND SYSTEM FOR INTERACTIVE
MOTION TRAINING AND A USER INTERFACE
THEREFOR
CROSS REFERENCE TO RELATED APPLICATIONS
This application claims the benefit of priority from the prior provisional application entitled "METHOD AND SYSTEM FOR INTERACTIVE MOTION TRAINING AND A USER INTERFACE THEREFOR", serial number 60/146,933, filed in the United States Patent and Trademark Office on July 30, 1999.
FIELD OF THE INVENTION
The present invention relates generally to a process for capturing three- dimensional motions of an instructor, actor or other demonstrator and converting the three- dimensional motion information into a three- or two-dimensional digital display format; developing a digital character including equipment such as clothing and gear appropriate for the motion being taught and having the same physical characteristics, height, weight, and shape of the instructor; dividing the full training motion sequence into individual motion segments; compressing the motion segments into a single, digitized software animation unit; and adding textual, audio or visual explanatory material associated with a particular motion segment to the animation unit. The present invention also relates generally to a computer software program that retrieves the animation unit from a remote file server or a local database or other location; and controls the animation unit to display the training motion, individual motion segments and/or the explanatory material on a display, or, for example, an Internet browser, television, or any hand-held or other computer device capable of running the software program and providing a display.
BACKGROUND OF THE INVENTION
Motion training is used to teach millions of people how to perform or perfect a new technique. The method of learning a motion most often employed by those seeking to learn new techniques is through self-education using books or videotapes or other educational material. For example, students may purchase instructional books or video tapes to read or review the proper techniques performed by an instructor, and then proceed to perform the motions on their own using the guidelines provided by the material. These methods, however, are static and typically do not allow for the student to interact or control the way in which motions are displayed. Also, the student may not gain the benefit of seeing the same motion from different vantage points or camera views. Furthermore, for the average person, this process can be difficult and unrewarding because there is typically no review of whether the student's technique was performed accurately, and therefore often leads to students improperly learning the motions. Motion training can also be taught by an instructor verbally directing the student to recognize the desired positions and sequence of the motions by imitation, feel and through the explanatory comments of the instructor. To receive the most money for their time, the instructor usually assembles a group of people, or class, and performs the techniques for many people at once. This method of learning can be too costly and time restrictive for the student as he or she is subject to the time schedule and location at which the instructor teaches the techniques. And, if the student wants to repeat any of the techniques with the instructor, or have the benefit of one-to-one instruction, an additional fee is charged to the student above and beyond the cost of the original lesson plans. Furthermore, the motions of the instructor are presented in one speed — as quickly as the instructor naturally moves; in one temporal direction — forward in time; and usually in one complete sequence. The segments or subsequences which may comprise the sequence of an activity are not easily viewed or separated by the instructor nor by a student viewing a live demonstration.
What is desired is a method and software system for training motions to a student which provides a digitally displayed character wearing appropriate equipment such as clothing and gear for the motion to be taught, where the character resembles an actual instructor who has performed the motion, possibly in accordance with sanctioned script instructions before a motion capture set up; and where the student may select individual motion segments comprising the complete motion sequence to display and redisplay along with explanatory material associated with that segment or subsequence; and where the student may select one of several camera angles for viewing the motion; and where the student may select individual portions of the digital character display to view additional information related to that portion.
SUMMARY OF THE INVENTION
Generally speaking, in accordance with the present invention, a method and system are presented for condensing one or more three dimensional, digital animation motion sequences into a self-contained, digitized animation unit, and dividing the motion sequences into separate motion segments or subsequences associated with explanatory text, audio, graphic or other material; and for providing an interactive user-interface for displaying and controlling the animation unit. The separate motion segments or subsequences of the overall animation sequence may be played as individual self-contained units by a user, allowing a user to focus on one particular portion of a physical activity (e.g. , only the ball toss subsequence in a tennis serve sequence). The user may play such a subsequence so that the motion display starts at the beginning of the subsequence, and not before, and ends at the end of the subsequence, and not after. Further, explanatory material associated with that segment may only be presented while the user is viewing that motion segment. Thus other portions or subsequences of the animation do not interfere with the user's study of the subsequence.
The present invention provides for a method of capturing the dynamic motion techniques of an instructor performing a motion, preferably using a script sanctioned or approved by an appropriate organization, digitizing the motion into data files, superimposing a character graphic on the three-dimensional animation data, converting the data into three- or two-dimensional vector data, cleaning and filtering the animation data to smooth the appearance and highlight those portions of the character which are most relevant to instruction of the technique, dividing the motion sequence into individual motion segments, adding explanatory material associated with each individual segment, and efficiently delivering the animation unit to a student using an interactive control interface over a communication network, the Internet, television, or any hand-held computer device capable of running an interactive user interface for controlling the animation unit. The present invention also provides for creating a simple, inexpensive and easy to use motion training user- interface, providing instructional information to a user, which displays a vector graphical representation or animation of a motion technique of an instructor who has correctly performed the motion sequence, and which also provides audio, text, or other material and Internet hyperlinks associated with one or more digitized motion segments or portions of the sequence, or associated with a portion of the digital character representation in an interactive environment. The software display program and user interface uses the animation unit as input. The animation unit may also include an entire course of training motion sequences performed by an instructor dynamically performing particular motion techniques.
The present invention further provides for a software display program or player, capable of displaying the motion techniques from an animation unit on an Internet browser, television, or any hand-held computer device capable of running the software player. The software player accepts and processes user input to control the digitized animation motion sequence, and individual motion segments, the speed of animation playback, and one or more different camera views of the same motion sequence. The software player typically also allows the user to display and review the animation motion sequences and individual segments to help the student isolate the different motions that make up the complete technique and provide additional explanatory material associated with certain portions of the display.
Accordingly, it is an important object of this invention to provide a method and system for interactive motion training including converting three dimensional training motion sequences performed by an instructor to three- or two- dimensional vector graphic data; including explanatory text, audio, graphical or other material associated with an individual motion segment of the training motion sequence; and providing an interactive user interface for displaying and controlling the training motion sequence and individual motion segments while viewing the explanatory material.
Another object of the invention is to provide a method and system for training a user to perform a motion sequence including a three- or two dimensional vector graphic representation of an instructor performing the motion sequence, and an interactive user display allowing the user to select specific motion segments from the training motion sequence to review, which motion segments are associated with explanatory material also displayed on the user interface.
A further object of the invention is to provide a user interface for displaying three- or two- dimensional vector graphic representations of a training sequence and explanatory material associated with individual segments of the training sequence motion wherein the user may select one of several camera angles with which to view the motion segments, and may select individual portions of the representation whereby the user is shown additional information associated with that portion of the representation. Still another object of the invention is to provide a user interface for displaying training motion sequences in which the user may select individual motion segments for display, may freeze, reverse or fast forward individual frames or a range of frames from the training motion sequence, and may view additional material associated with an individual frame or range of frames or associated with a user selected portion of the representation.
Still a further object of invention is to provide a digitized, sanctioned and approved motion training sequence as performed by an experienced instructor and explanatory materials associated with individual motion segments of the sequence; and a software program for controlling and displaying the sequence, whereby the student may control the camera angle for viewing the sequence, and a frame or range of frames comprising a portion of the training sequence or an individual motion segment.
Yet another object of the invention is to provide a user interface displaying an animation unit including individual motion segments forming a motion sequence; and displaying explanatory text associated with a frame or range of frames of the motion sequence; and further providing additional information, such as audio, graphic or other material, in response to user interaction.
Yet a further object of the invention is to provide a system and method for easily teaching a motion technique by allowing a student to view a motion sequence broken down into individual motion segments, wherein the motion sequence is generated from an instructor performing a script approved or sanctioned by an appropriate organization and creating a vector graphic representation, and providing explanatory material associated with portions of the motion sequence.
Still other objects and advantages of the invention will in part be obvious and will in part be apparent from the following detailed specification. The invention accordingly comprises the several steps and the relation of one or more of such steps with respect to each of the others, the system embodying features of construction, combinations of elements and arrangement of parts which are adapted to effect such steps, and the interface embodying features of construction, combinations of elements and arrangement of parts which are adapted to allow user interaction therewith, all as exemplified in the following detailed disclosure.
BRIEF DESCRIPTION OF THE DRAWINGS
For a more complete understanding of the present invention and its associated advantages, reference is now made to the following descriptions taken in conjunction with the accompanying drawings, in which: Fig. 1 is a block diagram of the major elements of one implementation of the present invention;
Fig. 2 A is a block diagram of the individual elements that make up the major element 102 in Fig. 1 ;
Fig. 2B is a block diagram of the individual elements that make up the major element 106 in Fig. 1;
Fig. 2C is a block diagram of the individual elements that make up the major element 110 in Fig. 1;
Fig. 2D is a block diagram of the individual elements that make up the major element 114 in Fig. 1; Fig. 3 is a diagram illustrating the individual instructional segments that make up an animation sequence to be displayed on a monitor according to one embodiment of the present invention; Fig. 4A is the main control user- interface illustrating a number of animation sequences, with level of expertise, as it can be displayed on a monitor according to one embodiment of the present invention;
Fig. 4B is the main control user-interface illustrating the product marketing introduction sequence as it can be displayed on a monitor according to one embodiment of the present invention;
Fig. 5 A is the main control user-interface illustrating one frame of the starting segment of the animation motion sequence from a side view as it can be displayed on a monitor according to one embodiment of the present invention; Fig. 5B is the main control user-interface illustrating another frame from the start segment of the animation motion sequence from a side view as it can be displayed on a monitor according to one embodiment of the present invention;
Fig. 5C is the main control user-interface illustrating a third frame from the start segment of the animation motion sequence from a side view as it can be displayed on a monitor according to one embodiment of the present invention;
Fig. 6 is the main control user-interface illustrating the last frame from the first individual instructional segment and highlighted areas of focus as it can be displayed on a monitor from a side camera view according to one embodiment of the present invention; Fig. 7 A is the main control user-interface illustrating the last frame from the second individual instructional segment and highlighted area of focus as it can be displayed on a monitor from a side camera view according to one embodiment of the present invention;
Fig. 7B is the main control user-interface illustrating the last frame from the second individual instructional segment and highlighted area of focus as it can be displayed on a monitor from a front camera view according to one embodiment of the present invention; Fig. 8 A is the main control user-interface illustrating the first frame from the third individual instructional segment as it can be displayed on a monitor from a side camera view according to one embodiment of the present invention;
Fig. 8B is the main control user-interface illustrating a latter frame from the third individual instructional segment as it can be displayed on a monitor from a side camera view according to one embodiment of the present invention;
Fig. 8C is the main control user- interface illustrating still a later frame from the third individual instructional segment as it can be displayed on a monitor from a side camera view according to one embodiment of the present invention; Fig. 8D is the main control user- interface illustrating the last frame from the third individual instructional segment and highlighted areas of focus as it can be displayed on a monitor from a side camera view according to one embodiment of the present invention;
Fig. 8E is the main control user-interface illustrating the last frame from the third individual instructional segment and highlighted areas of focus as it can be displayed on a monitor from a front camera view according to one embodiment of the present invention;
Fig. 9 A is the main control user-interface illustrating the last frame of the final individual instructional segment and highlighted areas of focus as it can be displayed on a monitor from a side camera view according to one embodiment of the present invention;
Fig. 9B is the main control user-interface illustrating the individual highlighted area of focus on the final individual instructional segment as it can be displayed on a monitor from a side camera view according to one embodiment of the present invention;
Fig. 9C is the main control user-interface illustrating another individual highlighted area of focus on the final individual instructional segment as it can be displayed on a monitor from a side camera view according to one embodiment of the present invention; Fig. 10A is the main control user-interface illustrating the ending instructional segment as it can be displayed on a monitor according to one embodiment of the present invention;
Fig. 10B illustrates an additional instructional topic as it can be displayed on a monitor according to one embodiment of the present invention;
Fig. IOC illustrates another additional instructional topic as it can be displayed on a monitor according to one embodiment of the present invention;
Fig. 11A is the main control user-interface describing the equipment used for the instructional sport with highlighted areas of focus as it can be displayed on a monitor from a side camera view according to one embodiment of the present invention;
Fig. 11B illustrates the textual description for the individual highlighted area of focus for an equipment item used for the instructional sport as it can be displayed on a monitor according to one embodiment of the present invention;
Fig. 11C illustrates the textual description for an individual highlighted area of focus for another equipment item used for the instructional sport as it can be displayed on a monitor according to one embodiment of the present invention;
Fig. 12 is a block diagram illustrating the hierarchy and navigational flow for the instructional content according to one embodiment of the present invention;
Fig. 13 is a block diagram illustrating the individual components of the animation unit according to one embodiment of the present invention.
DESCRIPTION OF THE PREFERRED EMBODIMENT
In the following description, various aspects of the present invention will be described. For purposes of explanation, specific configurations and details are set forth in order to provide a thorough understanding of the present invention. However, it will also be apparent to one skilled in the art that the present invention may be practiced without the specific details. Furthermore, well known features may be omitted or in the description simplified in order not to obscure the present invention. The present invention is utilized for providing instructional information including motion training techniques to a user; in particular, motion training for various sports, physical therapy, dance, workplace or recreational activities. The present invention is particularly useful in training a motion sequence in which the positions of the body and its limbs, as well as sport, therapy, work, or recreational equipment, are to be compared during the motion sequence and emulated by the student.
In a sporting activity, a participant who is trained in the proper motion technique may be able to improve his or her skill level or avoid an injury from performing a repetitive motion accurately. Physical therapy can be more effective when exercising a specific muscle needing therapy through a particular motion when that motion technique is performed correctly. In the workplace, a worker's safety may be protected or a gain in production efficiency may result from proper technique in performing a particular motion.
In many repetitive motion activities the larger muscles of the body typically do most of the work. Golf, tennis, baseball, football, basketball, running, and many physical therapies, to name a few, are based on fundamental movements and techniques that typically result from the correct positioning of the body as a whole. Correct motion technique and placement of the major body parts with respect to one another is of major importance to the successful completion of the intended action. Even where the perceived motion action (i.e. , throwing a baseball) is performed more by the joints or extremities than the larger body parts, it still is usually the proper sequence of movements of the large muscles of the body which bring the joints and extremities into correct alignment with the appropriate timing of execution.
In fact, for proper results in successfully performing the technique in many motions of sports such as golf, tennis, and baseball, where the object is to hit a ball and the like, it is actually more important for the extremities to be in relatively correct alignment, through proper positioning of the larger muscles of the body moving in ways that are subtly counterproductive to maximum impact. Incorrect motion existing solely in the movements of the joints and extremities (i.e. , poorly timed rolling of the forearms in a golf swing) is more clearly demonstrated to the student with the present invention, which can interactively focus on a particular portion of the movement unlike with books or videotapes. The student can see the actual shapes of the limbs in question moving in concert with one another and can highlight, through interaction with the display as described further below, the position of the extremity. That is, because the motion display sequence has certain highlighted areas, the student viewing a segment or subsequence may focus on a detailed view of the portion of the display highlighted to learn the correct positioning.
Although the present invention will now be described with regard to a particular in-line skating stop, this is but one example. The present invention may also be used for, among other techniques, training strokes or serves in tennis, hitting or pitching in baseball, running, hitting a nail with a hammer, proper technique to pickup heavy objects, and the like. Furthermore, the system and method of the present invention may be used for training in any sort of physical activity, such as physical therapy or dance.
The following embodiments are described in sufficient detail to enable those skilled in the art to practice the invention, and it is to be understood that other embodiments may be utilized and that structural, logical, and graphical changes may be made without departing from the spirit and scope of the present invention.
The approval of a governing body of representatives in the field in which the technique resides, or other appropriate organization, may be consulted and a sanctioned technique will be developed. A script may be written for an instructor who performs the technique in the approved way. When used herein, the term "instructor" may mean any person demonstrating a physical activity, and is not limited to a person who is a certified trainer or a professional in a given field.
Fig. 1 is a block diagram of the major elements of one implementation of the present invention. Referring to Fig. 1, in step 102, motion capture cameras surrounding the area of movement detect light reflecting from optical balls positioned on the instructor and motion capture software digitizes the three-dimensional movements into data points. In step 104, the software further generates HTR motion data from the data points captured. It is well known to capture the motion of an actor wearing devices such as optical balls or lights indicating the relative motion of the actor's body parts; other methods of motion capture may be used with the system and method of the present invention. The creation and use of HTR file data is well known. In step 106, a three-dimensional character is created which typically includes all equipment, including equipment such as safety gear and clothing used by people who perform the motion technique, in a computer software program. The HTR motion data is applied to the three-dimensional character model, which creates a three-dimensional version of the motion sequence. To optimize the viewing of the motion sequences on an on-line interactive display such as a browser, an additional computer software program may be installed on the computer to allow the main software program to call individual functions of the install program. This type of linked program is known as a plug-in.
In step 108, a plug-in program is used to render the three-dimensional motion sequences into two-dimensional vector data. One example of such a computer software program is 3D Studio Max running on a Microsoft Windows operating system; other software programs or modules may be used. A preferred minimum configuration supporting such a program includes a 128 Mbytes of main memory and an Intel Pentium 333 MHz processor. Those skilled in the art will recognize, however, that other comparable hardware and software programs may be employed without diverting from the spirit or scope of the present invention. Furthermore, where bandwidth is not an issue, the invention may be performed using a three-dimensional representation.
In step 110, the two-dimensional vector data files are imported into a vector- based animation software program. One such software program is Macromedia's Flash development environment; other software programs may be used. In step 112, animation data is created, and in step 114 an interactive animation unit or an animation data file is created. The images are treated, which may include editing unwanted data points or distortion, making line fixes, and colorizing, among other processes. The images may also be registered to remove lateral motion. Thus, a clean, two-dimensional graphic representation of the instructor wearing appropriate equipment, such as clothing and gear for performing the motion technique, is created to form the animation unit. In alternate embodiments, any representation of the animation information, such as an animation data file or remotely accessible animation data, may be created for later viewing by a user.
Fig. 2 A is a block diagram of the individual elements that make up the major element 102 in Fig. 1. Referring to Fig. 2A, in step 202 the MOCAP cameras are set up. In step 204, the instructor is fitted with a MOCAP suit or other equipment facilitating motion capture. Preferably, the instructor is fitted with a body suit for capturing motion, such as one including light reflecting, optical balls, approximately 1 inch in diameter, placed on the suit at key joint locations such as the elbow, knee, neck, etc. In step 206, the instructor performs the physical activity in the motion capture environment, for example the approved motion techniques outlined in an approved script. If such a script is used, it may be written with the cooperation and approval of the governing body of representatives or other appropriate organization that have joined together to promote and grow the activity or sport. The script comprises the steps that the instructor performs to teach the particular technique in the said motion capture environment.
Preferably, the motion capture environment includes light sensitive, optical cameras, computer hardware and the software to control the environment, but may include other combinations of equipment. One example of such an environment is a system provided by Motion Analysis Corporation. Other systems and methods of capturing motion may be used. In alternate embodiments a sanctioned script need not be provided; for example, an actor or instructor may simply execute a physical activity, such as jumping rope or performing therapeutic exercises according to a physical therapy regimen. In step 208, motion data points are generated. In step 210, the system and software generates HTR motion data from the data points captured. Fig. 2B is a block diagram of the individual elements that make up the major element 106 in Fig. 1. Referring to Fig. 2B, in step 212, a three-dimensional character is created. The character may correspond in physical likeness to the person whose image has been captured, or alternately may differ in such physical likeness. In step 214, HTR motion data is applied to the three-dimensional character model. In step 216, a 3-D motion sequence is generated. In step 218, vector graph data is rendered from the 3D motion sequence in known ways.
Fig. 2C is a block diagram of the individual elements that make up the major element 110 in Fig. 1. Referring to Fig. 2C, in step 220, the two-dimensional vector data files are imported into a vector-based animation software program. In step 222, the images are colorized and smoothed. In step 224, the images are also registered to remove lateral motion.
Fig. 2D is a block diagram of the individual elements that make up the major element 114 in Fig. 1. Referring to Fig. 2D, in step 226 interactive interface controls are applied. In step 228 explanatory or other materials may be added. In step 230 the explanatory materials may be associated with the relevant segments. In step 232 hot links to the animation unit may be added to provide highlighted areas of the animation sequence display, and, preferably, additional explanatory materials are added. In step 234 the animation unit may be exported. The animation unit provides instructional information for a motion sequence to a user.
The motion technique sequence which is thus created may be divided into individual motion segments or subsequences, each which may correspond to portions of an approved script or to logical breaks in the motion sequence. For a given motion sequence to be taught, several individual motion segments may be performed in sequence. For example, segments may correspond to a person tossing a ball in the air, drawing a racquet back, moving the racquet to strike the ball, and following through with the racquet.
Explanatory materials, such as text, graphics, audio or other material may be associated with each motion segment. In addition, certain areas of the representation may be linked to additional content. For example, a typical Internet hyperlink may be used to associate a portion of the representation with such additional content. A user may indicate or request that more information is desired about an object (e.g. , a body part or a piece of clothing) shown in the animation by, for example, moving an on screen indicator, such as a cursor, over the object and possibly clicking on the object using, for example, a pointing device such as a mouse or trackball. Additional information, such as the name of the object or a brief description of the desired position of the object (e.g. , the ankle is angled inward), may appear if a user drags a pointer over the object. Clicking on the object may also display additional information - for example, an opportunity to purchase the object if the object is an item of equipment. Methods of associating interactive information such as labels or hyperlinks with displayed images and objects are known. In the example provided, if a user of the interactive training interface clicks on the in-line skates of the digital representation, that user might be brought to a Web site provided by a manufacturer of in-line skates who wishes to advertise. Alternatively, content explaining the correct fit for in-line skates or what features of performance a user should look for when purchasing in-line skates may be shown to the user.
In another example, involving a baseball pitch, a user may click on the throwing hand of the digital representation of the trainer. If the user clicks the hand at the beginning of the motion sequence the user may be shown a close-up of the grip of that hand around a baseball. If the user clicks the hand at the release point in the motion, the user may be shown a slow motion close-up of the wrist flick being taught at the release point. Of course, it will be readily apparent to those of ordinary skill of the art, that any sort of additional information content may be associated with portions or objects of the digital representation at any time during the motion sequence. Allowing a user to control which objects need expanded explanation filters and limits the amount of information provided to a user, allowing for more effective training. Such methods are not easily effected with more traditional physical training, such as live classes, videotapes or books.
The animation unit also includes control structures which allow the animation unit to be affected by the user interface. For example, an index may be provided in the animation unit which determines the start frame and stop frame of the individual motion segments or subsequences within the complete motion sequence and may associate those segments or subsequences with particular explanatory material or other content. Typically, the animation unit also contains additional control information for performing other functions, such as tracking a particular user's viewing of that animation unit and/or monitoring a user's progress through a number of individual motion sequences or subsequences which may be contained in a course of related lessons.
The interactive control interface template is a software program adapted to control the display of the animation unit. Therefore, it includes a portion for displaying the motion sequence. Additional video control functions will appear in the control interface template, typically arrayed and applied around the motion animation sequence viewing portion. Other content such as text, Internet hyper-links, additional still images, and audio comprising, for example, explanatory material associated with each motion segment within the motion sequence, may be embedded into the interactive control interface, making a single, self-contained software program for playing any animation unit on a computer. In a preferred embodiment, the interactive interface works in conjunction with an Internet browser, and utilizes the internal functionality of the Internet browser to perform many of the display tasks and file management tasks such as retrieving the animation unit from a remote Internet Web Site. Of course, it will be easily appreciated by those of ordinary skill of the art, that the interactive interface can be adapted for use on any sort of computing system, such as a personal computing device having any sort of configuration, a television, a hand-held computer device, or any other device capable of running a software program and providing a display.
In one embodiment, the control user-interface contains software elements allowing the user to view the instruction techniques from several different views or camera angles. This allows the user to have the best vantage point for learning the motion technique being displayed. One example of benefiting from the ability to view different camera angles is the stride technique for in-line skating shown. In Fig. 5A, the stride motion technique is shown from the front view, where the user cannot view how far apart the instructor's legs are from each other, or how far out from the center of the instructor's body the legs are. By viewing the stride technique from a different camera angle, such as the front view in 7B, the user learns the proper leg positions preformed by the instructor.
Fig. 6 is the main control user-interface illustrating the last frame from the first individual instructional segment and highlighted areas of focus as it can be displayed on a monitor from a side camera view according to one embodiment of the present invention. Referring to Fig. 6, the control user-interface enhances the learning experience for the user by displaying visual cues, such as highlighting selected extremities on the instructor's body or equipment. For example, highlights may be applied to the animation sequence along the instructor's hands, knees, the bend in the elbows, or the position of the wrists. These visual cues provide the user with visual assistance in matching corresponding extremities or joints of the user's body as the motion dynamically progresses through the sequence. Fig. 3 is a diagram illustrating the individual instructional segments that make up an animation sequence to be displayed on a monitor according to one embodiment of the present invention. Referring to Fig. 3, in step 302 the user begins the animated sequence. In step 304, the skater is shown skating continuously. In step 306, Step 1 of the animation sequence (corresponding to segment or subsequence 1) begins. In Step 308, the animation of the skater moves to dynamic ready position. In step 310, Step 2 of the animation sequence (corresponding to subsequence 2) begins. In step 312, the skater starts a right A-frame turn. In step 314, Step 3 of the animation sequence (corresponding to subsequence 3) begins. In step 316, the animated skater rotates its head, upper body, and right knee. In step 318, the animated skater applies pressure to the inside edge of the left skate. In step 322, Step 4 of the animation sequence (corresponding to subsequence 4) begins. In step 322 the animated skater pulls its heels together and stands to end the stop. In step 326 the animation sequence ends. Any individual subsequence may be animated, and the animation of such subsequences may be controlled by being played forwards, backwards, at various speeds, in stop frame form, or by using other methods. The user may play each subsequence without playing any other part of the overall animation, starting at the beginning of the subsequence and ending at the end of the subsequence.
For purposes of illustration, a user's interaction with the animation unit will now be described. The animation unit will be accessed by a software player which may show the animation sequence, individual segments, explanatory text and certain control and other interface elements on a computer monitor, for example, or as part of another computer program, such as a browser. It will be readily understood that other means for displaying and controlling the animation unit may be used.
With reference to Fig. 4A, a computer monitor shows an introductory page showing a topic 402, in this case, In-Line Skating Techniques, and subtopic 404, in this case, braking lessons. An active display panel 406 may show various animation sequences related to subtopic 404, for example, standard heel stop, cuff-activated heel stop and grass stop. The user may select a particular animation sequence by clicking on the appropriate portion of active display panel 406 to form a highlight portion 408. A static display panel 410 will typically show text associated with highlight portion 408 or other explanatory materials as described herein. Typical control and navigation elements will also be provided, such as a home button 412, a back button 414 and a next button 416. Upon hitting next button 416, for example, the user may be shown additional introductory or explanatory materials related to highlight portion 408. In this case, active display panel 406 may show a live- action video, as shown in Fig. 4B, and static display panel 410 may provide additional information links 418, such as hyperlinks and the like, to additional explanatory materials. Selecting any of information links 418 may provide additional information to the user, either in static display panel 410, active display panel 406 or by opening an additional browser window, and the like. After reviewing this additional material, the user will typically press next button 416 to enter the lesson itself.
Referring to Fig. 5A, each motion technique may be divided into a number of smaller, logical instructional units, segments or subsequences. This allows the student to have the ability to isolate an individual instructional segment and replay it using the slide control. The user can view an instructors' motion techniques against a moving background (e.g. , a wall or a traffic cone) to give the user the sense of the rate of speed of the image of the instructor as it performs the techniques. The user can control different aspects of the animation by a known software element such as a slide control 800 or another element. The user can view the entire motion technique or view the motion techniques one frame at a time. In one embodiment, each mark on the slide control 800 corresponds to one frame of the instructional animation. Preferably, the user can also control the frame rate of speed of the individual motion techniques with the slide control 800. As the animation proceeds, instructional information corresponding to the portion of the animation which is being displayed may be shown on a portion of the screen. Such instructional information may be included in the animation unit or animation file containing animation information.
Referring to Fig. 5A, active display panel 406 will show a looped animation segment associated with the start phase of the lesson. Static display panel 410 will show explanatory materials associated with the start phase of the lesson. The control interface will provide for separate camera angles to view the animation segment in active display panel 406, such as side view 420 and front view 422. Again, additional control buttons linking to additional information may be provided, such as a see more button 424 and a hear more button 426. Also, typical motion control buttons may be provided, such as a rewind button 428, a pause button 430, a stop button 432 and a fast forward button 434. Fig. 5B and Fig. 5C show additional frames in the looped animation segment. Static display panel 410 may include tabbed portions 436 providing direct access to individual segments of the sequence, for example, step one, shown in Fig. 6. In this case, static display panel 410 will show explanatory materials associated with the segment identified as step one and active display panel 406 will show only the frames of the animation sequence associated with that segment. The animated character shown in active display panel 406 may also show highlighted regions 438 which are links to additional information. Upon selecting a highlighted region 438, the user may be shown such additional information.
Fig. 7 A shows the segment identified as step two from a side camera perspective, and Fig. 7B shows the same frame from a front camera perspective. Fig. 8 A shows the segment or subsequence identified as step 3, and in particular, the initial frame of that segment or subsequence. Slider 800 shows a segment frame region 802 associated with the frames of the segment or subsequence, and a frame marker 804 associated with the particular frame being shown. Fig. 8B and Fig. 8C show later frames of the segment, as shown by frame marker 804. Finally, Fig. 8D shows the final frame of the segment from a side camera perspective, and Fig. 8E shows the same frame from a front camera perspective. By interacting with slider 800, the user can select any specific frame from the segment for closer study, and by choosing different camera views, can see that frame from different perspectives.
Fig. 9A shows the segment identified as step four, with static display panel 410 displaying explanatory materials associated with that segment. Fig. 9B and Fig. 9C show additional explanatory material in static display panel 410 associated with the user selecting highlight regions 438.
Fig. 10A shows explanatory material associated with the end or completion of the lesson. In this case, static display panel 410 may also include additional information links 418. Fig. 10B and Fig. IOC show additional explanatory material in active display panel 408 associated with additional information links 418.
The above-mentioned introductory page, shown in Fig. 4A, may also include links to information other than lessons, such as a link to equipment useful for that topic. Fig. 11A shows the explanatory material in static panel 410 for such information, and active display panel 408 will typically show the animation character, including highlight regions 438 which link to information, such as the information shown in Fig. 11B and in Fig. llC.
In one embodiment of the system and method of the present invention, an animation software unit is made up of one or more instructional motion sequences or subsequences embedded into a software unit. A software element known as a drop-down or pick list allows the user to have control over which instructional lesson he or she wants to view and practice. Most students learn the basic fundamentals and then go on to learn the intermediate and advanced motion techniques at their own pace and time schedule. Fig. 12 is a block diagram illustrating the hierarchy and navigational flow for the instructional content according to one embodiment of the present invention. Referring to Fig. 12, in step 702, the user selects the "Lessons" option. From this step, the user may select a number of sports, for example in-line skating (704), golf (706), physical therapy (708) or other activities (710). If the user selects in-line skating (704), the user may be presented with a set of skills comprising in-line skating; for example braking (712), aggressive skating (714), or other options (716). If the user selects braking (712), the user may be presented with various types of braking lessons; for example the standard heel stop (718), the grass stop (720), the spin stop (722) or other methods of stopping (724).
Fig. 13 is a block diagram illustrating the individual components of the animation unit according to one embodiment of the present invention. Referring to Fig. 13, an animation unit (802) according to one embodiment of the present invention includes various files (804), including files containing general information (806) and motion sequence information (808). The motion sequence information (808) includes motion segments (810), which in turn comprises explanatory material (812) and animation frames (814). The explanatory material (812) and animation frames (814) are related and may be operated to interact with each other. In other embodiments, an animation unit may be of different structures, and may include different data.
In an exemplary embodiment a software display program using the animation unit as an input provides a user interface allowing the user to view and control the animation contained within the animation unit. The software display program may interface with an Internet browser, television, or any hand-held computer device to display and control the animation. The software display program accepts and processes user input to control the digitized animation motion sequence, individual motion segments, the speed of animation playback, and one or more different camera views of the same motion sequence. The software display program may also allow a user to access the additional information provided with the animation by, for example, using a pointing device to manipulate a cursor to indicate an object for which additional information is desired.
Accordingly, the present invention provides a process for capturing three- dimensional motions of an instructor and converting the three-dimensional motion information into a three- or two-dimensional digital display format; developing a digital character including equipment appropriate for the motion being taught and having the same physical characteristics, height, weight, and shape of the instructor; dividing the full training motion sequence into individual motion segments; compressing the motion segments into a single, digitized software animation unit; and adding textual, audio or visual explanatory material associated with a particular motion segment to the animation unit. The present invention also relates generally to a computer software program that retrieves the animation unit from a remote file server or a local database or other location; and controls the animation unit to display the training motion, individual motion segments and/or the explanatory material on a display, or, for example, an Internet browser, television, or any hand-held or other computer device capable of running the software program and providing a display.
It will thus be seen that the objects set forth above, among those made apparent from the preceding description, are efficiently attained and, since certain changes may be made in carrying out the above method and in the system set forth without departing from the spirit and scope of the invention, that it is intended that all matter contained in the above description and shown in the accompanying drawings shall be interpreted as illustrative and not in the limiting sense. The present invention may be implemented in different manners and used with different applications.
It is also to be understood that the following claims are intended to cover all of the generic and specific features of the invention herein described and all statements of this scope of the invention which, as a matter of language, might be said to fall there between.

Claims

CLAIMS What is claimed is:
1. A method of displaying to a user instructional information for a physical activity on a computing system, the method comprising: displaying an animated sequence of a physical activity and associated explanatory material, the animated sequence being divided into a set of subsequences which may be displayed continuously as the animated sequence or individually as individual animated subsequences, the explanatory material being divided into a set of portions, each portion associated with one of the subsequences, the animated sequence including images of objects which may be selected by a user; and while an object is selected by a user, displaying to that user additional information for that object.
2. The method of claim 1 further comprising allowing the user to select a subsequence and to display the selected subsequence and the associated portion of explanatory material such that the resulting animation display does not begin before the start of the selected subsequence and does not end after the end of the selected subsequence and unassociated portions of explanatory material are not shown.
3. The method of claim 1 wherein the selection of an object is effected by a user operating a pointing device to move a cursor over an object.
4. The method of claim 1 wherein the object is one of a representation of a human body part and a representation of an item of equipment.
5. The method of claim 1 , comprising allowing a user to alter the camera angle at which the animated sequence is displayed.
6. The method of claim 1, comprising allowing a user to alter the speed of the display of the animated sequence.
7. The method of claim 1 , comprising allowing a user to alter the time direction of the display of the animated sequence.
8. The method of claim 1, comprising providing information links in the explanatory material and displaying items of instructional information in response to the user's selection of the links.
9. The method of claim 1, wherein the physical activity to be performed is performed in accordance with a script and each subsequence is associated with a step in the script.
10. A system for displaying to a user instructional information for a physical activity, the system comprising: a computer module for displaying an animated sequence of a physical activity and associated explanatory material, the animated sequence being divided into a set of subsequences which may be displayed continuously as the animated sequence or individually as individual animated subsequences, the explanatory material being divided into a set of portions, each portion associated with one of the subsequences, the animated sequence including images of objects which may be selected by a user; and a computer module for receiving user input allowing a user to select an object and, while the object is selected, displaying additional information for that object.
11. The system of claim 10 wherein the computer module for receiving user input allows the user to select a subsequence and causes the computer module for displaying the animated sequence to display the selected subsequence and the associated portion of explanatory material such that the resulting animation display does not begin before the start of the selected subsequence and does not end after the end of the selected subsequence and unassociated portions of explanatory material are not shown.
12. The system of claim 10 wherein the selection of an object is effected by a user operating a pointing device to move a cursor over an object.
13. The system of claim 10 wherein the object is one of a representation of a human body part and a representation of an item of equipment.
14. The system of claim 10, wherein the user may alter the camera angle at which the animated sequence is displayed.
15. The system of claim 10, wherein the user may alter the speed of the display of the animated sequence.
16. The system of claim 10, wherein the user may alter the time direction of the display of animated sequence.
17. The system of claim 10, comprising an animation unit providing animation information to the computer module for displaying the animated sequence.
18. The system of claim 10, wherein the computer module for displaying an animated sequence may display information links in the explanatory material and may further display items of instructional information in response to the user's selection of the links and the computer module for receiving user input allows the user to select a link in the explanatory material.
19. A method of creating a training animation unit comprising: recording a moving image sequence of a live person demonstrating a physical activity; converting the moving image sequence into a series of datapoints representing the moving image sequence; forming, from the firstseries of datapoints, an animation sequence corresponding to the moving image sequence; forming, from the animation sequence, at least one vector data file; registering the at least one vector data file to remove lateral motion; and forming an animation data file including a training animation.
20. The method of claim 19, wherein the step of forming the animation sequence includes coloration and adding images of equipment.
21. The method of claim 19, wherein the step of forming the animation sequence includes the step of forming a three dimensional digital character.
22. The method of claim 19, comprising adding additional explanatory information to the animation data file, wherein the additional explanatory information may be displayed at the option of a user viewing the training animation.
23. The method of claim 22, wherein the additional information includes links which allow a user viewing the training animation to view further information.
24. The method of claim 19, wherein the animation data file includes selectable object representations and where a user viewing the training animation may view further information regarding a selected object by selecting such object while viewing the training animation.
25. The method of claim 22, wherein the additional information includes items of instructional information, each item of instructional information corresponding to a portion of the training animation.
26. The method of claim 19, wherein the training animation includes moving representations of an animated person.
27. The method of claim 26, wherein the animated person has the same characteristics as the live person demonstrating the physical activity.
28. The method of claim 19, wherein the training animation includes a plurality of views, each from a different viewpoint.
29. The method of claim 19, wherein the training animation data file is stored in a database.
30. A method of creating a training animation sequence comprising: recording a live person demonstrating a physical activity and forming a series of datapoints corresponding to the demonstration; forming, from the series of datapoints, an animation data set corresponding to the demonstration; including in the animation data set additional information corresponding to certain objects animated in the animation data set; and forming a training animation sequence from the animation data set.
31. The method of claim 30, wherein the step of forming the animation data set includes coloration and adding equipment.
32. The method of claim 30, wherein the step of forming the animation data set includes the step of forming a three dimensional digital character.
33. The method of claim 30, comprising adding additional information to the training animation sequence, wherein the additional information may be displayed at the option of a user viewing the training animation sequence.
34. The method of claim 33, wherein the additional information includes hyperlinks.
35. The method of claim 33, wherein the additional information includes information regarding selectable objects which are animated in the training animation sequence and the information is displayed when the object is selected by a user viewing the training animation sequence.
36. The method of claim 33, wherein the additional information includes items of instructional information, each item of instructional information corresponding to a portion of the training animation sequence.
37. The method of claim 30, wherein the training animation sequence includes moving representations of an animated person.
38. The method of claim 37, wherein the animated person has the same characteristics as the live person.
39. The method of claim 30, wherein the training animation sequence includes a plurality of views, each from a different viewpoint.
40. The method of claim 30, wherein the training animation sequence is stored in a database.
41. The method of claim 30, wherein the training animation sequence may be distributed to and accessed by a user.
PCT/US2000/020752 1999-07-30 2000-07-28 Method and system for interactive motion training WO2001009861A2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU65033/00A AU6503300A (en) 1999-07-30 2000-07-28 Method and system for interactive motion training and a user interface therefor

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US14693399P 1999-07-30 1999-07-30
US14693300P 2000-07-28 2000-07-28
US60/146,933 2000-07-28

Publications (2)

Publication Number Publication Date
WO2001009861A2 true WO2001009861A2 (en) 2001-02-08
WO2001009861A3 WO2001009861A3 (en) 2001-05-10

Family

ID=26844432

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2000/020752 WO2001009861A2 (en) 1999-07-30 2000-07-28 Method and system for interactive motion training

Country Status (2)

Country Link
AU (1) AU6503300A (en)
WO (1) WO2001009861A2 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6831603B2 (en) 2002-03-12 2004-12-14 Menache, Llc Motion tracking system and method
US7009561B2 (en) 2003-03-11 2006-03-07 Menache, Llp Radio frequency motion tracking system and method
US9418470B2 (en) 2007-10-26 2016-08-16 Koninklijke Philips N.V. Method and system for selecting the viewing configuration of a rendered figure

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
GAVRON ET AL.: 'How to use microsoft windows NT 4 workstation', XP002937726 * page 6 - page 7 * *
'Teach me Piano (software description)' VOYETRA TURTLE BEACH, INC.'S 1996, pages 2 - 3, XP002937724 *
'User Guide' ADOBE PREMIERE 3.0 page 54, 138, 211, XP002937725 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6831603B2 (en) 2002-03-12 2004-12-14 Menache, Llc Motion tracking system and method
US7009561B2 (en) 2003-03-11 2006-03-07 Menache, Llp Radio frequency motion tracking system and method
US7432810B2 (en) 2003-03-11 2008-10-07 Menache Llc Radio frequency tags for use in a motion tracking system
US9418470B2 (en) 2007-10-26 2016-08-16 Koninklijke Philips N.V. Method and system for selecting the viewing configuration of a rendered figure

Also Published As

Publication number Publication date
WO2001009861A3 (en) 2001-05-10
AU6503300A (en) 2001-02-19

Similar Documents

Publication Publication Date Title
US6722888B1 (en) Method and apparatus for tutorial, self and assisted instruction directed to simulated preparation, training and competitive play and entertainment
Chen et al. Computer-assisted yoga training system
US5890906A (en) Method and apparatus for tutorial, self and assisted instruction directed to simulated preparation, training and competitive play and entertainment
Hughes et al. Notational analysis of sport: Systems for better coaching and performance in sport
US6749432B2 (en) Education system challenging a subject's physiologic and kinesthetic systems to synergistically enhance cognitive function
US6164973A (en) Processing system method to provide users with user controllable image for use in interactive simulated physical movements
US8861091B2 (en) System and method for tracking and assessing movement skills in multidimensional space
AU2020209768A1 (en) Augmented cognition methods and apparatus for contemporaneous feedback in psychomotor learning
US20140078137A1 (en) Augmented reality system indexed in three dimensions
US20050250083A1 (en) Method and apparatus for instructors to develop pre-training lessons using controllable images
US20160049089A1 (en) Method and apparatus for teaching repetitive kinesthetic motion
CN106457045A (en) Method and system for portraying a portal with user-selectable icons on large format display system
EP1059970A2 (en) System and method for tracking and assessing movement skills in multidimensional space
Cannavò et al. A movement analysis system based on immersive virtual reality and wearable technology for sport training
Petri et al. Development of an autonomous character in karate kumite
Tisserand et al. Preservation and gamification of traditional sports
Bideau et al. Virtual reality applied to sports: do handball goalkeepers react realistically to simulated synthetic opponents?
Ruttkay et al. Elbows higher! Performing, observing and correcting exercises by a virtual trainer
CA2489926A1 (en) An athletic game learning tool, capture system, and simulator
US20080003554A1 (en) Interactive system and method whereby users direct idio-syncratically controllable images to reverse simulated movements of actual physical movements, thereby learning usual sequences of actual physical movements
WO2001009861A2 (en) Method and system for interactive motion training
Kincaid et al. A Study on VR Training of Baseball Athletes
Zhu Augmented Reality for Exercises for Elderly People
CN116974374A (en) Visual body-building auxiliary method based on augmented reality technology
Bleikli et al. VR Walk: Game. VR simulator for treadmill as a tool for constraint-based gait rehabilitation

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CR CU CZ DE DK DM DZ EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
AK Designated states

Kind code of ref document: A3

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CR CU CZ DE DK DM DZ EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A3

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase in:

Ref country code: JP