WO2010128423A1 - Method and system for analyzing motion data - Google Patents

Method and system for analyzing motion data Download PDF

Info

Publication number
WO2010128423A1
WO2010128423A1 PCT/IB2010/051825 IB2010051825W WO2010128423A1 WO 2010128423 A1 WO2010128423 A1 WO 2010128423A1 IB 2010051825 W IB2010051825 W IB 2010051825W WO 2010128423 A1 WO2010128423 A1 WO 2010128423A1
Authority
WO
WIPO (PCT)
Prior art keywords
sound
motion data
movements
events
processor
Prior art date
Application number
PCT/IB2010/051825
Other languages
French (fr)
Inventor
Gerd Lanfermann
Juergen Te Vrugt
Stefan Winter
Original Assignee
Koninklijke Philips Electronics N.V.
Philips Intellectual Property & Standards Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips Electronics N.V., Philips Intellectual Property & Standards Gmbh filed Critical Koninklijke Philips Electronics N.V.
Publication of WO2010128423A1 publication Critical patent/WO2010128423A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1123Discriminating type of movement, e.g. walking or running
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/1036Measuring load distribution, e.g. podologic studies
    • A61B5/1038Measuring plantar pressure during gait
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B7/00Instruments for auscultation
    • A61B7/006Detecting skeletal, cartilage or muscle noise

Abstract

There is provided a method of analyzing motion data representing movements of a user, the method comprising the steps of identifying sound events occurring during the movements; and dividing the motion data into a plurality of segments using the identified sound events.

Description

Method and system for analyzing motion data
TECHNICAL FIELD OF THE INVENTION
The invention relates to motion analysis, and in particular to a method and system for analyzing motion data.
BACKGROUND TO THE INVENTION
The segmentation of motion data is an essential task in motion analysis. The segmentation of motion data refers to the decomposition of a continuous data stream of motion data into segments that each relate to a basic or simple movement in an exercise (such as the movement of a limb, or an individual step when the motion data relates to walking) or to repetitions of an exercise. In some known techniques, motion data can be matched up to templates in order to identify particular movements within the motion data.
It is possible to determine a quality of movement (in relation to a specific physical exercise) by analyzing these basic movements, and to provide feedback to the user on the quality of their movements (for example in a rehabilitation system or a sport training system). However, if the division of the motion data into segments is not done accurately, the quality of the analysis will usually be very low and it will render the outcome unusable.
The task of identifying the "correct" segments in a stream of motion data can be made more complex for a number of reasons. Firstly, as shown in Fig. 1, a particular movement or motion is always performed with slight variations, even if the motion is repetitive and performed by the same person. In Fig. 1, the lines represent the repeated execution of the same drinking movement (i.e. lifting a cup to the mouth and replacing the cup on a table).
Secondly, as shown in Fig. 2, the person may have a disability (for example caused by a stroke) which means that they are unable to perform the movement in the "normal" way (as defined in stored templates). In Fig. 2, the graph in the top left shows the movements of a healthy person lifting their arm repetitively, whereas the other three graphs show the movements recorded from three patients attempting the same movement who have suffered strokes. Each graph shows three lines, with each line representing the respective angles of the upper arm, lower arm and torso of the person performing the exercise. Thirdly, as shown in Fig. 3, the person might not execute the exercises properly (particularly if they are inexperienced with the exercise), which means, again, the recorded movements will not match the stored templates. In Fig. 3, the left part of the graph represents the movements of three people performing an exercise in which they (correctly) keep their trunk stable (in terms of the angle of their torso), whereas the right part of the graph shows the movements recorded during the same exercise when the trunks of the three people are unstable (in terms of the angle of the torso).
Finally, the obtained motion data can be affected if the sensors are not calibrated correctly or if they are not attached to the person in the right way. In each of these cases, the motion data can be completely different to the expected or desired form and it is difficult, if not impossible, to use a pattern or template matching process to identify the proper segments.
Therefore there is a need for a method and system for segmenting motion data that overcomes, or at least mitigates, these difficulties.
SUMMARY OF THE INVENTION
According to a first aspect of the invention, there is provided a method of analyzing motion data representing movements of a user, the method comprising the steps of identifying sound events occurring during the movements; and dividing the motion data into a plurality of segments using the identified sound events.
According to a second aspect of the invention, there is provided a system for analyzing motion data representing movements of a user, the system comprising a processor for identifying sound events from measurements of sound occurring during the movements and for dividing the motion data into a plurality of segments using the identified sound events.
A third aspect of the invention relates to a computer program product comprising computer program code that, when executed on a suitable computer or processor, causes the computer or processor to perform the method defined above.
BRIEF DESCRIPTION OF THE DRAWINGS
The invention will now be described, by way of example only, with reference to the following drawings, in which:
Fig. 1 is a graph illustrating the variation in motion data across a repeated movement; Fig. 2 is a set of graphs illustrating the variation in motion data between healthy and impaired people;
Fig. 3 is a graph illustrating the variation in motion data when an exercise is executed with a stable trunk and an unstable trunk; Fig. 4 is an illustration of a system for segmenting motion data in accordance with the invention;
Fig. 5 is a more detailed block diagram of the system according to the invention;
Fig. 6 is a flow chart illustrating a method of segmenting motion data in accordance with the invention;
Fig. 7 is a graph illustrating the use of sound events to segment motion data from a healthy person; and
Fig. 8 is a graph illustrating the use of sound events to segment motion data from an impaired person.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
An exemplary system is accordance with the invention is shown in Fig. 4. The system 2 comprises a processing unit 4 that receives motion data relating to the movements or characteristics of the movement of a user 6. The motion data can be captured using any conventional technique, such as by placing movement sensors 8a, 8b on a part or parts of the user 6 and connecting the movement sensors 8a, 8b wirelessly or through a wired connection to the processing unit 4. Alternatively, the movement of the user 6 can be captured using one or more cameras (not shown) that track the movement of markers that are attached to different parts of the user 6. In accordance with the invention, the system 2 also comprises a sound sensor
10, such as a microphone, that collects sound information while motion data is being collected. In this illustrated embodiment, the sound sensor 10 is connected to the processing unit 4 through a wired connection, but it could alternatively be connected to the processing unit 4 wirelessly. The sound sensor 10 detects sounds that occur while the user 6 is exercising or performing some movement, and the processing unit 4 uses this sound information to assist in segmenting the collected motion data into particular basic or distinct movements.
For example, in addition to the system 2, Fig. 4 shows a table 12 and a cup 14 that the user 6 is to repeatedly pick up and put down on the table 12. Each time that the cup 14 contacts the table 12, there will be an audible sound that is detected by the sound sensor 10. The processing unit 4 can analyze the output of the sound sensor 10 to identify the sound of the cup 14 contacting the table 12 and can use the detected sound (and specifically the time that the sound event occurred) to assist in segmenting the collected motion data into separate repetitions of the exercise.
Fig. 5 shows the system 2 in accordance with the invention in more detail. The processing unit 4 comprises a processor 16 that is connected to the movement sensors 8a, 8b and sound sensor 10. The processing unit 4 also comprises a database 18 that stores data for use by the processor 16 in segmenting the motion data and identifying particular basic or distinct movements. For example, the database 18 can store templates for particular sound events (for example a cup being placed onto a table or a footstep), and templates for particular basic movements (such as an arm being raised and lowered). The processing unit 4 also comprises a memory 20 connected to the processor that temporarily stores data collected from the sound sensor 10 and movement sensors 8a, 8b. It will be appreciated by those skilled in the art that the sound event templates can be stored in a format that is suitable for comparison to the output from the sound sensor 10. For example, the sound sensor 10 can provide measurements of the detected sound amplitude, and the templates can comprise corresponding amplitude measurements expected during that type of event. Alternatively, the sound templates can be provided in terms of the "shape" of the sound event as represented by a plot of the measured sound amplitude over time and the processor 16 can use appropriate shape recognition techniques to compare the measured sound event with the stored templates to identify the measured sound.
As with the sound templates above, the movement templates can be stored in a format that is suitable for comparison to the output from the movement sensors 8a, 8b. For example the templates can comprise amplitude or angular measurements over time, or an expected "shape" of the movement.
As described further below, while a user 6 is performing a particular exercise or movement, the sound sensor 10 collects sound measurements, the processor 16 identifies a particular sound event from the sound measurements using templates stored in the database 18 and uses a timestamp associated with a particular identified sound event or events to assist in segmenting a stream of motion data that is being collected by movement sensors 8a, 8b on the user 6. To perform this segmentation, the processor 16 may use the movement templates stored in the database 18. In particular, the sound event may identify a specific phase in the movement performed by the user 6, or, when multiple, different, sound events are identified, they can identify multiple phases in the movement. The movement templates can indicate a position or a range of positions in which a specific sound event (such as a cup contacting a table or a footstep) is expected to be found, and the processor 16 can use the timing of the identified sound event to match the motion data to the movement template. Of course, those skilled in the art will appreciate that motion data can be segmented using alternative techniques that do not use movement templates.
Fig. 6 illustrates a method of segmenting motion data in more detail. In step 101, motion data is collected. As described above, the motion data can be collected using movement sensors 8a, 8b located on particular parts of the user's body, or can be collected using any other suitable form of arrangement, such as a camera-based tracking system. The motion data is stored in the memory 20 together with timing information that indicates the time at which each motion data sample or set of motion data samples was measured.
As described above, the format of the motion data will depend on the sensors used to collect the motion data. For example, movement sensors 8a, 8b can provide angular information for the relevant limbs of the user 6 over time, while a camera-based system can provide positional information for the user 6 over time. In step 103, while the motion data is being collected (as illustrated by step 103 being placed next to step 101 in Fig. 6), sound data is collected by the sound sensor 10. The sound sensor 10 can be a microphone, such as a piezoelectric microphone, and can be placed close to, or on, a surface which makes contact with parts of the body of the user 6 or objects during the training or exercise, and which can be used to segment motion data collected during the training or exercise. For example, the sound sensor 10 can be placed on a table (as shown in Fig. 4) to detect a cup or other object being placed on the table, or on the ground to detect footsteps.
In step 105, continuing from step 103, the processor 16 retrieves one or more sound profiles or templates from the database 18. As described above, the profiles or templates may comprise the shape of the relevant sound event or its typical amplitude.
In step 107, the sound profiles or templates retrieved from the database 18 in step 105 are compared to the sound data collected in step 103 to identify sound events corresponding to the profiles or templates in the sound data. In this step, the processor 16 can use pattern recognition algorithms or speech processing algorithms to identify particular sound events using the profiles or templates. As described above, a sound event can comprise a single signal or sample whose amplitude is above a threshold or a more complex sound pattern. Complex sound patterns may be generated by, for example, an item (such as a shoe or hand) rubbing over a certain surface. The processor 16 can distinguish and identify complex sounds such as these by using frequency analysis (for example using Fourier Transforms) and/or pattern recognition algorithms (for example Discriminative Time Warping). If a sound event or events corresponding to the sound profiles or templates is identified in the sound data, a signal identifying the sound event is generated and the time that the sound event occurred is determined from the sound data (step 109).
In step 111, the processor 16 combines the output of step 109 and step 101. In particular, the processor 16 applies the signals identifying the type and timing of the sound events to the stream of motion data, and uses the locations (in terms of time) of the sound events to determine or suggest how the motion data should be segmented into particular basic movements or exercises. As described above, the processor 16 can use motion data templates that include information on when a specific sound event is expected to be heard or other information regarding the exercise or movement being performed by the user 6 to determine how the motion data should be segmented.
Subsequent analysis of the motion data segments can be performed using any conventional technique, including by comparing each segment to a movement template.
Figs. 7 and 8 illustrate the application of the invention to motion data obtained from a user 6 performing the lifting exercise shown in Fig. 4. As described above, the exercise involves the user 6 picking up a cup 14 from a table 12 and placing it back down on the table 12. The sound sensor 10 is placed on the table 12 to detect sound events corresponding to the cup 14 contacting the table 12. Thus, the database 18 in the processing unit 4 includes a sound template for a cup 14 contacting the table 12. Typically, this sound template will show a relatively sharp peak. Fig. 7 shows the motion data (in terms of the angle of the upper arm of the user 6) obtained from a healthy user 6. The motion data is marked with reference numeral 22 and represents the movement of the arm of the user 6. As the user 6 is healthy, the motion data 22 is almost periodic, and there are only relatively small variations between the motion data 22 for each repetition of the exercise. The detected sound events that correspond to the sound template for a cup 14 contacting the table 12 are superimposed on the motion data graph 22. Each of these sound events are marked with reference numeral 24. Thus, it can be seen that each of the sound events 24 occurs at approximately the same point in the exercise cycle, when the arm is at its lowest point. Using this information, the motion data can be segmented into separate repetitions of the exercise. In this case, each segment starts at the time that the arm is at its highest point before the cup 14 contacts the table 12. The start of each segment is marked by line 26.
Fig. 8 shows the motion data (again in terms of the angle of the upper arm of the user 6) obtained from a user 6 that has an impairment that prevents them from performing the particular exercise properly. Again, the motion data is marked with reference numeral 22 and represents the movement of the arm of the user 6. Due to the impairment, the user 6 is unable to perform a steady arm movement, and this is represented by the motion data shown in the graph. In particular, it appears that there is no discernable periodic pattern, and there are significant variations between each repetition of the exercise.
The sound data collected from the sound sensor 10 is shown in the lower graph in Fig. 8. It can be seen that there were four sound events detected (that each correspond to the sound template for a cup 14 contacting a table 12) and these have been superimposed on the motion data graph. Again, each sound event is marked by a line 24. Using this sound event information, it is possible to segment the motion data
22 into separate repetitions of the exercise. Again, each segment starts at the time that the arm is at its highest point before the cup 14 contacts the table 12, and is marked by a respective line 26. Thus, it can be seen from the use of the sound event data that three of the repetitions (starting at lines 26a, 26c and 26d) are relatively similar (in terms of their duration and highest and lowest arm position), and that user 6 had some trouble in performing the repetition starting at line 26b.
Thus, it can be seen from the above example that sound event data can be used to assist in segmenting motion data.
While the invention has been illustrated and described in detail in the drawings and foregoing description, such illustration and description are to be considered illustrative or exemplary and not restrictive; the invention is not limited to the disclosed embodiments.
Variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed invention, from a study of the drawings, the disclosure, and the appended claims. In the claims, the word "comprising" does not exclude other elements or steps, and the indefinite article "a" or "an" does not exclude a plurality. A single processor or other unit may fulfill the functions of several items recited in the claims. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measured cannot be used to advantage. Any reference signs in the claims should not be construed as limiting the scope. A computer program may be stored/distributed on a suitable medium, such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems.

Claims

CLAIMS:
1. A method of analyzing motion data representing movements of a user, the method comprising the steps of: identifying sound events occurring during the movements; and dividing the motion data into a plurality of segments using the identified sound events.
2. A method as claimed in claim 1, further comprising the step of: measuring sound during the movements; and the step of identifying comprises identifying the sound events from the measured sound.
3. A method as claimed in claim 2, wherein the step of identifying sound events comprises: comparing the measured sound to one or more sound event profiles; and identifying sound events in the measured sound that match or substantially match one of the sound event profiles.
4. A method as claimed in claim 1, 2 or 3, wherein the step of identifying sound events comprises identifying a time at which each of said sound events occurred.
5. A method as claimed in claim 4, wherein the step of dividing the motion data into a plurality of segments includes applying the time associated with each of said identified sound events to said motion data.
6. A method as claimed in claim 5, wherein the step of dividing the motion data into a plurality of segments further comprises applying the type of sound event associated with each of said identified sound event to said motion data.
7. A method as claimed in any preceding claim, further comprising the step of: obtaining motion data representing the movements of the user.
8. A method as claimed in any preceding claim, wherein the step of dividing the motion data into a plurality of segments comprises: comparing the identified sound event and motion data to a movement template, the movement template including an indication of an expected sound event.
9. A system for analyzing motion data representing movements of a user, the system comprising : a processor for identifying sound events from measurements of sound occurring during the movements and for dividing the motion data into a plurality of segments using the identified sound events.
10. A system as claimed in claim 9, further comprising a sound sensor for measuring the sound occurring during the movements and for providing the measurements to the processor.
11. A system as claimed in claim 9 or 10, wherein the processor is configured to identify sound events by comparing the measured sound to one or more sound event profiles.
12. A system as claimed in claim 9, 10 or 11, wherein the processor is configured to identify a time at which each of said sound events occurred.
13. A system as claimed in claim 9, 10, 11 or 12, further comprising: means for collecting motion data representing the movements of the user.
14. A system as claimed in any of claims 9 to 13, wherein the processor is configured to divide the motion data into a plurality of segments by comparing the identified sound events and motion data to a movement template, the movement template including an indication of an expected sound event.
15. A computer program product comprising computer program code that, when executed on a suitable computer or processor, causes the computer or processor to perform the method as claimed in any one of claims 1 to 8.
PCT/IB2010/051825 2009-05-04 2010-04-27 Method and system for analyzing motion data WO2010128423A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP09159320 2009-05-04
EP09159320.2 2009-05-04

Publications (1)

Publication Number Publication Date
WO2010128423A1 true WO2010128423A1 (en) 2010-11-11

Family

ID=42238697

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2010/051825 WO2010128423A1 (en) 2009-05-04 2010-04-27 Method and system for analyzing motion data

Country Status (1)

Country Link
WO (1) WO2010128423A1 (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0062459A2 (en) * 1981-04-03 1982-10-13 National Research Development Corporation Orthopaedic diagnostic apparatus
US4823807A (en) * 1988-02-11 1989-04-25 Board Of Regents, Univ. Of Texas System Device for non-invasive diagnosis and monitoring of articular and periarticular pathology
US4836218A (en) * 1984-04-09 1989-06-06 Arthrotek, Inc. Method and apparatus for the acoustic detection and analysis of joint disorders
US20020107649A1 (en) * 2000-12-27 2002-08-08 Kiyoaki Takiguchi Gait detection system, gait detection apparatus, device, and gait detection method
WO2002067449A2 (en) * 2001-02-20 2002-08-29 Ellis Michael D Modular personal network systems and methods
WO2003065891A2 (en) * 2002-02-07 2003-08-14 Ecole Polytechnique Federale De Lausanne (Epfl) Body movement monitoring device
US20080146968A1 (en) * 2006-12-14 2008-06-19 Masuo Hanawaka Gait analysis system
DE102007038392A1 (en) * 2007-07-11 2009-01-15 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Apparatus and method for predicting a loss of control of a muscle

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0062459A2 (en) * 1981-04-03 1982-10-13 National Research Development Corporation Orthopaedic diagnostic apparatus
US4836218A (en) * 1984-04-09 1989-06-06 Arthrotek, Inc. Method and apparatus for the acoustic detection and analysis of joint disorders
US4836218B1 (en) * 1984-04-09 1991-12-17 Arthrotek Inc
US4823807A (en) * 1988-02-11 1989-04-25 Board Of Regents, Univ. Of Texas System Device for non-invasive diagnosis and monitoring of articular and periarticular pathology
US20020107649A1 (en) * 2000-12-27 2002-08-08 Kiyoaki Takiguchi Gait detection system, gait detection apparatus, device, and gait detection method
WO2002067449A2 (en) * 2001-02-20 2002-08-29 Ellis Michael D Modular personal network systems and methods
WO2003065891A2 (en) * 2002-02-07 2003-08-14 Ecole Polytechnique Federale De Lausanne (Epfl) Body movement monitoring device
US20080146968A1 (en) * 2006-12-14 2008-06-19 Masuo Hanawaka Gait analysis system
DE102007038392A1 (en) * 2007-07-11 2009-01-15 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Apparatus and method for predicting a loss of control of a muscle

Similar Documents

Publication Publication Date Title
CN102245102B (en) Method and apparatus for the analysis of ballistocardiogram signals
US20200319710A1 (en) Systems, methods, apparatuses and devices for detecting facial expression and for tracking movement and location in at least one of a virtual and augmented reality system
US10327670B2 (en) Systems, methods and devices for exercise and activity metric computation
CN107157450B (en) Quantitative assessment method and system for hand motion ability of Parkinson's disease people
US20190160339A1 (en) System and apparatus for immersive and interactive machine-based strength training using virtual reality
US20230085511A1 (en) Method and system for heterogeneous event detection
EP3589199A1 (en) Deep learning algorithms for heartbeats detection
Altaf et al. Acoustic gaits: Gait analysis with footstep sounds
EP3241492B1 (en) Heart rate detection method and device
WO2014089238A1 (en) Gait analysis system and method
US20180014756A1 (en) Method for Determining Types of Human Motor Activity and Device for Implementing Thereof
CN103717124A (en) Device and method for obtaining and processing measurement readings of a living being
WO2017161734A1 (en) Correction of human body movements via television and motion-sensing accessory and system
WO2014153665A1 (en) System and method for monitoring a subject
JP2019508123A (en) Device and method for extracting heart rate information
Gupta et al. A wearable multisensor posture detection system
CN109063661A (en) Gait analysis method and device
CN108242260B (en) Fitness monitoring method and device
Malawski et al. Real-time action detection and analysis in fencing footwork
KR101875472B1 (en) Finger language recognition system and method
WO2010128423A1 (en) Method and system for analyzing motion data
Marques et al. Exploring the Stationary Wavelet Transform detail coefficients for detection and identification of the S1 and S2 heart sounds
JP2007121217A (en) Bodily motion analyzer
CN114869274B (en) Motion gesture detection method, device and dot matrix type flexible pad
CN113208573B (en) Support wearable equipment of PPG + ECG function

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10717853

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10717853

Country of ref document: EP

Kind code of ref document: A1