US20140180632A1 - Motion Analysis Device - Google Patents

Motion Analysis Device Download PDF

Info

Publication number
US20140180632A1
US20140180632A1 US14/132,531 US201314132531A US2014180632A1 US 20140180632 A1 US20140180632 A1 US 20140180632A1 US 201314132531 A US201314132531 A US 201314132531A US 2014180632 A1 US2014180632 A1 US 2014180632A1
Authority
US
United States
Prior art keywords
data
trajectory
observation
axis
audio
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/132,531
Inventor
Koji Yataka
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yamaha Corp
Original Assignee
Yamaha Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yamaha Corp filed Critical Yamaha Corp
Assigned to YAMAHA CORPORATION reassignment YAMAHA CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: YATAKA, KOJI
Publication of US20140180632A1 publication Critical patent/US20140180632A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0003Analysing the course of a movement or motion sequences during an exercise or trainings sequence, e.g. swing for golf or tennis
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0003Analysing the course of a movement or motion sequences during an exercise or trainings sequence, e.g. swing for golf or tennis
    • A63B24/0006Computerised comparison for qualitative assessment of motion sequences or the course of a movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/7405Details of notification to user or communication with user or patient ; user input means using sound
    • A61B5/7415Sound rendering of measured values, e.g. by pitch or volume variation
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B60/00Details or accessories of golf clubs, bats, rackets or the like
    • A63B60/46Measurement devices associated with golf clubs, bats, rackets or the like for measuring physical parameters relating to sporting activity, e.g. baseball bats with impact indicators or bracelets for measuring the golf swing
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • A63B69/36Training appliances or apparatus for special sports for golf
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B71/0622Visual, audio or audio-visual systems for entertaining, instructing or motivating the user
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6887Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
    • A61B5/6895Sport equipment
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0003Analysing the course of a movement or motion sequences during an exercise or trainings sequence, e.g. swing for golf or tennis
    • A63B24/0006Computerised comparison for qualitative assessment of motion sequences or the course of a movement
    • A63B2024/0012Comparing movements or motion sequences with a registered reference
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B71/0622Visual, audio or audio-visual systems for entertaining, instructing or motivating the user
    • A63B2071/0625Emitting sound, noise or music
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2102/00Application of clubs, bats, rackets or the like to the sporting activity ; particular sports involving the use of balls and clubs, bats, rackets, or the like
    • A63B2102/32Golf

Definitions

  • the present invention relates to a technology for analyzing a motion of a user.
  • Priority is claimed on Japanese Patent Application No. 2012-279501, filed on Dec. 21, 2012, the content of which is incorporated herein by reference.
  • Japanese Unexamined Patent Application, First Publication No. H06-39070 discloses a technology which displays a moving image of a swing motion of the user concurrently with a moving image of a pre-recorded reference use swing motion (for example, a swing motion of a professional golfer) on an identical screen.
  • the user analyzes their own swing motion by visually comparing their own swing motion with a reference use swing motion.
  • a reference use swing motion it is difficult to understand the difference between both motions by accurately and precisely comparing the motion of the user with the reference-purpose motion while visually checking the moving images on the screen.
  • the present invention aims to enable the user to easily understand the difference between the motion of the user and the reference use motion.
  • a motion analysis device of the present invention includes observation data acquisition means for acquiring observation data which indicates a trajectory of a target observation point moving in conjunction with a motion of a user; comparison means for comparing reference data which indicates a predetermined trajectory of the target observation point with the observation data which is acquired and generated by the observation data acquisition means; and audio control means used to generate an audio signal according to a comparison result from the comparison means.
  • the audio signal can be generated according to the comparison result between the observation data and the reference data. Therefore, the user can easily understand a difference between the trajectory of the target observation point and the predetermined trajectory indicated by the reference data.
  • FIG. 1 is an external view illustrating a motion analysis system according to a first embodiment of the present invention.
  • FIG. 2 is a block diagram illustrating the motion analysis system according to the first embodiment of the present invention.
  • FIG. 3 illustrates a target observation point according to the first embodiment of the present invention.
  • FIG. 4 is a schematic diagram illustrating observation data according to the first embodiment of the present invention.
  • FIG. 5 is a schematic diagram illustrating a reference data sequence according to the first embodiment of the present invention.
  • FIG. 6 is a schematic diagram illustrating comparison data according to the first embodiment of the present invention.
  • FIG. 7 is a flowchart illustrating a comparison process performed by a comparison unit according to the first embodiment of the present invention.
  • FIG. 8 is a graph illustrating an audio signal generated by an audio control unit according to the first embodiment of the present invention.
  • FIG. 9 is a graph illustrating an audio signal generated by an audio control unit according to a second embodiment of the present invention.
  • FIG. 10 is a block diagram illustrating a motion analysis system according to a third embodiment of the present invention.
  • FIG. 1 is an external view of a motion analysis system 100 according to a first embodiment of the present invention.
  • FIG. 2 is a block diagram of the motion analysis system 100 .
  • the motion analysis system 100 includes a motion analysis device 10 and an acceleration sensor 20 .
  • the motion analysis device 10 analyzes a motion of a user U and notifies the user U of an analysis result, and is preferably used when practicing a specific action in various sports.
  • the motion analysis device 10 of the first embodiment analyzes a motion of the user U swinging a golf club C (hereinafter, referred to as a “swing motion”). More specifically, the motion analysis device 10 analyzes movement of a point P which moves in conjunction with the swing motion of the user U (hereinafter, referred to as a “target observation point”).
  • the target observation point P of the first embodiment is a specific point in the club C used by the user U.
  • a tip portion of a grip Cg fixed to a shaft Cs of the club C (end portion of as head Ch side) is set to be the target observation point P.
  • Other points of the club C for example, a point on the head Ch or on the shaft Cs
  • a point within a body of the user U which moves in conjunction with the swing motion can be set to be the target observation point P.
  • the acceleration sensor 20 in FIGS. 1 and 2 is a detector which detects movement of the target observation point P (swing motion of the user U), and sequentially generates a sensor output Da corresponding to the movement of the target observation point P at a predetermined cycle.
  • the acceleration sensor 20 of the present embodiment is a three-axis acceleration sensor that detects acceleration in each direction of three axes (X-axis, Y-axis and Z-axis) which are fixed to the target observation P and are orthogonal to one another.
  • the Z-axis is an axis which is parallel to a longitudinal direction of the shaft Cs of the club C.
  • the Y-axis and the X-axis are axes on a plane which is orthogonal to the Z-axis.
  • One sensor output Da is configured to include acceleration Ax in the X-axis direction, acceleration Ay in the Y-axis direction and acceleration Az in the Z-axis direction.
  • Each sensor output Da which is sequentially generated by the acceleration sensor 20 is transmitted to the motion analysis device 10 in a time-series manner.
  • the acceleration sensor 20 and the motion analysis device 10 perform data communication with each other in a wireless manner, but may perform the data communication by wire.
  • the motion analysis device 10 is operated by a computer system which includes an arithmetic processing unit 12 , a storage device 14 and a sound emitting device 16 .
  • the storage device 14 stores programs implemented by the arithmetic processing unit 12 and various data items used by the arithmetic processing unit 12 (for example, audio data W or reference data series Sref). Combinations of a semiconductor storage medium and a known recording medium such as a magnetic recording medium or a multiple types of the recording medium may be optionally employed as the storage device 14 .
  • the sound emitting device 16 is audio equipment (for example, a speaker) which reproduces a sound wave corresponding to an audio signal S generated by the arithmetic processing unit 12 .
  • the arithmetic processing unit 12 allows a plurality of functions (observation data acquisition unit 32 , comparison unit 34 and audio control unit 36 ) for analyzing the motion of the user U by implementing a program stored in the storage device 14 . It is also possible to distribute each function of the arithmetic processing unit 12 to a plurality of devices.
  • the observation data acquisition unit 32 sequentially acquires observation data Db indicating a trajectory (hereinafter, referred to as an “observation trajectory”) Oa of the target observation point P corresponding to the swing motion of the user U. More specifically, the observation data acquisition unit 32 sequentially generates the observation data Db supplied by the acceleration sensor 20 in the time-series manner. As illustrated in FIG. 4 , one observation data item Db is configured to include an observation value Bx, an observation value By and an observation value Bz.
  • the observation value Bx is a difference (variation) in the acceleration Ax between two sensor outputs Da which are generated in succession
  • the observation value By is a difference in the acceleration Ay between two sensor outputs Da which are generated in succession
  • the observation value Bz is a difference in the acceleration Az between two sensor outputs Da which are generated in succession.
  • the cycle where the observation data acquisition unit 32 acquires the observation data Db is set to be a sufficiently short time (for example, one millisecond) as compared to the time for the user U to perform the swing motion.
  • the storage device 14 in FIG. 2 stores the audio data W and the reference data series Sref.
  • the audio data W of the present embodiment is data which indicates specific audio waveforms. For example, a wind noise generated by the club C during the swing motion is recorded, and digital data sampled at a predetermined frequency (for example, 44.1 kHz) is stored in the storage device 14 in advance as the audio data W.
  • FIG. 5 is a schematic diagram of the reference data series Sref.
  • the reference data series Sref indicates a trajectory (hereinafter, referred to as a “reference trajectory”) Oref of the target observation point P over predetermined time duration.
  • the reference data series Sref are time series of a plurality of reference data Dref.
  • Each reference data Dref is compared with each observation data Db in order to evaluate the swing motion of the user U, and is configured to include a reference value Rx, a reference value Ry and a reference value Rz.
  • the reference trajectory Oref is a standard of an observation trajectory Oa specified by each observation data Db.
  • a trajectory of the target observation point P when an action performer such as a professional golfer skilled in the swing motion performs standard swing motion is preferably employed as the reference trajectory Oref.
  • time series of the plurality of observation data Db generated by the observation data acquisition unit 32 are stored in the storage device 14 in advance as the reference data series Sref (each reference data Dref).
  • the reference value Rx of each reference data Dref corresponds to a change amount of the acceleration Ax when performing the standard swing motion
  • the reference value Ry corresponds to a change amount of the acceleration Ay
  • the reference value Rz corresponds to a change amount of the acceleration Az.
  • the comparison unit 34 in FIG. 2 compares each observation data Db acquired by the observation data acquisition unit 32 with each reference data Dref of the reference data series Sref stored in the storage device 14 . More specifically, the comparison unit 34 reads out the reference data Dref from the reference data series Sref of the storage device 14 in chronological order each time the observation data acquisition unit 32 acquires the observation data Db, and generates comparison data Dc by calculating the difference between the observation data Db and the reference data Dref.
  • one comparison data Dc is configured to include a comparison value ⁇ Tx, a comparison value ⁇ Ty and a comparison value ⁇ Tz.
  • the comparison value ⁇ Tx is a difference between the observation value Bx of the observation data Db and the reference value Rx of the reference data Dref.
  • the comparison value ⁇ Ty is the difference between the observation value By and the reference value Ry
  • the comparison value ⁇ Tz is a difference between the observation value Bz and the reference value Rz.
  • Time series of the plurality of observation data Db correspond to the observation trajectory Oa and the reference data series Sref correspond to the reference trajectory Oref. Accordingly, the comparison data Dc corresponds to data indicating a difference between the observation trajectory OA and the reference trajectory Oref.
  • the comparison unit 34 of the present embodiment compares the observation data Db with the reference data Dref in a predetermined analysis section within a section from the start to the completion of the swing motion performed by the user U.
  • the analysis section is a section from a point of time when the user U starts downswing motion (action of swinging the club C down) after takeaway action in backswing (hereinafter, an “action start point”) until predetermined time duration T elapses.
  • the time duration T of the analysis section is set according to time duration from the action start point of the User U until the user U completes follow-through action (finishing action in swinging the club C).
  • the time duration from an actual action start point until the swing is completed varies depending on a swing speed of the user U.
  • an average swing speed of the user U is calculated based on results on the time series of the observation data Db which are previously measured multiple times.
  • the time duration T of the analysis section corresponding to the average swing speed is selected for each user U and is stored in the storage device 14 .
  • FIG. 7 is a flowchart in a process where the comparison unit 34 compares each observation data Db and each reference data Dref (hereinafter, referred to as a “comparison process”). For example, when the user U instructs the analysis to start by operating an input device (not illustrated), the comparison process in FIG. 7 is performed.
  • the comparison unit 34 detects the action start point by utilizing each observation data Db (S 1 ). Considering that a change amount in the acceleration of the target observation point P has a tendency to increase immediately after the start of the downswing motion, the comparison unit 34 of the first embodiment detects the action start point according to a change amount ⁇ A of the acceleration indicated by each observation data Db.
  • the comparison unit 34 sequentially determines whether or not the change amount ⁇ A in the acceleration indicated by the observation data Db which is sequentially supplied from the observation data acquisition unit 32 is beyond a predetermined threshold value ATH.
  • the change amount ⁇ A is the sum of an absolute value of the observation value Bx, an absolute value of the observation value By and an absolute value of the observation value Bz.
  • the comparison unit 34 repeats Step S 1 until the change amount ⁇ A is beyond the predetermined threshold value ATH (S 1 : NO), and detects a point in time when the change amount ⁇ A is beyond the predetermined threshold value ATH as the action start point (S 1 : YES). Then, the process proceeds to Step S 2 .
  • the comparison unit 34 stretches (expands or contracts) the reference data series Sref on a time axis according to the time duration T of the analysis section which is selected and stored in advance based on the average swing speed of the user U (S 2 ). More specifically, time duration Tref from foremost reference data Dref of the reference data series Sref to backmost reference data Dref (time duration from a start point to an end point of the reference trajectory Oref) is adjusted to be the time duration T.
  • the comparison unit 34 increases the amount of reference data Dref by performing an interpolation process on the reference data series Sref, thereby equalize the amount of the reference data Dref with the amount of observation data Db.
  • a known technology for example, a linear interpolation process or a spline interpolation process
  • a known technology for example, a linear interpolation process or a spline interpolation process
  • the comparison unit 34 decreases the amount of reference data Dref by performing a thinning process on the reference data series Sref, thereby equalizing the amount of reference data Dref with the amount of observation data DB.
  • a known technology may be optionally employed for the thinning process of the reference data series Sref.
  • the reference trajectory Oref itself does not vary in the process of Step S 2 .
  • the comparison unit 34 compares each observation data Db which is sequentially generated by the observation data acquisition unit 32 with each of the reference data Dref of the reference data series Sref after adjustment in Step S 2 (S 3 ).
  • the comparison unit 34 reads out each reference data Dref of the reference data series Sref after the adjustment, starting from the foremost, in chronological order each time the observation data acquisition unit 32 generates the observation data Db, and sequentially generates the comparison data Dc by setting a difference between the reference data Dref and the observation data Db to be the comparison data Dc.
  • the generation of the comparison data Dc (S 3 ) is repeated from the action start point detected in Step S 1 until it is determined in Step S 4 that the time duration T of the analysis section elapses (S 4 : NO).
  • the comparison unit 34 completes the comparison process.
  • the comparison data Dc indicating a difference between the observation trajectory Oa and the reference trajectory Oref is sequentially generated within the analysis section while the swing motion is performed.
  • the audio control unit 36 in FIG. 2 generates an audio signal S according to the comparison data Dc (comparison result of each observation data Db and each reference data Dref) which is sequentially generated by the comparison unit 34 . More specifically, from the action start point detected by the comparison unit 34 , the audio control unit 36 sequentially acquires each sample of audio data W from the storage device 14 in chronological order, and converts pitch and/or tempo of each sample of the audio data W according to the comparison data Dc generated by the comparison unit 34 immediately before each reading.
  • the audio signal S which is generated by the audio control unit 36 is supplied to the sound emitting device 16 to be reproduced as a sound wave.
  • a D/A converter which converts the digital audio signal S into the analog audio signal S is not illustrated for convenience.
  • the audio control unit 36 changes a pitch in each sample of the audio data W according to the comparison data Dc.
  • the comparison unit 34 converts the audio data W so that a change in the pitch in each sample is larger as each comparison value ( ⁇ Tx, ⁇ Ty and ⁇ Tz) of the comparison data Dc is greater (that is, as a difference between the observation trajectory Oa and the reference trajectory Oref is larger).
  • Each sample of the audio data W is sequentially (on a real-time basis) converted and output concurrently with the swing motion of the user U. That is, within the analysis section of the swing motion performed by the user U, the pitch in a reproduced sound varies moment by moment according to the difference between the observation trajectory Oa and the reference trajectory Oref.
  • a known method is used in an adjustment process of the pitch using the modulation of each sample of the audio data W.
  • a pitch adjustment method of adjusting a reading-out speed of the audio data W will be described below.
  • the sampled audio data W which is waveform data having predetermined time duration is configured to have a plurality of frames. Each frame is adapted to correspond to one section within a plurality of sections configuring the associated analysis section.
  • the comparison unit 34 if the user U performs the swing motion, the comparison unit 34 generates the comparison data Dc corresponding to each section. Based on the corresponding comparison data Dc, the audio control unit 36 determines the reading-out speed from the storage device 14 of the audio data W for the frame corresponding to each section, and reads out the audio data W of the corresponding frame at the determined reading speed, from the storage device 14 .
  • a sound of the corresponding frame is reproduced so as to have the pitch higher than a reference pitch.
  • reading-out of the entire frame is completed until the time duration of the corresponding frame elapses.
  • a reading process re-starts from the foremost sample of the corresponding frame. Until the time duration of the corresponding frame elapses, the reading process is continuously repeated.
  • the reading speed slower than the standard reading speed is determined, reading of the entire frame is not completed until the time duration of the corresponding frame elapses.
  • a method can be considered in which reading of the corresponding frame is stopped at a point of time when reaching the end of the time duration of the corresponding frame and the process proceeds to a new reading process for the sample of the subsequent frame.
  • the waveform can be discontinuous in a connecting portion between frames.
  • smooth waveform connection between the frames is achieved by using a known cross-fade process.
  • the above-described pitch adjustment method is also called a cut and splice method, and is disclosed in the related art of U.S. Pat. No. 5,952,596, for example.
  • FIG. 8 is an explanatory view of a pitch in a reproduced sound according to a difference between the observation trajectory Oa and the reference trajectory Oref.
  • FIG. 8 a portion of the observation trajectory Oa (Oa 1 , Oa 2 ) in the analysis section and a time variation in a pitch Pa (Pa 1 , Pa 2 ) of the reproduced sound are illustrated in parallel.
  • a ball hitting point Q in FIG. 8 corresponds to the target observation point P at a point of time when the head Ch hits a ball.
  • each observation trajectory Oa is shown together with the reference trajectory Oref.
  • Each pitch Pa is illustrated as a relative value in which a pitch in the audio data W is adapted to be a reference pitch Pref.
  • the audio control unit 36 converts pitch of the audio data W according to each comparison data Dc so that the pitch Pa in the reproduced sound is raised as the observation trajectory Oa is separated toward the user U side when viewed from the reference trajectory Oref and the pitch Pa in the reproduced sound is lowered as the observation trajectory Oa is separated to the opposite side to the user when viewed from the reference trajectory Oref.
  • the pitch Pa 1 in FIG. 8 is the pitch of the reproduced sound when the target observation point P is moved on the observation trajectory Oa 1 .
  • the observation trajectory Oa 1 before passing through the ball hitting point Q, is positioned at the opposite side to the user U when viewed from the reference trajectory Oref, and after passing through the ball hitting point Q, is positioned at the user U side when viewed from the reference trajectory Oref (from outside to inside). Therefore, when the target observation point P is moved on the observation trajectory Oa 1 , the pitch Pa 1 of the reproduced sound is higher than the reference pitch Pref before the target observation point P passes through the ball hitting point Q, and is lowered as the target observation point P is closer to the ball hitting point Q. Then, the pitch Pa 1 is lower than the reference pitch Pref after the target observation point P passes through the ball hitting point Q.
  • the pitch Pa 2 in FIG. 8 is the pitch of the reproduced sound when the target observation point P is moved on the observation trajectory Oa 2 .
  • the observation trajectory Oa 2 before passing through the ball hitting point Q, is positioned at the user U side when viewed from the reference trajectory Oref, and after passing through the ball hitting point Q, is positioned at the opposite side to the user U side when viewed from the reference trajectory Oref (from inside to outside). Therefore, when the target observation point P is moved on the observation trajectory Oa 2 , the pitch Pa 2 of the reproduced sound is lower than the reference pitch Pref before the target observation point P passes through the ball hitting point Q, and is raised as the target observation point P is closer to the ball hitting point Q. Then, the pitch Pa 2 is higher than the reference pitch Pref after the target observation point P passes through the ball hitting point Q.
  • the user U can intuitively understand how the difference between the observation trajectory Oa and the reference trajectory Oref is changed at each point in time (with the lapse of time).
  • the audio signal S is generated according to the comparison result between the observation data Db and the reference data Dref. Therefore, the user U can easily understand the difference between the observation trajectory Oa of the target observation point P and the reference trajectory Oref indicated by the reference data Dref.
  • the audio signal S is generated with respect to the swing motion on a real time basis. Therefore, as compared to a configuration where a sound is reproduced after the swing motion is performed, the user U can instinctively understand a relationship of the actual swing motion and the difference between the observation trajectory Oa and the reference trajectory Oref.
  • the comparison unit 34 sequentially compares the observation data with reference data concurrently with the swing motion of the user U, and the audio control unit 36 generates the audio signal concurrently with each comparison performed by the comparison unit 34 .
  • an audio signal is generated with respect to action of a user on a real time basis. Therefore, as compared to a configuration where the audio signal is generated after the action to be analyzed is performed, the user can instinctively understand actual action and a difference between a trajectory of a target observation point and a predetermined trajectory.
  • the reference data series Sref are stretched on the time axis according to an average swing speed of the user U. Therefore, it is possible to appropriately evaluate the difference between the observation trajectory of the swing motion of the user U and the reference trajectory Oref.
  • the comparison unit 34 stretches time series of reference data on a time axis and compares each stretched reference data with observation data.
  • the time series of the reference data are stretched on the time axis. Therefore, for example, if the time series of the reference data are stretched according to an action speed of a user, it is possible to appropriately evaluate a difference between a trajectory of a target observation point and a predetermined trajectory, as compared to a case of fixed time duration of time series of the reference data.
  • the storage device 14 of the second embodiment stores three types of audio data W (Wx, Wy and Wz) indicating a waveform of different sounds (for example, a warning sound such as a “beeping sound” having a different pitch or a sound quality).
  • the audio control unit 36 of the present embodiment controls the audio data Wx to be reproduced/stopped according to a comparison result where the comparison unit 34 compares the observation trajectory Oa with the reference trajectory Oref in the X-axis direction (comparison value ⁇ Tx), controls the audio data Wy to be reproduced/stopped according to a comparison result in the Y-axis direction (comparison value ⁇ Ty), and controls the audio data Wz to be reproduced/stopped according to a comparison result in the Z-axis direction (comparison value ⁇ Tz).
  • the audio signal S is generated by adding the audio data Wx, the audio data Wy and the audio data Wz.
  • the audio control unit 36 stops reproducing of the audio data W corresponding to the associated axis direction.
  • the comparison value ⁇ T is beyond the threshold value TH (when the difference is large between the observation value B and the reference value R)
  • the audio control unit 36 reproduces the audio data W. It is also possible to individually set the threshold value TH for each axis direction.
  • FIG. 9 is an explanatory view illustrating the reproduction/stop of the audio data W for each period of time (t1, t2 and t3) of the observation trajectory Oa.
  • the observation trajectory Oa Within the observation trajectory Oa, during the period of time t1 while the comparison value ⁇ Tx and the comparison value ⁇ Tz are below the threshold value TH and the comparison value ⁇ Ty is beyond the threshold value TH, only the audio data Wy corresponding to Y-axis direction is reproduced, and the reproduction of the audio data Wx and the audio data Wz is stopped.
  • any one of the audio data Wx to Wz is not reproduced.
  • the comparison value ⁇ Tx and the comparison value ⁇ Ty are beyond the threshold value TH and the comparison value ⁇ Tz is below the threshold value TH, a mixing sound of the audio data Wx and the audio data Wy is reproduced and the audio data Wz is not reproduced.
  • the same advantageous effects can be achieved as those in the first embodiment.
  • the comparison result between the observation trajectory Oa and the reference trajectory Oref is individually reflected on the audio signal S in each axis direction. Therefore, the user U can recognize which direction of three axis directions has caused the difference between the observation trajectory Oa and the reference trajectory Oref.
  • the audio control unit 36 selects the audio data according to an instruction from the user within a plurality of audio data items indicating different sounds, and converts pitch and/or tempo of the selected data according to the comparison result obtained by using the comparison unit, thereby generating the audio signal.
  • the audio control unit 36 selects the audio data according to an instruction from the user within a plurality of audio data items indicating different sounds, and converts pitch and/or tempo of the selected data according to the comparison result obtained by using the comparison unit, thereby generating the audio signal.
  • FIG. 10 is a block diagram of the motion analysis system 100 according to a third embodiment.
  • the motion analysis system 100 in the third embodiment is configured to additionally include a delay device 15 in the motion analysis system 100 of the first embodiment.
  • the delay device 15 delays the audio signal S by delay time ⁇ . Therefore, the audio signal S is reproduced in the sound emitting device 16 after the delay time ⁇ elapses from when the audio control unit 36 starts the generation.
  • the generation of the audio signal S (generation of the comparison data Dc) is started at the action start point. Accordingly, the reproduction of the audio signal S is started at a point of time when the delay time ⁇ elapses from the action start point. That is, the audio signal S is not reproduced from the action start point until the delay time ⁇ elapses.
  • An element (buffer) which temporarily holds and outputs the audio signal S is used as the delay device 15 .
  • the reproduction of the audio signal S is started at the point of time when the delay time ⁇ elapses from the action start point. Accordingly, it is possible to prevent concentration of the user U from being hindered before and after the action start point.
  • the time duration from the action start point until ball hitting is 500 milliseconds. Therefore, if the delay time ⁇ is set to be approximately 500 milliseconds, there is an advantageous effect in that it is possible to prevent the user U from being hindered during the action from the action start point until the ball hitting, which is the time for the user U to particularly concentrate their attention.
  • the configuration of the third embodiment (delay device 15 ) can also be applied to the second embodiment.
  • the motion analysis device of the third embodiment includes the delay device 15 which delays the audio signal after the generation by using the audio control unit 36 .
  • the delay device 15 which delays the audio signal after the generation by using the audio control unit 36 .
  • the change amount in the acceleration (Ax, Ay and Az) in each axis direction is set to be the observation data Db as an example.
  • the acceleration (Ax, Ay and Az) itself as the observation data Db.
  • a numerical value itself of the acceleration in each direction can be used as the reference data Dref.
  • an element for detecting (detector) the movement of the target observation point P is not limited to the acceleration sensor 20 .
  • the acceleration sensor 20 or together with the acceleration sensor 20 , it is also possible to use a speed sensor which detects a speed of the target observation point P or a direction sensor (for example, a gyro sensor) which detects the direction of the movement of the target observation point P.
  • the observation data Db may be time-series data indicating the observation trajectory Oa of the target observation point P.
  • the reference data Dref may be time-series data indicating the reference trajectory Oref.
  • the pitch in the reproduced sound is changed according to the difference between the observation trajectory Oa and the reference trajectory Oref, but the modulation method of the audio data W may be optionally used.
  • the modulation method of the audio data W may be optionally used.
  • the audio control unit 36 provides the audio data W with various sound effects (for example, an echo effect), it is also possible to control the extent of the sound effect to be provided for the audio data W according to the difference between the observation trajectory Oa and the reference trajectory Oref.
  • the audio control unit 36 may generate the audio signal S according to the comparison result (comparison data Dc) using the comparison unit 34 , and specific content in the process thereof is not limited thereto.
  • the audio control unit 36 is configured so as to select and convert pitch and/or tempo of the audio data W according to an instruction of the user U out of the plurality of audio data W.
  • the plurality of audio data W indicating the wind noise generated by different types of the club C during the swing motion is stored in the storage device 14 .
  • the audio control unit 36 selects the audio data W according to the types of the club C used by the user U from the storage device 14 , and generates the audio signal S by way of the modulation of the audio data W to which the comparison data Dc is applied.
  • the types of the club C (for example, a driver or irons) are instructed from the user to the motion analysis device 10 by operating, for example, an input device.
  • the storage device 14 stores sound effect data indicating a wave form of the sound effects.
  • the sound effects are sounds such as a sound, a shout for joy, or a sound of applause when a ball enters a hole cup.
  • the audio control unit 36 counts the amount N of the comparison data Dc in which each comparison ⁇ T ( ⁇ Tx, ⁇ Ty and ⁇ Tz) out of the comparison data Dc which is sequentially generated by the comparison unit 34 is beyond a threshold value.
  • the audio control unit 36 acquires sound effect data from the storage device 14 and supplies the sound effect data to the sound emitting device 16 as the audio signal S. That is, the audio signal S of the sound to which the sound effects are added immediately after of the sound indicated by the audio data W (wind noise generated by the club C during the swing motion) is reproduced.
  • the audio control unit 36 controls whether to add the sound effects to the audio signal S according to a degree of approximation between the observation trajectory Oa and the reference trajectory Oref.
  • the audio control unit 36 According to a degree of approximation between a trajectory specified by observation data and a predetermined trajectory, the audio control unit 36 generates an audio signal in which predetermined sound effects are added to a sound according to a comparison result obtained by using the comparison unit 34 .
  • the degree of approximation between the trajectory of the target observation point and the predetermined trajectory the sound according to the comparison result obtained by using the comparison unit 34 and the predetermined sound effects are reproduced. Therefore, there is an advantage in that the user can intuitively recognize whether the action is good or not.
  • the delay device 15 delays the audio signal S by the predetermined delay time ⁇ , but the delay time ⁇ can be controlled so as to be variable.
  • the delay device 15 delays the audio signal S from the action start point to the point of time a ball is hit (that is, the delay time ⁇ is set to be the time from the action start point to the point of time a ball is hit).
  • the sound is not reproduced from the action start point to the point of time a ball is hit, but the sound is reproduced after the point of time a ball is hit.
  • each element described above as an example can be appropriately omitted.
  • the storage device 14 by incorporating various data items from an external device separate from the motion analysis device 10 .
  • the sound emitting device 16 can be omitted in a configuration where the audio signal S generated by the audio control unit 36 is transmitted to the external device via a communication network or a portable recording medium and is reproduced in the sound emitting device 16 of the external device.
  • the observation data acquisition unit 32 sequentially generates the observation data Db by using the sensor output Da supplied from the acceleration sensor 20 .
  • the observation data acquisition unit 32 receives the observation data Db which is sequentially generated by the acceleration sensor 20 . That is, an element acquiring the observation data Db (observation data acquisition means) includes both an element generating the observation data Db from the detection result obtained by using the acceleration sensor 20 for itself and an element receiving the observation data Db from the external device (acceleration sensor 20 ).
  • the motion analysis device 10 which analyzes the swing motion of the golf club C has been described as an example.
  • a motion in which the motion analysis device 10 can be used is not limited to the action in golf.
  • the motion analysis system 100 motion analysis device 10
  • the motion analysis device 10 can also be used.
  • the comparison unit 34 increases or decreases the amount of reference data Dref by way of the interpolation process or the thinning process for the reference data series Sref, it is also possible to change the amount of reference data Dref for each unit of time inside and outside the predetermined section in the analysis section.
  • the motion analysis device is operated by hardware (electronic circuit) such as a digital signal processor (DSP) which is exclusively used to analyze motions of a user, or is also operated in cooperation of a program and a general-purpose arithmetic processing unit such as a central processing unit (CPU).
  • hardware electronic circuit
  • DSP digital signal processor
  • CPU central processing unit
  • a program of the present invention causes a computer to implement an observation data acquisition process for acquiring observation data indicating a trajectory of a target observation point which moves in conjunction with action of a user; a comparison process for comparing each of a plurality of reference data indicating a predetermined trajectory of the target observation point with the observation data acquired by the observation data acquisition process; and an audio control process used to generate an audio signal according to a result of the comparison process.
  • the recording medium is a non-transitory recording medium, and is preferably an optical recording medium (optical disk) such as CD-ROM.
  • the recording medium can include any known type of recording medium such as a semiconductor recording medium and a magnetic recording medium.
  • the program of the present invention can be provided in a distributing manner via a communication network, for example, by a distributing server device, and then can be installed in the computer.

Abstract

A motion analysis device includes an observation data acquisition unit that acquires observation data indicating a trajectory of a target observation point which moves in conjunction with motion of a user (for example, a specified point in a club used by the user); a comparison unit that compares each of a plurality of reference data indicating a predetermined trajectory of the target observation point with the observation data acquired by the observation data acquisition unit; and an audio control unit that generates an audio signal according to a comparison result obtained by using the comparison unit.

Description

    BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • The present invention relates to a technology for analyzing a motion of a user. Priority is claimed on Japanese Patent Application No. 2012-279501, filed on Dec. 21, 2012, the content of which is incorporated herein by reference.
  • 2. Description of Related Art
  • In the related art, various technologies have been proposed in order to analyze a motion of a user. For example, Japanese Unexamined Patent Application, First Publication No. H06-39070 discloses a technology which displays a moving image of a swing motion of the user concurrently with a moving image of a pre-recorded reference use swing motion (for example, a swing motion of a professional golfer) on an identical screen.
  • SUMMARY OF THE INVENTION
  • According to the technology disclosed in Japanese Unexamined Patent Application, First Publication No. H06-39070, the user analyzes their own swing motion by visually comparing their own swing motion with a reference use swing motion. However, in practice, it is difficult to understand the difference between both motions by accurately and precisely comparing the motion of the user with the reference-purpose motion while visually checking the moving images on the screen. In view of the above-described circumstances, the present invention aims to enable the user to easily understand the difference between the motion of the user and the reference use motion.
  • In order to solve the above-described problem, a motion analysis device of the present invention includes observation data acquisition means for acquiring observation data which indicates a trajectory of a target observation point moving in conjunction with a motion of a user; comparison means for comparing reference data which indicates a predetermined trajectory of the target observation point with the observation data which is acquired and generated by the observation data acquisition means; and audio control means used to generate an audio signal according to a comparison result from the comparison means.
  • According to the present invention, the audio signal can be generated according to the comparison result between the observation data and the reference data. Therefore, the user can easily understand a difference between the trajectory of the target observation point and the predetermined trajectory indicated by the reference data.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is an external view illustrating a motion analysis system according to a first embodiment of the present invention.
  • FIG. 2 is a block diagram illustrating the motion analysis system according to the first embodiment of the present invention.
  • FIG. 3 illustrates a target observation point according to the first embodiment of the present invention.
  • FIG. 4 is a schematic diagram illustrating observation data according to the first embodiment of the present invention.
  • FIG. 5 is a schematic diagram illustrating a reference data sequence according to the first embodiment of the present invention.
  • FIG. 6 is a schematic diagram illustrating comparison data according to the first embodiment of the present invention.
  • FIG. 7 is a flowchart illustrating a comparison process performed by a comparison unit according to the first embodiment of the present invention.
  • FIG. 8 is a graph illustrating an audio signal generated by an audio control unit according to the first embodiment of the present invention.
  • FIG. 9 is a graph illustrating an audio signal generated by an audio control unit according to a second embodiment of the present invention.
  • FIG. 10 is a block diagram illustrating a motion analysis system according to a third embodiment of the present invention.
  • DETAILED DESCRIPTION OF THE INVENTION First Embodiment
  • FIG. 1 is an external view of a motion analysis system 100 according to a first embodiment of the present invention. FIG. 2 is a block diagram of the motion analysis system 100.
  • As illustrated in FIGS. 1 and 2, the motion analysis system 100 includes a motion analysis device 10 and an acceleration sensor 20. The motion analysis device 10 analyzes a motion of a user U and notifies the user U of an analysis result, and is preferably used when practicing a specific action in various sports. The motion analysis device 10 of the first embodiment analyzes a motion of the user U swinging a golf club C (hereinafter, referred to as a “swing motion”). More specifically, the motion analysis device 10 analyzes movement of a point P which moves in conjunction with the swing motion of the user U (hereinafter, referred to as a “target observation point”). The target observation point P of the first embodiment is a specific point in the club C used by the user U.
  • More specifically, as illustrated in FIG. 3, a tip portion of a grip Cg fixed to a shaft Cs of the club C (end portion of as head Ch side) is set to be the target observation point P. Other points of the club C (for example, a point on the head Ch or on the shaft Cs) or a point within a body of the user U which moves in conjunction with the swing motion can be set to be the target observation point P.
  • The acceleration sensor 20 in FIGS. 1 and 2 is a detector which detects movement of the target observation point P (swing motion of the user U), and sequentially generates a sensor output Da corresponding to the movement of the target observation point P at a predetermined cycle. As illustrated in FIG. 3, the acceleration sensor 20 of the present embodiment is a three-axis acceleration sensor that detects acceleration in each direction of three axes (X-axis, Y-axis and Z-axis) which are fixed to the target observation P and are orthogonal to one another. The Z-axis is an axis which is parallel to a longitudinal direction of the shaft Cs of the club C. The Y-axis and the X-axis are axes on a plane which is orthogonal to the Z-axis. One sensor output Da is configured to include acceleration Ax in the X-axis direction, acceleration Ay in the Y-axis direction and acceleration Az in the Z-axis direction. Each sensor output Da which is sequentially generated by the acceleration sensor 20 is transmitted to the motion analysis device 10 in a time-series manner. The acceleration sensor 20 and the motion analysis device 10 perform data communication with each other in a wireless manner, but may perform the data communication by wire.
  • As illustrated in FIG. 2, the motion analysis device 10 is operated by a computer system which includes an arithmetic processing unit 12, a storage device 14 and a sound emitting device 16. The storage device 14 stores programs implemented by the arithmetic processing unit 12 and various data items used by the arithmetic processing unit 12 (for example, audio data W or reference data series Sref). Combinations of a semiconductor storage medium and a known recording medium such as a magnetic recording medium or a multiple types of the recording medium may be optionally employed as the storage device 14. The sound emitting device 16 is audio equipment (for example, a speaker) which reproduces a sound wave corresponding to an audio signal S generated by the arithmetic processing unit 12.
  • The arithmetic processing unit 12 allows a plurality of functions (observation data acquisition unit 32, comparison unit 34 and audio control unit 36) for analyzing the motion of the user U by implementing a program stored in the storage device 14. It is also possible to distribute each function of the arithmetic processing unit 12 to a plurality of devices.
  • The observation data acquisition unit 32 sequentially acquires observation data Db indicating a trajectory (hereinafter, referred to as an “observation trajectory”) Oa of the target observation point P corresponding to the swing motion of the user U. More specifically, the observation data acquisition unit 32 sequentially generates the observation data Db supplied by the acceleration sensor 20 in the time-series manner. As illustrated in FIG. 4, one observation data item Db is configured to include an observation value Bx, an observation value By and an observation value Bz. The observation value Bx is a difference (variation) in the acceleration Ax between two sensor outputs Da which are generated in succession, the observation value By is a difference in the acceleration Ay between two sensor outputs Da which are generated in succession, and the observation value Bz is a difference in the acceleration Az between two sensor outputs Da which are generated in succession. The cycle where the observation data acquisition unit 32 acquires the observation data Db is set to be a sufficiently short time (for example, one millisecond) as compared to the time for the user U to perform the swing motion.
  • The storage device 14 in FIG. 2 stores the audio data W and the reference data series Sref. The audio data W of the present embodiment is data which indicates specific audio waveforms. For example, a wind noise generated by the club C during the swing motion is recorded, and digital data sampled at a predetermined frequency (for example, 44.1 kHz) is stored in the storage device 14 in advance as the audio data W.
  • FIG. 5 is a schematic diagram of the reference data series Sref. The reference data series Sref indicates a trajectory (hereinafter, referred to as a “reference trajectory”) Oref of the target observation point P over predetermined time duration. As illustrated in FIG. 5, the reference data series Sref are time series of a plurality of reference data Dref. Each reference data Dref is compared with each observation data Db in order to evaluate the swing motion of the user U, and is configured to include a reference value Rx, a reference value Ry and a reference value Rz.
  • The reference trajectory Oref is a standard of an observation trajectory Oa specified by each observation data Db. For example, a trajectory of the target observation point P when an action performer such as a professional golfer skilled in the swing motion performs standard swing motion is preferably employed as the reference trajectory Oref. More specifically, when the standard actor performs the swing motion, time series of the plurality of observation data Db generated by the observation data acquisition unit 32 are stored in the storage device 14 in advance as the reference data series Sref (each reference data Dref). Therefore, the reference value Rx of each reference data Dref corresponds to a change amount of the acceleration Ax when performing the standard swing motion, the reference value Ry corresponds to a change amount of the acceleration Ay and the reference value Rz corresponds to a change amount of the acceleration Az.
  • The comparison unit 34 in FIG. 2 compares each observation data Db acquired by the observation data acquisition unit 32 with each reference data Dref of the reference data series Sref stored in the storage device 14. More specifically, the comparison unit 34 reads out the reference data Dref from the reference data series Sref of the storage device 14 in chronological order each time the observation data acquisition unit 32 acquires the observation data Db, and generates comparison data Dc by calculating the difference between the observation data Db and the reference data Dref.
  • As illustrated in FIG. 6, one comparison data Dc is configured to include a comparison value ΔTx, a comparison value ΔTy and a comparison value ΔTz. The comparison value ΔTx is a difference between the observation value Bx of the observation data Db and the reference value Rx of the reference data Dref. Similarly, the comparison value ΔTy is the difference between the observation value By and the reference value Ry, and the comparison value ΔTz is a difference between the observation value Bz and the reference value Rz. Time series of the plurality of observation data Db correspond to the observation trajectory Oa and the reference data series Sref correspond to the reference trajectory Oref. Accordingly, the comparison data Dc corresponds to data indicating a difference between the observation trajectory OA and the reference trajectory Oref.
  • The comparison unit 34 of the present embodiment compares the observation data Db with the reference data Dref in a predetermined analysis section within a section from the start to the completion of the swing motion performed by the user U. The analysis section is a section from a point of time when the user U starts downswing motion (action of swinging the club C down) after takeaway action in backswing (hereinafter, an “action start point”) until predetermined time duration T elapses.
  • The time duration T of the analysis section is set according to time duration from the action start point of the User U until the user U completes follow-through action (finishing action in swinging the club C). The time duration from an actual action start point until the swing is completed varies depending on a swing speed of the user U. In the present embodiment, an average swing speed of the user U is calculated based on results on the time series of the observation data Db which are previously measured multiple times. The time duration T of the analysis section corresponding to the average swing speed is selected for each user U and is stored in the storage device 14.
  • FIG. 7 is a flowchart in a process where the comparison unit 34 compares each observation data Db and each reference data Dref (hereinafter, referred to as a “comparison process”). For example, when the user U instructs the analysis to start by operating an input device (not illustrated), the comparison process in FIG. 7 is performed.
  • The comparison unit 34 detects the action start point by utilizing each observation data Db (S1). Considering that a change amount in the acceleration of the target observation point P has a tendency to increase immediately after the start of the downswing motion, the comparison unit 34 of the first embodiment detects the action start point according to a change amount ΔA of the acceleration indicated by each observation data Db.
  • More specifically, the comparison unit 34 sequentially determines whether or not the change amount ΔA in the acceleration indicated by the observation data Db which is sequentially supplied from the observation data acquisition unit 32 is beyond a predetermined threshold value ATH. For example, the change amount ΔA is the sum of an absolute value of the observation value Bx, an absolute value of the observation value By and an absolute value of the observation value Bz. The comparison unit 34 repeats Step S1 until the change amount ΔA is beyond the predetermined threshold value ATH (S1: NO), and detects a point in time when the change amount ΔA is beyond the predetermined threshold value ATH as the action start point (S1: YES). Then, the process proceeds to Step S2.
  • The comparison unit 34 stretches (expands or contracts) the reference data series Sref on a time axis according to the time duration T of the analysis section which is selected and stored in advance based on the average swing speed of the user U (S2). More specifically, time duration Tref from foremost reference data Dref of the reference data series Sref to backmost reference data Dref (time duration from a start point to an end point of the reference trajectory Oref) is adjusted to be the time duration T.
  • More specifically, when the time duration T is longer than the time duration Tref, the comparison unit 34 increases the amount of reference data Dref by performing an interpolation process on the reference data series Sref, thereby equalize the amount of the reference data Dref with the amount of observation data Db. For the interpolation process of the reference data series Sref, a known technology (for example, a linear interpolation process or a spline interpolation process) may be optionally employed.
  • On the other hand, when the time duration T is shorter than the time duration Tref, the comparison unit 34 decreases the amount of reference data Dref by performing a thinning process on the reference data series Sref, thereby equalizing the amount of reference data Dref with the amount of observation data DB. For the thinning process of the reference data series Sref, a known technology may be optionally employed. The reference trajectory Oref itself does not vary in the process of Step S2.
  • The comparison unit 34 compares each observation data Db which is sequentially generated by the observation data acquisition unit 32 with each of the reference data Dref of the reference data series Sref after adjustment in Step S2 (S3).
  • More specifically, the comparison unit 34 reads out each reference data Dref of the reference data series Sref after the adjustment, starting from the foremost, in chronological order each time the observation data acquisition unit 32 generates the observation data Db, and sequentially generates the comparison data Dc by setting a difference between the reference data Dref and the observation data Db to be the comparison data Dc. As illustrated in FIG. 7, the generation of the comparison data Dc (S3) is repeated from the action start point detected in Step S1 until it is determined in Step S4 that the time duration T of the analysis section elapses (S4: NO).
  • When it is determined that the time duration T elapses from the action start point (S4: YES), the comparison unit 34 completes the comparison process. As will be appreciated from the above description, the comparison data Dc indicating a difference between the observation trajectory Oa and the reference trajectory Oref is sequentially generated within the analysis section while the swing motion is performed.
  • The audio control unit 36 in FIG. 2 generates an audio signal S according to the comparison data Dc (comparison result of each observation data Db and each reference data Dref) which is sequentially generated by the comparison unit 34. More specifically, from the action start point detected by the comparison unit 34, the audio control unit 36 sequentially acquires each sample of audio data W from the storage device 14 in chronological order, and converts pitch and/or tempo of each sample of the audio data W according to the comparison data Dc generated by the comparison unit 34 immediately before each reading. The audio signal S which is generated by the audio control unit 36 is supplied to the sound emitting device 16 to be reproduced as a sound wave. A D/A converter which converts the digital audio signal S into the analog audio signal S is not illustrated for convenience.
  • More specifically, the audio control unit 36 changes a pitch in each sample of the audio data W according to the comparison data Dc. For example, the comparison unit 34 converts the audio data W so that a change in the pitch in each sample is larger as each comparison value (ΔTx, ΔTy and ΔTz) of the comparison data Dc is greater (that is, as a difference between the observation trajectory Oa and the reference trajectory Oref is larger). Each sample of the audio data W is sequentially (on a real-time basis) converted and output concurrently with the swing motion of the user U. That is, within the analysis section of the swing motion performed by the user U, the pitch in a reproduced sound varies moment by moment according to the difference between the observation trajectory Oa and the reference trajectory Oref.
  • Here, a known method is used in an adjustment process of the pitch using the modulation of each sample of the audio data W. As an example, a pitch adjustment method of adjusting a reading-out speed of the audio data W will be described below.
  • The sampled audio data W which is waveform data having predetermined time duration is configured to have a plurality of frames. Each frame is adapted to correspond to one section within a plurality of sections configuring the associated analysis section. In this case, if the user U performs the swing motion, the comparison unit 34 generates the comparison data Dc corresponding to each section. Based on the corresponding comparison data Dc, the audio control unit 36 determines the reading-out speed from the storage device 14 of the audio data W for the frame corresponding to each section, and reads out the audio data W of the corresponding frame at the determined reading speed, from the storage device 14. Here, when the reading speed is determined to be faster than a standard reading speed according to a value of the comparison data Dc, a sound of the corresponding frame is reproduced so as to have the pitch higher than a reference pitch. In this case, reading-out of the entire frame is completed until the time duration of the corresponding frame elapses. However, at a point of time when reading-out of the entire frame is completed at the fast speed, a reading process re-starts from the foremost sample of the corresponding frame. Until the time duration of the corresponding frame elapses, the reading process is continuously repeated. In contrast, when the reading speed slower than the standard reading speed is determined, reading of the entire frame is not completed until the time duration of the corresponding frame elapses. However, a method can be considered in which reading of the corresponding frame is stopped at a point of time when reaching the end of the time duration of the corresponding frame and the process proceeds to a new reading process for the sample of the subsequent frame. In a case of the fast reading speed as well as in a case of the slow reading speed, the waveform can be discontinuous in a connecting portion between frames. However, it can be considered that smooth waveform connection between the frames is achieved by using a known cross-fade process.
  • The above-described pitch adjustment method is also called a cut and splice method, and is disclosed in the related art of U.S. Pat. No. 5,952,596, for example.
  • FIG. 8 is an explanatory view of a pitch in a reproduced sound according to a difference between the observation trajectory Oa and the reference trajectory Oref.
  • In FIG. 8, a portion of the observation trajectory Oa (Oa1, Oa2) in the analysis section and a time variation in a pitch Pa (Pa1, Pa2) of the reproduced sound are illustrated in parallel. A ball hitting point Q in FIG. 8 corresponds to the target observation point P at a point of time when the head Ch hits a ball. In FIG. 8, each observation trajectory Oa is shown together with the reference trajectory Oref. Each pitch Pa is illustrated as a relative value in which a pitch in the audio data W is adapted to be a reference pitch Pref.
  • As illustrated in FIG. 8, the audio control unit 36 converts pitch of the audio data W according to each comparison data Dc so that the pitch Pa in the reproduced sound is raised as the observation trajectory Oa is separated toward the user U side when viewed from the reference trajectory Oref and the pitch Pa in the reproduced sound is lowered as the observation trajectory Oa is separated to the opposite side to the user when viewed from the reference trajectory Oref. A more specific description is as follows.
  • The pitch Pa1 in FIG. 8 is the pitch of the reproduced sound when the target observation point P is moved on the observation trajectory Oa1. The observation trajectory Oa1, before passing through the ball hitting point Q, is positioned at the opposite side to the user U when viewed from the reference trajectory Oref, and after passing through the ball hitting point Q, is positioned at the user U side when viewed from the reference trajectory Oref (from outside to inside). Therefore, when the target observation point P is moved on the observation trajectory Oa1, the pitch Pa1 of the reproduced sound is higher than the reference pitch Pref before the target observation point P passes through the ball hitting point Q, and is lowered as the target observation point P is closer to the ball hitting point Q. Then, the pitch Pa1 is lower than the reference pitch Pref after the target observation point P passes through the ball hitting point Q.
  • On the other hand, the pitch Pa2 in FIG. 8 is the pitch of the reproduced sound when the target observation point P is moved on the observation trajectory Oa2. The observation trajectory Oa2, before passing through the ball hitting point Q, is positioned at the user U side when viewed from the reference trajectory Oref, and after passing through the ball hitting point Q, is positioned at the opposite side to the user U side when viewed from the reference trajectory Oref (from inside to outside). Therefore, when the target observation point P is moved on the observation trajectory Oa2, the pitch Pa2 of the reproduced sound is lower than the reference pitch Pref before the target observation point P passes through the ball hitting point Q, and is raised as the target observation point P is closer to the ball hitting point Q. Then, the pitch Pa2 is higher than the reference pitch Pref after the target observation point P passes through the ball hitting point Q.
  • According to the above-described configuration, by checking a change in the pitch Pa in the reproduced sound, the user U can intuitively understand how the difference between the observation trajectory Oa and the reference trajectory Oref is changed at each point in time (with the lapse of time).
  • As described above, in the first embodiment, the audio signal S is generated according to the comparison result between the observation data Db and the reference data Dref. Therefore, the user U can easily understand the difference between the observation trajectory Oa of the target observation point P and the reference trajectory Oref indicated by the reference data Dref.
  • In addition, the audio signal S is generated with respect to the swing motion on a real time basis. Therefore, as compared to a configuration where a sound is reproduced after the swing motion is performed, the user U can instinctively understand a relationship of the actual swing motion and the difference between the observation trajectory Oa and the reference trajectory Oref.
  • That is, the comparison unit 34 sequentially compares the observation data with reference data concurrently with the swing motion of the user U, and the audio control unit 36 generates the audio signal concurrently with each comparison performed by the comparison unit 34. In the above-described configuration, an audio signal is generated with respect to action of a user on a real time basis. Therefore, as compared to a configuration where the audio signal is generated after the action to be analyzed is performed, the user can instinctively understand actual action and a difference between a trajectory of a target observation point and a predetermined trajectory.
  • In a configuration of fixing the time duration of the reference data series Sref, when the time duration of the swing motion is different from the time duration of the reference data series Sref, even though the observation trajectory Oa itself approximates to the reference trajectory Oref, it can be evaluated that observation trajectory Oa is different from the reference trajectory Oref. In the present embodiment, the reference data series Sref are stretched on the time axis according to an average swing speed of the user U. Therefore, it is possible to appropriately evaluate the difference between the observation trajectory of the swing motion of the user U and the reference trajectory Oref.
  • That is, the comparison unit 34 stretches time series of reference data on a time axis and compares each stretched reference data with observation data. In the above-described configuration, the time series of the reference data are stretched on the time axis. Therefore, for example, if the time series of the reference data are stretched according to an action speed of a user, it is possible to appropriately evaluate a difference between a trajectory of a target observation point and a predetermined trajectory, as compared to a case of fixed time duration of time series of the reference data.
  • Second Embodiment
  • A second embodiment of the present invention will be described below. In the following description, the reference numerals used in the above description will be given to configuring elements having an operation and a function which are the same as those in the first embodiment, and a detailed description thereof will be appropriately omitted here.
  • The storage device 14 of the second embodiment stores three types of audio data W (Wx, Wy and Wz) indicating a waveform of different sounds (for example, a warning sound such as a “beeping sound” having a different pitch or a sound quality). The audio control unit 36 of the present embodiment controls the audio data Wx to be reproduced/stopped according to a comparison result where the comparison unit 34 compares the observation trajectory Oa with the reference trajectory Oref in the X-axis direction (comparison value ΔTx), controls the audio data Wy to be reproduced/stopped according to a comparison result in the Y-axis direction (comparison value ΔTy), and controls the audio data Wz to be reproduced/stopped according to a comparison result in the Z-axis direction (comparison value ΔTz). The audio signal S is generated by adding the audio data Wx, the audio data Wy and the audio data Wz.
  • More specifically, when the comparison value ΔT (ΔTx, ΔTy and ΔTz) in each axis direction is below the predetermined threshold value TH (when a difference is small between the observation value B and the reference value R), the audio control unit 36 stops reproducing of the audio data W corresponding to the associated axis direction. When the comparison value ΔT is beyond the threshold value TH (when the difference is large between the observation value B and the reference value R), the audio control unit 36 reproduces the audio data W. It is also possible to individually set the threshold value TH for each axis direction.
  • FIG. 9 is an explanatory view illustrating the reproduction/stop of the audio data W for each period of time (t1, t2 and t3) of the observation trajectory Oa. Within the observation trajectory Oa, during the period of time t1 while the comparison value ΔTx and the comparison value ΔTz are below the threshold value TH and the comparison value ΔTy is beyond the threshold value TH, only the audio data Wy corresponding to Y-axis direction is reproduced, and the reproduction of the audio data Wx and the audio data Wz is stopped. Similarly, during the period of time t2 while all of the comparison value ΔTx, the comparison value ΔTy and the comparison value ΔTz are below the threshold value TH, any one of the audio data Wx to Wz is not reproduced. During the period of time t3 while the comparison value ΔTx and the comparison value ΔTy are beyond the threshold value TH and the comparison value ΔTz is below the threshold value TH, a mixing sound of the audio data Wx and the audio data Wy is reproduced and the audio data Wz is not reproduced.
  • Even in the second embodiment, the same advantageous effects can be achieved as those in the first embodiment. In addition, in the second embodiment, the comparison result between the observation trajectory Oa and the reference trajectory Oref is individually reflected on the audio signal S in each axis direction. Therefore, the user U can recognize which direction of three axis directions has caused the difference between the observation trajectory Oa and the reference trajectory Oref.
  • That is, the audio control unit 36 selects the audio data according to an instruction from the user within a plurality of audio data items indicating different sounds, and converts pitch and/or tempo of the selected data according to the comparison result obtained by using the comparison unit, thereby generating the audio signal. In the above-described configuration, it is possible to diversify types of the reproduced sound of the audio signal as compared to a configuration of generating the audio signal by modulating one type of audio data.
  • Third Embodiment
  • FIG. 10 is a block diagram of the motion analysis system 100 according to a third embodiment. As illustrated in FIG. 10, the motion analysis system 100 in the third embodiment is configured to additionally include a delay device 15 in the motion analysis system 100 of the first embodiment. The delay device 15 delays the audio signal S by delay time δ. Therefore, the audio signal S is reproduced in the sound emitting device 16 after the delay time δ elapses from when the audio control unit 36 starts the generation. The generation of the audio signal S (generation of the comparison data Dc) is started at the action start point. Accordingly, the reproduction of the audio signal S is started at a point of time when the delay time δ elapses from the action start point. That is, the audio signal S is not reproduced from the action start point until the delay time δ elapses. An element (buffer) which temporarily holds and outputs the audio signal S is used as the delay device 15.
  • Even in the third embodiment, the same advantageous effects can be achieved as those in the first embodiment. In addition, in the third embodiment, the reproduction of the audio signal S is started at the point of time when the delay time δ elapses from the action start point. Accordingly, it is possible to prevent concentration of the user U from being hindered before and after the action start point. For example, the time duration from the action start point until ball hitting is 500 milliseconds. Therefore, if the delay time δ is set to be approximately 500 milliseconds, there is an advantageous effect in that it is possible to prevent the user U from being hindered during the action from the action start point until the ball hitting, which is the time for the user U to particularly concentrate their attention. The configuration of the third embodiment (delay device 15) can also be applied to the second embodiment.
  • That is, the motion analysis device of the third embodiment includes the delay device 15 which delays the audio signal after the generation by using the audio control unit 36. In this configuration, since the audio signal is delayed, it is possible to prevent the concentration of the user from being hindered during a period from when the generation of the audio signal is started until the delayed time elapses, for example.
  • Modification Example
  • The above-described respective embodiments can be modified in various ways. Specific modification aspects will be described below. Two or more aspects which are optionally selected from the following example can be appropriately combined with one another.
  • (1) In each embodiment described, the change amount in the acceleration (Ax, Ay and Az) in each axis direction is set to be the observation data Db as an example. However, it is also possible to use the acceleration (Ax, Ay and Az) itself as the observation data Db. Similarly, a numerical value itself of the acceleration in each direction can be used as the reference data Dref.
  • In addition, an element for detecting (detector) the movement of the target observation point P (swing motion of the user U) is not limited to the acceleration sensor 20. For example, instead of the acceleration sensor 20 (or together with the acceleration sensor 20), it is also possible to use a speed sensor which detects a speed of the target observation point P or a direction sensor (for example, a gyro sensor) which detects the direction of the movement of the target observation point P.
  • In addition, it is also possible to identify the observation trajectory Oa from video images in which the swing motion of the user U is videotaped using a video camera.
  • As will be appreciated from the above description, the observation data Db may be time-series data indicating the observation trajectory Oa of the target observation point P. Similarly, the reference data Dref may be time-series data indicating the reference trajectory Oref.
  • (2) In each embodiment described above, the pitch in the reproduced sound is changed according to the difference between the observation trajectory Oa and the reference trajectory Oref, but the modulation method of the audio data W may be optionally used. For example, it is also possible to change a sound volume of the audio data W according to the difference between the observation trajectory Oa and the reference trajectory Oref (each comparison data Dc), for example. In addition, in a configuration where the audio control unit 36 provides the audio data W with various sound effects (for example, an echo effect), it is also possible to control the extent of the sound effect to be provided for the audio data W according to the difference between the observation trajectory Oa and the reference trajectory Oref.
  • As will be appreciated from the above description, the audio control unit 36 may generate the audio signal S according to the comparison result (comparison data Dc) using the comparison unit 34, and specific content in the process thereof is not limited thereto.
  • (3) It is also possible to selectively use a plurality of audio data W indicating different sounds. More specifically, it is preferable that the audio control unit 36 is configured so as to select and convert pitch and/or tempo of the audio data W according to an instruction of the user U out of the plurality of audio data W. For example, the plurality of audio data W indicating the wind noise generated by different types of the club C during the swing motion is stored in the storage device 14. The audio control unit 36 selects the audio data W according to the types of the club C used by the user U from the storage device 14, and generates the audio signal S by way of the modulation of the audio data W to which the comparison data Dc is applied. The types of the club C (for example, a driver or irons) are instructed from the user to the motion analysis device 10 by operating, for example, an input device.
  • According to the above-described configuration, it is possible to diversify the types of the reproduced sound.
  • (4) It is also possible to reproduce a specific sound (for example, sound effects) when the observation trajectory Oa approximates to the reference trajectory Oref. For example, the storage device 14 stores sound effect data indicating a wave form of the sound effects. For example, the sound effects are sounds such as a sound, a shout for joy, or a sound of applause when a ball enters a hole cup.
  • The audio control unit 36 counts the amount N of the comparison data Dc in which each comparison ΔT (ΔTx, ΔTy and ΔTz) out of the comparison data Dc which is sequentially generated by the comparison unit 34 is beyond a threshold value. When the amount N after the completion of the swing motion is below a predetermined threshold value (that is, when the observation trajectory Oa approximates to the reference trajectory Oref), the audio control unit 36 acquires sound effect data from the storage device 14 and supplies the sound effect data to the sound emitting device 16 as the audio signal S. That is, the audio signal S of the sound to which the sound effects are added immediately after of the sound indicated by the audio data W (wind noise generated by the club C during the swing motion) is reproduced.
  • In the above-described description, when the observation trajectory Oa approximates to the reference trajectory Oref, the sound effects are reproduced. Therefore, there is an advantage in that the user U can intuitively recognize whether their own swing motion is good or not in the observation trajectory Oa.
  • When the observation trajectory Oa is different from the reference trajectory Oref (when the above-described amount N is beyond the threshold value), the sound effects can be added to the audio signal S. That is, the audio control unit 36 controls whether to add the sound effects to the audio signal S according to a degree of approximation between the observation trajectory Oa and the reference trajectory Oref.
  • That is, in this modification example, according to a degree of approximation between a trajectory specified by observation data and a predetermined trajectory, the audio control unit 36 generates an audio signal in which predetermined sound effects are added to a sound according to a comparison result obtained by using the comparison unit 34. In this configuration, according to the degree of approximation between the trajectory of the target observation point and the predetermined trajectory, the sound according to the comparison result obtained by using the comparison unit 34 and the predetermined sound effects are reproduced. Therefore, there is an advantage in that the user can intuitively recognize whether the action is good or not.
  • (5) In the third embodiment, the delay device 15 delays the audio signal S by the predetermined delay time δ, but the delay time δ can be controlled so as to be variable. For example, in a configuration where the comparison unit 34 detects a point of time in the ball hitting by the club C according to a temporal change of the observation data Db (or the comparison data Dc), a configuration is employed where the delay device 15 delays the audio signal S from the action start point to the point of time a ball is hit (that is, the delay time δ is set to be the time from the action start point to the point of time a ball is hit). In the above-described configuration, the sound is not reproduced from the action start point to the point of time a ball is hit, but the sound is reproduced after the point of time a ball is hit.
  • (6) Each element described above as an example can be appropriately omitted. For example, it is possible to omit the storage device 14 by incorporating various data items from an external device separate from the motion analysis device 10. In addition, the sound emitting device 16 can be omitted in a configuration where the audio signal S generated by the audio control unit 36 is transmitted to the external device via a communication network or a portable recording medium and is reproduced in the sound emitting device 16 of the external device.
  • (7) In the first embodiment, the observation data acquisition unit 32 sequentially generates the observation data Db by using the sensor output Da supplied from the acceleration sensor 20. However, a configuration can also be employed where the observation data acquisition unit 32 receives the observation data Db which is sequentially generated by the acceleration sensor 20. That is, an element acquiring the observation data Db (observation data acquisition means) includes both an element generating the observation data Db from the detection result obtained by using the acceleration sensor 20 for itself and an element receiving the observation data Db from the external device (acceleration sensor 20).
  • (8) In each embodiment described above, the motion analysis device 10 which analyzes the swing motion of the golf club C has been described as an example. However, a motion in which the motion analysis device 10 can be used is not limited to the action in golf. For example, when analyzing swing motion of a bat in the baseball, swing motion of a racket in the tennis, throwing action of a fishing rod in fishing, the motion analysis system 100 (motion analysis device 10) can also be used.
  • (9) It is also possible to change the amount of reference data Dref for each unit time (sampling cycle) within the analysis section. For example, with regard to a section immediately before or immediately after an impact within the analysis section, it is preferable to configure the section so as to increase the amount of the reference data Dref for each time unit by comparing the section with other sections. As the section has a larger amount of reference data Dref within the unit time, the comparison between the observation data Db and the reference data Dref is performed at a shorter interval, and the observation trajectory Oa and the reference trajectory Oref are closely compared with each other. Therefore, it is possible to detailedly analyze a difference in each trajectory in the section immediately before and immediately after the impact, for example. In addition, the amount of reference data Dref is increased in a certain portion within the analysis section. Therefore, as compared to a configuration of increasing the amount in the entire analysis section, there is an advantage in that the amount of data can be reduced.
  • A consideration may be considered where the amount of reference data Dref is changed in advance inside and outside a predetermined section in the analysis section. However, when the comparison unit 34 increases or decreases the amount of reference data Dref by way of the interpolation process or the thinning process for the reference data series Sref, it is also possible to change the amount of reference data Dref for each unit of time inside and outside the predetermined section in the analysis section.
  • In addition, it is also possible to decrease the amount of reference data Dref for each unit time with regard to a section where a detailed analysis is not required within the analysis section.
  • The motion analysis device according to each embodiment described above is operated by hardware (electronic circuit) such as a digital signal processor (DSP) which is exclusively used to analyze motions of a user, or is also operated in cooperation of a program and a general-purpose arithmetic processing unit such as a central processing unit (CPU).
  • A program of the present invention causes a computer to implement an observation data acquisition process for acquiring observation data indicating a trajectory of a target observation point which moves in conjunction with action of a user; a comparison process for comparing each of a plurality of reference data indicating a predetermined trajectory of the target observation point with the observation data acquired by the observation data acquisition process; and an audio control process used to generate an audio signal according to a result of the comparison process.
  • The above-described program is provided by being stored in a computer-readable recording medium and is installed in the computer. For example, the recording medium is a non-transitory recording medium, and is preferably an optical recording medium (optical disk) such as CD-ROM. However, the recording medium can include any known type of recording medium such as a semiconductor recording medium and a magnetic recording medium.
  • In addition, for example, the program of the present invention can be provided in a distributing manner via a communication network, for example, by a distributing server device, and then can be installed in the computer.
  • While preferred embodiments of the invention have been described and illustrated above, it should be understood that these are exemplary of the invention and are not to be considered as limiting. Additions, omissions, substitutions, and other modifications can be made without departing from the spirit or scope of the present invention. Accordingly, the invention is not to be considered as being limited by the foregoing description, and is only limited by the scope of the appended claims.

Claims (16)

What is claimed is:
1. A motion analysis device comprising:
at least one processor; and
at least one memory including computer program instructions, the at least one memory and the computer program instructions being configured to, in cooperation with the at least one processor, cause the motion analysis device to:
acquire observation data indicating a trajectory of a target observation point which moves in conjunction with motion of a user;
compare reference data indicating a predetermined trajectory with the observation data to generate a comparison result; and
generate an audio signal according to the comparison result.
2. The motion analysis device according to claim 1, wherein the at least one memory and the computer program instructions are configured to cause the motion analysis device further to delay an output of the audio signal.
3. The motion analysis device according to claim 1, wherein time series of the reference data is expanded or contracted on a time axis and each expanded or contracted reference data is compared with the observation data.
4. The motion analysis device according to claim 1, wherein, according to a degree of approximation between a trajectory specified by the observation data and the predetermined trajectory, an additional audio signal is generated in which predetermined sound effects are added to the audio signal according to the comparison result.
5. The motion analysis device according to claim 1, wherein, according to the comparison result, a pitch in the audio signal is changed.
6. The motion analysis device according to claim 1, wherein:
a pitch in the audio signal is raised when a trajectory of the target observation point is closer to the user than a reference trajectory obtained based on the reference data; and
the pitch in the audio signal is lowered when the trajectory of the target observation point is farther from the user than the reference trajectory.
7. The motion analysis device according to claim 1, further comprising a storage device which stores mutually different audio data items by corresponding to components in X-axis, Y-axis and Z-axis directions which configure the observation data and the reference data, wherein the at least one memory and the computer program instructions are configured to cause the motion analysis device further to
compare the observation data with the reference data for each component in the X-axis, Y-axis and Z-axis directions, select the audio data corresponding to the component in the X-axis, Y-axis or Z-axis direction according to a comparison result of each component in the X-axis, Y-axis and Z-axis directions, read the audio data out from the storage device, and output an audio signal according to the selected audio data.
8. The motion analysis device according to claim 1, wherein motion of the user is a golf swing.
9. A motion analysis method performed by one or more processors comprising:
causing an observation data acquisition unit to acquire observation data indicating a trajectory of a target observation point which moves in conjunction with motion of a user;
causing a comparison unit to compare reference data indicating predetermined trajectory with the observation data acquired by the observation data acquisition unit; and
causing an audio control unit to generate an audio signal according to a comparison result obtained by using the comparison unit.
10. The motion analysis method according to claim 9, wherein an output of the audio signal generated by the audio control unit is delayed.
11. The motion analysis method according to claim 9, further comprising causing the comparison unit to expand or contract time series of the reference data on a time axis and to compare each expanded or contracted reference data with the observation data.
12. The motion analysis method according to claim 9, further comprising causing the audio control unit to generate an audio signal in which predetermined sound effects are added to the audio signal according to the comparison result obtained by using the comparison unit, according to a degree of approximation between a trajectory specified by the observation data and the predetermined trajectory.
13. The motion analysis method according to claim 9, further comprising causing the audio control unit to change a pitch in the audio signal according to a comparison result obtained by using the comparison unit.
14. The motion analysis method according to claim 9, further comprising:
causing the audio control unit to raise a pitch in the audio signal when a trajectory of the target observation point is closer to the user than a reference trajectory obtained based on the reference data; and
causing the audio control unit to lower the pitch in the audio signal when the trajectory of the target observation point is farther from the user than the reference trajectory.
15. The motion analysis method according to claim 9, further comprising:
storing mutually different audio data items in a storage unit by corresponding to components in X-axis, Y-axis and Z-axis directions which configure the observation data and the reference data;
comparing the observation data with the reference data for each component in the X-axis, Y-axis and Z-axis directions;
selecting the audio data corresponding to the component in the X-axis, Y-axis or Z-axis direction according to a comparison result of each component in the X-axis, Y-axis and Z-axis directions and reading out the audio data from the storage means; and
outputting an audio signal according to the selected audio data.
16. The motion analysis method according to claim 9, wherein motion of the user is a golf swing.
US14/132,531 2012-12-21 2013-12-18 Motion Analysis Device Abandoned US20140180632A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2012279501A JP5835206B2 (en) 2012-12-21 2012-12-21 Motion analyzer
JP2012-279501 2012-12-21

Publications (1)

Publication Number Publication Date
US20140180632A1 true US20140180632A1 (en) 2014-06-26

Family

ID=50947043

Family Applications (1)

Application Number Title Priority Date Filing Date
US14/132,531 Abandoned US20140180632A1 (en) 2012-12-21 2013-12-18 Motion Analysis Device

Country Status (4)

Country Link
US (1) US20140180632A1 (en)
JP (1) JP5835206B2 (en)
KR (1) KR20140081695A (en)
CN (1) CN103877715B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017031247A1 (en) 2015-08-18 2017-02-23 University Of Miami Method and system for adjusting audio signals based on motion deviation
CN106512362A (en) * 2016-12-13 2017-03-22 中山市得高行知识产权中心(有限合伙) Table tennis auxiliary training system and method
US20180140925A1 (en) * 2016-11-21 2018-05-24 Casio Computer Co., Ltd. Movement analysis device, movement analysis method and recording medium
CN111433831A (en) * 2017-12-27 2020-07-17 索尼公司 Information processing apparatus, information processing method, and program
US10750279B2 (en) 2016-10-14 2020-08-18 Sony Corporation Signal processing device and signal processing method for detection of direction of movement of a region

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20160024477A (en) * 2014-08-26 2016-03-07 김홍채 Golf swing analyzer
EP3252736A1 (en) * 2016-06-03 2017-12-06 West & Bergh IT Consulting AB Motion training aid
JP7134418B2 (en) * 2020-05-18 2022-09-12 カシオ計算機株式会社 Motion analysis device, motion analysis method and program

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020115047A1 (en) * 2001-02-16 2002-08-22 Golftec, Inc. Method and system for marking content for physical motion analysis
US6514081B1 (en) * 1999-08-06 2003-02-04 Jeffrey L. Mengoli Method and apparatus for automating motion analysis
US20040172213A1 (en) * 2001-07-11 2004-09-02 Kainulainen Raimo Olavi Motion analyzing device
US20070135225A1 (en) * 2005-12-12 2007-06-14 Nieminen Heikki V Sport movement analyzer and training device
US20070238538A1 (en) * 2006-03-16 2007-10-11 Priester William B Motion training apparatus and method
US20110021318A1 (en) * 2009-07-20 2011-01-27 Joanna Lumsden Audio feedback for motor control training
US20110143866A1 (en) * 2009-12-14 2011-06-16 William Dean McConnell Core Tempo Golf Swing Training Tones
US20120088612A1 (en) * 2009-06-17 2012-04-12 Vernon Ralph Johnson training aid
US20120196693A1 (en) * 2011-02-02 2012-08-02 Seiko Epson Corporation Swing analysis device, program, and swing analysis method

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5221088A (en) * 1991-01-22 1993-06-22 Mcteigue Michael H Sports training system and method
JPH0947535A (en) * 1995-08-05 1997-02-18 Yoshikazu Nakamura Golf swing practice device
JP2004024627A (en) * 2002-06-26 2004-01-29 Yamaha Corp Device for movement practice
KR100631035B1 (en) * 2004-06-03 2006-10-02 이기영 swing training equipment in ball game sports
US7219033B2 (en) * 2005-02-15 2007-05-15 Magneto Inertial Sensing Technology, Inc. Single/multiple axes six degrees of freedom (6 DOF) inertial motion capture system with initial orientation determination capability
JP2009125499A (en) * 2007-11-27 2009-06-11 Panasonic Electric Works Co Ltd Tennis swing improvement supporting system
JP5338104B2 (en) * 2008-03-27 2013-11-13 ヤマハ株式会社 Exercise support apparatus and program
JP5381293B2 (en) * 2009-04-28 2014-01-08 ヤマハ株式会社 Sound emission control device
JP2011019793A (en) * 2009-07-17 2011-02-03 Ishida Co Ltd Sports technique-improving device
JP2011062352A (en) * 2009-09-17 2011-03-31 Koki Hashimoto Exercise motion teaching device and play facility
JP2011120644A (en) * 2009-12-08 2011-06-23 Yamaha Corp Rotation movement analyzer and program
JP5948011B2 (en) * 2010-11-19 2016-07-06 セイコーエプソン株式会社 Motion analysis device
JP5641222B2 (en) * 2010-12-06 2014-12-17 セイコーエプソン株式会社 Arithmetic processing device, motion analysis device, display method and program

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6514081B1 (en) * 1999-08-06 2003-02-04 Jeffrey L. Mengoli Method and apparatus for automating motion analysis
US20020115047A1 (en) * 2001-02-16 2002-08-22 Golftec, Inc. Method and system for marking content for physical motion analysis
US20040172213A1 (en) * 2001-07-11 2004-09-02 Kainulainen Raimo Olavi Motion analyzing device
US20070135225A1 (en) * 2005-12-12 2007-06-14 Nieminen Heikki V Sport movement analyzer and training device
US20070238538A1 (en) * 2006-03-16 2007-10-11 Priester William B Motion training apparatus and method
US20120088612A1 (en) * 2009-06-17 2012-04-12 Vernon Ralph Johnson training aid
US20110021318A1 (en) * 2009-07-20 2011-01-27 Joanna Lumsden Audio feedback for motor control training
US20110143866A1 (en) * 2009-12-14 2011-06-16 William Dean McConnell Core Tempo Golf Swing Training Tones
US20120196693A1 (en) * 2011-02-02 2012-08-02 Seiko Epson Corporation Swing analysis device, program, and swing analysis method

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017031247A1 (en) 2015-08-18 2017-02-23 University Of Miami Method and system for adjusting audio signals based on motion deviation
EP3337401A4 (en) * 2015-08-18 2019-04-03 University of Miami Method and system for adjusting audio signals based on motion deviation
EP4268713A3 (en) * 2015-08-18 2024-01-10 University of Miami Method and system for adjusting audio signals based on motion deviation
US10750279B2 (en) 2016-10-14 2020-08-18 Sony Corporation Signal processing device and signal processing method for detection of direction of movement of a region
US20180140925A1 (en) * 2016-11-21 2018-05-24 Casio Computer Co., Ltd. Movement analysis device, movement analysis method and recording medium
US10661142B2 (en) * 2016-11-21 2020-05-26 Casio Computer Co., Ltd. Movement analysis device for determining whether a time range between a start time and a completion time of a predetermined movement by a target person is valid, and movement analysis method and recording medium
CN106512362A (en) * 2016-12-13 2017-03-22 中山市得高行知识产权中心(有限合伙) Table tennis auxiliary training system and method
CN111433831A (en) * 2017-12-27 2020-07-17 索尼公司 Information processing apparatus, information processing method, and program
US11508344B2 (en) 2017-12-27 2022-11-22 Sony Corporation Information processing device, information processing method and program

Also Published As

Publication number Publication date
CN103877715A (en) 2014-06-25
JP2014121456A (en) 2014-07-03
CN103877715B (en) 2017-08-08
JP5835206B2 (en) 2015-12-24
KR20140081695A (en) 2014-07-01

Similar Documents

Publication Publication Date Title
US20140180632A1 (en) Motion Analysis Device
US9162130B2 (en) Swing analyzing device, swing analyzing program, and recording medium
CN104169995B (en) Information processing equipment, information processing system
US20150285834A1 (en) Sensor, computing device, and motion analyzing apparatus
US20120316004A1 (en) Swing analyzing device, swing analyzing program, and recording medium
US20170215771A1 (en) Motion analysis method, motion analysis apparatus, motion analysis system, and program
US9017079B2 (en) Information notification apparatus that notifies information of data of motion
JP6354461B2 (en) Feedback providing method, system, and analysis apparatus
KR101556055B1 (en) Notification control apparatus, notification control method and computer readable recording medium for storing program thereof
US20130251287A1 (en) Image processing device that generates a composite image
CN108211302B (en) Motion analysis device, motion analysis method, and storage medium
JP2018153295A (en) Motion analysis device, motion analysis method, and motion analysis system
KR101705836B1 (en) System and Method for analyzing golf swing motion using Depth Information
US20170087409A1 (en) Imaging control method, imaging control apparatus, imaging control system, and program
JP2011234018A (en) Information processing device, method, and program
JP2021119993A (en) Sway detector and sway detection program
JP2015150134A (en) Sway detector, sway detection system, and sway detection program
JP2016073764A (en) Golf training support device
JP2016209180A (en) Swing analyzer, swing analysis system, swing analysis program and recording medium
AU2015291766A1 (en) Systems for reviewing sporting activities and events
US20230153610A1 (en) Using machine trained networks to analyze golf swings
JP2021007448A (en) Program, method, information processing device and plate space
JP2015073823A (en) Motion analysis method and motion analyzer
WO2023123373A1 (en) Vibration signal source positioning method, system, and medium
JP2016209282A (en) Swing analyzer, swing analysis system, swing analysis program and recording medium

Legal Events

Date Code Title Description
AS Assignment

Owner name: YAMAHA CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:YATAKA, KOJI;REEL/FRAME:031811/0662

Effective date: 20131213

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION