US8812258B2 - Identifying a type of motion of an object - Google Patents

Identifying a type of motion of an object Download PDF

Info

Publication number
US8812258B2
US8812258B2 US13/975,170 US201313975170A US8812258B2 US 8812258 B2 US8812258 B2 US 8812258B2 US 201313975170 A US201313975170 A US 201313975170A US 8812258 B2 US8812258 B2 US 8812258B2
Authority
US
United States
Prior art keywords
acceleration
motion
signatures
signature
matching
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
US13/975,170
Other versions
US20130346014A1 (en
Inventor
Vijay Nadkarni
Jeetendra Jangle
John Bentley
Umang Salgia
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Newyu Inc
IMETRIKUS Inc dba NUMERA
Original Assignee
NUMERA Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US12/560,069 external-priority patent/US20100217533A1/en
Application filed by NUMERA Inc filed Critical NUMERA Inc
Priority to US13/975,170 priority Critical patent/US8812258B2/en
Assigned to IMETRIKUS, INC. DBA NUMERA reassignment IMETRIKUS, INC. DBA NUMERA ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BENTLEY, JOHN, JANGLE, JEETENDRA, NADKARNI, VIJAY, SALGIA, UMANG
Publication of US20130346014A1 publication Critical patent/US20130346014A1/en
Assigned to MULTIPLIER CAPITAL, LP reassignment MULTIPLIER CAPITAL, LP SECURITY INTEREST Assignors: NUMERA, INC.
Application granted granted Critical
Publication of US8812258B2 publication Critical patent/US8812258B2/en
Assigned to NUMERA, INC. reassignment NUMERA, INC. ACKNOWLEDGMENT OF TERMINATION OF INTELLECTUAL PROPERTY SECURITY AGREEMENT Assignors: MULTIPLIER CAPITAL, LP
Assigned to NEWYU, INC. reassignment NEWYU, INC. CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: NUMERA, INC.
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C22/00Measuring distance traversed on the ground by vehicles, persons, animals or other moving solid bodies, e.g. using odometers, using pedometers
    • G01C22/006Pedometers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1116Determining posture transitions
    • G06F19/345
    • G06K9/00342
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/23Recognition of whole body movements, e.g. for sport training
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7253Details of waveform analysis characterised by using transforms
    • A61B5/726Details of waveform analysis characterised by using transforms using Wavelet transforms
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing

Definitions

  • the described embodiments relate generally to motion detecting. More particularly, the described embodiments relate to a method and apparatus for identifying a type of motion of an animate or inanimate object.
  • An embodiment includes a method of detecting human condition and activity of a user.
  • the method includes generating, by a motion detection device that includes a motion detection sensor, an acceleration signature based on sensed acceleration of an object that represents motion of the user, the motion detection device determining what network connections are available to the motion detection device, matching the acceleration signature with at least one of a plurality of stored acceleration signatures of motions of human beings, wherein each stored acceleration signatures corresponds with a type of motion, wherein the motion detection device distributes at least some of the acceleration signature matching processing when processing capability is available to the motion detection device though available network connections, and identifying a type of motion of the user and identifying a condition of the user based on the matching of the acceleration signature with a stored acceleration signature.
  • the apparatus includes a motion detection sensor operative to generate an acceleration signature based on sensed acceleration of the user, and a controller.
  • the controller is operative to determine what network connections are available to the motion detection device, match the acceleration signature with at least one of a plurality of stored acceleration signatures, wherein each stored acceleration signatures corresponds with a type of motion of the user, wherein the apparatus distributes at least some of the acceleration signature matching processing when processing capability is available to the motion detection device though available network connections, and identify the type of motion of the user and identify a condition of the user based on the matching of the acceleration signature.
  • FIG. 1 shows examples of different types of motions of a human being that an object attached to the human being can be used to detect or sense.
  • FIGS. 2A , 2 B, 2 C shows examples of time-lines of several different acceleration curves (signatures), wherein each signature is associated with a different type of sensed or detected motion.
  • FIG. 3 is an example of a block diagram of a motion detection device.
  • FIG. 4 is a flowchart that includes the steps of an example of a method for detecting various motions of daily living activities and emergency situations, such as, a fall.
  • FIG. 5 is a flowchart that includes the steps of a method for detection of a fall.
  • FIG. 6 is a flow chart that includes the steps of one example of a method of identifying a type of motion of an animate or inanimate object.
  • FIG. 7 is a flow chart that includes steps of one example of a method of a motion detection device checking network availability for improvements in speed and/or processing power of acceleration signature matching.
  • FIG. 8 shows a motion detection device that can be connected to one of multiple networks.
  • the atomic movements become simply alphabets that include elemental motions.
  • specific sequences of elemental motions become the vocabulary that comprises human behavior.
  • the behavioral pattern of the person is walking to the door or the house, opening and closing the door, walking further to the car, settling down in the car, starting the engine, accelerating the car, going through a series of stops, starts and turns, parking the car, getting out and closing the car door, and finally walking to the shopping center.
  • This sequence of human behavior is comprised of individual motions such as standing, walking, sitting, accelerating (in the car), decelerating, and turning left or right.
  • Each individual motion, for example walking is comprised of multiple atomic movements such as acceleration in an upward direction, acceleration in a downward direction, a modest forward acceleration with each step, a modest deceleration with each step, and so on.
  • Motion is identified by generating acceleration signatures based on the sensed acceleration of the object.
  • the acceleration signatures are compared with a library of motion signature, allowing the motion of the object to be identified. Further, sequences of the motions can be determined, allowing identification of activities of, for example, a person the object is attached to.
  • Algorithms used for pattern recognition should have the sophistication to accurately handle a wide range of motions. Such algorithms should have the ability to recognize the identical characteristics of a particular motion by a given human being, yet allow for minor variations arising from human randomness. Additionally, the devices used to monitor peoples' movement need to be miniature and easy to wear. These two objectives are fundamentally opposed. However, the described embodiments provide a single cohesive system that is both sophisticated enough to detect a wide range of motions.
  • FIG. 1 shows examples of different types of motions of a human being that an object attached to the human being can be used to detect or sense.
  • the human motions can include, for example, standing, sleeping, walking, and running A first motion 110 can include walking A second motion 120 can include falling.
  • a third motion 130 can include running Each of the motions generates a unique motion signature.
  • the signatures can be universal to, for example, many individuals. Additionally, the signatures can have additional characteristics that are unique to an individual.
  • FIGS. 2A , 2 B, 2 C shows examples of different types of acceleration and orientation signatures for various sample motions by human beings. It should be noted that these signatures are expected to have certain components that are common from one human being to the next, but also have certain components that vary from one human to the next.
  • FIGS. 2A , 2 B, 2 C are depicted in only one orientation. That is, three accelerometers can be used to generate acceleration signatures in the X, Y and Z (three) orientations.
  • the signatures of FIGS. 2A , 2 B, 2 C only show the signature of one of the three orientations. It is to be understood that matching can use the other orientations as well.
  • FIG. 2A shows an example of an acceleration signature of a person doing a slow fall and lying down summersault.
  • FIG. 2B shows an example of an acceleration signature of a person slipping and falling back on a bouncy surface (for example, an air mattress).
  • FIG. 2C shows an acceleration signature of a person fall on their face with their knees flexed.
  • FIG. 3 is an example of a block diagram of a motion detection device.
  • the motion detection device can be attached to an object, and therefore, detect motion of the object that can be identified. Based on the identified motion, estimates of the behavior and conditions of the object can be determined.
  • the motion detection device includes sensors (such as, accelerometers) that detect motion of the object.
  • sensors such as, accelerometers
  • One embodiment of the sensors includes accelerometers 312 , 314 , 316 that can sense, for example, acceleration of the object in X, Y and Z directional orientations. It is to be understood that other types of motion detection sensors can alternatively be used.
  • An analog to digital converter digitizes analog accelerometer signals.
  • the digitized signals are received by compare processing circuitry 330 that compares the digitized accelerometer signals with signatures that have been stored within a library of signatures 340 .
  • Each signature corresponds with a type of motion. Therefore, when a match between the digitized accelerometer signals and a signature stored in the library 340 , the type of motion experienced by the motion detection device can determined.
  • An embodiment includes filtering the accelerometer signals before attempting to match the signatures. Additionally, the matching process can be made simpler by reducing the possible signature matches.
  • An embodiment includes identifying a previous human activity context. That is, for example, by knowing that the previous human activity was walking, certain signatures can intelligently be eliminated from the possible matches of the present activity that occurs subsequent to the previous human activity (walking).
  • An embodiment includes additionally reducing the number of possible signature matches by performing a time-domain analysis on the accelerometer signal.
  • the time-domain analysis can be used to identify a transient or steady-state signature of the accelerometer signal. That is, for example, a walk may have a prominent steady-state signature, whereas a fall may have a prominent transient signature.
  • Identification of the transient or steady-state signature of the accelerometer signal can further reduce or eliminate the number of possible signature matches, and therefore, make the task of matching the accelerometer signature with a signature within the library of signature simpler, and easier to accomplish. More specifically, the required signal processing is simpler, easier, and requires less computing power.
  • an audio device 360 and/or a global positioning system (GPS) 370 can engaged to provide additional information that can be used to determine the situation of, for example, a human being the motion detection device is attached to.
  • GPS global positioning system
  • the condition, or information relating to the motion detection device can be communicated through a wired or wireless connection.
  • a receiver of the information can process it, and make a determination regarding the status of the human being the motion detection device is attached to.
  • FIG. 4 is a flowchart that includes the steps of an example of a method for detecting various motions of daily living activities and emergency situations, such as, a fall.
  • a first step 410 includes monitoring an activity of a person the motion detection device is attached.
  • Raw signal data is collected from, for example, an accelerometer sensor.
  • a second step 420 includes performing instantaneous computations over raw signals to compute atomic motions along with gravity vector and tilt vector.
  • a step third step 430 includes applying series of digital filters to remove noise in the atomic motions data.
  • a fourth step 440 includes performing state analysis on series of atomic data samples and forming context. Depending on the state analysis, the series of atomic data is passed through either a step 445 periodic or steady state data analysis or a step 450 transient state data analysis.
  • a sixth step 460 includes formation of macro motion signatures.
  • the macro motion signatures are built from an output of state analysis vectors using known wavelet transformation techniques (for example, a Haar Transform).
  • the transform performs pattern matching on current motion pattern with existing motion pattern library using, for example, DWT (Discreet Wavelet Transform) techniques.
  • Complex motion wavelets are later matched using statistical pattern matching techniques, such as, HHMM (Hidden Heuristic Markov Model).
  • the statistical pattern matching includes detecting and classifying events of interest.
  • the events of interest are built by observing various motions and orientation states data of an animate or inanimate object. This data is used to train the statistical model which performs the motion/activity detection. Each activity will have its own model trained based on the observed data.
  • a seventh step 470 includes a learning system providing the right model for the user from a set of model. It also aids in building newer (personal) patterns which are not in the library for the person who is wearing the motion detection device.
  • An eighth step 480 includes pre-building a motion database of motion libraries against which motion signatures are compared. The database adds new motion/states signature dynamically as they are identified.
  • FIG. 5 is a flowchart that includes the steps of an example of a method for detecting a fall.
  • a first step 510 includes monitoring an activity of, for example, a person the motion detection device is attached to.
  • a step 515 includes recording and reporting in deviations in normal motion patterns of the person.
  • a step 520 includes detecting the acceleration magnitude deviation exceeding a threshold. The acceleration magnitude deviation exceeding the threshold can be sensed as a probable fall, and audio recording is initiated. Upon detection of this condition, sound recording of the person the motion detection device is connected to can be activated. The activation of sound can provide additional information that can be useful in assessing the situation of the person.
  • a step 530 includes monitoring the person after the probable fall.
  • a step 525 includes detection of another acceleration having magnitude lesser than the threshold, and continuing monitoring of audio.
  • a step 535 includes detecting a short period of inactivity.
  • a step 540 includes monitoring the person after determining a fall probably occurred.
  • a step 545 includes subsequently detecting normal types of motion and turning off the audio because the person seems to be performing normal activity.
  • a step 550 includes monitoring a period of inactivity.
  • a step 555 includes additional analysis of detected information and signals.
  • a step 560 includes further analysis including motion data, orientation detection all indicating the person is functioning normally.
  • a step 560 includes determining that a fall has occurred based on the analysis of the motion data, and analysis of a concluded end position and orientation of the person. The sound recording can be de-activated.
  • a step 565 includes concluding that a fall has occurred.
  • a step 570 includes sending an alert and reporting sound recordings.
  • a step 575 includes the fall having been reported.
  • a step 580 includes an acknowledgement of the fall.
  • FIG. 6 is a flow chart that includes the steps of one example of a method of identifying a type of motion of an animate or inanimate object.
  • a first step 610 includes generating an acceleration signature (for example, a tri-axial) based on the sensed acceleration of the object.
  • a second step 620 includes matching the acceleration signature with at least one of a plurality of stored acceleration signatures, wherein each stored acceleration signatures corresponds with type of motion.
  • a third step 630 includes identifying the type of motion of the object based on the statistical (pattern) matching or exact matching of the acceleration signature.
  • the acceleration signal can be created using a wavelet transformation.
  • the type of motion includes at least one of atomic motion, elemental motion and macro-motion.
  • the first step 610 can include generating an acceleration signature, (and/or) orientation and audio signature based on the sensed acceleration, orientation of the object and audio generated by the object, for example, a thud of a fall, or a cry for help.
  • Atomic motion includes but is not limited to a sharp jolt, a gentle acceleration, complete stillness, a light acceleration that becomes stronger, a strong acceleration that fades, a sinusoidal or quasi-sinusoidal acceleration pattern, vehicular acceleration, vehicular deceleration, vehicular left and right turns, and more.
  • Elemental motion includes but is not limited to motion patterns for walking, running, fitness motions (e.g. elliptical machine exercises, rowing, stair climbing, aerobics, skipping rope, bicycling . . . ), vehicular traversal, sleeping, sitting, crawling, turning over in bed, getting out of bed, getting up from chair, and more.
  • fitness motions e.g. elliptical machine exercises, rowing, stair climbing, aerobics, skipping rope, bicycling . . .
  • vehicular traversal e.g. elliptical machine exercises, rowing, stair climbing, aerobics, skipping rope, bicycling . . .
  • vehicular traversal sleeping, sitting, crawling, turning over in bed, getting out of bed, getting up from chair, and more.
  • Macro-motion includes but is not limited to going for a walk in the park, leaving home and driving to the shopping center, getting out of bed and visiting the bathroom, performing household chores, playing a game of tennis, and more.
  • Each of the plurality of stored acceleration signatures corresponds with a particular type of motion.
  • An embodiment includes a common library and a specific library, and matching the acceleration signature includes matching the acceleration signature with stored acceleration signatures of the common library, and then matching the acceleration signature with stored acceleration signatures of the specific library.
  • the general library includes universal acceleration signatures
  • the specific library includes personal acceleration signatures. That is, for example, the stored acceleration signatures of the common library are useable for matching acceleration signatures of motions of multiple humans, and the stored acceleration signatures of the specific library are useable for matching acceleration signatures of motions of a particular human.
  • each library can be further categorized to reduce the number of possible matches. For example, at an initialization, a user may enter physical characteristics of the user, such as, age, sex and/or physical characteristics (such as, the user has a limp). Thereby, the possible signatures matches within the general library can be reduced.
  • the signature entries within the specific library can be learned (built) over time as the human wearing the motion detection device goes through normal activities of the specific human.
  • the specific library can be added to, and improved over time.
  • An embodiment includes filtering the acceleration signals. Additional embodiment include reducing the number of stored acceleration signature matches by identifying a previous activity of the object, and performing a time domain analysis on the filtered acceleration signal to identify transient signatures or steady-state signatures of the filtered acceleration signal. That is, by identifying a previous activity (for example, a human walking of sleeping) the possible number of present activities can be reduced, and therefore, the number of possible stored acceleration signature matches reduced. Additionally, the transient and/or steady-state signatures can be used to reduce the number of possible stored acceleration signature matches, which can improve the processing speed.
  • Another embodiment includes activating audio sensing of the object if matches are made with at least portions of particular stored acceleration signatures. For example, if the acceleration signature exceeds a threshold value, then audio sensing of the object is activated. This is useful because the audio information can provide additional clues as to what, for example, the condition of a person. That is, a fall may be detected, and audio information can be used to confirm that a fall has in fact occurred.
  • Another embodiment includes transmitting the sensed audio. For example, of a user wearing the object has fallen, and the fall has been detected, audio information can be very useful for determining the condition of the user.
  • the audio information can allow a receiver of the audio information to determine, for example, if the user is in pain, unconscious or in a dangerous situation (for example, in a shower or in a fire).
  • An embodiment includes the object being associated a person, and the stored acceleration signatures corresponding with different types of motion related to the person.
  • a particular embodiment includes identifying an activity of the person based on a sequence of identified motions of the person.
  • the activity of the person can include, for example, falling (the most important in some applications), walking, running, driving and more.
  • the activities can be classified as daily living activities such as walking, running, sitting, sleeping, driving, climbing stairs, and more, or sporadic activities, such as falling, having a car collision, having a seizure and so on.
  • An embodiment includes transmitting information related to the identified type of motion if matches are made with particular stored acceleration signatures.
  • the information related to the identified type of motion can include at least one of motions associated with a person the object is associated with.
  • the motions can include, for example, a heartbeat of the person, muscular spasms, facial twitches, involuntary reflex movements which can be sensed by, for example, an accelerometer.
  • the information related to the identified type of motion can include at least one of location of the object, audio sensed by the object, temperature of the object.
  • Another embodiment includes storing at least one of the plurality of stored acceleration signatures during an initialization cycle.
  • the initializing cycle can be influenced based on what the object is attached to. That is, initializing the stored acceleration signatures (motion patterns) can be based on what the object is attached to, which can both reduce the number of signatures required to be stored within, for example, the general library, and reduce the number of possible matches and reduce the processing required to identify a match.
  • initializing the stored acceleration signatures can be based on who the object is attached to, which can influence the specific library.
  • the initialization can be used to determine motions unique, for example, to an individual. For example, a unique motion can be identified for a person who walks with a limp, and the device can be initialized with motion patterns of the person walking with a limp.
  • An embodiment includes initiating a low-power sleep mode of the object if sensed acceleration is below a threshold for a predetermined amount of time. That is, if, for example, a person is sensed to be sleeping, power can be saved by de-activating at least a portion of the motion sensing device.
  • FIG. 7 is a flow chart that includes steps of one example of a method of a motion detection device checking network availability for improvements in speed and/or processing power of acceleration signature matching, wherein the motion detection device includes motion detection sensors that generate the acceleration signal.
  • a first step 710 includes the motion detection device determining what network connections are available to the motion detection device.
  • a second step 710 includes the motion detection device distributing at least some of the acceleration signature matching processing if processing capability is available to the motion detection device though available network connections.
  • the motion detection device distributes the acceleration signature matching processing if the processing capability is available to the motion detection device though available network connections, and distributing the acceleration signature matching processing saves the motion detection device processing power.
  • the motion detection device distributes the acceleration signature matching processing if the processing capability is available to the motion detection device though available network connections, and distributing the acceleration signature matching processing increases a speed of the motion detection device processing.
  • the motion detection device distributes the processing to optimize both power and processing speed. Additionally, the processing distribution can be dependent upon the bandwidths of the available network connections. That is, some networks connections can generally support higher data transfer rates, and therefore, influence the processing speed.
  • the motion detection device scales its processing to the level of processing available. That is, as additional processing power becomes available to the motion detection device, the motion detection device can increase the complexity of the signature matching processing.
  • the processing can be distributed as processing capability becomes available through network connections. The processing can be performed in different locations as network connectivity becomes available, which can advantageously reduce the power consumption of the motion detection device and/or increase the speed of the processing.
  • FIG. 8 shows a motion detection device 300 that can be connected to one of multiple networks.
  • Examples of possible networks the motion detection device 300 can connect to, include a cellular network 820 through, for example, a blue tooth wireless link 810 , or to a home base station 840 through, for example, a Zigbee wireless link 845 .
  • the wireless links 810 , 845 can each provide different levels of bandwidth.
  • Each of the networks includes available processing capabilities 830 , 850 .
  • the motion detection device 300 If the motion detection device 300 does not have any network connections available, the motion detection device 300 must perform its own matching processing. If this is the case, then the processing algorithms may be less complex to reduce processing power, and/or reduce processing speed. For example, the matching processing can be made simpler by comparing threshold levels for elemental motions by extracting significant wavelet coefficients. Acceleration signals data acquisition is performed in chunk of processing every few mili-seconds by waking up. For all other times the processor rests in low-power mode. Except for the emergency situation, the RF communication is done periodically when the data is in steady state, there is no need to send it to network i.e. when the object is in sedentary there is no need to send data change in the state is communicated to network.
  • the operation of the motion detection device 300 may be altered. For example, if the motion detection device 300 detects an emergency situation (such as, a fall), the motion detection device 300 may generate an audio alert. If a network connection was available, the audio alert may not be generated, but an alert may be transmitted over the available network.
  • an emergency situation such as, a fall
  • the motion detection device 300 includes a processor in which at least a portion of the analysis and signature matching can processing can be completed. However, if the motion detection device 300 has one or more networks available to the motion detection device 300 , the motion detection device can off-load some of the processing to one of the processors 730 , 750 associated with the networks.
  • the determination of whether to off-load the processing can be based on both the processing capabilities provided by available networks, and the data rates (bandwidth) provided by each of the available networks.

Abstract

An apparatus for identifying a type of motion and condition of a user is disclosed. One apparatus includes a motion detection sensor operative to generate an acceleration signature based on sensed acceleration of the user, and a controller. The controller is operative to determine what network connections are available to the motion detection device, match the acceleration signature with at least one of a plurality of stored acceleration signatures, wherein each stored acceleration signatures corresponds with a type of motion of the user, wherein the apparatus distributes at least some of the acceleration signature matching processing when processing capability is available to the motion detection device though available network connections, and identify the type of motion of the user and identify a condition of the user based on the matching of the acceleration signature.

Description

RELATED APPLICATIONS
This patent application is a continuation of U.S. patent application Ser. No. 12/883,304, filed Sep. 16, 2010, which is a continuation in part (CIP) of U.S. patent application Ser. No. 12/560,069 filed on Sep. 15, 2009, which claims priority to U.S. provisional patent application Ser. No. 61/208,344 filed on Feb. 23, 2009 which are all incorporated by reference.
FIELD OF THE DESCRIBED EMBODIMENTS
The described embodiments relate generally to motion detecting. More particularly, the described embodiments relate to a method and apparatus for identifying a type of motion of an animate or inanimate object.
BACKGROUND
There is an increasing need for remote monitoring of individuals, animals and inanimate objects in their daily or natural habitats. Many seniors live independently and need to have their safety and wellness tracked. A large percentage of society is fitness conscious, and desire to have, for example, workouts and exercise regimen assessed. Public safety officers, such as police and firemen, encounter hazardous situations on a frequent basis, and need their movements, activities and location to be mapped out precisely.
The value in such knowledge is enormous. Physicians, for example, like to know their patients sleeping patterns so they can treat sleep disorders. A senior living independently wants peace of mind that if he has a fall it will be detected automatically and help summoned immediately. A fitness enthusiast wants to track her daily workout routine, capturing the various types of exercises, intensity, duration and caloric burn. A caregiver wants to know that her father is living an active, healthy lifestyle and taking his daily walks. The police would like to know instantly when someone has been involved in a car collision, and whether the victims are moving or not.
Existing products for the detection of animate and inanimate motions are simplistic in nature, and incapable of interpreting anything more than simple atomic movements, such as jolts, changes in orientation and the like. It is not possible to draw reliable conclusions about human behavior from these simplistic assessments.
It is desirable to have an apparatus and method that can accurately monitor motion of either animate of inanimate objects.
SUMMARY
An embodiment includes a method of detecting human condition and activity of a user. The method includes generating, by a motion detection device that includes a motion detection sensor, an acceleration signature based on sensed acceleration of an object that represents motion of the user, the motion detection device determining what network connections are available to the motion detection device, matching the acceleration signature with at least one of a plurality of stored acceleration signatures of motions of human beings, wherein each stored acceleration signatures corresponds with a type of motion, wherein the motion detection device distributes at least some of the acceleration signature matching processing when processing capability is available to the motion detection device though available network connections, and identifying a type of motion of the user and identifying a condition of the user based on the matching of the acceleration signature with a stored acceleration signature.
Another embodiment includes an apparatus for identifying a type of motion and condition of a user. The apparatus includes a motion detection sensor operative to generate an acceleration signature based on sensed acceleration of the user, and a controller. The controller is operative to determine what network connections are available to the motion detection device, match the acceleration signature with at least one of a plurality of stored acceleration signatures, wherein each stored acceleration signatures corresponds with a type of motion of the user, wherein the apparatus distributes at least some of the acceleration signature matching processing when processing capability is available to the motion detection device though available network connections, and identify the type of motion of the user and identify a condition of the user based on the matching of the acceleration signature.
Other aspects and advantages of the described embodiments will become apparent from the following detailed description, taken in conjunction with the accompanying drawings, illustrating by way of example the principles of the described embodiments.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 shows examples of different types of motions of a human being that an object attached to the human being can be used to detect or sense.
FIGS. 2A, 2B, 2C shows examples of time-lines of several different acceleration curves (signatures), wherein each signature is associated with a different type of sensed or detected motion.
FIG. 3 is an example of a block diagram of a motion detection device.
FIG. 4 is a flowchart that includes the steps of an example of a method for detecting various motions of daily living activities and emergency situations, such as, a fall.
FIG. 5 is a flowchart that includes the steps of a method for detection of a fall.
FIG. 6 is a flow chart that includes the steps of one example of a method of identifying a type of motion of an animate or inanimate object.
FIG. 7 is a flow chart that includes steps of one example of a method of a motion detection device checking network availability for improvements in speed and/or processing power of acceleration signature matching.
FIG. 8 shows a motion detection device that can be connected to one of multiple networks.
DETAILED DESCRIPTION
The monitoring of human activities generally falls into three categories: safety, daily lifestyle, and fitness. By carefully interpreting human movements it is possible to draw accurate and reasonably complete inferences about the state of well being of individuals. A high degree of sophistication is required in these interpretations. Simplistic assessments of human activity lead to inaccurate determinations, and ultimately are of questionable value. By contrast, a comprehensive assessment leads to an accurate interpretation and can prove to be indispensable in tracking the well being and safety of the individual.
To draw accurate inferences about the behavior of humans, it turns out that the atomic movements become simply alphabets that include elemental motions. Furthermore, specific sequences of elemental motions become the vocabulary that comprises human behavior. As an example, take the case of a person who leaves the home and drives to the shopping center. In such a scenario, the behavioral pattern of the person is walking to the door or the house, opening and closing the door, walking further to the car, settling down in the car, starting the engine, accelerating the car, going through a series of stops, starts and turns, parking the car, getting out and closing the car door, and finally walking to the shopping center. This sequence of human behavior is comprised of individual motions such as standing, walking, sitting, accelerating (in the car), decelerating, and turning left or right. Each individual motion, for example walking, is comprised of multiple atomic movements such as acceleration in an upward direction, acceleration in a downward direction, a modest forward acceleration with each step, a modest deceleration with each step, and so on.
With written prose, letters by themselves convey almost no meaning at all. Words taken independently convey individual meaning, but do not provide the context to comprehend the situation. It takes a complete sentence to obtain that context. Along the same line of reasoning, it requires a comprehension of a complete sequence of movements to be able to interpret human behavior.
Although there is an undeniable use for products that are able to detect complex human movements accurately, the key to the success of such technologies lies in whether users adopt them or not. The technology needs to capture a wide range of human activities. The range of movements should ideally extend to all types of daily living activities that a human being expects to encounter—sleeping, standing, walking, running, aerobics, fitness workouts, climbing stairs, vehicular movements, falling, jumping and colliding, to name some of the more common ones.
It is important to detect human activities with a great deal of precision. In particular, activities that relate to safety, fitness, vehicular movements, and day to day lifestyle patterns such as walking, sleeping, climbing stairs, are important to identify precisely. For example, it is not enough to know that a person is walking One needs to know the pace and duration of the walk, and additional knowledge of gait, unsteadiness, limping, cadence and the like are important.
It is critical that false positives as well as false negatives be eliminated. This is especially important for cases of safety, such as falls, collisions, and the like. Human beings come in all types—short, tall, skinny, obese, male, female, athletic, couch potato, people walking with stick/rolator, people with disabilities, old and young. The product needs to be able to adapt to their individuality and lifestyle.
The embodiments described provide identification of types of motion of an animate or inanimate object. Motion is identified by generating acceleration signatures based on the sensed acceleration of the object. The acceleration signatures are compared with a library of motion signature, allowing the motion of the object to be identified. Further, sequences of the motions can be determined, allowing identification of activities of, for example, a person the object is attached to.
Just as the handwritten signatures of a given human being are substantively similar from one signature instance to the next, yet have minor deviations with each new instance, so too will the motion signatures of a given human be substantively similar from one motion instance to the next, yet have minor deviations.
Algorithms used for pattern recognition (signature matching) should have the sophistication to accurately handle a wide range of motions. Such algorithms should have the ability to recognize the identical characteristics of a particular motion by a given human being, yet allow for minor variations arising from human randomness. Additionally, the devices used to monitor peoples' movement need to be miniature and easy to wear. These two objectives are fundamentally opposed. However, the described embodiments provide a single cohesive system that is both sophisticated enough to detect a wide range of motions.
FIG. 1 shows examples of different types of motions of a human being that an object attached to the human being can be used to detect or sense. The human motions can include, for example, standing, sleeping, walking, and running A first motion 110 can include walking A second motion 120 can include falling. A third motion 130 can include running Each of the motions generates a unique motion signature. As will be described, the signatures can be universal to, for example, many individuals. Additionally, the signatures can have additional characteristics that are unique to an individual.
FIGS. 2A, 2B, 2C shows examples of different types of acceleration and orientation signatures for various sample motions by human beings. It should be noted that these signatures are expected to have certain components that are common from one human being to the next, but also have certain components that vary from one human to the next.
The signatures of FIGS. 2A, 2B, 2C are depicted in only one orientation. That is, three accelerometers can be used to generate acceleration signatures in the X, Y and Z (three) orientations. The signatures of FIGS. 2A, 2B, 2C only show the signature of one of the three orientations. It is to be understood that matching can use the other orientations as well.
FIG. 2A shows an example of an acceleration signature of a person doing a slow fall and lying down summersault. FIG. 2B shows an example of an acceleration signature of a person slipping and falling back on a bouncy surface (for example, an air mattress). FIG. 2C shows an acceleration signature of a person fall on their face with their knees flexed. By matching an acceleration signature that has been generated by sensing the motion of a person with one of many stored signatures, the motion of the person can be determined.
FIG. 3 is an example of a block diagram of a motion detection device. The motion detection device can be attached to an object, and therefore, detect motion of the object that can be identified. Based on the identified motion, estimates of the behavior and conditions of the object can be determined.
The motion detection device includes sensors (such as, accelerometers) that detect motion of the object. One embodiment of the sensors includes accelerometers 312, 314, 316 that can sense, for example, acceleration of the object in X, Y and Z directional orientations. It is to be understood that other types of motion detection sensors can alternatively be used.
An analog to digital converter (ADC) digitizes analog accelerometer signals. The digitized signals are received by compare processing circuitry 330 that compares the digitized accelerometer signals with signatures that have been stored within a library of signatures 340. Each signature corresponds with a type of motion. Therefore, when a match between the digitized accelerometer signals and a signature stored in the library 340, the type of motion experienced by the motion detection device can determined.
An embodiment includes filtering the accelerometer signals before attempting to match the signatures. Additionally, the matching process can be made simpler by reducing the possible signature matches.
An embodiment includes identifying a previous human activity context. That is, for example, by knowing that the previous human activity was walking, certain signatures can intelligently be eliminated from the possible matches of the present activity that occurs subsequent to the previous human activity (walking).
An embodiment includes additionally reducing the number of possible signature matches by performing a time-domain analysis on the accelerometer signal. The time-domain analysis can be used to identify a transient or steady-state signature of the accelerometer signal. That is, for example, a walk may have a prominent steady-state signature, whereas a fall may have a prominent transient signature. Identification of the transient or steady-state signature of the accelerometer signal can further reduce or eliminate the number of possible signature matches, and therefore, make the task of matching the accelerometer signature with a signature within the library of signature simpler, and easier to accomplish. More specifically, the required signal processing is simpler, easier, and requires less computing power.
Upon detection of certain types of motion, an audio device 360 and/or a global positioning system (GPS) 370 can engaged to provide additional information that can be used to determine the situation of, for example, a human being the motion detection device is attached to.
The condition, or information relating to the motion detection device can be communicated through a wired or wireless connection. A receiver of the information can process it, and make a determination regarding the status of the human being the motion detection device is attached to.
FIG. 4 is a flowchart that includes the steps of an example of a method for detecting various motions of daily living activities and emergency situations, such as, a fall. A first step 410 includes monitoring an activity of a person the motion detection device is attached. Raw signal data is collected from, for example, an accelerometer sensor. A second step 420 includes performing instantaneous computations over raw signals to compute atomic motions along with gravity vector and tilt vector. A step third step 430 includes applying series of digital filters to remove noise in the atomic motions data. A fourth step 440 includes performing state analysis on series of atomic data samples and forming context. Depending on the state analysis, the series of atomic data is passed through either a step 445 periodic or steady state data analysis or a step 450 transient state data analysis. A sixth step 460 includes formation of macro motion signatures. The macro motion signatures are built from an output of state analysis vectors using known wavelet transformation techniques (for example, a Haar Transform). The transform performs pattern matching on current motion pattern with existing motion pattern library using, for example, DWT (Discreet Wavelet Transform) techniques. Complex motion wavelets are later matched using statistical pattern matching techniques, such as, HHMM (Hidden Heuristic Markov Model). The statistical pattern matching includes detecting and classifying events of interest. The events of interest are built by observing various motions and orientation states data of an animate or inanimate object. This data is used to train the statistical model which performs the motion/activity detection. Each activity will have its own model trained based on the observed data. A seventh step 470 includes a learning system providing the right model for the user from a set of model. It also aids in building newer (personal) patterns which are not in the library for the person who is wearing the motion detection device. An eighth step 480 includes pre-building a motion database of motion libraries against which motion signatures are compared. The database adds new motion/states signature dynamically as they are identified.
FIG. 5 is a flowchart that includes the steps of an example of a method for detecting a fall. A first step 510 includes monitoring an activity of, for example, a person the motion detection device is attached to. A step 515 includes recording and reporting in deviations in normal motion patterns of the person. A step 520 includes detecting the acceleration magnitude deviation exceeding a threshold. The acceleration magnitude deviation exceeding the threshold can be sensed as a probable fall, and audio recording is initiated. Upon detection of this condition, sound recording of the person the motion detection device is connected to can be activated. The activation of sound can provide additional information that can be useful in assessing the situation of the person. A step 530 includes monitoring the person after the probable fall. A step 525 includes detection of another acceleration having magnitude lesser than the threshold, and continuing monitoring of audio. A step 535 includes detecting a short period of inactivity. A step 540 includes monitoring the person after determining a fall probably occurred. A step 545 includes subsequently detecting normal types of motion and turning off the audio because the person seems to be performing normal activity. A step 550 includes monitoring a period of inactivity. A step 555 includes additional analysis of detected information and signals. A step 560 includes further analysis including motion data, orientation detection all indicating the person is functioning normally. A step 560 includes determining that a fall has occurred based on the analysis of the motion data, and analysis of a concluded end position and orientation of the person. The sound recording can be de-activated. A step 565 includes concluding that a fall has occurred. A step 570 includes sending an alert and reporting sound recordings. A step 575 includes the fall having been reported. A step 580 includes an acknowledgement of the fall.
FIG. 6 is a flow chart that includes the steps of one example of a method of identifying a type of motion of an animate or inanimate object. A first step 610 includes generating an acceleration signature (for example, a tri-axial) based on the sensed acceleration of the object. A second step 620 includes matching the acceleration signature with at least one of a plurality of stored acceleration signatures, wherein each stored acceleration signatures corresponds with type of motion. A third step 630 includes identifying the type of motion of the object based on the statistical (pattern) matching or exact matching of the acceleration signature. As will be described, the acceleration signal can be created using a wavelet transformation.
For embodiments, the type of motion includes at least one of atomic motion, elemental motion and macro-motion.
Though embodiments of generating matching acceleration signatures are described, it is to be understood that additional or alternate embodiments can include generating and matching of orientation and/or audio signatures. Correspondingly, the first step 610 can include generating an acceleration signature, (and/or) orientation and audio signature based on the sensed acceleration, orientation of the object and audio generated by the object, for example, a thud of a fall, or a cry for help.
Atomic motion includes but is not limited to a sharp jolt, a gentle acceleration, complete stillness, a light acceleration that becomes stronger, a strong acceleration that fades, a sinusoidal or quasi-sinusoidal acceleration pattern, vehicular acceleration, vehicular deceleration, vehicular left and right turns, and more.
Elemental motion includes but is not limited to motion patterns for walking, running, fitness motions (e.g. elliptical machine exercises, rowing, stair climbing, aerobics, skipping rope, bicycling . . . ), vehicular traversal, sleeping, sitting, crawling, turning over in bed, getting out of bed, getting up from chair, and more.
Macro-motion includes but is not limited to going for a walk in the park, leaving home and driving to the shopping center, getting out of bed and visiting the bathroom, performing household chores, playing a game of tennis, and more.
Each of the plurality of stored acceleration signatures corresponds with a particular type of motion. By matching the detected acceleration signature of the object with at least one of a plurality of stored acceleration signatures, an estimate or educated guess can be made about the detected acceleration signature.
An embodiment includes a common library and a specific library, and matching the acceleration signature includes matching the acceleration signature with stored acceleration signatures of the common library, and then matching the acceleration signature with stored acceleration signatures of the specific library. For a particular embodiment, the general library includes universal acceleration signatures, and the specific library includes personal acceleration signatures. That is, for example, the stored acceleration signatures of the common library are useable for matching acceleration signatures of motions of multiple humans, and the stored acceleration signatures of the specific library are useable for matching acceleration signatures of motions of a particular human. Additionally, each library can be further categorized to reduce the number of possible matches. For example, at an initialization, a user may enter physical characteristics of the user, such as, age, sex and/or physical characteristics (such as, the user has a limp). Thereby, the possible signatures matches within the general library can be reduced. The signature entries within the specific library can be learned (built) over time as the human wearing the motion detection device goes through normal activities of the specific human. The specific library can be added to, and improved over time.
An embodiment includes filtering the acceleration signals. Additional embodiment include reducing the number of stored acceleration signature matches by identifying a previous activity of the object, and performing a time domain analysis on the filtered acceleration signal to identify transient signatures or steady-state signatures of the filtered acceleration signal. That is, by identifying a previous activity (for example, a human walking of sleeping) the possible number of present activities can be reduced, and therefore, the number of possible stored acceleration signature matches reduced. Additionally, the transient and/or steady-state signatures can be used to reduce the number of possible stored acceleration signature matches, which can improve the processing speed.
Another embodiment includes activating audio sensing of the object if matches are made with at least portions of particular stored acceleration signatures. For example, if the acceleration signature exceeds a threshold value, then audio sensing of the object is activated. This is useful because the audio information can provide additional clues as to what, for example, the condition of a person. That is, a fall may be detected, and audio information can be used to confirm that a fall has in fact occurred.
Another embodiment includes transmitting the sensed audio. For example, of a user wearing the object has fallen, and the fall has been detected, audio information can be very useful for determining the condition of the user. The audio information can allow a receiver of the audio information to determine, for example, if the user is in pain, unconscious or in a dangerous situation (for example, in a shower or in a fire).
An embodiment includes the object being associated a person, and the stored acceleration signatures corresponding with different types of motion related to the person. A particular embodiment includes identifying an activity of the person based on a sequence of identified motions of the person. The activity of the person can include, for example, falling (the most important in some applications), walking, running, driving and more. Furthermore, the activities can be classified as daily living activities such as walking, running, sitting, sleeping, driving, climbing stairs, and more, or sporadic activities, such as falling, having a car collision, having a seizure and so on.
An embodiment includes transmitting information related to the identified type of motion if matches are made with particular stored acceleration signatures. The information related to the identified type of motion can include at least one of motions associated with a person the object is associated with. The motions can include, for example, a heartbeat of the person, muscular spasms, facial twitches, involuntary reflex movements which can be sensed by, for example, an accelerometer. Additionally, the information related to the identified type of motion can include at least one of location of the object, audio sensed by the object, temperature of the object.
Another embodiment includes storing at least one of the plurality of stored acceleration signatures during an initialization cycle. The initializing cycle can be influenced based on what the object is attached to. That is, initializing the stored acceleration signatures (motion patterns) can be based on what the object is attached to, which can both reduce the number of signatures required to be stored within, for example, the general library, and reduce the number of possible matches and reduce the processing required to identify a match. Alternatively or additionally, initializing the stored acceleration signatures can be based on who the object is attached to, which can influence the specific library. The initialization can be used to determine motions unique, for example, to an individual. For example, a unique motion can be identified for a person who walks with a limp, and the device can be initialized with motion patterns of the person walking with a limp.
An embodiment includes initiating a low-power sleep mode of the object if sensed acceleration is below a threshold for a predetermined amount of time. That is, if, for example, a person is sensed to be sleeping, power can be saved by de-activating at least a portion of the motion sensing device.
FIG. 7 is a flow chart that includes steps of one example of a method of a motion detection device checking network availability for improvements in speed and/or processing power of acceleration signature matching, wherein the motion detection device includes motion detection sensors that generate the acceleration signal. A first step 710 includes the motion detection device determining what network connections are available to the motion detection device. A second step 710 includes the motion detection device distributing at least some of the acceleration signature matching processing if processing capability is available to the motion detection device though available network connections.
For an embodiment, the motion detection device distributes the acceleration signature matching processing if the processing capability is available to the motion detection device though available network connections, and distributing the acceleration signature matching processing saves the motion detection device processing power. Another embodiment, the motion detection device distributes the acceleration signature matching processing if the processing capability is available to the motion detection device though available network connections, and distributing the acceleration signature matching processing increases a speed of the motion detection device processing. Alternatively, the motion detection device distributes the processing to optimize both power and processing speed. Additionally, the processing distribution can be dependent upon the bandwidths of the available network connections. That is, some networks connections can generally support higher data transfer rates, and therefore, influence the processing speed.
Generally, the motion detection device scales its processing to the level of processing available. That is, as additional processing power becomes available to the motion detection device, the motion detection device can increase the complexity of the signature matching processing. The processing can be distributed as processing capability becomes available through network connections. The processing can be performed in different locations as network connectivity becomes available, which can advantageously reduce the power consumption of the motion detection device and/or increase the speed of the processing.
FIG. 8 shows a motion detection device 300 that can be connected to one of multiple networks. Examples of possible networks (not a comprehensive list) the motion detection device 300 can connect to, include a cellular network 820 through, for example, a blue tooth wireless link 810, or to a home base station 840 through, for example, a Zigbee wireless link 845. The wireless links 810, 845 can each provide different levels of bandwidth. Each of the networks includes available processing capabilities 830, 850.
If the motion detection device 300 does not have any network connections available, the motion detection device 300 must perform its own matching processing. If this is the case, then the processing algorithms may be less complex to reduce processing power, and/or reduce processing speed. For example, the matching processing can be made simpler by comparing threshold levels for elemental motions by extracting significant wavelet coefficients. Acceleration signals data acquisition is performed in chunk of processing every few mili-seconds by waking up. For all other times the processor rests in low-power mode. Except for the emergency situation, the RF communication is done periodically when the data is in steady state, there is no need to send it to network i.e. when the object is in sedentary there is no need to send data change in the state is communicated to network. Additionally, if no network connections are available, the operation of the motion detection device 300 may be altered. For example, if the motion detection device 300 detects an emergency situation (such as, a fall), the motion detection device 300 may generate an audio alert. If a network connection was available, the audio alert may not be generated, but an alert may be transmitted over the available network.
The motion detection device 300 includes a processor in which at least a portion of the analysis and signature matching can processing can be completed. However, if the motion detection device 300 has one or more networks available to the motion detection device 300, the motion detection device can off-load some of the processing to one of the processors 730, 750 associated with the networks.
The determination of whether to off-load the processing can be based on both the processing capabilities provided by available networks, and the data rates (bandwidth) provided by each of the available networks.
Although specific embodiments have been described and illustrated, the embodiments are not to be limited to the specific forms or arrangements of parts so described and illustrated.

Claims (20)

What is claimed:
1. A method of detecting human condition and activity of a user, comprising:
generating, by a motion detection device that includes a motion detection sensor, an acceleration signature based on sensed acceleration of an object that represents motion of the user;
the motion detection device determining what network connections are available to the motion detection device;
matching the acceleration signature with at least one of a plurality of stored acceleration signatures of motions of human beings, wherein each stored acceleration signatures corresponds with a type of motion, wherein the motion detection device distributes at least some of the acceleration signature matching processing when processing capability is available to the motion detection device though available network connections;
identifying a type of motion of the user and identifying a condition of the user based on the matching of the acceleration signature with a stored acceleration signature.
2. The method of claim 1, wherein the type of motion comprises at least one of atomic motion, elemental motion and macro-motion.
3. The method of claim 1, wherein the stored acceleration signatures are stored in a common library and a specific library, and matching the acceleration signature comprises matching the acceleration signature with stored acceleration signatures of the common library, and then matching the acceleration signature with stored acceleration signatures of the specific library.
4. The method of claim 3, wherein the common library includes universal motion and activities acceleration signatures, and the specific library includes person acceleration signatures.
5. The method of claim 3, wherein the stored acceleration signatures of the common library are useable for matching acceleration signatures of motions of multiple humans, and the stored acceleration signatures of the specific library are useable for matching acceleration signatures of motions of a particular human.
6. The method of claim 1, wherein when matches are made with at least portions of particular stored acceleration signatures, then audio sensing of the object is activated.
7. The method of claim 1, wherein matching further comprises filtering the acceleration signature.
8. The method of claim 7, further comprising reducing a number of stored acceleration signature matches by identifying a previous activity of the user, and performing a time domain analysis on the filtered acceleration signal to identify transient signatures or steady-state signatures of the filtered acceleration signal.
9. The method of claim 1, wherein if the acceleration signature exceeds a threshold value, then audio sensing of the object is activated.
10. The method of claim 9, further comprising transmitting the sensed audio.
11. The method of claim 1, further comprising identifying an activity of the user based on a sequence of identified motions of the person.
12. The method of claim 1, further comprising transmitting information related to the identified type of motion when matches are made with particular stored acceleration signatures.
13. The method of claim 1, further comprising storing at least one of the plurality of stored acceleration signatures during an initialization cycle.
14. The method of claim 1, further comprising initiating a low-power sleep mode of the object when sensed acceleration is below a threshold for a predetermined amount of time.
15. The method of claim 1, wherein the motion detection device distributes the acceleration signature matching processing when the processing capability is available to the motion detection device through available network connections, and distributing the acceleration signature matching processing saves the motion detection device processing power.
16. The method of claim 1, wherein the motion detection device distributes the acceleration signature matching processing when the processing capability is available to the motion detection device through available network connections, and distributing the acceleration signature matching processing increases a speed of the motion detection device processing.
17. An apparatus for identifying a type of motion and condition of a user, comprising:
a motion detection sensor operative to generate an acceleration signature based on sensed acceleration of the user;
a controller, wherein the controller is operative to:
determine what network connections are available to the motion detection device;
match the acceleration signature with at least one of a plurality of stored acceleration signatures, wherein each stored acceleration signatures corresponds with a type of motion of the user, wherein the apparatus distributes at least some of the acceleration signature matching processing when processing capability is available to the motion detection device though available network connections;
identify the type of motion of the user and identify a condition of the user based on the matching of the acceleration signature.
18. The apparatus of claim 17, wherein the type of motion comprises at least one of atomic motion, elemental motion and macro-motion.
19. The apparatus of claim 17, wherein the stored acceleration signatures are stored in a common library and a specific library, and matching the acceleration signature comprises matching the acceleration signature with stored acceleration signatures of the common library, and then matching the acceleration signature with stored acceleration signatures of the specific library.
20. The apparatus of claim 19, wherein the common library includes universal motion and activities acceleration signatures, and the specific library includes person acceleration signatures.
US13/975,170 2009-02-23 2013-08-23 Identifying a type of motion of an object Active US8812258B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US13/975,170 US8812258B2 (en) 2009-02-23 2013-08-23 Identifying a type of motion of an object

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US20834409P 2009-02-23 2009-02-23
US12/560,069 US20100217533A1 (en) 2009-02-23 2009-09-15 Identifying a Type of Motion of an Object
US12/883,304 US8560267B2 (en) 2009-09-15 2010-09-16 Identifying one or more activities of an animate or inanimate object
US13/975,170 US8812258B2 (en) 2009-02-23 2013-08-23 Identifying a type of motion of an object

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US12/883,304 Continuation US8560267B2 (en) 2009-02-23 2010-09-16 Identifying one or more activities of an animate or inanimate object

Publications (2)

Publication Number Publication Date
US20130346014A1 US20130346014A1 (en) 2013-12-26
US8812258B2 true US8812258B2 (en) 2014-08-19

Family

ID=43731375

Family Applications (2)

Application Number Title Priority Date Filing Date
US12/883,304 Active 2030-05-19 US8560267B2 (en) 2009-02-23 2010-09-16 Identifying one or more activities of an animate or inanimate object
US13/975,170 Active US8812258B2 (en) 2009-02-23 2013-08-23 Identifying a type of motion of an object

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US12/883,304 Active 2030-05-19 US8560267B2 (en) 2009-02-23 2010-09-16 Identifying one or more activities of an animate or inanimate object

Country Status (2)

Country Link
US (2) US8560267B2 (en)
WO (1) WO2012036958A2 (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160166180A1 (en) * 2014-12-11 2016-06-16 David Martin Enhanced Real Time Frailty Assessment for Mobile
US20160232201A1 (en) * 2015-02-11 2016-08-11 Google Inc. Methods, systems, and media for recommending computerized services based on an animate object in the user's environmentes
US10155131B2 (en) 2016-06-20 2018-12-18 Coreyak Llc Exercise assembly for performing different rowing routines
US10425725B2 (en) 2015-02-11 2019-09-24 Google Llc Methods, systems, and media for ambient background noise modification based on mood and/or behavior information
US10556167B1 (en) 2016-06-20 2020-02-11 Coreyak Llc Exercise assembly for performing different rowing routines
US10785203B2 (en) 2015-02-11 2020-09-22 Google Llc Methods, systems, and media for presenting information related to an event based on metadata
US10881936B2 (en) 2016-06-20 2021-01-05 Coreyak Llc Exercise assembly for performing different rowing routines
US11024142B2 (en) 2017-07-27 2021-06-01 NXT-ID, Inc. Event detector for issuing a notification responsive to occurrence of an event
US11048855B2 (en) 2015-02-11 2021-06-29 Google Llc Methods, systems, and media for modifying the presentation of contextually relevant documents in browser windows of a browsing application
US11158179B2 (en) 2017-07-27 2021-10-26 NXT-ID, Inc. Method and system to improve accuracy of fall detection using multi-sensor fusion
US11382511B2 (en) 2017-07-27 2022-07-12 Logicmark, Inc. Method and system to reduce infrastructure costs with simplified indoor location and reliable communications
US11504029B1 (en) 2014-10-26 2022-11-22 David Martin Mobile control using gait cadence

Families Citing this family (71)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7733224B2 (en) 2006-06-30 2010-06-08 Bao Tran Mesh network personal emergency response appliance
US8560267B2 (en) * 2009-09-15 2013-10-15 Imetrikus, Inc. Identifying one or more activities of an animate or inanimate object
US20100217533A1 (en) * 2009-02-23 2010-08-26 Laburnum Networks, Inc. Identifying a Type of Motion of an Object
US8475371B2 (en) 2009-09-01 2013-07-02 Adidas Ag Physiological monitoring garment
US8843101B2 (en) * 2010-10-04 2014-09-23 Numera, Inc. Fall detection system using a combination of accelerometer, audio input and magnetometer
US8768865B2 (en) * 2011-01-19 2014-07-01 Qualcomm Incorporated Learning situations via pattern matching
US9724600B2 (en) 2011-06-06 2017-08-08 Microsoft Technology Licensing, Llc Controlling objects in a virtual environment
US9700222B2 (en) 2011-12-02 2017-07-11 Lumiradx Uk Ltd Health-monitor patch
US9734304B2 (en) 2011-12-02 2017-08-15 Lumiradx Uk Ltd Versatile sensors with data fusion functionality
US9519909B2 (en) 2012-03-01 2016-12-13 The Nielsen Company (Us), Llc Methods and apparatus to identify users of handheld computing devices
US10922383B2 (en) * 2012-04-13 2021-02-16 Adidas Ag Athletic activity monitoring methods and systems
US8473975B1 (en) 2012-04-16 2013-06-25 The Nielsen Company (Us), Llc Methods and apparatus to detect user attentiveness to handheld computing devices
US9999376B2 (en) 2012-11-02 2018-06-19 Vital Connect, Inc. Determining body postures and activities
EP2960627B1 (en) * 2013-02-22 2020-01-01 Asahi Kasei Kabushiki Kaisha Carry-state change detection device, carry-state change detection method, and program
US9223297B2 (en) 2013-02-28 2015-12-29 The Nielsen Company (Us), Llc Systems and methods for identifying a user of an electronic device
US20140288878A1 (en) * 2013-03-15 2014-09-25 Aliphcom Identification of motion characteristics to determine activity
US20140288876A1 (en) * 2013-03-15 2014-09-25 Aliphcom Dynamic control of sampling rate of motion to modify power consumption
US20140288875A1 (en) * 2013-03-15 2014-09-25 Aliphcom Methods and architecture for determining activity and activity types from sensed motion signals
WO2014145122A2 (en) * 2013-03-15 2014-09-18 Aliphcom Identification of motion characteristics to determine activity
US20140288870A1 (en) * 2013-03-15 2014-09-25 Aliphcom Inline calibration of motion sensor
US20140288877A1 (en) * 2013-03-15 2014-09-25 Aliphcom Intermediate motion signal extraction to determine activity
WO2014153665A1 (en) * 2013-03-29 2014-10-02 Engage Biomechanics Inc. System and method for monitoring a subject
US8948783B2 (en) 2013-06-28 2015-02-03 Facebook, Inc. User activity tracking system
US9125015B2 (en) 2013-06-28 2015-09-01 Facebook, Inc. User activity tracking system and device
US9805577B2 (en) 2013-11-05 2017-10-31 Nortek Security & Control, LLC Motion sensing necklace system
US9444804B2 (en) * 2013-11-25 2016-09-13 Roy S. Melzer Dynamic security question generation
US10360907B2 (en) 2014-01-14 2019-07-23 Toyota Motor Engineering & Manufacturing North America, Inc. Smart necklace with stereo vision and onboard processing
US10024679B2 (en) * 2014-01-14 2018-07-17 Toyota Motor Engineering & Manufacturing North America, Inc. Smart necklace with stereo vision and onboard processing
US9915545B2 (en) * 2014-01-14 2018-03-13 Toyota Motor Engineering & Manufacturing North America, Inc. Smart necklace with stereo vision and onboard processing
US9578307B2 (en) 2014-01-14 2017-02-21 Toyota Motor Engineering & Manufacturing North America, Inc. Smart necklace with stereo vision and onboard processing
US9629774B2 (en) 2014-01-14 2017-04-25 Toyota Motor Engineering & Manufacturing North America, Inc. Smart necklace with stereo vision and onboard processing
US10248856B2 (en) 2014-01-14 2019-04-02 Toyota Motor Engineering & Manufacturing North America, Inc. Smart necklace with stereo vision and onboard processing
US10024667B2 (en) 2014-08-01 2018-07-17 Toyota Motor Engineering & Manufacturing North America, Inc. Wearable earpiece for providing social and environmental awareness
US9619039B2 (en) * 2014-09-05 2017-04-11 The Boeing Company Obtaining metrics for a position using frames classified by an associative memory
US9911031B2 (en) * 2014-09-05 2018-03-06 The Boeing Company Obtaining metrics for a position using frames classified by an associative memory
US10024678B2 (en) * 2014-09-17 2018-07-17 Toyota Motor Engineering & Manufacturing North America, Inc. Wearable clip for providing social and environmental awareness
US9922236B2 (en) 2014-09-17 2018-03-20 Toyota Motor Engineering & Manufacturing North America, Inc. Wearable eyeglasses for providing social and environmental awareness
USD768024S1 (en) 2014-09-22 2016-10-04 Toyota Motor Engineering & Manufacturing North America, Inc. Necklace with a built in guidance device
EP3206654B1 (en) * 2014-10-17 2020-11-25 Stryker Corporation Person support apparatuses with motion monitoring
US10347108B2 (en) * 2015-01-16 2019-07-09 City University Of Hong Kong Monitoring user activity using wearable motion sensing device
US9576460B2 (en) 2015-01-21 2017-02-21 Toyota Motor Engineering & Manufacturing North America, Inc. Wearable smart device for hazard detection and warning based on image and audio data
KR102278945B1 (en) 2015-01-27 2021-07-19 삼성전자주식회사 Image processing method and electronic device supporting the same
US10490102B2 (en) 2015-02-10 2019-11-26 Toyota Motor Engineering & Manufacturing North America, Inc. System and method for braille assistance
WO2016137960A1 (en) 2015-02-24 2016-09-01 Otis Elevator Company System and method of measuring and diagnosing ride quality of an elevator system
US9586318B2 (en) 2015-02-27 2017-03-07 Toyota Motor Engineering & Manufacturing North America, Inc. Modular robot with smart device
US9811752B2 (en) 2015-03-10 2017-11-07 Toyota Motor Engineering & Manufacturing North America, Inc. Wearable smart device and method for redundant object identification
US9677901B2 (en) 2015-03-10 2017-06-13 Toyota Motor Engineering & Manufacturing North America, Inc. System and method for providing navigation instructions at optimal times
US9972216B2 (en) 2015-03-20 2018-05-15 Toyota Motor Engineering & Manufacturing North America, Inc. System and method for storing and playback of information for blind users
US20160292584A1 (en) * 2015-03-31 2016-10-06 Microsoft Technology Licensing, Llc Inferring User Sleep Patterns
US9898039B2 (en) 2015-08-03 2018-02-20 Toyota Motor Engineering & Manufacturing North America, Inc. Modular smart necklace
JP6598570B2 (en) * 2015-08-11 2019-10-30 日本光電工業株式会社 Biological information measuring device and program
US10024680B2 (en) 2016-03-11 2018-07-17 Toyota Motor Engineering & Manufacturing North America, Inc. Step based guidance system
US9958275B2 (en) 2016-05-31 2018-05-01 Toyota Motor Engineering & Manufacturing North America, Inc. System and method for wearable smart device communications
CA3026740C (en) 2016-06-08 2021-12-28 Aerial Technologies Inc. System and methods for smart intrusion detection using wireless signals and artificial intelligence
US10561519B2 (en) 2016-07-20 2020-02-18 Toyota Motor Engineering & Manufacturing North America, Inc. Wearable computing device having a curved back to reduce pressure on vertebrae
US10043366B2 (en) 2016-10-18 2018-08-07 International Business Machines Corporation Personal safety monitoring
US10432851B2 (en) 2016-10-28 2019-10-01 Toyota Motor Engineering & Manufacturing North America, Inc. Wearable computing device for detecting photography
US10012505B2 (en) 2016-11-11 2018-07-03 Toyota Motor Engineering & Manufacturing North America, Inc. Wearable system for providing walking directions
US10521669B2 (en) 2016-11-14 2019-12-31 Toyota Motor Engineering & Manufacturing North America, Inc. System and method for providing guidance or feedback to a user
US10172760B2 (en) 2017-01-19 2019-01-08 Jennifer Hendrix Responsive route guidance and identification system
EP3738506A1 (en) 2017-03-07 2020-11-18 Motionize Israel Ltd. Footwear sensor mounting system
ES2684386B1 (en) * 2017-03-31 2019-07-18 Planetus S L SYSTEM AND METHOD OF DETERMINATION OF FALL IN TWO-WHEELED VEHICLES
US20180293359A1 (en) 2017-04-10 2018-10-11 International Business Machines Corporation Monitoring an individual's condition based on models generated from e-textile based clothing
US11153364B1 (en) 2017-11-29 2021-10-19 Parallels International Gmbh Embedding remote applications into HTML pages
CN111684535A (en) * 2018-02-02 2020-09-18 皇家飞利浦有限公司 System and method for optimal sensor placement
CN108653974A (en) * 2018-05-11 2018-10-16 珠海云麦科技有限公司 A kind of intelligence rope skipping with data analysis
CN108831527B (en) * 2018-05-31 2021-06-04 古琳达姬(厦门)股份有限公司 User motion state detection method and device and wearable device
CN110859629A (en) * 2019-10-29 2020-03-06 北京机械设备研究所 Exoskeleton gait identification method and device
JP7385826B2 (en) * 2019-11-08 2023-11-24 オムロン株式会社 Motion analysis device, motion analysis method, and motion analysis program
CN111444812A (en) * 2020-03-23 2020-07-24 星汉智能科技股份有限公司 Human body posture assessment method and system for daily public security training
US11006860B1 (en) * 2020-06-16 2021-05-18 Motionize Israel Ltd. Method and apparatus for gait analysis

Citations (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4513437A (en) 1982-06-30 1985-04-23 International Business Machines Corporation Data input pen for Signature Verification
US6028626A (en) 1995-01-03 2000-02-22 Arc Incorporated Abnormality detection and surveillance system
US6265982B1 (en) 2000-05-31 2001-07-24 Storage Technology Corporation Method and system for monitoring vibration of robots in an automated storage library
US20030158489A1 (en) 2002-02-18 2003-08-21 Colin Corporation Pressure-pulse-wave detecting apparatus
US6626728B2 (en) 2000-06-27 2003-09-30 Kenneth C. Holt Motion-sequence activated toy wand
US6675649B2 (en) 2000-04-17 2004-01-13 Fujitsu Takamisawa Component Limited Acceleration detection device, method of detecting acceleration, input device, and recording medium
US6756889B2 (en) 2002-09-12 2004-06-29 General Motors Corporation Dual sensor crash sensing system
US6816766B2 (en) 2002-11-26 2004-11-09 General Motors Corporation Continuous collision severity prediction
US20050154512A1 (en) 2004-01-08 2005-07-14 Schubert Peter J. Vehicle rollover detection and method of anticipating vehicle rollover
US20060005578A1 (en) 2004-07-08 2006-01-12 Stefano Tortoli Device and method for positioning ornaments onto elongated ornamental articles
US6999863B2 (en) 2003-01-06 2006-02-14 General Motors Corporation Variation manager for crash sensing algorithms
US20060089538A1 (en) 2004-10-22 2006-04-27 General Electric Company Device, system and method for detection activity of persons
US7145461B2 (en) 2001-01-31 2006-12-05 Ilife Solutions, Inc. System and method for analyzing activity of a body
US20060282021A1 (en) 2005-05-03 2006-12-14 Devaul Richard W Method and system for fall detection and motion analysis
US20070167693A1 (en) 2005-11-15 2007-07-19 Bernd Scholler Display means for vital parameters
US7248172B2 (en) 2005-03-22 2007-07-24 Freescale Semiconductor, Inc. System and method for human body fall detection
US20070293781A1 (en) 2003-11-04 2007-12-20 Nathaniel Sims Respiration Motion Detection and Health State Assesment System
US20080256796A1 (en) 2007-04-17 2008-10-23 Fix Sandra L Necklace stabilizer
US7467060B2 (en) 2006-03-03 2008-12-16 Garmin Ltd. Method and apparatus for estimating a motion parameter
US20090303204A1 (en) 2007-01-05 2009-12-10 Invensense Inc. Controlling and accessing content using motion processing on mobile devices
US20100073284A1 (en) 2008-09-25 2010-03-25 Research In Motion Limited System and method for analyzing movements of an electronic device
US7715982B2 (en) 2002-11-01 2010-05-11 M.B.T.L. Limited Monitoring sports
US20100121226A1 (en) 2007-04-19 2010-05-13 Koninklijke Philips Electronics N.V. Fall detection system
US20100217533A1 (en) 2009-02-23 2010-08-26 Laburnum Networks, Inc. Identifying a Type of Motion of an Object
US7827000B2 (en) 2006-03-03 2010-11-02 Garmin Switzerland Gmbh Method and apparatus for estimating a motion parameter
US7881902B1 (en) * 2006-12-22 2011-02-01 Dp Technologies, Inc. Human activity monitoring device
US8560267B2 (en) * 2009-09-15 2013-10-15 Imetrikus, Inc. Identifying one or more activities of an animate or inanimate object

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6160478A (en) * 1998-10-27 2000-12-12 Sarcos Lc Wireless health monitoring system
US6166639A (en) * 1999-03-12 2000-12-26 Advanced Marketing Systems Corporation Personal emergency response system
US6885932B2 (en) * 2003-08-08 2005-04-26 Motorola, Inc. Misfire detection in an internal combustion engine
US7301526B2 (en) * 2004-03-23 2007-11-27 Fujitsu Limited Dynamic adaptation of gestures for motion controlled handheld devices
CN101919273B (en) * 2007-11-09 2013-07-10 谷歌公司 Activating applications based on accelerometer data
US7898428B2 (en) * 2008-03-06 2011-03-01 Research In Motion Limited Safety for mobile device users while driving

Patent Citations (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4513437A (en) 1982-06-30 1985-04-23 International Business Machines Corporation Data input pen for Signature Verification
US6028626A (en) 1995-01-03 2000-02-22 Arc Incorporated Abnormality detection and surveillance system
US6675649B2 (en) 2000-04-17 2004-01-13 Fujitsu Takamisawa Component Limited Acceleration detection device, method of detecting acceleration, input device, and recording medium
US6265982B1 (en) 2000-05-31 2001-07-24 Storage Technology Corporation Method and system for monitoring vibration of robots in an automated storage library
US6626728B2 (en) 2000-06-27 2003-09-30 Kenneth C. Holt Motion-sequence activated toy wand
US7145461B2 (en) 2001-01-31 2006-12-05 Ilife Solutions, Inc. System and method for analyzing activity of a body
US20030158489A1 (en) 2002-02-18 2003-08-21 Colin Corporation Pressure-pulse-wave detecting apparatus
US6802814B2 (en) 2002-02-18 2004-10-12 Colin Medical Technology Corporation Pressure-pulse-wave detecting apparatus
US6756889B2 (en) 2002-09-12 2004-06-29 General Motors Corporation Dual sensor crash sensing system
US7715982B2 (en) 2002-11-01 2010-05-11 M.B.T.L. Limited Monitoring sports
US6816766B2 (en) 2002-11-26 2004-11-09 General Motors Corporation Continuous collision severity prediction
US6999863B2 (en) 2003-01-06 2006-02-14 General Motors Corporation Variation manager for crash sensing algorithms
US20070293781A1 (en) 2003-11-04 2007-12-20 Nathaniel Sims Respiration Motion Detection and Health State Assesment System
US20050154512A1 (en) 2004-01-08 2005-07-14 Schubert Peter J. Vehicle rollover detection and method of anticipating vehicle rollover
US20060005578A1 (en) 2004-07-08 2006-01-12 Stefano Tortoli Device and method for positioning ornaments onto elongated ornamental articles
US20060089538A1 (en) 2004-10-22 2006-04-27 General Electric Company Device, system and method for detection activity of persons
US7248172B2 (en) 2005-03-22 2007-07-24 Freescale Semiconductor, Inc. System and method for human body fall detection
US20060282021A1 (en) 2005-05-03 2006-12-14 Devaul Richard W Method and system for fall detection and motion analysis
US20070167693A1 (en) 2005-11-15 2007-07-19 Bernd Scholler Display means for vital parameters
US7467060B2 (en) 2006-03-03 2008-12-16 Garmin Ltd. Method and apparatus for estimating a motion parameter
US7827000B2 (en) 2006-03-03 2010-11-02 Garmin Switzerland Gmbh Method and apparatus for estimating a motion parameter
US8060337B2 (en) 2006-03-03 2011-11-15 Garmin Switzerland Gmbh Method and apparatus for estimating a motion parameter
US7881902B1 (en) * 2006-12-22 2011-02-01 Dp Technologies, Inc. Human activity monitoring device
US20090303204A1 (en) 2007-01-05 2009-12-10 Invensense Inc. Controlling and accessing content using motion processing on mobile devices
US20080256796A1 (en) 2007-04-17 2008-10-23 Fix Sandra L Necklace stabilizer
US20100121226A1 (en) 2007-04-19 2010-05-13 Koninklijke Philips Electronics N.V. Fall detection system
US20100073284A1 (en) 2008-09-25 2010-03-25 Research In Motion Limited System and method for analyzing movements of an electronic device
US20100217533A1 (en) 2009-02-23 2010-08-26 Laburnum Networks, Inc. Identifying a Type of Motion of an Object
US8560267B2 (en) * 2009-09-15 2013-10-15 Imetrikus, Inc. Identifying one or more activities of an animate or inanimate object

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Response to Office Action submitted Apr. 2, 2013, for U.S. Appl. No. 12/621,099, filed Nov. 18, 2009.
Response to Office Action submitted Jan. 23, 2011, for U.S. Appl. No. 12/560,069, filed Sep. 15, 2009.
Response to Office Action submitted May 10, 2012, for U.S. Appl. No. 12/883,304, filed Sep. 16, 2010.

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11504029B1 (en) 2014-10-26 2022-11-22 David Martin Mobile control using gait cadence
US20160166180A1 (en) * 2014-12-11 2016-06-16 David Martin Enhanced Real Time Frailty Assessment for Mobile
US10880641B2 (en) 2015-02-11 2020-12-29 Google Llc Methods, systems, and media for ambient background noise modification based on mood and/or behavior information
US10425725B2 (en) 2015-02-11 2019-09-24 Google Llc Methods, systems, and media for ambient background noise modification based on mood and/or behavior information
US11910169B2 (en) 2015-02-11 2024-02-20 Google Llc Methods, systems, and media for ambient background noise modification based on mood and/or behavior information
US10785203B2 (en) 2015-02-11 2020-09-22 Google Llc Methods, systems, and media for presenting information related to an event based on metadata
US11494426B2 (en) 2015-02-11 2022-11-08 Google Llc Methods, systems, and media for modifying the presentation of contextually relevant documents in browser windows of a browsing application
US11841887B2 (en) 2015-02-11 2023-12-12 Google Llc Methods, systems, and media for modifying the presentation of contextually relevant documents in browser windows of a browsing application
US11671416B2 (en) 2015-02-11 2023-06-06 Google Llc Methods, systems, and media for presenting information related to an event based on metadata
US11048855B2 (en) 2015-02-11 2021-06-29 Google Llc Methods, systems, and media for modifying the presentation of contextually relevant documents in browser windows of a browsing application
US11516580B2 (en) 2015-02-11 2022-11-29 Google Llc Methods, systems, and media for ambient background noise modification based on mood and/or behavior information
US20160232201A1 (en) * 2015-02-11 2016-08-11 Google Inc. Methods, systems, and media for recommending computerized services based on an animate object in the user's environmentes
US11392580B2 (en) * 2015-02-11 2022-07-19 Google Llc Methods, systems, and media for recommending computerized services based on an animate object in the user's environment
US10556167B1 (en) 2016-06-20 2020-02-11 Coreyak Llc Exercise assembly for performing different rowing routines
US10881936B2 (en) 2016-06-20 2021-01-05 Coreyak Llc Exercise assembly for performing different rowing routines
US10155131B2 (en) 2016-06-20 2018-12-18 Coreyak Llc Exercise assembly for performing different rowing routines
US11382511B2 (en) 2017-07-27 2022-07-12 Logicmark, Inc. Method and system to reduce infrastructure costs with simplified indoor location and reliable communications
US11158179B2 (en) 2017-07-27 2021-10-26 NXT-ID, Inc. Method and system to improve accuracy of fall detection using multi-sensor fusion
US11024142B2 (en) 2017-07-27 2021-06-01 NXT-ID, Inc. Event detector for issuing a notification responsive to occurrence of an event

Also Published As

Publication number Publication date
WO2012036958A9 (en) 2012-05-24
US20110066383A1 (en) 2011-03-17
WO2012036958A2 (en) 2012-03-22
US20130346014A1 (en) 2013-12-26
WO2012036958A3 (en) 2012-07-12
US8560267B2 (en) 2013-10-15

Similar Documents

Publication Publication Date Title
US8812258B2 (en) Identifying a type of motion of an object
US20100217533A1 (en) Identifying a Type of Motion of an Object
Vallabh et al. Fall detection monitoring systems: a comprehensive review
Qi et al. Examining sensor-based physical activity recognition and monitoring for healthcare using Internet of Things: A systematic review
US10319209B2 (en) Method and system for motion analysis and fall prevention
de la Concepción et al. Mobile activity recognition and fall detection system for elderly people using Ameva algorithm
US8972197B2 (en) Method and system for analyzing breathing of a user
Mubashir et al. A survey on fall detection: Principles and approaches
Doukas et al. Patient fall detection using support vector machines
US9060714B2 (en) System for detection of body motion
Jafari et al. Physical activity monitoring for assisted living at home
Song et al. Speed estimation from a tri-axial accelerometer using neural networks
Estudillo-Valderrama et al. Design and implementation of a distributed fall detection system—personal server
WO2014052505A2 (en) Biometric identification method and apparatus to authenticate identity of a user of a wearable device that includes sensors
Rasheed et al. Evaluation of human activity recognition and fall detection using android phone
Li et al. Grammar-based, posture-and context-cognitive detection for falls with different activity levels
Dong et al. Meal-time and duration monitoring using wearable sensors
US20110288784A1 (en) Monitoring Energy Expended by an Individual
Fujimoto et al. Wearable human activity recognition by electrocardiograph and accelerometer
Bisio et al. Towards IoT-based eHealth services: A smart prototype system for home rehabilitation
WO2017081829A1 (en) Behavior detection device, behavior detection method, and behavior detection program
Doukas et al. Advanced classification and rules-based evaluation of motion, visual and biosignal data for patient fall incident detection
Luštrek et al. Confidence: ubiquitous care system to support independent living
Valero et al. Reprint of: Vibration sensing-based human and infrastructure safety/health monitoring: A survey
Tahafchi et al. Freezing-of-gait detection using wearable sensor technology and possibilistic k-nearest-neighbor algorithm

Legal Events

Date Code Title Description
AS Assignment

Owner name: IMETRIKUS, INC. DBA NUMERA, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:NADKARNI, VIJAY;JANGLE, JEETENDRA;BENTLEY, JOHN;AND OTHERS;SIGNING DATES FROM 20130629 TO 20130718;REEL/FRAME:031074/0339

AS Assignment

Owner name: MULTIPLIER CAPITAL, LP, MARYLAND

Free format text: SECURITY INTEREST;ASSIGNOR:NUMERA, INC.;REEL/FRAME:033083/0092

Effective date: 20140527

STCF Information on status: patent grant

Free format text: PATENTED CASE

AS Assignment

Owner name: NUMERA, INC., WASHINGTON

Free format text: ACKNOWLEDGMENT OF TERMINATION OF INTELLECTUAL PROPERTY SECURITY AGREEMENT;ASSIGNOR:MULTIPLIER CAPITAL, LP;REEL/FRAME:036584/0956

Effective date: 20150701

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YR, SMALL ENTITY (ORIGINAL EVENT CODE: M2551)

Year of fee payment: 4

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8

AS Assignment

Owner name: NEWYU, INC., WASHINGTON

Free format text: CHANGE OF NAME;ASSIGNOR:NUMERA, INC.;REEL/FRAME:058957/0923

Effective date: 20150813