CA2953856A1 - System for monitoring health related information for individuals - Google Patents

System for monitoring health related information for individuals Download PDF

Info

Publication number
CA2953856A1
CA2953856A1 CA2953856A CA2953856A CA2953856A1 CA 2953856 A1 CA2953856 A1 CA 2953856A1 CA 2953856 A CA2953856 A CA 2953856A CA 2953856 A CA2953856 A CA 2953856A CA 2953856 A1 CA2953856 A1 CA 2953856A1
Authority
CA
Canada
Prior art keywords
user
wearer
processor
gait
food
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
CA2953856A
Other languages
French (fr)
Inventor
Jay William SALES
Richard Chester Klosinski, Jr.
Matthew Allen WORKMAN
Meghan Kathleen MURPHY
Matthew David STEEN
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vision Service Plan VSP
Original Assignee
Vision Service Plan VSP
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vision Service Plan VSP filed Critical Vision Service Plan VSP
Publication of CA2953856A1 publication Critical patent/CA2953856A1/en
Abandoned legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/112Gait analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/11Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for measuring interpupillary distance or diameter of pupils
    • A61B3/112Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for measuring interpupillary distance or diameter of pupils for measuring diameter of pupils
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0015Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system
    • A61B5/0022Monitoring a patient using a global network, e.g. telephone networks, internet
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0077Devices for viewing the surface of the body, e.g. camera, magnifying lens
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/1032Determining colour for diagnostic purposes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1103Detecting eye twinkling
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1113Local tracking of patients, e.g. in a hospital or private home
    • A61B5/1114Tracking parts of the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1116Determining posture transitions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • A61B5/1128Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using image analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/117Identification of persons
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/117Identification of persons
    • A61B5/1171Identification of persons based on the shapes or appearances of their bodies or parts thereof
    • A61B5/1176Recognition of faces
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/145Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
    • A61B5/1455Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue using optical sensors, e.g. spectral photometrical oximeters
    • A61B5/14551Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue using optical sensors, e.g. spectral photometrical oximeters for measuring blood gases
    • A61B5/14552Details of sensors specially adapted therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/318Heart-related electrical modalities, e.g. electrocardiography [ECG]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/369Electroencephalography [EEG]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/40Detecting, measuring or recording for evaluating the nervous system
    • A61B5/4076Diagnosing or monitoring particular conditions of the nervous system
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/42Detecting, measuring or recording for evaluating the gastrointestinal, the endocrine or the exocrine systems
    • A61B5/4261Evaluating exocrine secretion production
    • A61B5/4266Evaluating exocrine secretion production sweat secretion
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/44Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
    • A61B5/441Skin evaluation, e.g. for skin disorder diagnosis
    • A61B5/443Evaluating skin constituents, e.g. elastin, melanin, water
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/486Bio-feedback
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4884Other medical applications inducing physiological or psychological stress, e.g. applications for stress testing
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/6803Head-worn items, e.g. helmets, masks, headphones or goggles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7246Details of waveform analysis using correlation, e.g. template matching or determination of similarity
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7271Specific aspects of physiological measurement analysis
    • A61B5/7278Artificial waveform generation or derivation, e.g. synthesising signals from measured signals
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B7/00Instruments for auscultation
    • A61B7/02Stethoscopes
    • A61B7/04Electric stethoscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0062Monitoring athletic performances, e.g. for determining the work of a user on an exercise apparatus, the completed jogging or cycling distance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/34User authentication involving the use of external additional devices, e.g. dongles or smart cards
    • G06F21/35User authentication involving the use of external additional devices, e.g. dongles or smart cards communicating wirelessly
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/19Sensors therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/197Matching; Classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/23Recognition of whole body movements, e.g. for sport training
    • G06V40/25Recognition of walking or running movements, e.g. gait recognition
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C9/00Individual registration on entry or exit
    • G07C9/30Individual registration on entry or exit not involving the use of a pass
    • G07C9/32Individual registration on entry or exit not involving the use of a pass in combination with an identity check
    • G07C9/37Individual registration on entry or exit not involving the use of a pass in combination with an identity check using biometric data, e.g. fingerprints, iris scans or voice recognition
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/04Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
    • G08B21/0407Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons based on behaviour analysis
    • G08B21/0423Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons based on behaviour analysis detecting deviation from an expected pattern of behaviour or schedule
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/04Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
    • G08B21/0438Sensor means for detecting
    • G08B21/0461Sensor means for detecting integrated or attached to an item closely associated with the person but not worn by the person, e.g. chair, walking stick, bed sensor
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/04Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
    • G08B21/0438Sensor means for detecting
    • G08B21/0476Cameras to detect unsafe condition, e.g. video cameras
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass
    • G09B19/0092Nutrition
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances
    • G09B5/06Electrically-operated educational appliances with both visual and audible presentation of the material to be studied
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/40ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/67ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/08Network architectures or network communication protocols for network security for authentication of entities
    • H04L63/0861Network architectures or network communication protocols for network security for authentication of entities using biometrical features, e.g. fingerprint, retina-scan
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2560/00Constructional details of operational features of apparatus; Accessories for medical measuring apparatus
    • A61B2560/04Constructional details of apparatus
    • A61B2560/0475Special features of memory means, e.g. removable memory cards
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0219Inertial sensors, e.g. accelerometers, gyroscopes, tilt switches
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0223Magnetic field sensors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0257Proximity sensors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2576/00Medical imaging apparatus involving image processing or analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/0205Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/05Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves 
    • A61B5/053Measuring electrical impedance or conductance of a portion of the body
    • A61B5/0531Measuring skin impedance
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/08Detecting, measuring or recording devices for evaluating the respiratory organs
    • A61B5/0816Measuring devices for examining respiratory frequency
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1118Determining activity level
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7271Specific aspects of physiological measurement analysis
    • A61B5/7282Event detection, e.g. detecting unique waveforms indicative of a medical condition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F2/00Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
    • A61F2/50Prostheses not implantable in the body
    • A61F2/76Means for assembling, fitting or testing prostheses, e.g. for measuring or balancing, e.g. alignment means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F2/00Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
    • A61F2/50Prostheses not implantable in the body
    • A61F2/76Means for assembling, fitting or testing prostheses, e.g. for measuring or balancing, e.g. alignment means
    • A61F2002/7695Means for testing non-implantable prostheses
    • GPHYSICS
    • G02OPTICS
    • G02CSPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
    • G02C11/00Non-optical adjuncts; Attachment thereof
    • G02C11/10Electronic devices other than hearing aids
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing

Abstract

A computer-implemented method, and related system, for monitoring the wellbeing of an individual by providing eyewear that includes at least one sensor for monitoring the motion of the user. In various embodiments, the system receives data generated by the at least one sensor, uses the data to determine the user's movements using the received data, and compares the user's movements to previously established movement patterns of the user. If the system detects one or more inconsistencies between the user's current movements as compared to the previously established movement patterns of the user, the system may notify the user or a third party of the detected one or more inconsistencies. The system may similarly monitor a user's compliance with a medical regime and notify the user or a third party of the user's compliance with the regime.

Description

SYSTEM FOR MONITORING HEALTH RELATED INFORMATION FOR
INDIVIDUALS
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application claims the benefit of U.S. Patent Application No.
14/610,589, filed January 30, 2015, entitled "Systems and Method for Monitoring an Individual's Compliance with a Weight Loss Plan," U.S. Patent Application No. 14/550,406, filed November 21, 2014, entitled "Wearable Gait Monitoring Apparatus, Systems, and Related Methods,"
and U.S. Patent Application No. 14/562,454, filed December 5, 2014, entitled "System for Monitoring Individuals as They Age in Place," which all claim the benefit of U.S.
Provisional Patent Application No. 62/046,406, filed September 5, 2014, entitled, "Wearable Health Computer Apparatus, Systems, and Related Methods," the entire disclosures of which are incorporated herein by reference in their entirety.
BACKGROUND
[0002] Being able to monitor elderly individuals who live independently at home has become increasingly important due, in part, to the high cost of elder care facilities. Accordingly, there is a need for improved systems and methods for monitoring the activities and wellbeing of elderly individuals living at home. There is a similar need for monitoring the activities and wellbeing of individuals with special needs living outside of an institutional setting.
[0003] Furthermore, observing a person's gait is often an important clinical step in diagnosing certain types of musculoskeletal and neurological conditions. Proper gait diagnosis may also be valuable in properly fitting a patient with a prosthesis. Currently, gait analysis is largely dependent on the subjective perception of a trained professional. Such manual diagnosis methods may rely on a small set of observable activities and may have inherent inaccuracies, creating the potential for misdiagnosis and the delay of proper treatment.
Thus, there is currently a need for improved systems and methods for diagnosing the gait of an individual.
[0004] Moreover, public health officials concur that Americans are not adhering to current healthful lifestyle recommendations. Only one in five of the US population follows recommendations for fruit and vegetable consumption. Only one in four adheres to the recommendations for exercise, and three of four adhere to recommendations not to smoke.
Possibly, the most telling statistic is that only three of every 100 US adults follow all recommendations to consume five fruits and vegetables daily, get regular physical activity, maintain a healthy weight, and not smoke. Being able to monitor an individual's compliance with weight loss recommendations is advantageous for the individual, for health care providers and for insurance companies. Accordingly, there is a need for improved systems and methods for monitoring an individual's compliance with a weight loss regime.
[0005] Various embodiments of the present systems and methods recognize and address the foregoing considerations, and others, of prior art systems and methods.
SUMMARY OF THE VARIOUS EMBODIMENTS
[0006] A computer-implemented method of monitoring the wellbeing of an individual according to various embodiments comprises the steps of: (1) providing a user with computerized eyewear comprising at least one sensor for monitoring the motion of the user; (2) receiving data generated by the at least one sensor; (3) determining the user's movements using the received data; (4) comparing the user's movements to previously established one or more movement patterns for the user; (5) determining whether one or more inconsistencies exist between the current user's movements and the previously-established one or more movement patterns; and (6) at least partially in response to determining that such one or more inconsistencies exist, notifying a recipient selected from a group consisting of: the user and/or a third party of the detected one or more inconsistencies.
[0007] A computer-implemented method of monitoring the wellbeing of an individual according to further embodiments comprises the steps of: (1) providing a user with a computerized wearable device comprising at least one sensor for monitoring actions taken by a user; (2) receiving a medicine regime associated with the user; (3) receiving data generated by the at least of the wearable device's sensors; (4) analyzing the received data generated by the at least one sensor to determine: (a) the type of medicine taken by the wearer; (b) the ,time the medicine is taken by the wearer; ancUor (c) the dose of medicine taken by the wearer; (5) comparing the medicine regime for the user to the determined one or more of the type of medicine taken by the wearer, the time the medicine is taken by the wearer, and/or dose of medicine taken by the wearer; (6) detecting one or more inconsistencies between the medicine regime associated with the user and the determined one or more of the type of medicine taken by the user, the time the medicine is taken by the user, and/or the dose of medicine taken by the user;
(7) notifying the user and/or third party of the detected inconsistencies.
[0008] In various embodiments, a method of monitoring the health of an individual comprises:
(1) receiving information obtained from at least one sensor worn adjacent the individual's head;
(2) in response to receiving the information from a user, utilizing the information to assess the gait of the individual; and (3) at least partially in response to receiving the assessed gait, determining whether the assessed gait includes one or more particular gait patterns that arc associated with a particular medical condition.
[0009] In various embodiments, a method of monitoring the proper fit of a prosthesis worn by an individual comprises: (1) receiving information obtained from at least one sensor worn adjacent an individual's head; (2) at least partially in response to receiving the information, utilizing the information to assess the gait of the individual and to analyze the assessed gait to determine one or more gait patterns associated with the individual's gait; (3) determining whether the one or more gait patterns are consistent with a particular gait abnormality; and (4) in response to identifying a gait pattern that is consistent with a particular gait abnormality, generating an alert to indicate that the individual may have a gait abnormality, which may further evidence an improper fit of the prosthesis.
[0010] In various embodiments, a computer system for monitoring the gait of an individual comprises a pair of glasses comprising one or more sensors for assessing the gait of the individual. In various particular embodiments, the system is configured to analyze a user's assessed gait to determine whether the assessed gait includes one or more particular gait patterns that are consistent with one or more particular medical conditions. In response to determining that the assessed gait includes one or more particular gait patterns that are consistent with one or more particular medical conditions, the system may generate an alert that communicates the particular medical condition to a user (e.g., the individual or a caregiver of the individual).
[0011] A computer-implemented method of monitoring compliance with a weight loss plan by a wearer of computerized eyewear, the method comprising (1) receiving, by at least one processor, at least one signal generated by one or more sensors operatively coupled to the computerized eyewear; (2) at least partially in response to receiving the at least one signal from the one or more sensors, determining, by at least one processor, an identity of a food that the wearer is preparing to ingest; (3) determining, by at least one processor, a quantity of the food that the wearer is preparing to ingest; (4) tracking, by at least one processor, the identity and the quantity of the food that the wearer is preparing to ingest; and (5) comparing, by at least one processor, the identity and the quantity of the food that the wearer is preparing to ingest to a predetermined weight loss plan. In various embodiments, determining the identity of the food further comprises capturing an image by the forward facing camera of a packaging associated with the food, at least partially in response to capturing the image, detecting a barcode contained on the packaging, and searching a database of bareodes to determine the nutritional value associated with the food. In other embodiments, determining a quantity of the food further comprises detecting in the captured image a vessel used by the wearer to measure the quantity of the food that the wearer is preparing to ingest, and detecting at least one marking on the vessel that indicates the quantity of the food placed in the vessel.
[0012] A system for monitoring compliance with a weight loss plan by a wearer of computerized eyewear comprises one or more processors and memory operatively coupled to the one or more processors. In various embodiments, the one or more processors is configured to (1) receive at least one signal generated by one or more sensors operatively coupled to the computerized eyewear; (2) at least partially in response to receiving the at least one signal from the one or more sensors, determine an identity of a food that the wearer is preparing to ingest; (3) determine a quantity of the food that the wearer is preparing to ingest; (4) track the identity and the quantity of the food that the wearer is preparing to ingest; (5) identify the date and time that the food is ingested by the wearer; and (6) compare the identity and the quantity of the food that the wearer ingests to a predetermined dietary plan.
[0013] A computer-implemented method of monitoring compliance with a weight loss plan by a wearer of computerized eyewear comprising the steps of (1) receiving, by at least one processor, at least one signal generated by one or more sensors operatively coupled to the computerized eyewear; (2) at least partially in response to receiving the at least one signal from the one or more sensors, determining, by at least one processor, an identity of a food that the wearer is preparing to ingest; (3) determining, by at least one processor, a quantity of the food that the wearer is preparing to ingest; (4) comparing, by at least one processor, the identity and the quantity of the food that the wearer is preparing to ingest to a predetermined weight loss plan; (5) at least partially in response to comparing the identity and the quantity of the food that the wearer is preparing to ingest to a predetermined dietary plan, calculating, by at least one processor, one or more recommendations to assist the wearer in complying with the predetermined dietary plan; and (6) notifying, by at least one processor, the wearer of the one or more recommendations.
[0014] A wearable health monitoring device comprising at least one processor; and one or more sensors, wherein the wearable health monitoring device is configured for (1) receiving, from the one or more sensors, substantially current health information for a user; (2) receiving information associated with one or more items taken orally by the user; (3) providing the substantially current health information and the information associated with one or more items taken orally by the user to a person selected from: (a) the user; and (b) a healthcare professional. In various embodiments, the wearable health monitoring device is selected from a group consisting of a pair of eyewear; a set of clothing; and a wristwatch. In some embodiments, the one or more items taken orally comprise one or more food items and may comprise one or more medications.
[0015] In various embodiments, the one or more sensors comprise a heart rate monitor; and the substantially current health information comprises a heart rate of the user.
In other embodiments, the one or more sensors comprise one or more accelerometers; and the substantially current health information comprises information selected from a group consisting of (1) a number of steps taken by the user over a particular period of time;
(2) a posture of the user; and (3) an activity level of the user. In other embodiments, the wearable health monitoring device is further configured for determining, based at least in part on the substantially current health information, one or more health risks posed to the user.
[0016] A wearable health monitoring device comprises (1) at least one processor; (2) one or more cameras; and (3) one or more sensors. The wearable health monitoring device is configured for (a) determining, by the one or more sensors, one or more vital signs of a user; (2) determining one or more exercises performed by the user; (3) determining diet information for the user; (4) determining sleep pattern information for the user; (5) determining eye heath for the user; and (6) providing information associated with the one or more vital signs of the user, the one or more exercises performed by the user, the diet information for the user, the sleep pattern information for the user, and the eye health for the user to a person selected from a group consisting of the user; and a healthcare professional. In certain embodiments, the one or more sensors are selected form a group consisting of one or more heart rate monitors; one or more thermometers; one or more accelerometers; and one or more blood pressure sensors.
[0017] In some embodiments, the one or more vital signs comprise a heart rate; a temperature; a blood pressure; and a respiratory rate. In certain embodiments, the sleep pattern information for the user comprises sleep cycle data. In various embodiments, the wearable health monitoring device is a pair of eyewear; the one or more cameras are disposed adjacent a front portion of the pair of eyewear facing substantially in a direction the user is viewing while wearing the pair of eyewear; and determining the diet information for the user comprises receiving a first image from the one or more cameras; identifying a first piece of food in the first image; and at least partially in response to identifying the first piece of food, determining nutritional information for the first piece of food.
BRIEF DESCRIPTION OF THE DRAWINGS
[0018] Various embodiments of systems and methods for assessing a user's activities and movements are described below. In the course of this description, reference will be made to the accompanying drawings, which are not necessarily drawn to scale and wherein:
[0019] Figure IA is a block diagram of a Behavior Pattern Analysis System in accordance with an embodiment of the present system;
[0020] Figure 1B is a block diagram of an exemplary system for monitoring an individual's gait in accordance with an embodiment of the present system;
[0021] Figure IC is a block diagram of a weight loss compliance system in accordance with an embodiment of the present system;
[0022] Figure ID is a block diagram of a wearable health monitoring system in accordance with an embodiment of the present system;
[0023] Figure 2 is a block diagram of the computer for use in any one of Figures lA ¨ 1C;
[0024] Figure 3 is a perspective view of computerized eyewear according to a particular embodiment for use with the systems shown in Figures lA ¨ IC;
[0025] Figure 4A is a flowchart that generally illustrates various steps executed by a Behavior Pattern Analysis Module according to a particular embodiment;
[0026] Figure 4B depicts a flowchart that generally illustrates a method of monitoring an individual's gait;
[0027] Figure 4C illustrates a flowchart that generally illustrates various steps executed by a Weight Loss Compliance Module according to a particular embodiment;
[0028] Figure 5 depicts a flowchart that generally illustrates various steps executed by a Health Data Acquisition Module according to a particular embodiment;
[0029] Figures 6A-6C depict a flowchart that generally illustrates various steps executed by an Actionable Data Collection Module according to a particular embodiment;
[0030] Figure 7 depicts a flowchart that generally illustrates various steps executed by a Decision Making Data Acquisition Module according to a particular embodiment;
and
[0031] Figure 8 depicts a flowchart that generally illustrates various steps executed by a Processing and Reporting Module according to a particular embodiment.
DETAILED DESCRIPTION OF SOME EMBODIMENTS
[0032] Various embodiments will now be described more fully hereinafter with reference to the accompanying drawings. It should be understood that the invention may be embodied in many different forms and should not be construed as limited to the embodiments set forth herein.
Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art.
Like numbers refer to like elements throughout.
Behavior Pattern Analysis System Overview
[0033] A wearable health monitoring system, according to various embodiments, may include a suitable wearable device that is configured to monitor one or more movements, activities, and/or health attributes of a wearer (e.g., user). Suitable wearable devices may include, for example: (1) pair of eyewear (e.g., goggles or eyeglasses); (2) one or more contact lenses;
(3) a wristwatch;
(4) an article of clothing (e.g., such as a suitable shirt, pair of pants, undergarment, compression sleeve, etc.); (5) footwear; (6) a hat; (7) a helmet; (8) an adhesive strip or other tag that may be selectively attached to an individual or the individual's clothing; (9) a computing device that is embedded into a portion of an individual's body (e.g., under the individual's skin, or within a medical device, such as a pacemaker); (10) an orthopedic cast, or (11) any other suitable wearable item. In a particular example, a wearable health monitoring system embodied as a pair of eyewear may enable the system to monitor what an individual is sensing (e.g., touching, seeing, hearing, smelling, ancUor tasting) based at least in part on a proximity of the eyewear to the wearer's sensory systems (e.g., skin, eyes, mouth, ears, nose) when worn by the wearer.
[0034] In various embodiments, the system comprises one or more sensors that are configured to determine one or more current physical attributes of the wearer (e.g., heart rate, brain wave activity, movement, body temperature, blood pressure, oxygen saturation level, etc...). The one or more sensors may include, for example: (1) one or more heart rate monitors;
(2) one or more electrocardiograms (EKG); (3), one or more electroencephalograms (EEG); (4) one or more pedometers; (5) one or more thermometers; (6) one or more transdermal transmitter sensors; (7) one or more front-facing cameras; (8) one or more eye-facing cameras; (9) one or more microphones; (10) one or more accelerometers; (11) one or more blood pressure sensors; (12) one or more pulse oximeters; (13) one or more respiratory rate sensors; (14) one or more blood alcohol concentration (BAC) sensors; (15) one or more near-field communication sensors; (16) one or more motion sensors; (17) one or more gyroscopes; (18) one or more geomagnetic sensors; (19) one or more global positioning system sensors; (20) one or more impact sensors;
and/or (21) any other suitable one or more sensors.
[0035] In particular embodiments, the system is configured to gather data, for example, using the one or more sensors, about the wearer (e.g., such as the wearer's body temperature, balance, heart rate, level of physical activity, diet (e.g., food recently eaten), compliance with a prescribed medical regimen (e.g., medications recently taken), position, movements (e.g., body movements, facial muscle movements), location, distance traveled, etc.). In various embodiments, the system is configured to, for example: (1) store the gathered data associated with the user; (2) provide the data to one or more medical professionals, for example, to aid in the diagnosis and/or treatment of the user; (3) use the data to predict one or more medical issues with the user (e.g., the illness or death of the user); and/or (4) take any other suitable action based at least in part on the gathered data.
[0036] In a particular implementation, the system's wearable device is a pair of computerized cyewear that comprises one or more sensors for monitoring one or more day-to-day activities of an elderly individual as they "age in place" (e.g., they live in a non-institutional setting). In particular embodiments, the one or more sensors are coupled to (e.g., connected to, embedded in, etc.) the pair of glasses, which may be, for example, a pair of computerized or non-computerized eyeglasses. In particular embodiments, the individual is a senior citizen who lives at least substantially independently.
[0037] In particular embodiments, the wearable computing device comprises one or more location sensors (e.g., geomagnetic sensors, etc.), motion sensors (e.g., accelerometers, gyroscopes, magnetic sensors, pressure sensors, etc.), and/or impact sensors that are adapted to sense the movement and location of the individual. In various embodiments, the wearable device is adapted to facilitate the transmission of this movement information to a remote computing device (e.g., a handheld computing device, an automated dialing device, a central server, or any other suitable smart device that may, in various embodiments, contain a wireless communications device that can connect to the wearable computing device) that analyzes the information to determine whether the individual's movement patterns are consistent with the individual's typical (e.g., past) movement patterns. If the movement patterns are inconsistent with the individual's typical movement patterns, the system may, for example, generate and transmit an alert to a third party (e.g., a physician, relative of the individual, other caretaker, police, etc.) informing the third party of the irregularities in the individual's movement. The third party may then, for example, check on the individual to make sure that the individual does not require assistance.
[0038] In further embodiments, the wearable device may be adapted, for example, to monitor:
(1) an individual's compliance with a prescribed treatment plan (e.g., compliance with a medication schedule); (2) an individual's compliance with a diet; and/or (3) whether an individual leaves a prescribed area defined by a geo-fence (e.g., a virtual fence). The system may do this, for example, by using any suitable sensors (e.g., location sensors, cameras, etc...) associated with the wearable device.
Wearable Gait Monitoring System Overview
[0039] A system, according to various embodiments, includes eyewear (or any other suitable wearable device) that includes one or more sensors (e.g., one or more heart rate monitors, one or more electrocardiograms (EKG), one or more electroencephalograms (EEG), one or more pedometers, one or more thermometers, one or more transdermal sensors, one or more front-facing cameras, one or more eye-facing cameras, one or more microphones, one or more accelerometers, one or more blood pressure sensors, one or more pulse oximeters, one or more respiratory rate sensors, one or more blood alcohol concentration (BAC) sensors, one or more motion sensors, one or more gyroscopes, one or more geomagnetic sensors, one or more global positioning system sensors, one or more impact sensors, or any other suitable one or more sensors) that may be used to monitor the gait of an individual. The system may further include one or more suitable computing devices for analyzing the individual's gait.
This information may then be used, for example, to: (I) identify one or more medical conditions associated with the individual; (2) assess the fit of a prosthetic device worn by the individual, ancUor (3) assess an individual's recovery from a particular injury or medical procedure.
System for Monitoring an Individual's Compliance with a Weight Loss Plan Overview
[0040] A weight loss compliance system, according to various embodiments, may include a suitable wearable device that is configured to monitor one or more of a wearer's (e.g., a user) food consumption, physical activities, sleep patterns, and/or compliance with a medicine regime (e.g., weight loss medicine/supplement). In particular embodiments, the system is configured to gather data, for example, using one or more sensors, about the wearer (e.g., such as the wearer's body temperature, heart rate, level of physical activity, geographic location, distance traveled, diet (e.g., food consumed, calories for each meal, total calories, etc.), compliance with a prescribed medical regimen (e.g., medications recently taken, dose, side effects), sleeping patterns (e.g., hours slept, type of sleep, quality of sleep), etc.). In various embodiments, the system is configured to, for example: (I) identify a food and quantity of the food that the wearer is preparing to ingest; (2) track the identity and the quantity of food (e.g., the nutritional value, the calories associated with the food, the date and time the food is ingested, etc.); (3) compare the identity and quantity of the food to a predetermined weight loss plan (e.g., a dietary plan); (4) calculate one or more recommendations to assist the wearer in complying with the predetermined weight loss plan; and/or (5) notify the wearer of the one or more recommendations.
[0041] In particular embodiments, the wearable device is adapted to facilitate the transmission of the data captured by the one or more sensors to a remote computing device (e.g., a handheld computing device, a central server, or any other suitable smart device that may, in various embodiments, contain a wireless communications device that can connect to the wearable computing device) that analyzes the information to determine whether the wearer is complying with the predetermined weight loss plan (e.g., a dietary plan, an exercise plan, a medicine plan, a sleep plan, or a combination of one or more of the proceeding). As the system tracks the wearer's food intake (e.g., calories, nutritional values, etc.), physical activities, sleep patterns, and/or medicine intake, the system may provide the wearer feedback, warnings, and/or recommendations on (1) how to better comply with the weight plan (e.g., eat more green vegetables or fruit, don't eat fast food, you should eat smaller portions, etc.), (2) how a particular food or quantity will impact the weight loss plan (e.g., if you eat this, you must perform a particular physical activity to offset the calories, etc.), and/or (3) how a medicine can impact the wearer's ability to lose weight (e.g., a particular side effect of the medicine is weight gain/loss, etc.).
[0042] Suitable wearable devices may include, for example: (1) a pair of eyewear (e.g., goggles or eyeglasses); (2) one or more contact lenses; (3) a wristwatch; (4) an article of clothing (e.g., such as a suitable shirt, pair of pants, undergarment, compression sleeve, etc.); (5) footwear; (6) a hat; (7) a helmet; (8) an adhesive strip or other tag that may be selectively attached to an individual or the individual's clothing; (9) a computing device that is embedded into a portion of an individual's body (e.g., under the individual's skin, or within a medical device, such as a pacemaker); (10) an orthopedic cast; or (11) any other suitable wearable item.
[0043] In various embodiments, the wearable device comprises one or more sensors that are configured to sense the wearer's food intake, physical activities, sleep patterns, and medicine intake (e.g., prescription weight loss drugs, over-the-counter weight loss drugs, nutritional supplements, prescription drugs, etc.). The one or more sensors may include, for example: (1) one or more forward facing cameras; (2) one or more global positioning sensors; (3), one or more olfactory sensors; (4) one or more pedometers; (5) one or more microphones; (6) one or more accelerometers; (7) one or more blood pressure sensors; (8) one or more pulse oximeters;
(9) one or more respiratory rate sensors; (10) one or more near-field communication chips; (11) one or more gyroscopes; (12) one or more geomagnetic sensors; (13) one or more global positioning system chips; and/or (14) any other suitable one or more sensors.
[0044] In a particular implementation, the system's wearable device includes a pair of computerized eyewear that comprises the one or more sensors for monitoring the wearer's compliance with a weight loss plan. In particular embodiments, the one or more sensors are coupled to (e.g., connected to, embedded in, etc.) the pair of glasses, which may be, for example, a pair of computerized or non-computerized eyeglasses. In particular embodiments, the wearer is a person trying to lose weight, become more physically active, under orders of a doctor to lose weight for medical reasons or any other person trying to live a more active and healthy life.

Health Monitoring System Overview
[0045] A wearable health monitoring system, in various embodiments, may, for example, be embodied in any suitable wearable device configured to monitor one or more health attributes of a wearer. The system may, for example, be embodied as a pair of eyewear, as contact lenses, as a wristwatch, as a suitable piece of clothing (e.g., such as a suitable shirt, pair of pants, undergarment, compression sleeve, etc.), as footwear, as a hat, as an orthopedic east, or any other suitable wearable item. In a particular example, a wearable health monitoring system embodied as a pair of eyewear may enable the system to access all five of a user's senses (e.g., touch, sight, sound, smell, and taste) based at least in part on a proximity of the eyewear to the user's sensory systems (e.g., eyes, mouth, ears, nose) when worn by the user.
[0046] In various embodiments, the system comprises one or more sensors configured to determine one or more attributes of the wearer. The one or more sensors may include, for example, one or more heart rate monitors, one or more pedometers, one or more thermometers, one or more cameras, one or more microphones, one or more accelerometers, one or more blood pressure sensors or any other suitable one or more sensors. In particular embodiments, the system is configured to gather data, for example, using the one or more sensors, about the user (e.g., such as temperature, balance, heart rate, activity, activity levels, food eaten, medications taken, steps taken, position, etc.). In various embodiments, the system is configured to, for example: (1) store the gathered data associated with the user; (2) provide the data to one or more medical professionals, for example, to aid in the diagnosis and/or treatment of the user; (3) use the data to predict one or more medical issues with the user; and/or (4) take any other suitable action based at least in part on the gathered data.
Exemplary Technical Platforms
[0047] As will be appreciated by one skilled in the relevant field, the present systems and methods may be, for example, embodied as a computer system, a method, or a computer program product. Accordingly, various embodiments may be entirely hardware or a combination of hardware and software. Furthermore, particular embodiments may take the form of a computer program product stored on a computer-readable storage medium having computer-readable instructions (e.g., software) embodied in the storage medium. Various embodiments = may also take the form of Internet-implemented computer software. Any suitable computer-readable storage medium may be utilized including, for example, hard disks, compact disks, DVDs, optical storage devices, and/or magnetic storage devices.
[0048] Various embodiments are described below with reference to block diagram and flowchart illustrations of methods, apparatuses, (e.g., systems), and computer program products. It should be understood that each block of the block diagrams and flowchart illustrations, and combinations of blocks in the block diagrams and flowchart illustrations, respectively, can be implemented by a computer executing computer program instructions. These computer program instructions may be loaded onto a general purpose computer, a special purpose computer, or other programmable data processing apparatus that can direct a computer or other programmable data processing apparatus to function in a particular manner such that the instructions stored in the computer-readable memory produce an article of manufacture that is configured for implementing the functions specified in the flowchart block or blocks.
[0049] The computer instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on a user's computer and partly on a remote computer, or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including but not limited to: (1) a local area network (LAN); (2) a wide area network (WAN);
(3) a cellular network; or (4) the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
[0050] These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner such that the instructions stored in the computer-readable memory produce an article of manufacture that is configured for implementing the function specified in the flowchart block or blocks. The computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer-implemented process (e.g., method) such that the instructions that execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart block or blocks.

Example System Architecture ¨ Behavior Pattern Analysis System
[0051] Figure 1A is a block diagram of a behavior pattern analysis system 100A according to particular embodiments. As may be understood from this figure, the behavior pattern analysis system 100A includes one or more networks 115A, one or more third party servers 50A, a pattern analysis server 120A that includes a behavior pattern analysis module 400A, a movement information database 140A, one or more remote computing devices 154A (e.g., such as a smart phone, a tablet computer, a wearable computing device, a laptop computer, a desktop computer, a Bluetooth device, an automated dialing apparatus, etc.), and one or more wearable health monitoring devices 156A, which may, for example, be embodied as one or more of eyewear, headwear, clothing, a watch, a hat, a helmet, a cast, an adhesive bandage, a piece of jewelry (e.g., a ring, earring, necklace, bracelet, etc.), or any other suitable wearable device. In particular embodiments, the one or more computer networks 115A facilitate communication between the one or more third party servers 50A, the pattern analysis server 120A, the movement information database 140A, the one or more remote computing devices 154A, and the one or more health monitoring devices 156A.
[0052] The one or more networks 115A may include any of a variety of types of wired or wireless computer networks such as the Internet, a private intranet, a mesh network, a public switch telephone network (PSTN), or any other type of network (e.g., a network that uses Bluetooth or near field communications to facilitate communication between computing devices). The communication link between the one or more remote computing devices 154A
and the pattern analysis server 120A may be, for example, implemented via a local area network (LAN) or via the Internet.
Example System Architecture ¨ Wearable Gait Monitoring System
[0053] Figure 1B is a block diagram of a wearable gait monitoring system 100B according to a particular embodiment. As may be understood from this figure, the wearable gait monitoring system 100 includes one or more computer networks 115B, one or more third party servers 50B, 50B, 50B, a gait server 120B, a database 130B, one or more remote computing devices 110B
(e.g., such as a smart phone, a tablet computer, a wearable computing device, a laptop computer, etc.), and one or more wearable gait monitoring device(s) 156B, which may, for example, be embodied as eyewear (e.g., glasses or goggles), clothing, a watch, a hat, a cast, an adhesive bandage, a piece of jewelry (e.g., a ring, earring, necklace, etc.), ancUor any other suitable wearable device.
[0054] In various embodiments, the one or more wearable gait monitoring device(s) 156B may further comprise at least one processor and one or more sensors (e.g., an accelerometer, a magnetometer, a gyroscope, a front-facing camera, a location sensor such as a GPS unit, etc.). In particular embodiments, the system is configured to gather data, for example, using the one or more sensors, regarding the user's gait as the user walks or runs (e.g., the user's stride cadence, the user's speed (e.g., the speed of the user's feet and/or body), the orientation of the user (e.g., the orientation of the user's body ancUor feet), the elevation of the user's respective feet from the ground, the movement of the user's head such as bobbing, etc.).
[0055] In various embodiments, the database is configured to store information regarding gait patterns associated with various predetermined medical conditions. The system is configured to store information regarding normal gait patterns for a particular individual or individuals who arc similar in physical stature to the particular individual. In various embodiments, the database stores past information regarding an individual's gait and may include recent gait measurements for the individual, which may, for example, be used to track the individual's progress in improving their gait (e.g., after an injury or a medical procedure).
[0056] The one or more computer networks 115B may include any of a variety of types of wired or wireless computer networks such as the Internet, a private intranet, a mesh network, a public switch telephone network (PSTN), or any other type of network (e.g., a network that uses Bluetooth or near field communications to facilitate communication between computers). The communication link between the wearable gait monitoring system 100B and the database 130B
may be, for example, implemented via a local area network (LAN) or via the Internet. In particular embodiments, the one or more computer networks 115b facilitate communication between the one or more third party servers 50b, 50b, 50b, the gait server 120b, the database 130b, and one or more remote computing devices 110b. In various embodiments, the handheld device 110a is configured to communicate with the wearable gait monitoring device 156B via, for example, Bluetooth. In various other embodiments, the wearable gait monitoring device 156B may communicate with a remote server, for example, the gait server 120B, via a cellular communication or wireless Internet connection. In yet other embodiments, the system may be further configured to allow the wearable gait monitoring device 1568 to communicate with the remote server (e.g., the gait server 120B), without the intermediary handheld device 110B.
Example System Architecture ¨ Systems for Monitoring Compliance with a Weight Loss Plan
[0057] Figure 1C is a block diagram of a weight loss compliance system 100C according to particular embodiments. As may be understood from this figure, the weight loss compliance system 100C includes one or more networks 115C, one or more third party servers 50C, a compliance server 120C that includes a weight loss compliance module 400C, a dietary information database 140C, an exercise information database 142C, a medicine database 144C, a sleep information database 146C, one or more remote computing devices 154C
(e.g., such as a smart phone, a tablet computer, a wearable computing device, a laptop computer, a desktop computer, a Bluetooth device, an automated dialing apparatus, etc.), and one or more health monitoring devices 156C, which may, for example, be embodied as one or more of eyewear, headwear, clothing, a watch, a hat, a helmet, a cast, an adhesive bandage, a piece ofjewelry (e.g., a ring, earring, necklace, bracelet, etc.), or any other suitable wearable device. In particular embodiments, the one or more computer networks 115C facilitate communication between the one or more third party servers 50C, the compliance server 120C, the dietary information database 140C, the exercise information database 142C, the medicine database 144C, the sleep information database 146C, the one or more remote computing devices 154C, and/or the one or more health monitoring devices 156C.
[0058] The one or more networks 115C may include any of a variety of types of wired or wireless computer networks such as the Internet, a private intranet, a mesh network, a public switch telephone network (PSTN), or any other type of network (e.g., a network that uses Bluctooth or near field communications to facilitate communication between computing devices). The communication link between the one or more remote computing devices 154C and the compliance server 120C may be, for example, implemented via a Local Area Network (LAN) or via the Internet.
Example System Architecture ¨ Health Monitoring System
[0059] Figure 1D is a block diagram of a wearable health monitor system 100D according to a particular embodiment. As may be understood from this figure, the wearable health monitor system 100D includes one or more computer networks 115D, one or more third party servers 50, a health server 120D, a database 140D, one or more remote computing devices 154D (e.g., such as a smart phone, a tablet computer, a wearable computing device, a laptop computer, etc.), and one or more wearable health monitoring devices 156D, which may, for example, be embodied as eyewear, clothing, a watch, a hat, a cast, an adhesive bandage, a piece of jewelry (e.g., a ring, earring, necklace, etc.), or any other suitable wearable device. In various embodiments, the one or more wearable health monitoring devices 156D comprise at least one processor and one or more sensors. In particular embodiments, the one or more computer networks facilitate communication between the one or more third party servers 50D, health server 120D, database 140D, and one or more remote computing devices 154D.
[0060] The one or more computer networks 115D may include any of a variety of types of wired or wireless computer networks such as the Internet, a private intranet, a mesh network, a public switch telephone network (PSTN), or any other type of network (e.g., a network that uses Bluetooth or near field communications to facilitate communication between computers). The communication link between the system information server 120D and the database 140D may be, for example, implemented via a Local Area Network (LAN) or via the Internet.
[0061] Figure 2 illustrates a diagrammatic representation of a computer architecture 200 for one or more of the pattern analysis server 120A, the gait server 120B, the compliance server 120C or the health server 120D that may be used respectively within the behavior pattern analysis system 100A, the wearable gait monitoring system 100B, the weight loss compliance system 100C or the health monitoring system 100D. It should be understood that the computer architecture 200 shown in Figure 2 may also represent the computer architecture for any one of the one or more remote computing devices 154A, 154B, 154C or 154D, one or more third party servers 50A, 50B, 50C or 50D and the one or more health monitoring devices 150A, 150B, 150C
or 150D
shown in Figures 1A, 1B, 1C or 1D.
[0062] In particular embodiments, the computer 200 may be connected (e.g., networked) to other computing devices in a LAN, an intranet, an extranet, and/or the Internet as shown in Figures 1 A
¨ 1D. As noted above, the Computer 200 may operate in the capacity of a server or a client computing device in a client-server network environment, or as a peer computing device in a peer-to-peer (or distributed) network environment. The computer 200 may be a desktop personal computing device (PC), a tablet PC, a set-top box (STB), a Personal Digital Assistant (PDA), a cellular telephone, a web appliance, a network router, a switch or bridge, or any other computing device capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that computing device. Further, while only a single computing device is illustrated, the term "computing device" shall also be interpreted to include any collection of computing devices that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein.
[0063] An exemplary computer 200 includes a processing device 202, a main memory 204 (e.g., read-only memory (ROM), flash memory, dynamic random access memory (DRAM) such as synchronous DRAM (SDRAM) or Rambus DRAM (RDRAM), etc.), a static memory 206 (e.g., flash memory, static random access memory (SRAM), etc.), and a data storage device 218, which communicate with each other via a bus 232.
[0064] The processing device 202 represents one or more general-purpose or specific processing devices such as a microprocessor, a central processing unit (CPU), or the like. More particularly, the processing device 202 may be a complex instruction set computing (CISC) microprocessor, reduced instruction set computing (RISC) microprocessor, very long instruction word (VLIW) microprocessor, or processor implementing other instruction sets, or processors implementing a combination of instruction sets. The processing device 202 may also be one or more special-purpose processing devices such as an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), a digital signal processor (DSP), network processor, or the like. The processing device 202 may be configured to execute processing logic 226 for performing various operations and steps discussed herein.
[0065] The computer 200 may further include a network interface device 208. The computer 200 may also include a video display unit 210 (e.g., a liquid crystal display (LCD) or a cathode ray tube (CRT)), an alpha-numeric input device 212 (e.g., a keyboard), a cursor control device 214 (e.g., a mouse), and a signal generation device 216 (e.g., a speaker).
[0066] The data storage device 218 may include a non-transitory computing device-accessible storage medium 230 (also known as a non-transitory computing device-readable storage medium or a non-transitory computing device-readable medium) on which is stored one or more sets of instructions (e.g., the Movement Pattern Analysis Module 400A, the Gait Module 400B, the Weight Loss Compliance Module 400C, the Health Data Acquisition Module 500, the Actionable Data Collection Module 600, the Decision Making Data Acquisition Module 700 or the Processing and Reporting Module 800) embodying any one or more of the methodologies or functions described herein. The one or more sets of instructions may also reside, completely or at least partially, within the main memory 204 and/or within the processing device 202 during execution thereof by the computer 200 ¨ the main memory 204 and the processing device 202 also constituting computing device-accessible storage media. The one or more sets of instructions may further be transmitted or received over a network 115A, 115B, 115C, or 115D
via a network interface device 208.
[0067] While the computing device-accessible storage medium 230 is shown in an exemplary embodiment to be a single medium, the term "computing device-accessible storage medium"
should be understood to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions. The term "computing device-accessible storage medium" should also be understood to include any medium that is capable of storing, encoding, or carrying a set of instructions for execution by the computing device and that causes the computing device to include any one or more of the methodologies of the present invention. The term "computing device-accessible storage medium" should accordingly be understood to include, but not be limited to, solid-state memories, optical and magnetic media, etc.
Structure of the Evewear
[0068] As shown in Figure 3, the one or more wearable health monitoring devices 156A, the one or more wearable gait monitoring device 156B, the one or more health monitoring devices 156C, or the one or more health monitoring device 156D may be configured as eyewear 300, which according to various embodiments, includes: (1) an eyewear frame 310; (2) a first temple 312;
and (3) a second temple 314, as described below.
Eyewear Frame
[0069] Referring still to Figure 3, eyewear 300, in various embodiments, includes any suitable eyewear frame 310 configured to support one or more lenses 318, 320. In the embodiment shown in this figure, the eyewear frame 310 has a first end 302 and a second end 304. The eyewear frame 310 may be made of any suitable material such as metal, ceramic, polymers or any combination thereof. In particular embodiments, the eyewear frame 310 is configured to support the first and second lenses 318, 320 about the full perimeter of the first and second lenses 318, 320. In other embodiments, the eyewear frame 310 may be configured to support the first and second lenses 318, 320 about only a portion of each respective lens. In various embodiments, the eyewear frame 310 is configured to support a number of lenses other than two lenses (e.g., a single lens, a plurality of lenses, etc.). In particular embodiments, the lenses 318, 320 may include prescription lenses, sunglass lenses, or any other suitable type of lens (e.g., reading lenses, non-prescription lenses), which may be formed from glass or polymers.
[0070] The eyewear frame 310 includes a first and second nose pad 322 (not shown in figures), 324, which may be configured to maintain the eyewear 300 adjacent the front of a wearer's face such that the lenses 318, 320 are positioned substantially in front of the wearer's eyes while the wearer is wearing the eyewear 300. In particular embodiments, the nose pads 322, 324 may comprise a material that is configured to be comfortable when worn by the wearer (e.g., rubber, etc.). In other embodiments, the nose pads may include any other suitable material (e.g., plastic, metal, etc.). In still other embodiments, the nose pads may be integrally formed with the frame 310.
[0071] The eyewear frame 310 includes a first and second hinge 326, 328 that attach the first and second temples 312, 314 to the frame first and second ends 302, 304, respectively. In various embodiments, the hinges may be formed by any suitable connection (e.g., tongue and groove, ball and socket, spring hinge, etc.). In particular embodiments, the first hinge 326 may be welded to, or integrally formed with, the frame 310 and the first temple 312 and the second hinge 328 may be welded to, or integrally formed with, the frame 310 and the second temple 314.
First and Second Temples
[0072] Still referring to Figure 4, the first temple 312, according to various embodiments, is rotatably connected to the frame 310 at a right angle to extend the first temple 312 substantially perpendicular, substantially parallel, or anywhere in between the right angle to the frame 310.
The first temple 312 has a first and second end 312a, 312b. Proximate the first temple second end 312b, the first temple 312 includes an earpiece 313 configured to be supported by a wearer's ear. Similarly, the second temple 314, according to various embodiments, is rotatably connected to the frame 310 at a right angle to extend the second temple 314 substantially perpendicular, substantially parallel, or anywhere in between the right angle to the frame 310. The second temple 314 has a first and second end 314a, 314b. Proximate the second temple second end 314b, the second temple 314 includes an earpiece 315 configured to be supported by a wearer's ear.
Sensors
[0073] In various embodiments, the second temple 314 has one or more sensors 330 connected to the second temple 314. In various embodiments, the one or more sensors 330 may be coupled to the frame 310, the first and second temples 312, 314, the first and second lenses 318, 310, the nose piece 324, or any other portion of the eyewear 300 in any suitable way.
For instance, the one or more sensors 330 may be embedded into the eyewear 300, coupled to the eyewear 300, and/or operatively coupled to the eyewear 300. In various embodiments, the one or more sensors may be formed at any point along the eyewear 300. For instance, a fingerprint reader may be disposed adjacent the first temple of the eyewear 300. In various embodiments, the one or more sensors may be formed in any shape. In addition, the one or more sensors may be formed on the inner (back) surface of the frame 310, the first and second temples 312, 414, the first and second lenses 318, 310, or any other portion of the eyewear 300. In other embodiments, the one or more sensors may be formed on the outer (front) surface of the frame 310, the first and second temples 312, 314, the first and second lenses 318, 310, or any other portion of the eyewear 300.
[0074] In various embodiments, the one or more sensors 330 that are coupled to the eyewear (or other wearable device) are adapted to detect the identity of a food, the quantity of a food, the identity of a medicine, the dose of the medicine, one or more physical activities performed by the wearer, the length of the physical activity, etc. In various embodiments, the one or more sensors coupled to the eyewear or other health monitoring device may include, for example, one or more of the following: a near-field communication chip, a gyroscope, a Bluetooth chip, a GPS unit, an RFID tag (passive or active), a fingerprint reader, an iris reader, a retinal scanner, a voice recognition sensor, a forward facing camera, an olfactory sensor, a heart rate monitor, an electrocardiogram (EKG), an electroencephalogram (EEG), a pedometer, a thermometer, a microphone, an accelerometer, a magnetometer, a blood pressure sensor, a pulse oximeter, a skin conductance response sensor, a blood sensor, any suitable biometric reader or any other suitable sensor. In particular embodiments, the sensors coupled to the eyewear may include one or more electronic communications devices such as a near field communication chip, a Bluetooth chip, a forward facing camera, a microphone and a GPS unit.
[0075]
In various embodiments, the one or more sensors are coupled to a computing device that is associated with (e.g., embedded within, attached to) the eyewear or other wearable device. In particular embodiments, the eyewear or other wearable device comprises at least one processor, computer memory, suitable wireless communications components (e.g., a Bluetooth chip) and a power supply for powering the wearable device and/or the various sensors. In some such embodiments, the one or more sensors may be coupled to a Bluetooth device that is configured to transmit the one or more signals to a handheld wireless device.
[0076] In particular embodiments, the system is configured to receive input from a user (e.g., a wearer of the eyewear) via one or more gestures, for example, using at least one of the sensors described above. In various embodiments, the system may, for example, be configured to: (1) identify a gesture performed by the user; and (2) at least partially in response to identifying the gesture, perform a function associated with the gesture. In particular embodiments, the system may be configured to perform a particular function in response to identifying a particular gesture, where the particular gesture is associated with the particular function.
In particular embodiments, the system may be configured to enable the user to provide one or more gestures for performing a particular function. In such embodiments, the system may, for example: (1) receive a selection of a particular function from the user; (2) receive input of one or more gestures from the user; and (3) associate the particular function with the one or more gestures.
[0077] In various embodiments, the one or more gestures may include, for example: (1) one or more hand gestures (e.g., a thumbs up, a wave, two thumbs up, holding up any particular number of fingers, making one or more fists, performing a particular movement with one or more hands, etc.); (2) one or more head movements (e.g., shaking of the user's head, a nod, etc.); (3) one or more eye movements (e.g., looking in a particular direction for a particular period of time, a wink, blinking, blinking in a particular pattern, etc.); (4) one or more facial movements (e.g., a smile, a frown, sticking out of a tongue, etc.); and/or (5) any suitable combination of these or any other suitable gestures.
[0078] In particular embodiments, the system is configured to identify the one or more gestures, for example, using a suitable imaging device (e.g., camera) that is part of the system. In particular embodiments, the imaging device may be directed toward an area in front of the user while the user is wearing the eyewear 300 and configured to identify gestures performed by the user's hands, arms, feet, legs, etc. In other embodiments, the system may include an imaging device directed toward the user's face and/or eyes while the user is wearing the eyewear 300 that is configured to identify gestures performed by the user's face and/or eyes.
[0079] In other embodiments, the system comprises one or more gyroscopes and/or accelerometers configured to determine a position or change in position of the eyewear 300 while the user is wearing the eyewear. In such embodiments, the one or more gyroscopes and/or accelerometers are configured to identify one or more gestures performed by the user that include one or more gestures that include movement of the user's head. In still other embodiments, the system comprises one or more gyroscopes and/or one or more accelerometers, disposed on any other portion of the user's body, configured to identify any gesture performed by the user using the other portion of the user's body (e.g., arm, hand, leg, foot, etc.). In various embodiments, the system comprises any other suitable sensor for identifying one or more gestures performed by the user.
[0080] In particular embodiments, one or more of the sensors may be detachable from the eyewear. For instance, if a wearer does not need a temperature sensor or other particular sensor, the sensor may be removed from the eyewear. In still other embodiments, some of the one or more sensors may be coupled to the handheld wireless device, while other of the one or more sensors may be coupled to the wearable device.
More Detailed Description of System Functionality
[0081] Various embodiments of systems for monitoring the behavior patterns of an individual, the monitoring the gait of an individual and for monitoring an individual's compliance with a weight loss program are described below and may be implemented in any suitable context.
Behavior Pattern Analysis Functionality
[0082] As noted above, a system, according to various embodiments, is adapted to monitor one or more patterns of behavior and/or one or more locations of a user of a wearable device.
Various aspects of the system's functionality may be executed by certain system modules, including the behavior pattern analysis module 400A. The behavior pattern analysis module 400A is discussed in greater detail below.
[0083] Figure 4A is a flow chart of operations performed by an exemplary Behavior Pattern Analysis Module 400A, which may, for example, run on the pattern analysis server 120A, or any suitable computing device (such as the one or more health monitoring devices 156A, a handheld computing device coupled to communicate with the one or more health monitoring devices 156A
or a suitable mobile computing device). In particular embodiments, the behavior pattern analysis module 400A may assess a user's behavior and determine the user's location to provide this information to the user or to a third party.
[0084] The system begins, in various embodiments, at Step 405A by providing a user with computerized eyewear comprising at least one sensor for monitoring one or more behaviors of the user and/or any suitable attributes of the user. In various embodiments, the at least one sensor may include a location sensor (e.g., a GPS unit), an accelerometer, a heart rate monitor, one or more electrocardiogram (EKG), an electroencephalogram (EEG), a pedometer, a thermometer, a front-facing camera, an eye-facing camera, a microphone, an accelerometer, a blood pressure sensor, a pulse oximeter, a near-field communication sensor, a motion sensor, a gyroscope, a geomagnetic sensor, an impact sensor, and/or any other suitable sensor. In particular embodiments, the computerized eyewear further comprises: a motion sensor, an accelerometer, a GPS unit, a gyroscope, and/or a front-facing camera.
[0085] In particular embodiments, the sensors may be coupled to the eyewear in any suitable way. For example, in various embodiments, the sensors may be physically embedded into, or otherwise coupled to the eyewear. In some embodiments, the sensors may be positioned: (1) along the brow bar of the eyewear; (2) along the temples of the eyewear; (3) adjacent the lenses of the eyewear; and/or (4) in any other suitable location.
[0086] In particular embodiments: (1) the sensors are coupled to a wireless (e.g., Bluetooth, near-field communications, Wi-Fi, etc.) device that is configured to transmit one or more signals obtained from the one or more sensors to a handheld wireless device (e.g., a smartphone, a tablet, an automated dialing device, etc.); and (2) the step of receiving one or more signals from the one or more sensors further comprises receiving the one or more signals from the wireless handheld device via the Internet. In particular embodiments, one or more of the sensors may be selectively detachable from the eyewear, or other wearable device. For example, if a user does not need the temperature sensor, the temperature sensor may be selectively removed from the eyewear and stored for future use.
[0087] At Step 410A, the system receives data generated by the at least one sensor. In particular embodiments, the data generated by the at least one sensor may include data for a heart rate, a heart rhythm or electrical activity, a brain wave activity, a distance traveled, a temperature, an image, a sound, a speed traveled, a blood pressure, an oxygen saturation level, a near-field communication, a motion, an orientation, a geomagnetic field, a global position, an impact, a medicine regime, or any other suitable data.
[0088] In various embodiments, the system may receive the data substantially automatically after the sensor generates the data. In some embodiments, the system may receive the data periodically (e.g., by the second, by the minute, hourly, daily, etc.). For example, the system may receive the data every thirty seconds throughout the day. In other embodiments, the system may receive the data after receiving an indication from the user or a third party that the system should receive the data. For instance, the user may speak a voice command to the wearable device requesting that the device track the user's steps taken. In various embodiments, the system may receive an indication from the user or a third party of when to have the system receive the data. For example, the system may receive an indication from the third party to have the system receive global positioning data at 8:00 a.m. and at 2:00 p.m.
[0089] In particular embodiments, the system may receive an indication from the user or a third party to have particular data received from a particular sensor at the same time that the system receives second particular data from a second particular sensor. For example, when the system receives data that indicates that the user's speed has increased, the system may at least partially in response to receiving the increased speed data, also obtain global position data of the user. In particular embodiments, the system may receive behavior data during a predefined time period.
For instance, the system may receive behavior data for the user during a predefined time period when the user should not be moving (e.g., 11:00 p.m. through 7:00 a.m. because the user should be sleeping). In various embodiments, the system may receive the data when a sensor detects movement of the user. For example, the system may receive data from the global positioning system sensor when the accelerometer or the gyroscope detects movement of the user.
[0090] In other embodiments, the data generated by the at least one sensor may be whether the user experiences one of sudden acceleration and sudden impact. In still other embodiments, the data generated by the at least one sensor may be a heartbeat and whether the user is breathing. In yet other embodiments, the data generated by the at least one sensor may be a medicine regime associated with the user. For instance, the user or the user's physician may manually input a medicine regime into the system by stating the name of the medicine while the user or the user's physician requests that the front-facing camera capture an image of the medicine and the
91 PCT/US2015/048612 medicine bottle. In some embodiments, the received data generated by the at least one sensor may be one or more images captured by the forward facing camera. In other embodiments, the received data generated by the at least one sensor may be the level of one or more medicines in the user's bloodstream.
[0091] In various embodiments, the system may receive data from a single sensor. In other embodiments, the system may receive data from all of the sensors. In yet other embodiments, the system may receive multiple data from each of the sensors. In various embodiments, the system may be configured to receive first data from a first sensor at the same time that it receives second data from a second sensor. For example, the system may be configured to receive a global position from the global positioning system sensor at the same time that it receives impact data from the impact sensor.
[0092] In particular embodiments, the system may store the received data.
In various embodiments, the system may store the received data substantially automatically after receiving the data. In other embodiments, the system may store the received data after receiving manual input from the user or a third party requesting that the system store the data. In various embodiments, the system may store the received data for a specified period of time. For instance, the system may store the received data for a day, a month, a year, etc., in the behavior information database 140A. In some embodiments, the system may store the received data on any suitable server, database, or device. In other embodiments, the system may store the received data on the pattern analysis server 120A. In still other embodiments, the system may store the received data in an account associated with the user. In various embodiments, the system may store the received data with a timestamp of when the data was received.
[0093] At Step 415A, the system analyzes the data generated by the at least one sensor. In various embodiments, the system analyzes the data generated by the at least one sensor substantially automatically after receiving the generated data. In various embodiments, the system may analyze the data periodically (e.g., by the second, by the minute, hourly, daily, etc.).
For example, the system may analyze the data every thirty seconds throughout the day. In other embodiments, the system may analyze the data after receiving an indication from the user or a third party that the system should analyze data. For instance, the user may speak a voice command to the wearable device requesting that the device analyze the user's steps taken. In various embodiments, the system may receive an indication from the user or a third party of when to have the system analyze the data. For example, the system may receive an indication from the third party to have the system analyze global positioning data at 8:00 a.m. and at 2:00 p.m.
[0094] In other embodiments, the system may analyze the data to determine one or more of (1) the type of medicine taken by the user; (2) the time the medicine is taken by the user; and (3) the dose of medicine taken by the user. In still other embodiments, the step of analyzing the received data further comprises detecting one or more pills in the one or more images, comparing the one or more detected pills found in the one or more images to known images of pills stored in a database, identifying the one or more pills by matching the one or more pills from the one or more images to the known images of pills stored in the database, and detecting the time that the image was taken. In various embodiments, the system analyzes the level of the one or more medicines in the user's bloodstream.
[0095] Then, at Step 420A, the system determines the user's current movements using the received data in order to generate one or more movement patterns for the user.
In various embodiments, the system determines the user's current movements substantially automatically after receiving the data. In various embodiments, the system may determine the user's current movements periodically (e.g., by the second, by the minute, hourly, daily, etc.). For example, the system may determine the user's current movements every thirty seconds throughout the day.
In other embodiments, the system may determine the user's current movements after receiving an indication from the user or a third party that the system should analyze data. For instance, the user may speak a voice command to the wearable device requesting that the device analyze the user's steps taken. In various embodiments, the system may receive an indication from the user or a third party of when to have the system analyze the data. For example, the system may receive an indication from the third party to have the system analyze global positioning data at 8:00 a.m. and at 2:00 p.m.
[0096] In various embodiments, the system determines the user's current movements by calculating the number of steps taken by the user in a particular day. In some embodiments, the system determines the user's current movements by tracking the distance traveled by the user for a particular day. In other embodiments, the system determines the user's current movements by capturing a series of images from the front-facing camera throughout the day.
In still other embodiments, the system determines the user's current movements by tracking the orientation of the user using the gyroscope. In particular embodiments, the current movements of the user may include actions such as lying down, falling, wandering, sitting, standing, walking, running, convulsing, shaking, balancing, etc.
[0097] In various embodiments, the user's current movements may include the user's current location. For example, the user's current location may be an address, a geographic area, an intersection, a bus stop, a building, or any other suitable definable location. In other embodiments, the user's current movements may help to indicate the user's current status (e.g., asleep, awake, conscious, unconscious, alive, deceased, stable, good, fair, serious, critical, injured, distressed, etc.). In some embodiments, the user's current behaviors may include compliance with prescribed treatment regimes. For instance, the user's current behaviors may include that the user has not been complying with prescribed treatment regimens as captured through the front-facing camera of the user not taking prescribed medicine.
[0098] In various embodiments, the system tracks current movements, current location, current status, and current compliance to generate one or more movement patterns, location patterns, status patterns, and compliance patterns. In some embodiments, the system generates the one or more patterns substantially automatically after the system determines the user's current movements, location, status, and compliance. In some embodiments, the system may generate the patterns periodically (e.g., by the second, by the minute, hourly, daily, weekly, monthly, etc.). For example, the system may generate a movement pattern for each month.
In other embodiments, the system may generate the pattern after receiving an indication from the user or a third party that the system should generate the pattern. For instance, the user may speak a voice command to the wearable device requesting that the device generate a pattern for the number of steps taken by the user for a typical day. In various embodiments, the system may receive an indication from the user or a third party of when to have the system generate the patterns. For example, the system may receive an indication from the third party to have the system generate a location pattern for the location of the user at 8:00 a.m.
and at 2:00 p.m. for the previous month.
[0099] In various embodiments, the movement patterns may include one or more typical movements made by the user. For example, the movement pattern may include that the user gets out of bed every morning. In particular embodiments, the location patterns may include one or more typical locations of the user. For instance, the location pattern may include that the user is at a first particular address in the morning, at a second particular address during the day, and at the first particular address at night. In some embodiments, the status patterns may include one or more typical statuses of the user. In example, the status pattern may include that the user is awake from 7:00 a.m. until 11:00 p.m. and asleep from 11:00 p.m. until 7:00 a.m. In other embodiments, the compliance patterns may include one or more typical compliance schedules of the user. For example, the compliance pattern may include that the user is prescribed a medicine that the user takes every day in the morning with food. In yet other embodiments, the medicine regime patterns may include one or more typical medicine regimes for the user.
For instance, the medicine regime pattern may include that the user takes a particular yellow pill, a particular white pill, and a particular pink pill in the evening with food. In various embodiments, the system may include one or more typical levels of one or more medicines in the user's bloodstream. For example, the typical level of a particular medicine in the user's bloodstream may be a certain volume at a particular period of time.
[00100] In particular embodiments, the system may store the generated patterns in an account associated with the user. In some embodiments, the generated patterns may be accessible by the user or a third party. For instance, the generated patterns may be diagramed in a chart that is accessible from the wearable device or from a computing device by the user's physician. In various embodiments, the system may store the generated patterns in the behavior information database 140A. In particular embodiments, the system may store information in the behavior information database 140A regarding past movement patterns associated with the user (e.g., when the user goes into different rooms in the user's house, when the user eats, when the user takes a walk, the destinations along the walk, etc.). In some embodiments, the system may store information in the behavior information database 140A regarding the user's sleep patterns. In other embodiments, the system may store information in the behavior information database 140A
regarding geo-fences associated with the user. In still other embodiments, the system may store information in the behavior information database 140A regarding deviations to the user's typical behavior (e.g., movement) patterns.
[00101] At Step 425A, the system compares the user's behaviors (e.g., movements) to the previously established one or more patterns for the user. In some embodiments, the system compares the user's movement to the previously established one or more movement patterns for the user substantially automatically after the system receives the user's current movements. In some embodiments, the system may compare the user's movement to the previously established one or more movement patterns periodically (e.g., by the second, by the minute, hourly, daily, weekly, monthly, etc.). For example, the system may compare the user's current movement to the previously established one or more movement patterns every thirty minutes throughout the day. In other embodiments, the system may compare the user's movement to the previously established one or more movement patterns after receiving an indication from the user or a third party that the system should compare the user's movement to the previously established movement pattern. For instance, the user may speak a voice command to the wearable device requesting that the device compare the user's movements for the current day to a movement pattern established the previous month. In various embodiments, the system may receive an indication from the user or a third party of when to have the system compare the user's movements to the one or more patterns. For example, the system may receive an indication from the third party to have the system compare the user's current location to a location pattern for the location of the user at 8:00 a.m. and at 2:00 p.m. on a typical day.
[00102] In some embodiments, the system may compare the user's movements to a previously established movement pattern by calculating the number of steps taken by the user in the particular day to a predetermined average number of steps taken by the user in a day. In various embodiments, the system may compare the user's location to a previously established location pattern by determining the average location of the user at a particular time of day. In other embodiments, the system may compare the user's status to a previously established status pattern by determining the user's average status at particular times of day.
[00103] In still other embodiments, the system may compare the user's compliance with a prescribed treatment regime by determining the user's average compliance with the prescribed treatment regime for a particular day. In yet other embodiments, the system may compare the one or more of the type of medicine taken; the time the medicine is taken; and the dose of the medicine taken to the stored medicine regime for the user. In various embodiments, the system may compare the level of one or more medicines in the user's bloodstream by determining the average level of the one or more medicines in the user's bloodstream at particular times of day.
[00104] In particular embodiments, the system may store the comparisons in an account associated with the user. In some embodiments, the comparisons may be accessible by the user or a third party. For instance, the comparisons may be diagramed in a chart that is accessible from the wearable device or from a computing device by the user's physician.
[00105] Continuing to Step 430A, the system detects one or more inconsistencies between the user's current movements as compared to the previously established one or more patterns. In other embodiments, the system does not detect one or more inconsistencies between the user's current movements as compared to the previously established one or more patterns. In various embodiments, the system may detect the one or more inconsistencies by determining that the user's current movements are inconsistent with the previously established patterns. In particular embodiments, the user's current movements may be inconsistent with previously established patterns based on the current movements being different from the established patterns by a particular percentage. For instance, where the user's movement patterns establish that the user walks a total of one mile a day, the system may determine that the user's current movement of walking 1/2 mile for the day is inconsistent with the user's previously established pattern of walking one mile a day because there is a difference of 50%.
[00106] In some embodiments, the user's current movements may be inconsistent with the previously established movement patterns based on the user's current movements not matching the previously established movement patterns. For instance, for the movement pattern that includes that the user gets out of bed every morning, where the system detects that the user does not get out of bed on a particular morning, the system may determine that the user's current movements are inconsistent with the previously established pattern.
[00107] In other embodiments, the user's current movements may be inconsistent with the previously established patterns based on the user's current location not matching the previously established location patterns. For example, for the location pattern that includes the user at a first particular address in the morning, at a second particular address during the day, and at the first particular address at night, where the system detects that the user was not at the second particular address during the day, the system may determine that the user's current movements are inconsistent with the previously established pattern.
[00108] In still other embodiments, the user's current movements may be inconsistent with the previously established patterns based on the user's current status not matching the previously established status patterns. For instance, for the status pattern that includes that the user is awake from 7:00 a.m. until 11:00 p.m. and asleep from 11:00 p.m. until 7:00 a.m., where the system detects that the user is asleep from 7:00 a.m. until 2:00 p.m., the system may determine that the user's current movements are inconsistent with the previously established pattern.
[00109] In yet other embodiments, the system may detect one or more inconsistencies between the medicine regime associated with the user and the determined one or more of the type of medicine taken by the user, the time the medicine is taken by the user, and the dose of medicine taken by the user. For instance, for a medicine regime that includes that the user takes a particular pill having a particular color (e.g., yellow), shape (e.g., triangular, square), and marking (e.g., the number 17) in the evening with food, where the system detects that the user did not take the particular yellow pill on a particular evening with food, the system may determine that the user's current movements are inconsistent with the previously established pattern.
[00110] In some embodiments, the system may detect one or more inconsistencies between the level of the one or more medicines in the user's bloodstream and the determined typical level of the one or more medicines in the user's bloodstream. For example, for a typical level of a particular medicine in the user's bloodstream that includes that the level is a certain volume at a particular period of time, where the system detects that the level of the medicine in the user's bloodstream is less than the typical level, the system may determine that the user's current movements are inconsistent with the previously established patterns.
[00111] At Step 435A, the system notifies the user and/or a third party of the detected one or more inconsistencies. In particular embodiments, in addition to notifying at least one recipient selected from a group consisting of: the user and the third party, the system updates the user's account to note that a notification was sent. In various embodiments, the system notifies the user of the detected one or more inconsistencies. In some embodiments, the system notifies the third party of the detected one or more inconsistencies. In particular embodiments, the system may notify the user of the detected one or more inconsistencies by displaying an image on the lens of the eyewear, or on another display associated with the eyewear. In other embodiments, the system notifies the user of the one or more inconsistencies by communicating through a speaker to the user.
[00112]
In various embodiments, the third party may be a relative of the user. In other embodiments, the third party may be a police depat ___________________________ Intent. In particular embodiments, the third party may be an ambulance service. In some embodiments, the third party may be a physician.

In still other embodiments, the third party may be an independent living provider. In yet other embodiments, the third party may be a particular caregiver of the user.
[001131 In some embodiments, the system notifies the user and/or the third party of the one or more inconsistencies by sending a notification to the user's and/or the third party's mobile devices. In particular embodiments, the system notifies the user and/or the third party of the one or more inconsistencies by email or text message. In other embodiments, the system may notify the user and/or the third party of a single inconsistency substantially immediately after the system detects the inconsistency between the user's current movements as compared to the previously established one or more movement patterns. In yet other embodiments, the system may notify the user and/or the third party of all inconsistencies detected on a particular day at the end of that day.
[00114] In various embodiments, the system may notify the user and/or the third party of the one or more inconsistencies after a particular event. For example, the system may notify the user if the system determines that the calculated number of steps of the user for a particular day is less than a predetermined percentage of the predetermined average number of steps taken by the user in a day. In some embodiments, the system may notify the user and/or the third party of the one or more inconsistencies after a particular period of time. For instance, the system may notify the third party of an association one hour after the system detects one or more inconsistencies between the user's current movements as compared to the previously established one or more movement patterns. In still other embodiments, the system may notify the user of the one or more inconsistencies at a particular time of day. As an example, the system may notify the user of one or more inconsistencies between the user's current movements as compared to the previously established one or more movement patterns at the end of the day.
[00115] In various embodiments, at least partially in response to detecting whether the user moves during the predefined time period, the system may notify the user and/or third party if the user does not move during the predefined time period. In other embodiments, at least partially in response to detecting one of sudden acceleration and sudden impact (e.g., such as that associated with a fall), the system may notify user and/or the third party that the user experienced the one of sudden acceleration and sudden impact. In some embodiments, at least partially in response to not detecting one of a heartbeat or breathing associated with the user, the system may notify the user and/or the third party that the heartbeat and/or breathing of the user cannot be detected.

This may indicate, for example, a medical emergency associated with the user or a malfunction of one or more system components.
[00116] In particular embodiments, the system may notify the user and/or the third party of detected inconsistencies between the user's current movements and the previously established movement patterns. In some embodiments, the system may notify the user and/or the third party of detected inconsistencies between the user's current location and the previously established location patterns. In other embodiments, the system may notify the user and/or the third party of detected inconsistencies between the user's current status and the previously established status patterns. In still other embodiments, the system may notify the user and/or the third party of detected inconsistencies between the user's current compliance and the previously established compliance patterns. In yet other embodiments, the system may notify the user and/or the third party of detected inconsistencies between the user's current medicine regime and the previously established medicine regime patterns. In various embodiments, the system may notify at least one recipient selected from a group consisting of: the user and the third party of detected inconsistencies between the user's current level of one or more medicines and the previously established typical one or more levels of medicine.
[00117] In particular embodiments, the system may notify the user and/or the third party of detected inconsistencies between the stored medicine regime and the one or more of the type of medicine taken, the time the medicine is taken, and the dose of medicine taken. In some embodiments, the system may notify at least one recipient selected from a group consisting of:
the user and the third party if the user removes the wearable device for a predetermined period of time. In other embodiments, the system may notify the user and/or the third party if the user does not consume food for a predetermined period of time. In particular embodiments, the system may notify the user and/or the third party if the user does not consume liquids for a predetermined period of time. In various embodiments, the system may notify the user and/or the third party if the user's caloric intake is above or below a predetermined number of calories.
In some embodiments, the system may notify the user and/or the third party if the user's oxygen levels fall below a predetermined threshold. In other embodiments, the system may notify the user and/or the third party if the user's blood sugar drops below a predetermined threshold.

[00118] In various embodiments, the system, when executing the Behavior Pattern Analysis Module 300, may omit particular steps, perform particular steps in an order other than the order presented above, or perform additional steps not discussed directly above.
Gait Monitoring Functionality [00119] Various embodiments of a system for the monitoring the gait of an individual are described below and may be implemented in any suitable context. For example, particular embodiments may be implemented to: (1) identify one or more medical conditions associated with the individual; (2) assess the fit of a prosthetic device worn by the individual, and/or (3) assess an individual's recovery from a particular injury or medical procedure.
[00120] Various aspects of the system's functionality may be executed by certain system modules, including the gait monitoring module 400B. The gait monitoring module 400B is discussed in greater detail below.
[00121] Referring to Figure 4B, when executing the gait monitoring module 400B, the system begins, in various embodiments, at Step 405B by receiving data from a wearable device worn by an individual whose gait is to be monitored by the system. In particular embodiments, the system is configured to receive data from one or more sensors (e.g. an accelerometer, a gyroscope, a position locating device and/or magnetometer) while: (1) the individual is wearing the wearable device adjacent the user's face and/or head; and (2) the individual is walking or running. In particular embodiments, the system is configured to receive the data while the individual is walking or running within the context of their typical daily routine, and not within the context of a medical diagnostic visit. The system may also or alternatively be configured to receive data within the context of a medical diagnostic visit (e.g., at a doctor's office, hospital, or other medical facility).
[00122] In particular embodiments, the one or more of the system's sensors may be embedded in, or otherwise attached to, eyewear or other wearable device (e.g., another wearable device worn adjacent the individual's head or another suitable part of the individual's body). In particular embodiments, at least one or more of the system's sensors may be incorporated into a prosthesis or into a portion of the individual's shoes. In certain embodiments, the system may include one or more sensors that arc incorporated into (e.g., embedded in, or attached to) a plurality of wearable devices (e.g., eyewear and the individual's shoes) that are adapted to be worn simultaneously by the user while the system retrieves signals from the sensors to assess the individual's gait.
[00123] In particular embodiments, the system may include a set of eyewear that includes one or more motion sensors (e.g., accelerometers, gyroscopes, or location sensors) for sensing the movement of the head of an individual who is wearing the eyewear as the individual walks. The system may then use this head movement information (e.g., using any suitable technique, such as any suitable technique described herein) to determine whether the user has a gait abnormality.
The system may do this, for example, by comparing one or more of the measured head motions of an individual (e.g., as measured when the individual is walking or running) with the actual or typical head motions experienced by individuals with gait abnormalities as those individuals walk or run.
[00124] In various embodiments, the system is configured to measure and receive at least one of the velocity, height, and orientation of one or more of the individual's feet.
For example, in certain embodiments, the system is configured to measure and receive (e.g., using the suitable sensors) the linear acceleration of each of the individual's feet, the height of each of the feet from the ground, and/or the position and/or orientation of each of the feet relative to the central axis of the individual's body as the individual walks or runs.
[00125] The system continues at Step 410B by using the data received from the system's sensors to identify one or more relative peaks in linear acceleration of the individual's body and/or head as the user ambulates (e.g., walks or runs). In various embodiments, the system may do this by processing the data received from the sensor(s) in Step 405B, and then isolating the relative peaks in the data. Such peaks represent the relative maxima and minima of the linear acceleration of the user's head, body, and/or one or more of the individual's lower body parts (e.g., knee, ankle, or foot) as the user ambulates. Alternatively or additionally, the system may be configured to identify the relative peaks in linear acceleration by identifying the slope of the line formed by regression analysis of the data received from the sensors. This regression analysis may indicate the change in magnitude of the linear acceleration with time.
[00126] In identifying the relative peaks in linear acceleration, the system is further configured to identify the peaks such that the magnitude and phase of these peaks may be utilized to aid in the diagnosis of one or more gait abnormalities by comparing the magnitude and phase of the peaks associated with the individual's gait with the magnitude and phase of the peaks associated with:

(1) the gait of one or more individuals who are known to have one or more gait abnormalities;
(2) a typical gait associated with individuals who are known to have one or more gait abnormalities; and/or (3) the individual's normal gait (which may be determined based on data stored in system memory that the system obtained, for example, when the individual was known to walk or run without a gait abnormality). This comparison may be helpful in determining whether the individual has a gait abnormality and, if so, whether the gait abnormality exists due to an improper prosthetic fit.
[00127] In a particular embodiment, the above comparison may involve comparing the magnitude and/or phase of peaks that represent a user's head movement as the user ambulates with the magnitude and/or phase of peaks that represent the head movement, during ambulation, of (1) one or more individuals who are known to have one or more gait abnormalities;
(2) a typical individual (or model theoretical individual) who is known to have one or more gait abnormalities; and/or (3) the individual themself (this data may be determined, for example, based on data stored in system memory that the system obtained, for example, when the individual was known to walk or run without a gait abnormality).
[00128] Continuing at Step 415B, the system is configured to analyze the received gait information to determine whether the individual has an identifiable gait abnormality and to communicate the results of the analysis to the user. In various embodiments, the system may use the gait information to: (1) identify potential, previously undiagnosed medical conditions (e.g., one or more medical conditions, such as ALS or MS, that may be indicated by a particular gait abnormality, such as foot drop); (2) assess the quality of the fit of a prosthesis; and/or (3) assess the individual's progress in recovering from an injury or medical procedure (e.g., knee or hip surgery).
Use of System to Identify Previously Undiagnosed Medical Condition [00129] In identifying a potential, previously undiagnosed medical condition, the system is configured to compare the gait of the individual with: (1) the gait of one or more individuals who are known to have one or more gait abnormalities (e.g., hemiplegic gait, diplegic gait, neuropathic gait, foot drop, myopathic gait, or ataxic gait); (2) a typical gait associated with individuals who are known to have one or more gait abnormalities; and/or (3) the individual's normal gait. To do this, the system may compare one or more gait patterns of a user (e.g., in the manner discussed above or in any other suitable way) with information regarding one or more abnormal gait patterns that is stored in a gait database 130B. The system may do this, for example, by applying any suitable mathematical or other data comparison technique to determine whether one of more of the individual's gait patterns are at least substantially similar to one or more abnormal gait patterns stored in the system's gait database 130B.
[00130] If the system determines that the individual has, or may have, a particular gait abnormality, the system may generate and send a notification to a suitable individual (e.g., the individual or the individual's physician) indicating that the individual may have a gait abnormality and/or that it may be beneficial to examine or monitor the individual for one or more medical conditions that are typically associated with the gait abnormality, e.g., stroke, amyotrophic lateral sclerosis, muscular dystrophy, Charcot Manes Tooth disease, multiple sclerosis, cerebral palsy, hereditary spastic paraplegia, and Friedrich's ataxia. The notification may be, for example, a suitable electronic notification (e.g., a message on a display screen, an e-mail, a text), or any other suitable notification. .
Use of System to Determine Whether a Prosthesis Fits Correctly [00131] In assessing the quality of fit of the prosthesis, the system in various embodiments, may, in various embodiments, be configured to compare the user's assessed gait with: (1) the gait of one or more individuals who are known to have one or more gait abnormalities that are associated with an improper prosthetic fit; (2) a typical gait associated with individuals who are known to have one or more gait abnormalities that are associated with an improper prosthetic fit;
and/or (3) the individual's normal gait. This comparison may be done as discussed above or in any othcr suitable way. In particular embodiments, the gait patterns that the individual's gait patterns are compared with may be modeled, for example, based on previously recorded data for individuals with one or more physical attributes (e.g., height, age, weight, femur length, etc...) that are similar to that of the individual. In various other embodiments, such patterns may be modeled from previously recorded data for users that aren't physically similar to the individual.
[00132] In response to determining that the individual has one or more gait patterns that are associated with an improper prosthetic fit, the system may generate an alert indicating that the prosthesis may fit improperly. The system may send this alert electronically, for example, via email, text message, or via a display on a display screen, to the user and/or their physician or other suitable individual.
[00133] In various embodiments, after determining that the individual has an abnormal gait, the system may then determine whether the gait deviation results from an improperly fitting prosthesis or from an injury associated with the individual (e.g., an infected wound adjacent the prosthesis) . It is noted that an improper fit of a prosthetic leg may result in any of a number of gait deviations such as trans-femoral (TF) long prosthetic step, TF excessive lumbar lordosis, TF
drop off at end of stance, TF foot slap, TF medial or lateral whips, TF uneven heel rise, etc.
While such gait deviations may result from an improper prosthetic fit, they may also manifest from: (1) various improper actions or movements by the amputee while the amputee is wearing the prosthesis; or (2) an injury adjacent the prosthesis. Clinically distinguishing an improper gait caused by a poorly fitting prosthetic from an improper gait caused by improper use of a properly fitted prosthesis may be important in helping the amputee regain proper functionality of the prosthetic.
Use of System to Assess an Individual's Recovery from an Injury or Medical Procedure [00134] In assessing an individual's recovery from an injury or medical procedure, the system may compare the individual's current gait with historical gait information for the individual stored in the database 130B. The historical gait information, in various embodiments, may include gait pattern information taken for the individual at some time in the past (e.g., the recent past) before or after the user suffered the injury or underwent the medical procedure.
[00135] The system may then analyze both sets of gait information to determine whether the individual's gait has become more consistent with the user's normal gait (e.g., fewer abnormalities in gait, more regular, quicker lateral acceleration, etc.) To do this, the system may, in various embodiments, compare the user's current gait information with a normal gait to determine whether the user's gait has become more consistent with a normal gait over time. In other embodiments, the system may compare the most current gait data with other post-procedure or post-injury gait data for the individual to determine whether the user's gait has become more consistent with a normal gait (e.g., the individual's normal gait).
[00136] Upon analyzing both sets of gait information, the system may generate an appropriate assessment of the user's recovery and/or to generate one or more treatment recommendations.

The system may, in various embodiments, generate a report that communicates the progress of an individual's recovery. The system may also, or alternatively, generate an alternate treatment plan for the individual, if necessary. For example, a particular generated report may include one or more recommendations with regard to a particular type and length of physical therapy to be performed by the individual, and/or one or more dietary restrictions that the individual should implement to aid recovery to regain muscle tone and strength in the affected limb. The system may then communicate the report to the individual or an appropriate third party.
Weight Loss Compliance Module [00137] Figure 4C is a flow chart of operations performed by an exemplary weight loss compliance module 400C, which may, for example, run on the compliance server 120C, or any suitable computing device (such as the one or more health monitoring devices 156C, a handheld computing device coupled to communicate with the one or more health monitoring devices 156C
or a suitable mobile computing device). In particular embodiments, the weight loss compliance module 400C may assess a wearer's compliance with a weight loss plan (e.g., a dietary plan, an exercise plan, a sleep plan, and/or a medicine regime) and provide recommendations, feedback and/or supportive messages on how the wearer can improve their compliance with the weight loss plan.
[00138] The system begins, in various embodiments, at Step 405C where the system receives, by at least one processor, at least one signal generated by one or more sensors operatively coupled to the computerized eyewear. In various embodiments, the at least one signal received may include one or more images taken by a forward facing camera coupled to the health monitoring device 156C. In other embodiments, the at least one signal received may include one or more signals from a global positioning system unit that are associated with the location of the wearer of the health monitoring device 156C. In still other embodiments, the at least one signal received may include one or more signals from an olfactory sensor that may be used to determine the smell associated with one or more foods being prepared for ingestion by the wearer of the health monitoring device 156C. In yet other embodiments, the at least one signal received may include one or more audio signals received by a microphone coupled to the health monitoring device 156C.

[00139] In various embodiments, the system receives the signals substantially automatically after the sensor generates the signal. In some embodiments, the system may receive the signal periodically (e.g., by the second, by the minute, hourly, daily, etc.). For example, the system may receive the signal every thirty seconds throughout the day. In other embodiments, the system may receive the signal after receiving an indication from the wearer or a third party that the system should receive the signal. For instance, the wearer may speak a voice command to the wearable device requesting that the device track the food being prepared by the wearer. In various embodiments, the system may receive an indication from the wearer or a third party of when to have the system receive the signal. In particular embodiments, the system may receive an indication from the wearer or a third party to have particular data received from a particular sensor at the same time that the system receives second particular data from a second particular sensor. For example, when the system receives one or more images, the system may at least partially in response to receiving the one or more images, also obtain global position data of the wearer, olfactory data associated with food being prepared by the wearer, etc.
[00140] In particular embodiments, the system may store data associated with the received signal in memory (e.g., local memory, remote memory, etc.). In various embodiments, the system may store the data substantially automatically after receiving the signal. In other embodiments, the system may store the data after receiving manual input from the wearer or a third party requesting that the system store the data. In various embodiments, the system may store the data for a specified period of time. For instance, the system may store the data for a day, a month, a year, etc., in, for example, the dietary information database 140C, the exercise information database 142C, the medicine database 144C, and/or the sleep information database 146C. In some embodiments, the system may store the data on any suitable server, database, or device. In other embodiments, the system may store the data on the compliance server 120C. In still other embodiments, the system may store data in an account associated with the wearer. In various embodiments, the system may store the data with a timestamp of when the data was received.
[00141] At Step 410C, the system, at least partially in response to receiving the at least one signal from the one or more sensors, determines, by at least one processor, an identity of a food that the wearer is preparing to ingest. In various embodiments, the at least one signal may be an image that is captured by a forward facing camera. In other embodiments, the at least one signal may be a signal received from a GPS chip that provides the current location of the wearer. In still other embodiments, the at least one signal may be an audio signal received from a microphone.
In yet other embodiments, the signal may be a smell captured by the olfactory sensor.
[00142] In embodiments, where the at least one signal is one or more captured images, the system may capture an image of packaging associated with a food that is being prepared by the wearer.
In some such embodiments, the system may analyze the image to detect a barcode located on the packaging associated with the food. Once the system detects the barcode, the system may decipher the bar code and search the dietary information database 140C of barcodes to find a matching barcode. Once the system identifies the food associated with the barcode, the system may also obtain from the dietary information database 140C nutritional information (e.g., calories, fat content, protein content, etc.) that is associated with the matching barcode.
[00143] In other embodiments, the system may analyze the captured one or more images to identify a food located in the captured one or more images. For example, the system may capture an image of a hamburger that the wearer is preparing to eat. The system may search the dietary information database 140C of images of food to find a matching image of a hamburger that is similar (e.g., the same, substantially similar, etc.) to the hamburger that the wearer is preparing. Once a match is determined by the system, the system can identify the food and obtain the nutritional information associated with matching image stored in the dietary information database 140C. In various embodiments, the system may capture a series of images that show the wearer preparing the hamburger. Thus, in some embodiments, the system may first capture an image of the hamburger as the wearer barbecues the hamburger.
From the first image, the system may identify the hamburger by finding a matching image of a hamburger in the dietary information database 140C. Next, the system may capture one or more images of the user removing a hamburger roll from a package of hamburger rolls. The system may detect a barcode from the image of the packaging of the hamburger rolls and identify the roll and the nutritional information associated with the roll from the barcode associated with the one or more images by searching the dietary information database 140C for a matching barcode. The system may continue to identify various other condiments that the wearer places on the hamburger in a similar fashion.
[00144] In still other embodiments, the system may capture an image of a food that the wearer is preparing to ingest while the system simultaneously receives a GPS signal from a GPS chip contained in the computerized eyewear worn by the wearer. In some such embodiments, the system may determine the location of the wearer from the GPS signal, determine if a restaurant or similar food establishment is located at the location, and simultaneously analyze one or more captured images to identify the food that the wearer is preparing to ingest by, for example, one of the methods described above. In some of these embodiments, the wearer's location and a determination of a particular restaurant or similar food establishment may assist the system in identifying the food that the wearer is preparing to ingest. For example, if the system cannot positively identify the exact food item the wearer is preparing to ingest, the system may display a list of one or more food items associated with the restaurant or similar food establishment on a display associated with the computerized eyewear (e.g., a display coupled to the computerized eyewear, a display on a handheld computing device, etc.) and allow the wearer to select the particular food item from the displayed list of one or more food items.
[00145] In still other embodiments, the system may receive an audio signal from one or more microphones coupled to the computerized eyewear. In some embodiments, the system may convert the received audio signal to text and use the text to search the dietary information database 140C for matching text words to identify the food that the wearer is preparing to ingest.
In some embodiments, the dietary information database 140C may also contain nutritional information associated with each word in the database. For example, the wearer may place the computerized eyewear in a state where it is ready to accept voice input. The wearer may, for example, say "hamburger". The system may convert the received voice command to text and search the dietary information database 140C for a matching word. Each word in the database may contain associated nutritional information. Thus, if the system finds a match for the word "hamburger", the system identifies the food item as a "hamburger" and obtains nutritional information associated with the word "hamburger." In some embodiments, once the system converts the received audio signal into text, the system may seek confirmation of the converted word by the wearer before searching the database of food items.
[00146] As noted above, in various embodiments, the system may determine a wearer's location using a signal received from the GPS chip in the computerized eyewear or from a GPS chip on a handheld computing device coupled to the computerized eyewear. In some such embodiments, if the system detects that the user is located in a restaurant or other food establishment, the system may begin to monitor the wearer's speech in order to detect one or more food items ordered by the wearer. As the system detects the wearer's speech, the system may convert the speech to text and search the dietary information database 140 based on one or more of: (1) the location of the wearer, (2) the associated name of the restaurant or other similar food establishment, or (3) a detected name of a food item ordered by the wearer. In various embodiments, the system may automatically detect the ordered food item. In other embodiments, once the system identifies an ordered item, the system may request confirmation that the detected item is accurate prior to making a final identification of the food item that the wearer is preparing to ingest.
[00147] In various embodiments, confirmation of the identity of the food item may be made by the wearer orally using the microphone on the computerized eyewear. In other embodiments, the system may request confirmation by displaying the identified food item on a display associated with the computerized eyewear. For example, the computerized eyewear may contain a built in display that the wearer can view while wearing the computerized eyewear. In other embodiments, the computerized eyewear may be coupled (e.g., wirelessly, wired, etc.) to a handheld computing device that displays the selected food item. In some such embodiments, the system may be configured to allow the user to select a "confirm" or "deny"
button displayed on the handheld computing device.
[00148] At Step 415C, the system determines, by at least one processor, a quantity of the food that the wearer is preparing to ingest. In various embodiments where the system detects and reads a barcode located in one or more captured images, the system may determine the serving size associated with the barcode by searching the dietary information database 140C for a matching barcode and nutritional information, which may also include a serving size for the food item. For example, in some instances where the wearer is eating prepackaged food, the matching barcode may indicate that the serving size associated with the packaged food is a single serving (e.g., a candy bar, a frozen meal, etc.). In other embodiments, the matching barcode may indicate that the serving size is a teaspoon, a tablespoon, a cup, etc.
[00149] In various embodiments where the serving size is not a single serving, the system may calculate an estimated volume from the captured one or more image. In other embodiments, the system may capture and analyze one or more images of the food and calculate an estimate of the quantity of the food that the wearer is preparing to ingest. For example, in some embodiments, the system may analyze the captured one or more images to count the number of teaspoons, tablespoons or cups of a food that the wearer selects. In other embodiments, the wearer may input the quantity of food using an input device (e.g, a food scale) coupled to the computerized eyewear or a handheld computing device coupled to the computerized eyewear or the wearer may input the quantity using an audio input via the one or more microphones coupled to the computerized eyewear.
[00150] In other embodiments, the system may analyze the captured one or more images to detect a vessel (e.g., bowl, measuring cup, container, etc.) that the wearer is using to measure a quantity of food. In some embodiments, the system may determine the quantity of the selected food based on the identity of the vessel: For example, a set of measuring cups and/or spoons may be color coded based on the size of each cup or spoon. In these embodiments, the system may analyze the captured one or more images to detect the particular measuring cup and/or spoon being used. In other embodiments, the system may analyze the captured one or more images to detect one or more markings located on the vessel that would allow the system to determine an accurate measurement of the food based on the identity of the vessel and/or the one or more markings on the vessel. In yet other embodiments, the system may be configured to operatively couple (e.g. via Bluetooth, Wi-Fi, physical wiring, etc.) to a measuring device such as a scale, an electronic measuring cup, etc., and use a received signal from the measuring device to determine the quantity of the selected food.
[00151] At Step 420C, the system tracks, by at least one processor, the identity and the quantity of the food that the wearer is preparing to ingest. In various embodiments, the system may store the identity and quantity of a food that a wearer prepares to ingest in an account associated with the wearer on the system. In some embodiments, the system may allow the user to input when the wearer plans on eating the prepared food (e.g., for breakfast, lunch, dinner, midmorning snack, late afternoon snack, etc.). In other embodiments, the system may determine a date and time that the wearer ingests the prepared food. For example, the system may determine the date, time, and associated meal (e.g., breakfast, lunch, dinner, etc.) and stored the information in a database (e.g., dietary information database 140C) or an account on the system associated with the wearer.
In still other embodiments, the system may track the number of calories associated with the prepared food, the nutritional information (e.g., amount of fat, protein, carbohydrates, vitamins, minerals, etc.) associated with the prepared food, and also store this information in the database or account associated with the wearer.

[00152] At step 425C, the system compares, by at least one processor, the identity and quantity of the food that the wearer is preparing to ingest to a predetermined weight loss plan. In various embodiments, the weight loss plan may contain one or more of a dietary plan, an exercise plan, a medicine regime, or a sleep plan. In some embodiments, the weight loss plan may be prepared by the wearer. In other embodiments, the weight loss plan may be prepared by a third-party (e.g., a doctor, a trainer, a nutritionist, etc.).
[00153] In various embodiments, the dietary plan for the wearer may consist of at least one of:
(A) one or more daily values for: (1) total calorie intake, (2) total fat intake, (3) total protein intake, (4) total carbohydrate intake; (B) one or more prohibited foods; (C) one or more prohibited vitamins; (D) one or more prohibited minerals; (E) a daily expected weight loss value;
(F) a weekly expected weight loss value; (G) a recommended amount of daily sleep; (H) etc. In various embodiments, the system may compare the amount and quantity of food and its associated nutritional information to the dietary plan values in order to determine if the wearer is complying with the dietary plan. In some embodiments, the system may compare the actual consumed total daily calorie intake to the planned total daily calorie intake.
In other embodiments, the system may, in addition to the total daily calorie intake also compare one or more of the actual daily protein intake, daily fat intake or daily carbohydrate intake to the planned daily values.
[00154] In various embodiments, the exercise plan may consist of at least one of: (A) a weight loss goal; (B) a target daily calories to burn in order to lose a particular amount of weight per week; (C) various suggested exercises to achieve the desired weight loss goal;
or (D) one or more recommended daily physical activities (e.g., walking, stretching, etc.).
In some such embodiments, the system may be configured to track one or more of the wearer's (1) movements, (2) heart rate, or (3) respiration rate to determine the amount of calories burned by the wearer and the type of exercise performed by the wearer.
[00155] In various embodiments, the system may use signals received from the one or more sensors (e.g., gyroscope, accelerometer, GPS unit, heart rate sensor, etc.) to determine the wearer's movements and calories burned. In some embodiments, the system may determine the wearer's current movements by tracking the distance traveled by the wearer for a particular day.
In still other embodiments, the system may determine the wearer's current movements by tracking the orientation of the wearer using the gyroscope. In particular embodiments, the current movements of the wearer may include actions such as running, jumping, bicycling, weight lifting, sitting, standing, etc. In various embodiments, the system may be configured to determine a total calories burned by the wearer for any particular activity based on one or more of (1) the movements associated with the wearer during the activity, (2) the amount of elapsed time during which the activity is performed, (3) the wearer's heart rate (actual, average, mean, etc.) during the time the activity is performed and/or after completion of the activity, or (4) the wearer's respiration rate during and/or after the activity is performed by the wearer.
[00156] In various embodiments, the system determines the wearer's current movements, heart rate, and/or respiration rate substantially automatically after receiving the signals from the one or more sensors. In various embodiments, the system may determine the wearer's current movements, heart rate, and/or respiration rate periodically (e.g., by the second, by the minute, hourly, daily, etc.). For example, the system may determine the wearer's current movements, heart rate, and/or respiration rate every thirty seconds throughout the day.
In other embodiments, the system may determine the wearer's current movements, heart rate, and/or respiration rate after receiving an indication from the wearer that the system should analyze the signals received from the one or more sensors. For instance, the wearer may speak a voice command to the computerized eyewear requesting that the device analyze the wearer's steps taken, heart rate and/or respiration rate. In other embodiments, the wearer may indicate to the system using a handheld computing device operatively coupled to the computerized eyewear that the system should capture the wearer's current movements, heart rate, and/or respiration rate.
[00157] In various embodiments, the weight loss plan may contain a medicine regime that is designed to support the wearer's weight loss goals. For example, the medicine regime may comprise at least one of (A) one or more prescription weight loss medicines;
(B) one or more over the counter weight loss medicines; or (C) one or more over the counter supplements (e.g.
Chitosan, Chromium Picolinate, etc.). In various embodiments, the medicine regime may also comprise tracking one or more prescription medicines that the wearer is already taking to determine any weight loss or weight gain side effects that may be caused by the one or prescribed medications taken by the wearer. For example, if the wearer is diabetic, the medicine regime may track the wearer's use of insulin in conjunction with the wearer's food intake, physical activity, and glucose levels to determine any weight loss or weight gain effects on the wearer.

[00158] In some embodiments, the wearer or a third-party (e.g., the wearer's physician, etc.) may manually input a medicine regime into an account associated with the wearer on the system. For example, where a physician prescribes one or more weight loss medicines or supplements as part of the wearer's weight loss plan, the prescribed medicines or supplements may be entered into the wearer's weight loss plan. In other embodiments where the wearer is already taking prescription medications, the wearer may manually enter one or more prescription medicines into the wearer's account on the system or the wearer's account may be linked to a medical records database so that prescribed medicines are automatically linked to the wearer's weight loss plan.
[00159] In still other embodiments, the one or more sensors on the computerized eyewear may capture one or more images of the one or more medicines/supplements that the wearer is preparing to ingest. The system may be configured to analyze the captured image to detect the presence of a label on a medicine/supplement bottle to identify the medicine/supplement and dose being ingested by the wearer. In some embodiments the system may perform optical character recognition techniques on the label to identify the type of medicine/supplement and the dose taken by the wearer. Once the medicine/supplement and/or dose is identified, the system may add the medicine/supplement to the wearer's medicine regime and perform a look-up of the medicine/supplement in a medicine database 144C to determine if the medicine/supplement has any positive or negative side effect on weight loss. Once the system receives or identifies all medicines/supplements taken by the wearer, the system may track the wearer's compliance with the medicine regime.
[00160] In various embodiments, the system may capture one or more images throughout the day and analyze the images to determine one or more of (1) a type of medicine/supplement taken by the wearer; (2) the time the medicine/supplement is ingested by the wearer;
and (3) the dose of medicine/supplement ingested by the wearer. For example, in some embodiments, the system may detect one or more pills in the captured one or more images, compare the one or more detected pills found in the captured one or more images to known images of pills stored in the medicine database 144C, identify the one or more pills by matching the one or more pills from the captured one or more images to the known images of pills stored in the medicine database 144C, and detect the time that the captured one or more images was taken.
Based on this information, the system is able to determine if the wearer is complying with the medicine regime prescribed by the wearer's physician in addition to monitoring the potential side effects of medicine/supplements that the wearer takes while following the weight loss plan.
[00161] In various embodiments, the weight loss plan may include a sleep plan that is designed to support the wearer's weight loss goals. In some such embodiments, the sleep plan for the wearer may consist of at least one of: (1) a recommended amount of sleep; (2) a recommended time for the wearer to go to sleep; (3) a recommended time for the wearer to wake up;
and/or (4) a recommended nap time. In various embodiments, the sleep plan may include a position for the wearer to sleep in, a temperature for the room where the wearer is sleeping, and/or a lighting condition for the room where the wearer is sleeping. In particular embodiments, the sleep plan may include advice for the wearer on avoiding certain types of light. In other embodiments, the sleep plan for the wearer may include suggestions for relaxation techniques.
In various embodiments, the system may track the amount and quality of sleep the wearer is obtaining to determine if the wearer is complying with the sleep plan. The system may do this, for example, using one or more of the sensors (e.g., motion sensors) described herein. In some embodiments, the system may compare the actual daily sleep of the wearer to the prescribed sleep plan. In particular embodiments, the system may, in addition to tracking the wearer's sleep patterns, track the wearer's hormone levels (e.g., through perspiration), the wearer's glucose levels, and the wearer's activity levels.
[00162] At step 430C, at least partially in response to comparing the identity and the quantity of the food that the wearer is preparing to ingest to a predetermined weight loss plan, the system calculates, using at least one processor, one or more recommendations to assist the wearer in complying with the predetermined weight loss plan. In various embodiments, the system may analyze the wearer's calorie intake, compare the current calorie intake to the predetermined weight loss plan, analyze the wearer's current calories burned and calculate one or more activities that would allow the wearer meet the daily goals for the predetermined weight loss plan. In some embodiments, for example, the system may determine that the wearer is preparing to ingest a 500 calorie hamburger. In some such embodiments, the system may recommend that the wearer run for 45 minutes to offset the additional 500 calories.
[00163] In other embodiments, the system may determine that the wearer is preparing to ingest a food item that contains 1500 mg of sodium. If the predetermined weight loss plan contains a restriction on sodium, the system may recommend that the wearer not eat the food item since the food item contains too much sodium. In still other embodiments, the system may be configured to recommend an alternative food item that is substantially similar (e.g., the same, in the same category as the food item, etc.) as the food item that the wearer is preparing to ingest. In still other embodiments, the system may be configured to identify a particular food item, its associated nutritional information and then question the wearer's desire to ingest the particular food item. For example, the wearer may visit his local eating establishment and order "the heart attack" hamburger off the menu. Once the system identifies the food item and its associated nutritional information (e.g., calories, sodium content, fat content, etc.), the system may recommend to the wearer that he choose a more nutritional food item since "the heart attack"
does not comply with the wearer's weight loss plan.
[00164] In various embodiments, the weight loss plan may also include allergy information associated with the wearer. For example, the wearer's predetermined weight loss plan may include information relating to the wearer's allergy to soy. Thus, once the system identifies that the wearer is ordering "the Heart Attack" hamburger, which based on the associated nutritional information indicates that it contains soy products, the system may send a recommendation to the wearer that states "the heart attack burger contains soy ¨ you're allergic to soy ¨ don't eat it!
[00165] In other embodiments, the system may be configured to monitor the wearer's exercise routine to determine whether the wearer is complying with daily exercise requirements. For example, the system may determine by mid-afternoon that that the wearer has only completed 50 percent of the steps required by the wearer's predetermined weight loss plan.
In some such embodiments, the system may calculate additional exercise recommendations that the wearer may perform in order to meet the daily recommended steps set forth in the wearer's weight loss plan. In other embodiments, if the system detects that the wearer has been sitting for a large portion of the day, the system may calculate recommended activities that lessens the amount of time that the wearer is in the seated position.
[00166] In other embodiments, if the wearer's is engaging in a physical activity, the system may monitor whether the wearer is exercising at the proper intensity based on at least one of the wearer's movements, the wearer's heart rate, the wearer's respiration rate or the weight loss plan.
If the system detects that the wearer is not exercising at a predetermined intensity established in the weight loss plan, the system may recommend that the wearer increase the pace of the wearer's movements.

[00167] In various embodiments, the system may monitor the wearer's compliance with the medicine regime and calculate recommendations to help the wearer comply with the medicine regime. For example, in some embodiments, the system may calculate reminders for the wearer to assist the wearer in taking medicine/supplements. In other embodiments, the system may identify a side effect of a medicine that the wearer is taking and recommend that the wearer monitor any weight gain since a common side effect of the medicine is weight gain.
[00168] In various embodiments, the system may be configured to monitor the wearer's compliance with the sleep plan to determine whether the wearer is complying with the sleep plan. For example, the system may use one or more sensors described herein (e.g., motion sensors) to determine by that the wearer has only slept a total of five hours for a particular day when the wearer's sleep plan called for seven hours per day. In some such embodiments, the system may calculate an additional nap time for the wearer in order to meet the daily recommended hours of sleep set forth in the wearer's weight loss plan. In other embodiments, if the system detects that the wearer has been sleeping for a large portion of the day, the system may calculate recommended activities that lessen the amount of time that the wearer is sleeping during the day. In still other embodiments, if the system determines that the wearer's sleep cycles are getting interrupted while the wearer is sleeping, for instance by sleep apnea, the system may recommend that wearer monitor any weight gain and comply with the wearer's dietary, exercise, and medicinal regimes since sleep apnea is a common side effect of gaining weight.
[00169] At Step 435C, the system notifies the wearer of one or more recommendations regarding the wearer's compliance with the predetermined weight loss plan. In various embodiments, the system notifies the wearer and/or a third-party of the one or more recommendations calculated by the system by sending a notification to the wearer's and/or the third-party's handheld computing devices. In particular embodiments, the system may notify the wearer and/or the third-party of the one or more recommendations by email or text message. In yet other embodiments, the system notifies the user of the one or more inconsistencies by communicating through a speaker coupled to the computerized eyewear.
[00170] In various embodiments, the system may notify the wearer and/or the third-party of a recommendation substantially immediately (e.g., immediately) after the system calculates the recommendation. In yet other embodiments, the system may notify the wearer and/or the third-party of all recommendations at the end of that day. In still other embodiments, the system may notify the wearer of an exercise recommendation when the wearer provides an indication to the system that the wearer is beginning to exercise.
[00171] In various embodiments, the third-party may be a relative or friend of the wearer so that the third-party may assist the wearer in meeting the requirements of the predetermined weight loss plan. In other embodiments, the third-party may be the wearer's nutritionist and/or physician to allow the nutritionist and/or physician to monitor the wearer's compliance with the predetermined weight loss plan. In still other embodiments, the third-party may be an insurance company that may adjust the wearer's life/health insurance rate(s) based on the wearer's compliance with the predetermined weight loss plan.
[00172] In various embodiments, the system, when executing the Behavior Pattern Analysis Module 400C, may omit particular steps, perform particular steps in an order other than the order presented above, or perform additional steps not discussed directly above.
Health Monitoring Functionality [00173] Various embodiments of a wearable health monitoring system are described below and may be implemented in any suitable context. Various aspects of the system's functionality may be executed by certain system modules, including a Health Data Acquisition Module 500, an Actionable Data Collection Module 600, a Decision Making Data Acquisition Module 700, and a Processing and Reporting Module 800. These modules are discussed more fully below.
Health Data Acquisition Module [00174] Referring to Figure 5, when executing the Health Data Acquisition Module5, the system begins, in various embodiments, at Step 510, by receiving basic vitals of a user. In particular embodiments, the system is configured to receive the basic vitals from one or more sensors associated with the wearable health monitoring device, such as any of the one or more sensors discussed above (e.g., such as a heart rate monitor, blood pressure sensor, etc.). In various embodiments, the system is configured to receive basic vitals such as, for example, blood pressure, heart rate, temperature, respiratory rate, blood oxygen levels, blood sugar levels, respiratory rate, or any other suitable vital.
[00175] The system continues, at Step 520, by receiving a location of the user. In various embodiments, the system is configured to receive the location of the user based at least in part on global position information determined from the wearable health monitoring device. In other embodiments, the system receives the location in response to the user providing his or her location to the system (e.g., via input on a mobile computing device such as a smartphone or tablet computer). In other embodiments, the system receives the location via voice input by the user, for example, via a microphone associated with the wearable health monitoring device, which may, for example, be configured to use suitable voice recognition and speech identification techniques to determine, based on the user's voice input, the location of the user.
[00176] Continuing at Step 530, the system collects sensory information associated with the user.
The system may be configured to collect the sensory information via, for example, one or more cameras, one or more microphones, one or more accelerometers, or any other suitable sensory perceiving sensors. In particular embodiments, the sensory information may include information about what the user is touching, seeing, hearing, smelling, tasting, etc. In other embodiments, the sensory information may include information about the user's balance, acceleration, proprioception, nociception, interoception, and chronoception.
[00177] Next, the system, at Step 540, retrieves information associated with one or more medications that the user is taking or should be taking. In particular embodiments, the one or more medications the user should be taking may include one or medications prescribed by a health care professional. In particular embodiments, the system receives information associated with the one or medications the user is taking by receiving, from a camera associated with the wearable health monitoring device, one or more images of the one or more medications (e.g., or a container or bottle in which the one or more medications are stored) as the user is taking the one or more medications. The system may then use the image to determine the type of medication that the user is taking (by, for example, using OCR techniques to read alphanumeric information from a particular pill, and/or by determining the size, color, and shape of a pill, and searching a database of medications to determine the medication contained within the pill based on the pill's size, color, and/or shape, and/or by alphanumeric information printed on the pill.
[00178] In various embodiments, the system is configured to scan one or more machine-readable indicia (e.g., one or more bar codes, one or more QR codes, or any other suitable indicia) on a medicine bottle to determine what medication the user is taking. In other embodiments, the system receives the medication information from the user, for example via voice input or via input on a computing device such as a smartphone, or on the wearable health monitoring device itself.
[00179] In various embodiments, the system continues, at Step 550, by retrieving other information about one or more user habits. The one or more user habits may include dietary preferences (e.g., food the user eats), drug use, smoking, exercise frequency, sleep information, etc. The system may, in various embodiments, retrieve the information about the one or more user habits based in part on one or more images taken by an imaging device associated with the wearable health monitoring device (e.g., a front facing camera on a pair of glasses) as the user partakes in the one or more user habits while wearing the wearable health monitoring device.
[00180] In particular embodiments, the system continues, at Step 560, by saving any of the above data and/or information to memory. In various embodiments, the system is configured to save the information to local memory associated with the wearable health monitoring device. In other embodiments, the system may store the information remotely (e.g., in one or more remote servers or other suitable storage medium).
Actionable Data Collection Module [00181] Referring to Figure 5A-6C, when executing the Actionable Data Collection Module 600, the system begins, in various embodiments, at Step 605, by gathering biological information about a user from one or more sensors. The biological information may include, for example, heart rate, temperature, geolocaiton, proximity to one or more other individuals or to a particular location (e.g., such as location proximate to a health-related outbreak or risk), humidity, accelerometer data, or other biological information associated with the user.
In various embodiments, the system is configured to receive one or more biological samples from the user, such as, for example, blood, sweat, teardrops, etc. In particular embodiments, the system is configured to receive input (e.g., via a mobile computing device or voice input) describing one or more symptoms the user may be experiencing. In other embodiments, the system is configured to receive biological information based on one or more images of the user (e.g., taken from an imaging device associated with the wearable health monitoring system). The system may, for example, determine from the one or more images that the user looks discolored (e.g., has a face that is flushed or paler than normal).

[00182] The system continues, at Step 610, by gathering external data associated with the user.
The external data may include, for example, calendar data (e.g., data associated with one or more places the user is travelling, data associated with one or more medical appointments, etc.). The user may, for example, have a vacation to a foreign country on their calendar, where the foreign country may require one or more vaccinations or have other health alerts that may be useful to the user. In other embodiments, the system is configured to gather external data such as one or more images of the user (e.g., from one or more social networking websites, from one or more email messages, etc.). In other embodiments, the system is configured to gather data from one or more e-mails sent to or received by the user. For example, the system may ascertain from an e-mail to the user's mother that the user is trying to have a baby or planning to train for a particular type of race or undertake any other action or activity that may warrant consultation with a physician or other medical professional.
[00183] The system continues, in particular embodiments, at Step 615 by retrieving physical data associated with the user. The physical data may include, for example, height, weight, build, body mass index (BMI), etc. In various embodiments, the system may retrieve the physical data, for example, by reading one or more scales as the user is standing on the scale (e.g., via an imaging device associated with the wearable health monitoring system), via input from the user or one or more health professionals measuring the user's physical data, or any other suitable technique. BMI may, for example, be determined based at least in part on the user's retrieved height, weight, age, and one or more other factors.
[00184] Continuing at Step 620, the system retrieves data associated with the user from one or more wearable computing devices, such as, for example, Nike+ FuelBand, FitBit, Jawbone UP, etc. The data retrieved from these devices may include, for example, activity level, number of steps taken, calories burned, or any other suitable data that such devices may track. Next, at Step 425, the system retrieves data from one or more mobile devices. The data may include, for example, data ascertained from one or more text messages (e.g., SMS messages) sent or received by the user, one or more applications used by or downloaded by the user, or any other suitable data from the one or more mobile devices.
[00185] In particular embodiments, the system continues at Step 630 by retrieving data associated with the user from a universal health record. The data may include, for example, information associated with the user's medical history, ailments for which the user has been treated, ailments for which the user is being treated, family health history information, susceptibility toward particular diseases associated with the user based on the users genealogy, or any other useful data available in a universal health record.
[00186] In various embodiments, at Step 635, the system is configured to monitor one or more sleep patterns of the user. The system may, for example, comprise one or more imaging devices directed toward the user's eyes, which may, for example, determine when the user is asleep or awake based at least in part on whether the user's eyes are open or closed.
The system may utilize one or more accelerometers to determine the user's movement while sleeping, which may, for example, enable the system to determine when the user is experiencing REM
(rapid eye movement) sleep. In various embodiments, the system may monitor, for example, hours of rest, hours spent in REM sleep, a time the user went to bed, a time the user got out of bed, napping behaviors, tossing and turning of the user during sleep, quality of sleep, sleep disturbances, sleep disorders (e.g., sleep apnea, bruxism, night terrors, insomnia, delayed sleep phase disorder, sleepwalking, etc.), elevation and position of the user's head during sleep, or any other suitable sleep related data.
[00187] In particular embodiments, the system continues, at Step 640, by monitoring user eye health. The system may, for example, monitor user eye inflammation, user vision (e.g., by determining whether a user often squints), or any other suitable eye health information, such as, for example information related to glaucoma, ankylosing spondylitis, high blood pressure, diabetes, sickle cell anemia, jaundice, colon polyps, heart disease, multiple sclerosis, leukemia, brain tumors, etc. The system may monitor eye health, for example, by collecting one or more tear and/or blood samples from the user, or using any other suitable technique described herein.
[00188] Next, at Step 645, the system monitors user compliance with one or more prescribed medical regimens. The system may, for example, monitor whether the user is taking particular prescribed medications using any suitable technique, such as one or more of the techniques described above. The system may be configured to monitor recommended diet and exercise regimens, by, for example, utilizing any of the data collected and/or retrieved by the system in the steps above, such as wearable computing device data or from data derived from one or more images taken from the imaging device associated with the wearable health monitoring device (e.g., such as by taking images of all food consumed by the user and using the images to determine what food and what quantities of the food the user has consumed).

[00189] At Step 650, the system monitors the user's posture, for example, by using one or more accelerometers, one or more gyroscopes, or any other suitable mechanism.
Similarly, at Step 455, the system is configured to monitor head movements of the user. The system may monitor body position, for example, to monitor mental health. For example, a user suffering from symptoms of depression may exhibit different posture than a user that is not (e.g. different body language) in the form of slouching or other body language. In various embodiments, tracking of posture and head position may enable the system to ascertain potential spinal cord problems (e.g., scoliosis).
[00190] Continuing at Step 660, the system provides feedback to the user based at least in part on information retrieved or monitored in Steps 605 ¨ 655. The system may, for example, recommend changes in behavior such as getting more sleep or eating healthier food, remind users to take one or more medications or vitamin supplements, recommend the user seek the advice of a physician before undertaking or continuing a particular activity, etc. The system may determine which feedback is appropriate by, for example, running one or more recommendation algorithms or scripts using the collected information. In a particular embodiment, one goal of providing the recommendations would be to improve a user's health or lifestyle.
[00191] At Step 665, the system continues by predicting, based at least in part on the collected data, one or more potential medical emergencies that may be likely to befall the user. The system may, for example, utilize any of the information collected above in combination with any other available information to predict and/or diagnose, for example, a heart attack, stroke, cataracts, macular degeneration, cancer, or any other medical emergency or illness that may befall the user. The system then, at Step 670, saves the collected data to the user's medical records for use by the user by one or more physicians or other medical professionals. Next, at Step 675, the system saves the collected data to memory. In various embodiments, the system is configured to save the information to local memory associated with the wearable health monitoring device. In other embodiments, the system may store the information remotely (e.g., in one or more remote servers or other suitable storage medium).
Decision Making Data Acquisition Module [00192] Referring to Figure 7, when executing the Decision Making Data Acquisition Module 700, the system begins, in various embodiments, at Step 710, by gathering diet information associated with the user. In particular embodiments, the system receives the diet information by receiving, from a camera associated with the wearable health monitoring device, one or more images of food that the user eats (e.g., as the user is preparing and/or consuming the food). In various embodiments, the system is configured to scan one or more machine-readable indicia (e.g., one or more bar codes, one or more QR codes, or any other suitable indicia) on food packaging for food that the user consumes. In other embodiments, the system receives the diet from the user, for example via voice input or via input on a computing device such as a smartphone, or on the wearable health monitoring device itself. In various embodiments, the diet information comprises food consumed as well as macronutrient information associated with the food (e.g., caloric content; grams of protein, fat, carbohydrates, etc.;
sodium content; vitamin and mineral content; etc.).
[00193] Continuing at Step 720, the system continues by gathering exercise information associated with the user. In various embodiments, the exercise information comprises, for example, one or more particular exercises performed by the user, a number of repetitions of each exercise performed, an amount of weight lifted as part of each exercise, a distance travelled during one or more walks/runs/etc., a number of calories burned during a particular aerobic activity, an amount of time for which a particular activity is performed, etc.
In particular embodiments, the system is configured to gather the exercise information from one or more images (e.g., and/or videos) of the user performing the activity, from one or more wearable computing device such as those discussed above, etc. In particular embodiments, the system is configured to gather the exercise information from input of the exercise information by the user.
[00194] The system continues, at Step 730, by comparing the diet information and exercise information. The system may for example, compare caloric intake via the user's diet with caloric expenditure via exercise from the exercise information. The system may, for example, predict a net weight loss and/or gain over a particular period of time based on the user's net caloric intake or output. The system may further compare diet and exercise information to determine risk of potential health issues such as heart disease, obesity, diabetes, etc. The system continues, at Step 740, by providing the comparison to the user. The system may, for example, provide information associated with progress toward a user's weight loss or weight gain goals.
In a particular example, the system may provide the user with information that they may lose a particular amount of weight in a particular number of weeks if they maintain their current diet and exercise regimen.
[00195] Next, at Step 750, the system saves the collected data to memory.
In various embodiments, the system is configured to save the information to local memory associated with the wearable health monitoring device. In other embodiments, the system may store the information remotely (e.g., in one or more remote servers or other suitable storage medium).
Processing and Reporting Module [00196] Referring to Figure 8, when executing the Processing and Reporting Module 800, the system begins, in various embodiments, at Step 810, by updating user health information to include information obtained via the Health Data Acquisition Module, Actionable Data Module, and Decision Making Data Acquisition Module described above. In particular embodiments, the user health information includes any suitable health related information and may be stored as part of any suitable record, in any suitable database, and/or on any suitable storage medium.
[00197] The system continues, at Step 820, by creating a health record interface, which may, for example, enable one or more users to access the user health information. The system then displays information to either the user's physician at Step 830 or the user at Step 840. The system may display the information in response to a request to display particular of the user's health information such as, for example, in the form of a user health report on sleep patterns, diet, exercise, medication, health history, blood pressure during a particular period, or any other suitable health information. In various embodiments, the system is further configured to provide user health information to one or more health insurance providers (which may, for example, provide discounts or have requirements about a user's diet and exercise choices), one or more life insurance providers (which may, for example, provide discounts for healthier lifestyles which can be tracked using the system), or any other party that may be interested in any of the information the system is capable of tracking.
Exemplary User Experience For Behavior Pattern Analysis System Independent Living of Elderly [00198] In a particular example of a user using the Behavior Pattern Analysis Module 300, the user may put on the wearable device in the morning and continue to wear the device throughout the day. In various embodiments, the wearable device may be operatively coupled (e.g., via a suitable wireless or wired connection) to a smart phone, a laptop, a desktop computer, an automated dialing apparatus, or any other computing device that can receive signals from the wearable device and either transmit the signals to a central system (e.g., via a wireless or wired telephone network) or analyze the signals and make decisions based on the received signals (e.g., call for help, notify a loved one, etc.). During this time, the system will track the movements of the user using the motion sensor, the accelerometer, the global positioning sensor, the gyroscope, and the front-facing camera. In this example, the user may be an elderly or infirm person that desires to live independently, but the person requires monitoring for events that deviate from the person's normal routine. Thus, by wearing the wearable device throughout the day, the device is able to track the user's movements and create certain patterns based on these movements for the user. The system may then store these patterns in a database while continuing to track the user's movements. Where the system detects that the user has deviated from the previously established pattern, the system may notify the user's physician, for example directly from the wearable device, or via the connected smartphonc, computer or the automated dialing apparatus. Such deviations from the previously established pattern may include that the user falls, that the user wanders beyond preset boundaries (e.g., defined by one or more geofences), that the user begins sleeping longer than usual, that the user stops moving, or any other deviation from a previously established pattern of the user's normal routine.
[00199] For example, the user may be an Alzheimer's patient that has lucid moments and moments of memory loss. As has been established as a movement pattern by the wearable device, the patient takes a walk around the block every morning. However, if the patient wanders two blocks over and outside of the user's predetermined geo-fenced area, which is a deviation from the patient's normal pattern of movements, the system may alert the patient's caregiver of the inconsistency between the patient's current actions and the previously established patterns.
Monitor Compliance with Medicine Regime [00200] The system, in a particular example, will also monitor the user's compliance with a medicine regime. In order to establish the user's medicine regime pattern, the user may wear the wearable device to detect when the user takes medicine and what medicine is taken using the front-facing camera. The user may also speak the name of the medicine as the wearable device captures an image of the medicine the user is taking. The system is then able to establish a pattern of the user taking blood pressure medicine every morning after monitoring the user for a week. The system may then monitor the user's current medicine intake to compare the medicines the user takes and the time that the user takes the medicine to the previously established medicine regime pattern. If the user fails to take the blood pressure medicine on a particular morning, the system may notify the user, the user's caregiver, a health care provider, or a third party that the user has deviated from the previously established medicine regime.
Exemplary User Experience For Gait Analysis [00201] In a particular example, a pair of eyewear with embedded sensors may be used to monitor the user's gait over the course of one or more days (e.g., days, weeks, months, years, etc.). As the sensors measure the movements of the individual's body (e.g., the individual's head, legs, feet, etc...), the system may transmit the related movement data to a remote server where the information is stored in a suitable database. After receiving the data, a central server may process the data to identify one or more gait patterns for the individual. The system may then compare one or more of the individual's gait patterns with one or more known irregular gait patterns to determine whether the individual has an irregular gait pattern as discussed above.
[00202] The system may be utilized, for example, in the following construct. A patient may present to a physician complaining of weakness and decreased use of one leg.
The physician may perform a routine physical, ask diagnostic questions, and have the patient walk briefly in order to physically demonstrate the purported condition. Upon observing the patient, the doctor may decide that the patient may potentially have a gait abnormality, but the physician cannot isolate the specific abnormality as presented by the patient. The physician may instruct the patient to wear the wearable gait monitoring device over the course of one or more days. During this time, the wearable gait monitoring device would obtain and record information regarding the individual's gait as discussed above.
[00203] The system may then use the information to identify one or more gait pattern irregularities as discussed above and generate a message to the user's treating physician indicating that the individual appears to have an abnormal gait. The system may optionally further display one or more potential medical conditions associated with that gait, e.g., amyotrophic lateral sclerosis, multiple sclerosis, etc. The physician may then meet with the individual to discuss the individual's condition, and/or to order additional testing to establish a particular diagnosis. For example, the physician may review the patient's medical history, presented gait pattern, and possible conditions contributing to the gait abnormality to diagnose and/or to order more tests to aid in the diagnosis of such medical conditions.
[00204] The system may similarly be used to analyze the fit of a particular prosthetic, or a user's recovery from an injury or surgery using similar techniques in combination with one or more of the methods described above.
Exemplary User Experience Monitoring Compliance with a Weight Loss Plan [00205] In a particular example of a wearer using the health monitoring device 156C (in the form of computerized eyewear) and the system 100C to monitor the wearer's compliance with a weight loss plan, the wearer may put on the computerized eyewear in the morning and continue to wear the device throughout the day. In various embodiments, the computerized eyewear may be operatively coupled (e.g., via a suitable wireless or wired connection) to a smart phone, a laptop, a desktop computer, or any other computing device that can receive signals from the computerized eyewear and either transmit the signals to a central system (e.g., via a wireless or wired telephone network, the Internet, etc.) or analyze the signals and make decisions based on the received signals (e.g., identify a food, a quantity of the food, calculate recommendations, etc.). During this time, the system (1) identifies a food item and a quantity of the food item that the wearer is preparing to ingest using the forward facing camera, the GPS
unit, the microphone, etc., (2) tracks the food item and quantity, (3) compares the food item and quantity to a predetermined weight loss plan, (4) tracks the movements of the wearer using the motion sensor, the accelerometer, the global positioning sensor, the gyroscope, and the front-facing camera, (5) tracks the wearer's compliance with a medicine regime, as described in detail above, and/or (6) tracks the wearer's compliance with a sleep plan, as also described in detail above.
[00206] In this example, the wearer may be under doctor's orders to lose weight. Thus, by wearing the computerized eyewcar throughout the day, the device is able to capture one or more images of one or more food items that the wearer is preparing to ingest, identify the one or more food items, and compare the nutritional information for the one or more food items to a predetermined weight loss plan set by the doctor. Continuing with this example, if the wearer goes out to eat, the system may receive a GPS signal that can be used to determine the location of the wearer. In addition to determining the location of the wearer, the system may also determine that the wearer is at a particular food establishment (e.g., a restaurant, a food store, etc.) located at the geographical location. Thus, based on the captured one or more images of the food item that the wearer is preparing to eat, the location of the wearer and the name of the food establishment, the system can identify the food item and the nutritional information associated with the food item by comparing the one or more captured images of the food item to images of food items available at the food establishment. Once the system determines a match and identifies the food item, the system may also identify nutritional information associated with the food item from a food information database accessed by the system.
[00207] In various embodiments, the system tracks the nutritional information associated with the food item and compares this information to the weight loss program. For example, the system may add the calories associated with the food item to a "total calories"
consumed for the day value. In some embodiments, if the system determines that the food item will cause the wearer to exceed the total calories allocated to the wearer for the day under the weight loss program, the system may calculate one or more recommendations for the wearer. For example, the one or more recommendations may include a statement to the wearer that says "if you eat this food item, you must perform one or more of the following exercises for 40 minutes to offset the excess calories due to the food item." Another recommendation may be "Instead of eating the 'Blue Bomber hamburger' why not have the Chicken Caesar Salad?" In various embodiments, if the system determines that the wearer ingests the food item, the system may continue to remind the wearer that the wearer must perform additional physical activity to offset the excess calories consumed during the day.
Exemplary User Experience For A Health Monitoring System [00208] In a particular example of a user using the wearable health monitoring system, the system tracks the user through a typical day via a wearable health monitoring device embodied as a pair of eycwear. As may be understood from this disclosure, the system is configured to monitor the user's vitals, health, posture, diet, exercise and other factors throughout the day. When the user wakes up and puts the wearable health monitoring device on, the system takes baseline readings of the user's heart rate, for example via a touch sensor on the temple portion of the eyewear and, temperature, via a thermometer on the cyewcar. The system may then gather data associated with the user's previous night's sleep such as, for example, length of sleep, REM sleep pattern based on eye movement captured via a camera on the eyewear that was positioned to face the user while the user was sleeping, and so on.
[00209] When the user is eating breakfast, the system captures data related to food and beverage intake. The system may, for example, ascertain that the user ate two fried eggs and a piece of toast for breakfast while drinking a glass of orange juice. The system may record information about the nutrition of the food and beverages consumed, for example, from a database of available information (e.g., the average number of calories per egg), or from the packaging in which the food was purchased. The user may then take a morning jog around their neighborhood, during which the eyewear would track the distance travelled by the user, the user's speed ancUor elevation change, and duration of the user's jog. The eyewear may track this information via one or more accelerometers, or using GPS. The system may also continue to track the user's heart rate and body temperature during the run to monitor whether their heart rate, elevation and temperature change are normal for that type of activity.
[00210] After the run, on the user's commute to work, the system may track the time that the user is sedentary and the user's posture while the user is seated driving their vehicle. The device calculates the amount of time spent driving to/from work and other time spent sitting in the vehicle throughout the day. The system then utilizes a pedometer or accelerometer to track the number of steps taken by the user while the user is at work as well as time that the user spends seated at work. After recording and capturing what the user eats for lunch, the wearable health monitoring device reminds the user to take their medication, which the system confirms that the user takes in the proper dosage by using a camera to capture an image of the user taking his two pills.
[00211] The system, throughout the course of the user's day, also tracks the user's exposure, via one or more suitable sensors, to one or more portions of the electromagnetic spectrum. The system tracks lUV exposure, for example, and may recommend that the user applies sunscreen during period of long exposure (e.g., when the user is outside touring a new facility at work).
The system may warn the user, based on a change in the user's skin pigmentation detected by the camera, that the user has had too much sun exposure and should go inside or take other preventative measures to protect from the sun's rays.

[00212] The system may monitor the user's stress level (e.g., from their blood pressure, heart rate, etc.) and associate higher stress levels with particular co-workers with whom the user is interacting during periods of high stress or with particular projects on which the user is working during periods of high stress. The system may store this information to provide later to the user.
[00213] At the end of the day, the system may compile all information gathered throughout the day to provide the user with health updates such as stress prevention and coping techniques, progress toward the user's weight loss goals based on the user's food intake and exercise, etc.
The system may further provide this information to one or more healthcare professionals who may be treating the user.
Other Practical Implementations of the System [00214] In particular embodiments, the system is configured to measure, gather, or retrieve any of the information discussed in this disclosure in response to any particular other information gathered by the system. The system may, for example, track blood sugar levels in response to detecting via a system camera that a user has eaten a particular food (e.g., for a diabetic patient).
In another example, the system may track user eating habits in response to particular detected or measured stimuli such as, for example, stress, sleep deprivation, oversleep, etc. The system may be configured to determine any suitable associations between various measured data and other suitable data (e.g., other data measured by the system). For example, the system may measure change in blood pressure over time in response to determining that the user's activity level (e.g., frequency of exercise) has increased, or alternatively because the user's activity level has decreased.
Conclusion [00215] Many modifications and other embodiments of the invention will come to mind to one skilled in the art to which this invention pertains, having the benefit of the teaching presented in the foregoing descriptions and the associated drawings. Therefore, it is to be understood that the invention is not to be limited to the specific embodiments disclosed and that modifications and other embodiments are intended to be included within the scope of the appended claims.
Although specific terms are employed herein, they are used in a generic and descriptive sense only and not for the purposes of limitation.

Claims (67)

What is claimed is:
1. A computer-implemented method of monitoring the wellbeing of an individual, the method comprising the steps of:
a. providing a user with computerized eyewear comprising at least one sensor for monitoring the motion of the user;
b. receiving, by a processor, data generated by the at least one sensor;
c. at least partially in response to receiving data generated by the at least one sensor, determining, by a processor, the user's movements using the received data;
d. at least partially in response to determining the user's movements, comparing, by a processor, the user's movements to previously established one or more movement patterns for the user;
e. detecting, by a processor, one or more inconsistencies between the current user's movements as compared to the previously established one or more movement patterns;
and f. at least partially in response to detecting the one more inconsistencies, notifying, by a processor, at least one recipient selected from a group consisting of: the user or a third party of the detected one or more inconsistencies.
2. The computer-implemented method of claim 1, wherein the at least one sensor comprises at least one sensor selected from a group consisting of:
a. a motion sensor;
b. an accelerometer;
c. a gyroscope;
d. a geomagnetic sensor;
e. a global positioning system sensor;
f. an impact sensor;
g. a microphone;
h. a forward facing camera;
i. a heart rate monitor;
j. a pulse oximeter;
k. a blood alcohol monitor;

l. a respiratory rate sensor; and m. a transdermal sensor.
3. The computer-implemented method of claim 2, wherein the at least one sensor comprises at least one sensor selected from a group consisting of: a motion sensor, an accelerometer, a global positioning sensor, a gyroscope, and a forward facing camera.
4. The computer-implemented method of claim 2, wherein the method further comprises the step of:
a. calculating, by a processor, a number of steps taken by the user in a particular day;
b. at least partially in response to calculating the number of steps, comparing, by a processor, the calculated number of steps taken by the user in the particular day to a predetermined average number of steps taken by the user in a day; and c. at least partially in response to comparing the calculated number of steps to the predetermined average number of steps, notifying the user or a third party if the calculated number of steps in the particular day is less than a predetermined percentage of the predetermined average number of steps taken by the user in a day.
5. The computer-implemented method of claim 2, further comprising the steps of:
a. detecting, by a processor, whether the user moves during a predefined time period; and b. at least partially in response to detecting whether the user moves during the predefined time period, notifying, by a processor, the at least one recipient selected from a group consisting of: the user or a third party if the user does not move during the predefined time period.
6. The computer-implemented method of claim 2, further comprising the steps of:
a. detecting, by a processor, from the received data generated by the at least one sensor if the user experiences a sudden acceleration or sudden impact; and b. at least partially in response to detecting that the user has experienced a: sudden acceleration or sudden impact, notifying, by a processor, the user or a third party that the user experienced the sudden acceleration or sudden impact.
7. The computer-implemented method of claim 2, further comprising the steps of:
a. detecting, by a processor, from the received data generated by the at least one sensor: (1) whether the user is breathing; and (2) whether the user's heart is beating;
and b. at least partially in response to determining that the user is not breathing or that the user's heart is not beating, sending a notification to a third party.
8. The computer-implemented method of claim 2, further comprising the steps of:
a. receiving, by a processor, from the user or third party, a medicine regime associated with the user;
b. storing, by a processor, the medicine regime in memory;
c. receiving, by a processor, data generated by a forward facing camera associated with the computerized eyewear;
d. analyzing, by a processor, the received data to determine data selected from a group consisting of one or more:
i. types of medicine taken by the user;
ii. times the medicine is taken by the user; and iii. doses of the medicine taken by the user;
e. at least partially in response to analyzing the received data, comparing, by a processor, the one or more of the types of medicine taken, the one or more times the medicine is taken, or the one or more doses of medicine taken to the stored medicine regime for the user;
f. at least partially in response to comparing the one or more of the type of medicine taken, the time the medicine is taken and the dose of medicine taken, identifying, by a processor, one or more inconsistencies between the stored medicine regime, and the one or more types of medicine taken, the one or more times the medicine is taken, or the one or more doses of medicine taken; and g. at least partially in response to identifying the one or more inconsistencies between the medicine regime and the one or more of the types of medicine taken, the one or more times the medicine is taken, or the one or more doses of medicine taken, sending an alert to the user or a third party of the one or more inconsistencies.
9. The computer-implemented method of claim 8, wherein:
a. the data generated comprises one or more images captured by the forward facing camera;
b. the step of analyzing the received data further comprises:
i. detecting, by a processor, one or more pills in the one or moric images;

ii. comparing, by a processor, the one or more detected pills found in the one or more images to one or more known images of pills stored in a database;
iii. identifying, by a processor, the one or more pills by matching the one or more pills from the one or more images to the one or more known images of pills stored in the database; and iv. detecting, by a processor, a time that the one or more images were taken.
10. A computer-implemented method of monitoring the wellbeing of an individual, the method comprising the steps of:
a. providing a user with a computerized wearable comprising at least one sensor for monitoring actions taken by a user;
b. receiving, by a processor, a medicine regime associated with the user;
c. receiving, by a processor, data generated by the at least one sensor;
d. analyzing, by a processor, the received data generated by the at least one sensor to determine data selected from the group consisting of one or more:
i. types of medicine taken by the user;
ii. times the medicine is taken by the user; and iii. doses of medicine taken by the user;
e. at least partially in response to analyzing the received data generated by the at least one sensor, comparing, by a processor, the medicine regime for the user to the determined one or more of the types of medicine taken by the user, the one or more times the medicine is taken by the user, or the one or more doses of medicine taken by the user;
f. detecting, by a processor, one or more inconsistencies between the medicine regime associated with the user and the determined one or more of the types of medicine taken by the user, the one or more times the medicine is taken by the user, or the one or more doses of medicine taken by the user; and g. at least partially in response to detecting one or more inconsistencies, notifying, by a processor, at least one recipient selected from a group consisting of: the user or the third party of the detected inconsistencies.
11. The computer-implemented method of claim 10, wherein the at least one sensor farther comprises one or more sensors selected from a group consisting of: a camera, a scanner, a motion sensor, and a microphone.
12. The computer-implemented method of claim 10, wherein the computerized wearable is eyewear.
13. The computer-implemented method of claim 10, wherein:
a. the data generated is one or more images captured by a forward facing camera;
b. the step of analyzing the received data further comprises:
i. detecting, by a processor, one or more pills in the one or more images;
ii. comparing, by a processor, the one or more detected pills found in the one or more images to one or more known images of pills stored in a database;
iii. identifying, by a processor, the one or more pills by matching the one or more pills from the one or more images to the one or more known images of pills stored in the database; and iv. detecting, by a processor, a time that the one or more images were taken.
14. The computer-implemented method of claim 10, further comprising the step of:
a. detecting, by a processor, a level of one or more medicines in the user's bloodstream;
b. comparing, by a processor, the level of the one or medicines in the user's bloodstream to a predefined level for each of the one or more medicines stored in a database for the user;
and c. at least partially in response to comparing the level of the one or medicines in the user's bloodstream, notifying the at least one recipient selected from a group consisting of: the user or a third party when the level of the one or more medicines is below the predefined level for each of the one or more medicines.
15. The computer-implemented method of claim 10, further comprising the steps of:
a. determining, by a processor, the user's one or more movements using the received data generated by the at least one sensor;
b. analyzing, by a processor, the received data generated by the at least one sensor to determine the one or more movements associated with the user;
c. at least partially in response to analyzing the received data, comparing, by a processor, the user's one or more movements to previously established one or more movement patterns for the user;

d. detecting, by a processor, one or more inconsistencies between the user's one or more movements as compared to the previously established one or more movement patterns;
and e. at least partially in response to detecting one or more inconsistencies between the current user's one or more movements as compared to previously established one or more movement patterns, notifying, by a processor, at least one recipient selected from a group consisting of: the user or a third party of the detected one or more inconsistencies.
16. Eyewear for monitoring the well-being of an individual, said eyewear comprising:
a. a frame portion that is adapted for supporting at least one lens, a front portion of the frame portion defining a first lateral side and a second lateral side;
b. a first temple attached adjacent the first lateral side, the first temple being mounted to extend, at least selectively, rearwardly away from the frame portion so that a first temple distal portion may engage a first ear of a wearer of the eyewear to at least partially support the eyewear while the eyewear is being worn adjacent the wearer's face;
c. a second temple attached adjacent the second lateral side, the second temple being mounted to extend, at least selectively, rearwardly away from the frame portion so that a second temple distal portion may engage a second ear of the wearer to at least partially support the eyewear while the eyewear is being worn adjacent the wearer's face; and d. at least one well-being monitoring device that is adapted to monitor the well-being of the wearer of the eyewear, wherein:
i. the at least one well-being monitoring device is adapted for tracking the movement of the wearer; and ii. the eyewear is adapted to transmit data regarding the movement of the wearer for use in determining whether the wearer's movement is inconsistent with one or more previously-established movement patterns of the individual.
17. The eyewear of Claim 16, wherein the well-being monitoring device comprises one or more sensors selected from a group consisting of:
a. motion sensors;
b. accelerometers;
c. gyroscopes;

d. geomagnetic sensors;
e. global positioning system sensors;
f. impact sensors;
g. microphones;
h. forward facing cameras;
i. heart rate monitors;
j. pulse oximeters;
k. blood alcohol monitors;
l. respiratory rate sensors; and m. transdermal sensors.
18. The eyewear of Claim 16, wherein the one or more sensors are adapted for tracking the movement of the wearer while the wearer is wearing the eyewear.
19. The eyewear of Claim 16, wherein the one or more movement patterns comprise one or more patterns of physical movement.
20. The eyewear of Claim 16, wherein the one or more movement patterns comprise one or more patterns of movement between one or more rooms within a particular dwelling.
21. A computer-readable medium storing computer-executable instructions for:
a. receiving information obtained from at least one sensor worn adjacent the individual's head;
b. using the information to assess the gait of the individual;
c. analyzing the assessed gait to determine whether the assessed gait includes one or more particular gait patterns that arc associated with a particular medical condition;
and d. in response to determining that the assessed gait includes the one or more gait patterns, generating an alert that the individual may have the particular medical condition.
22. The computer-readable medium of Claim 21, wherein the at least one sensor is embedded into a pair of glasses worn by the individual.
23. The computer-readable medium of Claim 22, wherein the at least one sensor is a gyroscope.
24. The computer-readable medium of Claim 21, wherein the step of using the information to assess the gait of the individual comprises:

a. using the information to assess one or more movements of the individual's head as the individual ambulates; and b. using the one or more movements of the individual's head to assess the gait of the individual.
25. The computer-readable medium of Claim 21, wherein the step of analyzing the assessed gait to determine whether the assessed gait includes one or more particular gait patterns that are associated with a particular medical condition comprises:
a. using the information to assess one or more movements of the individual's head as the individual ambulates;
b. comparing the one or more movements of the individual's head with one or more head movements that are associated with the particular medical condition; and c. at least partially in response to determining that the one or more movements of the individual's head are at least similar to one or more head movements that are associated with the particular medical condition, determining that the assessed gait includes one or more particular gait patterns that are associated with the particular medical condition.
26. The computer-readable medium of Claim 21, wherein the one or more gait patterns is foot drop.
27. The computer-readable medium of Claim 26, wherein the particular medical condition is a medical condition selected from a group consisting of: stroke, amyotrophic lateral sclerosis, muscular dystrophy, Charcot Marie Tooth disease, multiple sclerosis, cerebral palsy, hereditary spastic paraplegia, and Friedrich's ataxia.
28. The computer-readable medium of Claim 21, wherein the one or more gait patterns is propulsive gait.
29. The computer-readable medium of Claim 28, wherein the particular medical condition is a medical condition selected from a group consisting of: carbon monoxide poisoning, manganese poisoning, and Parkinson's disease.
30. The computer-readable medium of Claim 21, wherein the one or more gait patterns is waddling gait.
31. The computer-readable medium of Claim 30, wherein the particular medical condition is a medical condition selected from a group consisting of: congenital hip dysplasia, muscular dystrophy, and spinal muscle atrophy.
32. The computer-readable medium of Claim 31, wherein the step of analyzing the assessed gait to determine whether the assessed gait includes one or more particular gait patterns that are associated with a particular medical condition comprises:
a. comparing one or more particular gait patterns from the assessed gait of the individual with the one or more gait patterns that are associated with the particular medical condition; and b. in response to determining that the one or more particular gait patterns are at least substantially similar to the one or more gait patterns that are associated with the particular medical condition, determining that the assessed gait includes one or more particular gait patterns that are associated with a particular medical condition.
33. The computer-readable medium of Claim 32, wherein the step of comparing one or more particular gait patterns from the assessed gait of the individual with the one or more gait patterns that are associated with the particular medical condition comprises comparing one or more relative peaks in linear acceleration associated with the assessed gait of the individual with one or more relative peaks in linear acceleration of the one or more gait patterns that are associated with the particular medical condition.
34. A method of monitoring the proper fit of a prosthesis on an individual, said method comprising:
a. receiving information obtained from at least one sensor worn adjacent the individual's head;
b. using the information to assess the gait of the individual;
c. analyzing the assessed gait to determine whether the assessed gait includes one or more gait patterns that are consistent with an improper fit of the prosthesis; and d. in response to determining that the assessed gait includes the one or more gait patterns, generating an alert that the prosthesis may fit the individual improperly.
35. The method of Claim 34, wherein the prosthesis comprises a prosthetic limb.
36. The method of Claim 35, wherein the at least one sensor is embedded into a pair of glasses worn by the individual.
37. The method of Claim 34, wherein the prosthesis comprises a prosthetic foot.
38. A system for monitoring the gait of an individual, said system comprising:
a. a pair of glasses comprising one or more sensors for assessing the gait of the individual;

b. a computer system comprising a processor, the computer system being configured for:
c. analyzing the assessed gait to determine whether the assessed gait includes one or more gait patterns that are consistent with a particular medical state of an individual; and d. in response to determining that the assessed gait includes the one or more gait patterns, generating an alert that communicates the particular medical state.
39. The system of Claim 38, wherein the particular medical state includes an indication of the level of an individual's recovery from a medical procedure.
40. The system of Claim 38, wherein the particular medical state includes an indication of the level of an individual's recovery from a particular injury.
41. The computer-readable medium of Claim 38, wherein the step of analyzing the assessed gait to determine whether the assessed gait includes one or more particular gait patterns that are consistent with a particular medical state of an individual comprises:
a. using the information to assess one or more movements of the individual's head as the individual ambulates;
b. comparing the one or more movements of the individual's head with one or more head movements that are consistent with a particular medical state of an individual; and c. at least partially in response to determining that the one or more movements of the individual's head arc consistent with the particular medical state, deteimining that the assessed gait includes one or more particular gait patterns that are consistent with the particular medical state.
42. A computer-implemented method of monitoring compliance with a weight loss plan by a wearer of computerized eyewear, the method comprising:
a. receiving, by at least one processor, at least one signal generated by one or more sensors operatively coupled to the computerized eyewear;
b. at least partially in response to receiving the at least one signal from the one or more sensors, determining, by at least one processor, an identity of a food that the wearer is preparing to ingest based on the received at least one signal generated by the one or more sensors;
c. determining, by at least one processor, a quantity of the food that the wearer is preparing to ingest based on the received at least one signal generated by the one or more sensors;

d. tracking, by at least one processor, the identity and the quantity of the food that the wearer is preparing to ingest;
e. comparing, by at least one processor, the identity and the quantity of the food that the wearer is preparing to ingest to a predetermined weight loss plan; and f. at least partially in response to comparing the identity and the quantity of the food to the predetermined weight loss plan, at least one of:
i. updating, by at least one processor, a database to indicate whether the wearer is compliant with the weight loss plan;
ii. calculating, by at least one processor, one or more recommendations for the wearer associated with the wearer's compliance with the weight loss plan;
iii. notifying, by at least one processor, the wearer of the wearer's compliance with the weight loss plan;
iv. notifying, by at least one processor, a third-party of the wearer's compliance with the weight loss plan;
v. sending, by at least one processor, an alert to the wearer; or vi. sending, by at least one processor, an alert to the wearer regarding the calculated one or more recommendations.
43. The computer-implemented method of claim 42, wherein the one or more sensors comprises at least one sensor selected from a group consisting of:
a. a forward facing camera;
b. a global positioning system unit;
c. an olfactory sensor; and d. a microphone.
44. The computer-implemented method of claim 42, wherein determining the identity of the food further comprises:
a. capturing, by at least one processor, an image by a forward facing camera of a packaging associated with the food;
b. at least partially in response to capturing the image, detecting, by at least one processor, a barcode contained on the packaging; and c. searching, by at least one processor, a database of barcodes to determine a nutritional value associated with the food.
45. The computer-implemented method of claim 44, wherein determining a quantity of the food further comprises, at least partially in response to capturing the image, estimating, by at least one processor, the quantity of food selected by the wearer.
46. The computer-implemented method of claim 44, wherein determining a quantity of the food further comprises:
a. detecting, by at least one processor, in the captured image a vessel used by the wearer to measure the quantity of the food that the wearer is preparing to ingest;
and b. detecting, by at least one processor, at least one marking on the vessel that indicates the quantity of the food placed in the vessel.
47. The computer-implemented method of claim 44, wherein determining a quantity of the food further comprises:
a. receiving, by at least one processor, a signal from a measuring device that is used by the wearer to measure the quantity of the food; and b. at least partially in response to receiving the signal from the measuring device, determining, by at least one processor, the quantity of the food.
48. The computer-implemented method of claim 47, wherein the measuring device is a wireless scale.
49. The computer-implemented method of claim 42, wherein tracking the identity and the quantity of the food that the wearer is preparing to ingest further comprises:
a. determining, by at least one processor, a date and a time that the wearer ingests the food;
b. calculating, by at least one processor, a number of calories associated with the food;
c. determining, by at least one processor, a nutritional value associated with the food; and d. storing the date and time that the wearer ingests the food, the number of calories associated with the food and the nutritional value associated with the food in memory operatively coupled to the computerized eyewear.
50. The computer-implemented method of claim 42, wherein the predetermined weight loss plan further comprises one or more values selected from a group consisting of:
a. a total daily calorie intake value;
b. a total daily fat intake value;
c. a total daily protein intake value;

d. a total daily carbohydrate intake value;
e. one or more prohibited foods;
f. one or more prohibited nutrients;
g. a weekly weight loss value;
h. a total daily calorie burned value;
i. a total daily amount of sleep value; and j. one or more daily physical activities.
51. The computer-implemented method of claim 50, further comprising:
a. at least partially in response to comparing the identity and the quantity of the food that the wearer is preparing to ingest to a predetermined weight loss plan, calculating, by at least one processor, one or more recommendations to assist the wearer in complying with the predetermined weight loss plan; and b. notifying, by at least one processor, the wearer of the one or more recommendations.
52. The computer-implemented method of claim 51, wherein calculating one or more recommendations to assist the wearer in complying with the predetermined weight loss plan further comprises:
a. determining, by at least one processor, one or more physical activity recommendations;
b. accessing, by at least one processor, one or more daily caloric intake recommendations;
c. comparing, by at least one processor, the one or more physical activity recommendations to the one or more daily caloric intake recommendations;
d. at least partially in response to comparing the one or more physical activity recommendations to the one or more daily caloric intake recommendations, calculating, by at least one processor, a recommended combination of at least one physical activity recommendation and at least one daily caloric intake recommendation to assist the wearer in complying with the predetermined weight loss plan; and e. notifying, by at least one processor, the wearer of the recommended combination of the at least one physical activity recommendation and the at least one daily caloric intake recommendation.
53. The computer-implemented method of claim 43, wherein the at least one signal generated by one or more sensors is an audio signal received by the microphone, the method further comprising:

a. converting, by at least one processor, the audio signals to text using speech recognition techniques;
b. at least partially based on the converted text, searching, by at least one processor, a database of food items using the text; and c. matching, by at least one processor, the converted text to at least one food item in the database of food items to identify the food item the wearer is preparing to ingest.
54. The computer-implemented method of claim 43, further comprising:
a. receiving, by at least one processor, a signal from the global positioning system unit;
b. determining, by at least one processor, the location of the wearer at least partially based on the received signal; and c. using, by at least one processor, the location of the wearer in determining the identity of the food that the wearer is preparing to ingest.
55. A system for monitoring compliance with a weight loss plan by a wearer of computerized eyewear, comprising:
a. one or more processors;
b. memory operatively coupled to the one or more processors;
wherein the one or more processors is configured to:
i. receive at least one signal generated by one or more sensors operatively coupled to the computerized eyewear;
ii. at least partially in response to receiving the at least one signal from the one or more sensors, determine an identity of a food that the wearer is preparing to ingest;
iii. determine a quantity of the food that the wearer is preparing to ingest;
iv. track the identity and the quantity of the food that the wearer is preparing to ingest;
v. identify the date and time that the food is ingested by the wearer;
vi. compare the identity and the quantity of the food that the wearer ingests to a predetermined weight loss plan; and vii. at least partially in response to comparing the identity and the quantity of the food to the predetermined weight loss plan, at least one of:

(a) update a database to indicate whether the wearer is compliant with the weight loss plan;
(b) calculate one or more recommendations for the wearer associated with the wearer's compliance with the weight loss plan;
(c) notify the wearer of the wearer's compliance with the weight loss plan;
(d) notify a third-party of the wearer's compliance with the weight loss plan;
(e) send an alert to the wearer; or (f) send an alert to the wearer regarding the calculated one or more recommendations.
56. The system of claim 55, wherein the one or more sensors comprises at least one sensor selected from a group consisting of:
a. a forward facing camera;
b. a global positioning system unit;
c. an olfactory sensor; and d. a microphone.
57. The system of claim 55, wherein the one or more processors is further configured to:
a. receive a signal from a global positioning system unit;
b. determine the location of the wearer at least partially based on the received signal; and c. use the location of the wearer to determine the identity of the food that the wearer is preparing to ingest.
58. The system of claim 55, wherein the one or more processors is further configured to:
a. capture one or more images by a forward-facing camera of medicine ingested by the wearer;
b. detect one or more pills in the one or more images;
c. compare the one or more detected pills found in the one or more images to one or more known images of pills stored in memory;
d. at least partially in response to comparing the one or more detected pills to one or more known images of pills stored in memory, determine data selected from a group consisting of one or more:
i. types of medicine taken by the user;
ii. times the medicine is taken by the user; and iii. doses of medicine taken by the user;
e. determine, at least partially based on the data, if the medicine ingested by the wearer causes a side effect that can affect the wearer's ability to lose weight; and f. notify either the wearer or a third party of the side effect.
59. The system of claim 58, wherein the one or more processors is further configured to:
a. detect, by a processor, a level of one or more medicines in the wearer's bloodstream based on a signal received from a blood testing sensor operatively coupled to the system;
b. compare the level of the one or medicines in the user's bloodstream to a predefined level for each of the one or more medicines stored in a database for the wearer; and c. at least partially in response to comparing the level of the one or medicines in the wearer's bloodstream, notify one of the wearer or the third party when the level of the one or more medicines is below the predefined level for each of the one or more medicines.
60. The system of claim 59, wherein the one or more medicines is selected from a group consisting of:
a. a prescription drug;
b. a nutritional supplement;
c. a prescription weight loss drug; and d. an over-the-counter weight loss drug.
61. A computer-implemented method of monitoring compliance with a weight loss plan by a wearer of computerized eyewear, the method comprising:
a. receiving, by at least one processor, at least one signal generated by one or more sensors operatively coupled to the computerized eyewear;
b. at least partially in response to receiving the at least one signal from the one or more sensors, determining, by at least one processor, an identity of a food that the wearer is preparing to ingest;
c. determining, by at least one processor, a quantity of the food that the wearer is preparing to ingest;
d. comparing, by at least one processor, the identity and the quantity of the food that the wearer is preparing to ingest to a predetermined weight loss plan;

e. at least partially in response to comparing the identity and the quantity of the food that the wearer is preparing to ingest to the predetermined weight loss plan, calculating, by at least one processor, one or more recommendations to assist the wearer in complying with the predetermined weight loss plan; and f. notifying, by at least one processor, the wearer of the one or more recommendations.
62. The computer-implemented method of claim 61, wherein the one or more sensors comprises at least one sensor selected from a group consisting of:
a. a forward facing camera;
b. a global positioning system unit;
c. an olfactory sensor; and d. a microphone.
63. The computer-implemented method of claim 62, wherein determining the identity of the food further comprises:
a. capturing, by at least one processor, an image by the forward facing camera of a packaging associated with the food;
b. at least partially in response to capturing the image, detecting, by at least one processor, a barcode contained on the packaging; and c. searching, by at least one processor, a database of barcodes to determine the nutritional value associated with the food.
64. The computer-implemented method of claim 63, wherein determining a quantity of the food further comprises:
a. detecting, by at least one processor, in the captured image a vessel used by the wearer to measure the quantity of the food that the wearer is preparing to ingest;
and b. detecting, by at least one processor, at least one marking on the vessel that indicates the quantity of the food placed in the vessel.
65. The computer-implemented method of claim 61, wherein the predetermined weight loss plan further comprises one or more values selected from a group consisting of:
a. a total daily calorie intake value;
b. a total daily fat intake value;
c. a total daily protein intake value;
d. a total daily carbohydrate intake value;

e. one or more prohibited foods;
f. one or more prohibited nutrients;
g. a weekly weight loss value;
h. a total daily amount of sleep value;
i. a total daily calorie burned value; and j. one or more daily physical activities.
66. The computer-implemented method of claim 65, wherein calculating one or more recommendations to assist the wearer in complying with the predetermined weight loss plan further comprises:
a. determining, by at least one processor, one or more physical activity recommendations;
b. determining, by at least one processor, one or more daily caloric intake recommendations;
c. at least partially in response to determining the one or more physical activity recommendations and the one or more daily caloric intake recommendations, calculating, by at least one processor, a recommended combination of at least one physical activity recommendation and at least one daily caloric intake recommendation;
and d. notifying, by at least one processor, the wearer of the recommended combination of the at least one physical activity recommendation and the at least one daily caloric intake recommendation to assist the wearer in complying with the predetermined weight loss plan.
67. The computer-implemented method of claim 65, wherein calculating one or more recommendations to assist the wearer in complying with the predetermined weight loss plan further comprises:
a. determining, by at least one processor, one or more total daily amounts of sleep recommendations;
b. determining, by at least one processor, one or more daily caloric intake recommendations;
c. comparing, by at least one processor, the one or more total daily amounts of sleep recommendations to the one or more daily caloric intake recommendations;

d. at least partially in response to comparing the one or more total daily amounts of sleep recommendations to the one or more daily caloric intake recommendations, calculating, by at least one processor, a recommended combination of at least one total daily amount of sleep recommendation and at least one daily caloric intake recommendation;
and e. notifying, by at least one processor, the wearer of the recommended combination of the at least one total daily amount of sleep recommendation and the at least one daily caloric intake recommendation to assist the wearer in complying with the predetermined weight loss plan.
CA2953856A 2014-09-05 2015-09-04 System for monitoring health related information for individuals Abandoned CA2953856A1 (en)

Applications Claiming Priority (9)

Application Number Priority Date Filing Date Title
US201462046406P 2014-09-05 2014-09-05
US62/046,406 2014-09-05
US14/550,406 US10448867B2 (en) 2014-09-05 2014-11-21 Wearable gait monitoring apparatus, systems, and related methods
US14/550,406 2014-11-21
US14/562,454 2014-12-05
US14/562,454 US9795324B2 (en) 2014-09-05 2014-12-05 System for monitoring individuals as they age in place
US14/610,589 US20160071423A1 (en) 2014-09-05 2015-01-30 Systems and method for monitoring an individual's compliance with a weight loss plan
US14/610,589 2015-01-30
PCT/US2015/048612 WO2016037091A1 (en) 2014-09-05 2015-09-04 System for monitoring health related information for individuals

Publications (1)

Publication Number Publication Date
CA2953856A1 true CA2953856A1 (en) 2016-03-10

Family

ID=55436366

Family Applications (3)

Application Number Title Priority Date Filing Date
CA2960429A Abandoned CA2960429A1 (en) 2014-09-05 2015-09-04 Computerized replacement temple for standard eyewear
CA2960425A Abandoned CA2960425A1 (en) 2014-09-05 2015-09-04 Systems, apparatus, and methods for using eyewear, or other wearable item, to confirm the identity of an individual
CA2953856A Abandoned CA2953856A1 (en) 2014-09-05 2015-09-04 System for monitoring health related information for individuals

Family Applications Before (2)

Application Number Title Priority Date Filing Date
CA2960429A Abandoned CA2960429A1 (en) 2014-09-05 2015-09-04 Computerized replacement temple for standard eyewear
CA2960425A Abandoned CA2960425A1 (en) 2014-09-05 2015-09-04 Systems, apparatus, and methods for using eyewear, or other wearable item, to confirm the identity of an individual

Country Status (4)

Country Link
US (12) US10448867B2 (en)
EP (3) EP3189367B1 (en)
CA (3) CA2960429A1 (en)
WO (3) WO2016037091A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4027349A4 (en) * 2019-11-01 2022-11-09 TERUMO Kabushiki Kaisha Image management system, wearable device, image management method, and image management program

Families Citing this family (129)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9900669B2 (en) 2004-11-02 2018-02-20 Pierre Touma Wireless motion sensor system and method
US10831982B2 (en) * 2009-10-14 2020-11-10 Iplcontent, Llc Hands-free presenting device
US9968297B2 (en) * 2012-06-14 2018-05-15 Medibotics Llc EEG glasses (electroencephalographic eyewear)
US11612786B2 (en) * 2012-08-31 2023-03-28 Blue Goji Llc System and method for targeted neurological therapy using brainwave entrainment with passive treatment
US20140142442A1 (en) * 2012-11-19 2014-05-22 Judy Sibille SNOW Audio Feedback for Medical Conditions
US9400549B2 (en) 2013-03-08 2016-07-26 Chi Fai Ho Method and system for a new-era electronic book
CA2910699A1 (en) * 2013-04-30 2014-11-06 Chester WHITE Body impact bracing apparatus
US11918375B2 (en) 2014-09-05 2024-03-05 Beijing Zitiao Network Technology Co., Ltd. Wearable environmental pollution monitor computer apparatus, systems, and related methods
US10617342B2 (en) 2014-09-05 2020-04-14 Vision Service Plan Systems, apparatus, and methods for using a wearable device to monitor operator alertness
US10448867B2 (en) * 2014-09-05 2019-10-22 Vision Service Plan Wearable gait monitoring apparatus, systems, and related methods
FR3028980B1 (en) * 2014-11-20 2017-01-13 Oberthur Technologies METHOD AND DEVICE FOR AUTHENTICATING A USER
US10215568B2 (en) 2015-01-30 2019-02-26 Vision Service Plan Systems and methods for tracking motion, performance, and other data for an individual such as a winter sports athlete
KR20160101497A (en) * 2015-02-17 2016-08-25 삼성전자주식회사 Wearable device and method for operating thereof
US9886633B2 (en) * 2015-02-23 2018-02-06 Vivint, Inc. Techniques for identifying and indexing distinguishing features in a video feed
KR102634148B1 (en) 2015-03-16 2024-02-05 매직 립, 인코포레이티드 Methods and system for diagnosing and treating health ailments
US10387173B1 (en) 2015-03-27 2019-08-20 Intuit Inc. Method and system for using emotional state data to tailor the user experience of an interactive software system
US10799118B2 (en) * 2015-03-27 2020-10-13 Intel Corporation Motion tracking using electronic devices
US10169827B1 (en) 2015-03-27 2019-01-01 Intuit Inc. Method and system for adapting a user experience provided through an interactive software system to the content being delivered and the predicted emotional impact on the user of that content
US9930102B1 (en) 2015-03-27 2018-03-27 Intuit Inc. Method and system for using emotional state data to tailor the user experience of an interactive software system
WO2016183020A1 (en) 2015-05-11 2016-11-17 Magic Leap, Inc. Devices, methods and systems for biometric user recognition utilizing neural networks
US10154129B2 (en) * 2015-05-15 2018-12-11 Polar Electro Oy Wearable electronic apparatus
US20160350138A1 (en) * 2015-05-31 2016-12-01 Roya Caroline SALAS Biofeedback system
US10791938B2 (en) 2015-06-14 2020-10-06 Facense Ltd. Smartglasses for detecting congestive heart failure
US11064892B2 (en) 2015-06-14 2021-07-20 Facense Ltd. Detecting a transient ischemic attack using photoplethysmogram signals
US11154203B2 (en) 2015-06-14 2021-10-26 Facense Ltd. Detecting fever from images and temperatures
US11103140B2 (en) 2015-06-14 2021-08-31 Facense Ltd. Monitoring blood sugar level with a comfortable head-mounted device
US10799122B2 (en) 2015-06-14 2020-10-13 Facense Ltd. Utilizing correlations between PPG signals and iPPG signals to improve detection of physiological responses
US11103139B2 (en) 2015-06-14 2021-08-31 Facense Ltd. Detecting fever from video images and a baseline
US20170225033A1 (en) * 2015-06-23 2017-08-10 Ipcomm Llc Method and Apparatus for Analysis of Gait and to Provide Haptic and Visual Corrective Feedback
US10332122B1 (en) 2015-07-27 2019-06-25 Intuit Inc. Obtaining and analyzing user physiological data to determine whether a user would benefit from user support
DK3328276T3 (en) * 2015-07-27 2023-09-18 Massachusetts Inst Technology APPARATUS IN CONNECTION WITH MONITORING METABOLISM
US20170061823A1 (en) * 2015-09-02 2017-03-02 Hello Doctor Ltd. System for tracking and monitoring personal medical data and encouraging to follow personalized condition-based profile and method thereof
US11272864B2 (en) 2015-09-14 2022-03-15 Health Care Originals, Inc. Respiratory disease monitoring wearable apparatus
EP3779996A1 (en) 2015-10-01 2021-02-17 Dnanudge Limited Method, apparatus and system for securely transferring genetic information
US10861594B2 (en) 2015-10-01 2020-12-08 Dnanudge Limited Product recommendation system and method
EP3363360B1 (en) * 2015-10-13 2021-03-24 Alps Alpine Co., Ltd. Walking measurement device, walking measurement method and corresponding computer program
US10014967B2 (en) * 2015-11-23 2018-07-03 Huami Inc. System and method for authenticating a broadcast device using facial recognition
US10096383B2 (en) * 2015-11-24 2018-10-09 International Business Machines Corporation Performing a health analysis using a smart floor mat
US10105095B2 (en) * 2015-11-30 2018-10-23 Oura Health Oy Method and system for defining balance between physical activity and rest
EP3178379A1 (en) * 2015-12-09 2017-06-14 Rythm Method and device for bioelectric physiological signal acquisition and processing
US10610146B1 (en) * 2015-12-21 2020-04-07 Dp Technologies, Inc. Utilizing wearable devices in an internet of things environment
US11094418B2 (en) * 2015-12-31 2021-08-17 Nokia Technologies Oy Optimized biological measurement
WO2017163227A1 (en) * 2016-03-25 2017-09-28 Randolph Andrae User authentication using biometric information
US10251597B2 (en) 2016-04-21 2019-04-09 Viavi Solutions Inc. Health tracking device
KR102427034B1 (en) * 2016-04-22 2022-07-29 바이압틱스 인코포레이티드 Determination of absolute and relative tissue oxygen saturation
US10955269B2 (en) 2016-05-20 2021-03-23 Health Care Originals, Inc. Wearable apparatus
US20170345274A1 (en) * 2016-05-27 2017-11-30 General Scientific Corporation Neck posture recording and warning device
US10002515B2 (en) 2016-06-01 2018-06-19 Tile, Inc. User intervention based on tracking device location
WO2018030734A1 (en) * 2016-08-09 2018-02-15 주식회사 비플렉스 3d simulation method and apparatus
WO2018027253A1 (en) * 2016-08-11 2018-02-15 Bloomfield Lochlan John Health management system and method
WO2018050746A1 (en) 2016-09-14 2018-03-22 F. Hoffmann-La Roche Ag Digital biomarkers for progressing ms
CN106510719B (en) * 2016-09-30 2023-11-28 歌尔股份有限公司 User gesture monitoring method and wearable device
US10539549B2 (en) * 2016-10-13 2020-01-21 Worcester Polytechnic Institute Mobile blood alcohol content and impairment sensing device
CN106377271A (en) * 2016-10-20 2017-02-08 中国矿业大学 Wearable method and wearable device for monitoring and regulating physiology and psychology
US20190244714A1 (en) * 2016-10-20 2019-08-08 Dapapult, Inc. Sickness prediction application system
US10029068B2 (en) 2016-11-01 2018-07-24 Polyvagal Science LLC Methods and systems for reducing sound sensitivities and improving auditory processing, behavioral state regulation and social engagement behaviors
US11062175B2 (en) * 2016-11-22 2021-07-13 Japan Aerospace Exploration Agency System, method, and program for estimating reduced attention state, and storage medium storing the same program
GB201620638D0 (en) 2016-12-05 2017-01-18 Equi+Poise Ltd A gait analysis system
KR20180089803A (en) 2017-02-01 2018-08-09 삼성전자주식회사 Electronic apparatus and method for processing authentication
US11622716B2 (en) * 2017-02-13 2023-04-11 Health Care Originals, Inc. Wearable physiological monitoring systems and methods
US10973446B2 (en) * 2017-02-13 2021-04-13 David Schie Device to extract physiological information and method therefor
EP3579751A1 (en) 2017-02-13 2019-12-18 Starkey Laboratories, Inc. Fall prediction system and method of using same
JP6894252B2 (en) * 2017-02-16 2021-06-30 日本光電工業株式会社 Sensor device and watching device
US20180241973A1 (en) * 2017-02-21 2018-08-23 Janet Newell Video and audio recording system and method
SE541712C2 (en) * 2017-02-22 2019-12-03 Next Step Dynamics Ab Method and apparatus for health prediction
WO2018156992A1 (en) * 2017-02-23 2018-08-30 Miller Charles Robert Iii Device and system for user context-cortical sensing and determination
WO2018182159A1 (en) * 2017-03-28 2018-10-04 문명일 Smart glasses capable of processing virtual object
KR102651253B1 (en) * 2017-03-31 2024-03-27 삼성전자주식회사 An electronic device for determining user's emotions and a control method thereof
CN107049338A (en) * 2017-04-12 2017-08-18 河南工业大学 A kind of medical use mood detection means communicated based on computer
US11559252B2 (en) 2017-05-08 2023-01-24 Starkey Laboratories, Inc. Hearing assistance device incorporating virtual audio interface for therapy guidance
US11589807B2 (en) * 2017-05-11 2023-02-28 The Regents Of The University Of California Biosensor for monitoring eyedrop usage compliance
US10699247B2 (en) 2017-05-16 2020-06-30 Under Armour, Inc. Systems and methods for providing health task notifications
US10685585B2 (en) 2017-06-27 2020-06-16 International Business Machines Corporation Physical activity and dietary based services
US10534203B2 (en) * 2017-07-31 2020-01-14 Snap Inc. Near-field antenna for eyewear
US11373450B2 (en) * 2017-08-11 2022-06-28 Tectus Corporation Eye-mounted authentication system
US10832590B2 (en) * 2017-09-13 2020-11-10 At&T Intellectual Property I, L.P. Monitoring food intake
US11062572B1 (en) * 2017-09-20 2021-07-13 Amazon Technologies, Inc. Visual indicator for head-mounted device
US10073998B1 (en) * 2017-11-29 2018-09-11 The United States Of America As Representd By The Secretary Of The Navy Multifunction wearable object identified glasses for the visually handicapped
US20190167226A1 (en) * 2017-12-04 2019-06-06 International Business Machines Corporation Infant gastrointestinal monitor
US11092998B1 (en) 2018-01-12 2021-08-17 Snap Inc. Eyewear device with fingerprint sensor for user input
CN108089326B (en) * 2018-02-01 2023-12-26 北京七鑫易维信息技术有限公司 Device suitable for being used with glasses
US11183291B2 (en) 2018-02-12 2021-11-23 Zoe Limited Generating personalized nutritional recommendations using predicted values of biomarkers
US11295860B2 (en) 2018-02-12 2022-04-05 Zoe Limited Using at home measures to predict clinical state and improving the accuracy of at home measurements/predictions data associated with circadian rhythm and meal timing
WO2019157450A1 (en) * 2018-02-12 2019-08-15 Cornell University Methods and systems for concussion management using cold stimulus
US11183080B2 (en) 2018-02-12 2021-11-23 Zoe Limited Generating predicted values of biomarkers for scoring food
US11915151B2 (en) 2018-08-27 2024-02-27 Zoe Limited Accuracy of test data outside the clinic
US11348479B2 (en) 2018-05-23 2022-05-31 Zoe Limited Accuracy of measuring nutritional responses in a non-clinical setting
CA3091209C (en) * 2018-03-01 2021-08-31 Polyvagal Science LLC Systems and methods for modulating physiological state
CN108401129A (en) * 2018-03-22 2018-08-14 广东小天才科技有限公司 Video call method, device, terminal based on Wearable and storage medium
US11331003B2 (en) 2018-03-27 2022-05-17 Samsung Electronics Co., Ltd. Context-aware respiration rate determination using an electronic device
JP7019796B2 (en) * 2018-03-30 2022-02-15 株式会社日立製作所 Physical function independence support device and its method
US10424035B1 (en) * 2018-05-16 2019-09-24 Trungram Gyaltrul R. Sherpa Monitoring conditions associated with remote individuals over a data communication network and automatically notifying responsive to detecting customized emergency conditions
US10438479B1 (en) * 2018-05-21 2019-10-08 International Business Machines Corporation Safety enhancement for persons under the care of others
US11054638B2 (en) * 2018-06-13 2021-07-06 Reavire, Inc. Tracking pointing direction of device
US10121355B1 (en) * 2018-06-26 2018-11-06 Capital One Services, Llc Condition-responsive wearable device for sensing and indicating proximity of an article with a specific characteristic
US10922397B2 (en) 2018-07-24 2021-02-16 Dnanudge Limited Method and device for comparing personal biological data of two users
US10582897B2 (en) 2018-07-24 2020-03-10 Dnanudge Limited Method and device for comparing personal biological data of two users
US10722128B2 (en) 2018-08-01 2020-07-28 Vision Service Plan Heart rate detection system and method
US11380215B2 (en) * 2018-08-30 2022-07-05 Kyndryl, Inc. Reward-based ecosystem for tracking nutritional consumption
AU2019339190A1 (en) * 2018-09-12 2021-04-15 Dnanudge Limited Product recommendation system and method
WO2020061209A1 (en) * 2018-09-18 2020-03-26 Biointellisense, Inc. Validation, compliance, and/or intervention with ear device
CN109543546B (en) * 2018-10-26 2022-12-20 复旦大学 Gait age estimation method based on depth sequence distribution regression
WO2020124022A2 (en) 2018-12-15 2020-06-18 Starkey Laboratories, Inc. Hearing assistance system with enhanced fall detection features
US11638563B2 (en) 2018-12-27 2023-05-02 Starkey Laboratories, Inc. Predictive fall event management system and method of using same
DE112020000351A5 (en) * 2019-01-07 2021-10-28 Metralabs Gmbh Neue Technologien Und Systeme Method and system for recording a person's movement sequence
US11150788B2 (en) * 2019-03-14 2021-10-19 Ebay Inc. Augmented or virtual reality (AR/VR) companion device techniques
US10811140B2 (en) 2019-03-19 2020-10-20 Dnanudge Limited Secure set-up of genetic related user account
US10699806B1 (en) 2019-04-15 2020-06-30 Dnanudge Limited Monitoring system, wearable monitoring device and method
US11156856B2 (en) 2019-05-02 2021-10-26 Tom Nepola Eyewear with wearing status detector
US11007406B2 (en) * 2019-05-03 2021-05-18 Xperience Robotics, Inc. Wearable device systems and methods for guiding physical movements
US11663927B2 (en) * 2019-05-10 2023-05-30 Tso-Cheng Chien Food quantum tracking tools and methods related thereto
ES1231504Y (en) * 2019-05-14 2019-09-13 I4Life Innovacion Y Desarrollos S L Multifunctional Unlocking Device
US11786694B2 (en) 2019-05-24 2023-10-17 NeuroLight, Inc. Device, method, and app for facilitating sleep
US11223939B1 (en) * 2019-07-31 2022-01-11 United Services Automobile Association (Usaa) Environmental conditions monitoring system
US11310322B2 (en) 2019-11-21 2022-04-19 Blackberry Limited Method and system for pairing a chassis and container in an asset tracking system
DE102020102281A1 (en) * 2019-12-19 2021-06-24 USound GmbH Glasses with charging interface
GB2590802A (en) 2020-01-03 2021-07-07 Dnanudge Ltd Method and device for comparing personal biological data of two users
US11948672B2 (en) * 2020-02-27 2024-04-02 Todd Martin Mobile intelligent injury minimization system and method
US10991190B1 (en) 2020-07-20 2021-04-27 Abbott Laboratories Digital pass verification systems and methods
CN112150767B (en) * 2020-09-27 2021-11-23 河南城建学院 Fatigue driving monitoring system based on Internet of things and computer
KR102525485B1 (en) * 2020-12-02 2023-04-25 임지호 Self-authentication wearable device for the visually impaired
CN112562260B (en) * 2020-12-16 2022-08-09 浙江大华技术股份有限公司 Anti-lost method and device
DE202021100976U1 (en) 2021-02-26 2021-05-18 seiwo Technik GmbH System for recording the state of health of a person
DE102021104705A1 (en) 2021-02-26 2022-09-01 seiwo Technik GmbH Method and system for recording the state of health of a person
CN117561022A (en) * 2021-07-02 2024-02-13 研究三角协会 Systems, methods, and apparatus for detecting viral respiratory disease in pre-symptomatic and asymptomatic infected persons
CN113457108B (en) * 2021-07-07 2022-07-15 首都体育学院 Cognitive characterization-based exercise performance improving method and device
US20230028690A1 (en) * 2021-07-23 2023-01-26 Consumer Safety Technology Llc Method and system of deploying a wearable transdermal based vehicle ignition interlock
WO2023021509A1 (en) * 2021-08-17 2023-02-23 Brainwatch Tech Ltd. Methods, systems, and devices for determining a status of brain and/or nerve functions of a patient
US11806078B1 (en) 2022-05-01 2023-11-07 Globe Biomedical, Inc. Tear meniscus detection and evaluation system

Family Cites Families (253)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3505879A (en) 1968-07-31 1970-04-14 Ford Motor Co Electrical generator including components of an automotive vehicle mechanical speedometer
US3548663A (en) 1968-07-31 1970-12-22 Ford Motor Co Electrical generator including components of an automotive vehicle mechanical speedometer
US3972038A (en) 1975-03-28 1976-07-27 Nasa Accelerometer telemetry system
US4100401A (en) 1977-01-13 1978-07-11 Tutt Eugene F Calorie calculator-chronometer
DE2709412A1 (en) 1977-03-04 1978-09-07 Max Baermann Eddy current tachometer with temperature compensation
US4195642A (en) 1978-01-03 1980-04-01 Beehive International Wearable heart rate monitor
GB1593839A (en) 1978-05-26 1981-07-22 Pringle R D Performance testing device
US4434801A (en) 1980-04-30 1984-03-06 Biotechnology, Inc. Apparatus for testing physical condition of a self-propelled vehicle rider
US4407295A (en) 1980-10-16 1983-10-04 Dna Medical, Inc. Miniature physiological monitor with interchangeable sensors
US4855942A (en) 1987-10-28 1989-08-08 Elexis Corporation Pedometer and/or calorie measuring device and method
US4878749A (en) 1988-06-29 1989-11-07 Mcgee James E Protective eyewear with interchangeable decorative frames
US4919530A (en) 1989-01-25 1990-04-24 Hyman Roger L Eyeglass assembly
US5670872A (en) 1992-06-22 1997-09-23 U.S. Philips Corporation System and device with vertical and rotary wheel-velocity-measuring for determining vehicle displacement
US5497143A (en) 1993-06-24 1996-03-05 Casio Computer Co., Ltd. Electronic device for a vehicle
US5422816A (en) 1994-02-22 1995-06-06 Trimble Navigation Limited Portable personal navigation tracking system
US5452480A (en) 1994-04-15 1995-09-26 Electric Eyewear Systems, Inc. Ski goggles
US7386401B2 (en) 1994-11-21 2008-06-10 Phatrat Technology, Llc Helmet that reports impact information, and associated methods
US5585871A (en) 1995-05-26 1996-12-17 Linden; Harry Multi-function display apparatus
US5746501A (en) 1995-09-01 1998-05-05 Chien; Tseng Lu Portable object having a fastening band illuminated by a super thin lighting element
US6183425B1 (en) 1995-10-13 2001-02-06 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Method and apparatus for monitoring of daily activity in terms of ground reaction forces
US5966680A (en) 1996-02-15 1999-10-12 Butnaru; Hanan Motion sickness/vertigo prevention device and method
EP0842635B1 (en) 1996-04-08 2003-09-24 Seiko Epson Corporation Motion prescription support device
US5976083A (en) 1997-07-30 1999-11-02 Living Systems, Inc. Portable aerobic fitness monitor for walking and running
US5891042A (en) 1997-09-09 1999-04-06 Acumen, Inc. Fitness monitoring device having an electronic pedometer and a wireless heart rate monitor
US6381482B1 (en) 1998-05-13 2002-04-30 Georgia Tech Research Corp. Fabric or garment with integrated flexible information infrastructure
US6013007A (en) 1998-03-26 2000-01-11 Liquid Spark, Llc Athlete's GPS-based performance monitor
US5931764A (en) 1998-06-24 1999-08-03 Viztec, Inc. Wearable device with flexible display
US7376238B1 (en) 1998-09-18 2008-05-20 Rivas Technologies International, Inc. Pulse rate, pressure and heart condition monitoring glasses
US6218958B1 (en) 1998-10-08 2001-04-17 International Business Machines Corporation Integrated touch-skin notification system for wearable computing devices
US6532298B1 (en) 1998-11-25 2003-03-11 Iridian Technologies, Inc. Portable authentication device and method using iris patterns
WO2000046581A1 (en) 1999-02-05 2000-08-10 Curtis Instruments, Inc. Shaft sensor for angular velocity, torque, power
WO2001028416A1 (en) 1999-09-24 2001-04-26 Healthetech, Inc. Physiological monitor and associated computation, display and communication unit
US6736759B1 (en) 1999-11-09 2004-05-18 Paragon Solutions, Llc Exercise monitoring system and methods
US6431705B1 (en) 1999-11-10 2002-08-13 Infoeye Eyewear heart rate monitor
US7156809B2 (en) 1999-12-17 2007-01-02 Q-Tec Systems Llc Method and apparatus for health and disease management combining patient data monitoring with wireless internet connectivity
US7454002B1 (en) 2000-01-03 2008-11-18 Sportbrain, Inc. Integrating personal data capturing functionality into a portable computing device and a wireless communication device
US6513532B2 (en) 2000-01-19 2003-02-04 Healthetech, Inc. Diet and activity-monitoring device
JP2001297318A (en) 2000-04-14 2001-10-26 Omron Corp Pedometer
DK1285409T3 (en) 2000-05-16 2005-08-22 Swisscom Mobile Ag Process of biometric identification and authentication
JP2001344352A (en) 2000-05-31 2001-12-14 Toshiba Corp Life assisting device, life assisting method and advertisement information providing method
US6325507B1 (en) 2000-06-02 2001-12-04 Oakley, Inc. Eyewear retention system extending across the top of a wearer's head
DE10043797A1 (en) 2000-09-06 2002-03-28 Daimler Chrysler Ag Integrated traffic monitoring system
US20050054942A1 (en) 2002-01-22 2005-03-10 Melker Richard J. System and method for therapeutic drug monitoring
EP1344092A2 (en) 2000-12-15 2003-09-17 Daniel Rosenfeld Location-based weather nowcast system and method
US20020151810A1 (en) 2001-04-16 2002-10-17 Acumen, Inc. Wrist-based fitness monitoring devices
US6769767B2 (en) 2001-04-30 2004-08-03 Qr Spex, Inc. Eyewear with exchangeable temples housing a transceiver forming ad hoc networks with other devices
JP2003033328A (en) 2001-07-19 2003-02-04 Nippon Seimitsu Sokki Kk Heart rate monitor and method for measuring heart rate
JP2003242584A (en) 2002-02-13 2003-08-29 Seiko Instruments Inc Wearable electronic equipment system and wearable electronic equipment
IL164685A0 (en) 2002-04-22 2005-12-18 Marcio Marc Aurelio Martins Ab Apparatus and method for measuring biologic parameters
US9153074B2 (en) 2011-07-18 2015-10-06 Dylan T X Zhou Wearable augmented reality eyeglass communication device including mobile phone and mobile computing via virtual touch screen gesture control and neuron command
GB2396421A (en) 2002-12-16 2004-06-23 Orange Personal Comm Serv Ltd Head-worn device measuring brain and facial muscle activity
CN103300821B (en) 2003-02-26 2016-11-09 马尔西奥·马克·奥雷利奥·马丁斯·阿布雷乌 The instrument of measurement biological parameter and method
US7922321B2 (en) 2003-10-09 2011-04-12 Ipventure, Inc. Eyewear supporting after-market electrical components
US7792552B2 (en) 2003-04-15 2010-09-07 Ipventure, Inc. Eyeglasses for wireless communications
US7581833B2 (en) 2003-10-09 2009-09-01 Ipventure, Inc. Eyewear supporting after-market electrical components
US7500747B2 (en) 2003-10-09 2009-03-10 Ipventure, Inc. Eyeglasses with electrical components
US7255437B2 (en) 2003-10-09 2007-08-14 Howell Thomas A Eyeglasses with activity monitoring
US7192136B2 (en) 2003-04-15 2007-03-20 Howell Thomas A Tethered electrical components for eyeglasses
US8465151B2 (en) 2003-04-15 2013-06-18 Ipventure, Inc. Eyewear with multi-part temple for supporting one or more electrical components
US7500746B1 (en) 2004-04-15 2009-03-10 Ip Venture, Inc. Eyewear with radiation detection system
US7806525B2 (en) 2003-10-09 2010-10-05 Ipventure, Inc. Eyeglasses having a camera
US8109629B2 (en) 2003-10-09 2012-02-07 Ipventure, Inc. Eyewear supporting electrical components and apparatus therefor
US7380936B2 (en) 2003-10-09 2008-06-03 Ipventure, Inc. Eyeglasses with a clock or other electrical component
US20050033200A1 (en) 2003-08-05 2005-02-10 Soehren Wayne A. Human motion identification and measurement system and method
US7059717B2 (en) 2003-08-11 2006-06-13 Bloch Nigel K Eyeglasses with interchangable temple-members
US7677723B2 (en) 2003-10-09 2010-03-16 Ipventure, Inc. Eyeglasses with a heart rate monitor
US10310296B2 (en) 2003-10-09 2019-06-04 Ingeniospec, Llc Eyewear with printed circuit board
US7438410B1 (en) 2003-10-09 2008-10-21 Ip Venture, Inc. Tethered electrical components for eyeglasses
FR2860700B1 (en) 2003-10-10 2005-12-09 Commissariat Energie Atomique CROWN CONTROL DEVICE
CN102670163B (en) 2004-04-01 2016-04-13 威廉·C·托奇 The system and method for controlling calculation device
US8337013B2 (en) 2004-07-28 2012-12-25 Ipventure, Inc. Eyeglasses with RFID tags or with a strap
US8915588B2 (en) 2004-11-02 2014-12-23 E-Vision Smart Optics, Inc. Eyewear including a heads up display
US7793361B2 (en) 2004-11-12 2010-09-14 Nike, Inc. Article of apparel incorporating a separable electronic device
US20060115130A1 (en) 2004-11-29 2006-06-01 Douglas Kozlay Eyewear with biometrics to protect displayed data
US7457434B2 (en) 2005-04-04 2008-11-25 Massachusetts Eye & Ear Infirmary Adaptively focusing extra-ocular vision prostheses
US7400257B2 (en) 2005-04-06 2008-07-15 Rivas Victor A Vital signals and glucose monitoring personal wireless system
EP1877981A4 (en) * 2005-05-02 2009-12-16 Univ Virginia Systems, devices, and methods for interpreting movement
US8979295B2 (en) 2005-05-17 2015-03-17 Michael Waters Rechargeable lighted glasses
US20070052672A1 (en) 2005-09-08 2007-03-08 Swisscom Mobile Ag Communication device, system and method
US20070112287A1 (en) 2005-09-13 2007-05-17 Fancourt Craig L System and method for detecting deviations in nominal gait patterns
US20070081123A1 (en) 2005-10-07 2007-04-12 Lewis Scott W Digital eyewear
US11428937B2 (en) 2005-10-07 2022-08-30 Percept Technologies Enhanced optical and perceptual digital eyewear
US8696113B2 (en) 2005-10-07 2014-04-15 Percept Technologies Inc. Enhanced optical and perceptual digital eyewear
US9658473B2 (en) 2005-10-07 2017-05-23 Percept Technologies Inc Enhanced optical and perceptual digital eyewear
US7648463B1 (en) 2005-12-15 2010-01-19 Impact Sports Technologies, Inc. Monitoring device, method and system
GB0602127D0 (en) 2006-02-02 2006-03-15 Imp Innovations Ltd Gait analysis
US8188868B2 (en) 2006-04-20 2012-05-29 Nike, Inc. Systems for activating and/or authenticating electronic devices for operation with apparel
US7558622B2 (en) 2006-05-24 2009-07-07 Bao Tran Mesh network stroke monitoring appliance
US7539533B2 (en) 2006-05-16 2009-05-26 Bao Tran Mesh network monitoring appliance
US7543934B2 (en) 2006-09-20 2009-06-09 Ipventures, Inc. Eyeglasses with activity monitoring and acoustic dampening
WO2008073806A1 (en) 2006-12-08 2008-06-19 Sabic Innovative Plastics Ip B.V. Active transdermal drug delivery system
US8157730B2 (en) * 2006-12-19 2012-04-17 Valencell, Inc. Physiological and environmental monitoring systems and methods
WO2008089184A2 (en) 2007-01-15 2008-07-24 Deka Products Limited Partnership Device and method for food management
JP2008198028A (en) 2007-02-14 2008-08-28 Sony Corp Wearable device, authentication method and program
EP2149068B1 (en) 2007-04-23 2021-06-09 Huawei Technologies Co., Ltd. Eyewear having human activity monitoring device
US8944590B2 (en) 2010-07-02 2015-02-03 Mitsui Chemicals, Inc. Electronic spectacle frames
WO2008143738A1 (en) 2007-05-18 2008-11-27 Ultimate Balance, Inc. Newtonian physical activity monitor
US7884727B2 (en) 2007-05-24 2011-02-08 Bao Tran Wireless occupancy and day-light sensing
EP2165234A1 (en) 2007-06-07 2010-03-24 Panagiotis Pavlopoulos An eyewear comprising at least one display device
US9254100B2 (en) 2007-09-12 2016-02-09 Cardiac Pacemakers, Inc. Logging daily average metabolic activity using a motion sensor
EP2197363B1 (en) 2007-09-21 2016-11-02 Covidien LP Surgical device
US8448846B2 (en) 2007-11-18 2013-05-28 Intel-Ge Care Innovations Llc Medication recording device
US8202148B2 (en) 2007-12-03 2012-06-19 Julius Young Machine and method for caddying and golf instruction
US20090195747A1 (en) 2008-02-04 2009-08-06 Insua Luisa M Interchangeable eyeglass temples
US20090227853A1 (en) 2008-03-03 2009-09-10 Ravindra Wijesiriwardana Wearable optical pulse plethysmography sensors or pulse oximetry sensors based wearable heart rate monitoring systems
US20120142443A1 (en) 2008-03-17 2012-06-07 Chris Savarese Golf club apparatuses and methods
CN101566874A (en) 2008-04-24 2009-10-28 鸿富锦精密工业(深圳)有限公司 Control device and electronic equipment using same
RU2011106031A (en) 2008-07-18 2012-08-27 Опталерт Пти Лтд (Au) SENSITIVE DEVICE FOR ACTIVE STATE
US8011242B2 (en) 2008-07-29 2011-09-06 Garmin Switzerland Gmbh System and device for measuring and analyzing forces applied by a cyclist on a pedal of a bicycle
US20100042430A1 (en) 2008-08-12 2010-02-18 Irody Inc System and method for collecting and authenticating medication consumption
WO2010027725A1 (en) 2008-08-25 2010-03-11 Tri-Specs, Inc. Fashion eyewear frame that houses circuitry to effect wireless audio communication while providing extraneous background noise cancellation capability
US20100136508A1 (en) 2008-10-23 2010-06-03 Damir Zekhtser Meal Plan Management
WO2010062481A1 (en) 2008-11-02 2010-06-03 David Chaum Near to eye display system and appliance
US8494507B1 (en) 2009-02-16 2013-07-23 Handhold Adaptive, LLC Adaptive, portable, multi-sensory aid for the disabled
US20110054359A1 (en) 2009-02-20 2011-03-03 The Regents of the University of Colorado , a body corporate Footwear-based body weight monitor and postural allocation, physical activity classification, and energy expenditure calculator
EP3357419A1 (en) 2009-02-25 2018-08-08 Valencell, Inc. Light-guiding devices and monitoring devices incorporating same
US20130024211A1 (en) 2009-04-09 2013-01-24 Access Mobility, Inc. Active learning and advanced relationship marketing and health interventions
US20100280336A1 (en) 2009-04-30 2010-11-04 Medtronic, Inc. Anxiety disorder monitoring
US8081082B2 (en) 2009-05-27 2011-12-20 International Business Machines Corporation Monitoring patterns of motion
US20100308999A1 (en) 2009-06-05 2010-12-09 Chornenky Todd E Security and monitoring apparatus
US8253561B2 (en) 2009-06-10 2012-08-28 Betty L. Bowers Medication management apparatus and system
US20100332571A1 (en) 2009-06-30 2010-12-30 Jennifer Healey Device augmented food identification
US10748447B2 (en) 2013-05-24 2020-08-18 Lincoln Global, Inc. Systems and methods providing a computerized eyewear device to aid in welding
US20130009907A1 (en) 2009-07-31 2013-01-10 Rosenberg Ilya D Magnetic Stylus
EP2484281A4 (en) * 2009-09-30 2015-05-06 Mitsubishi Chem Corp Body movement signal information processing method, information processing system and information processing device
US8303311B2 (en) 2009-09-30 2012-11-06 Forest Carl A Sport personal coach system
JP5504810B2 (en) 2009-10-06 2014-05-28 オムロンヘルスケア株式会社 Walking posture determination device, control program, and control method
US8605165B2 (en) 2010-10-06 2013-12-10 Ai Cure Technologies Llc Apparatus and method for assisting monitoring of medication adherence
US8290558B1 (en) 2009-11-23 2012-10-16 Vioptix, Inc. Tissue oximeter intraoperative sensor
FR2953284A1 (en) 2009-12-02 2011-06-03 Movea Sa SYSTEM AND METHOD FOR DRIVER ASSISTANCE OF BIOMECHANIC DRIVE VEHICLE COMPRISING AT LEAST ONE WHEEL
US8634701B2 (en) 2009-12-04 2014-01-21 Lg Electronics Inc. Digital data reproducing apparatus and corresponding method for reproducing content based on user characteristics
US20110169932A1 (en) 2010-01-06 2011-07-14 Clear View Technologies Inc. Wireless Facial Recognition
IT1397737B1 (en) 2010-01-18 2013-01-24 Giovanni Saggio EQUIPMENT AND METHOD OF DETECTION, TRAINING AND TRAINING OF MOTOR ACTIVITIES
US9229227B2 (en) 2010-02-28 2016-01-05 Microsoft Technology Licensing, Llc See-through near-eye display glasses with a light transmissive wedge shaped illumination system
US20120206485A1 (en) 2010-02-28 2012-08-16 Osterhout Group, Inc. Ar glasses with event and sensor triggered user movement control of ar eyepiece facilities
US8964298B2 (en) 2010-02-28 2015-02-24 Microsoft Corporation Video display modification based on sensor input for a see-through near-to-eye display
EP2539759A1 (en) 2010-02-28 2013-01-02 Osterhout Group, Inc. Local advertising content on an interactive head-mounted eyepiece
US20110224505A1 (en) 2010-03-12 2011-09-15 Rajendra Padma Sadhu User wearable portable communicative device
US9652965B2 (en) 2010-03-12 2017-05-16 Rajendra Padma Sadhu System and method for transmitting alerts and notifications to a user
US8568313B2 (en) 2010-03-12 2013-10-29 Rajendra Padma Sadhu User wearable portable communication device for collection and transmission of physiological data
WO2011143655A1 (en) 2010-05-14 2011-11-17 Advitech, Inc. System and method for prevention and control of the effects of spatial disorientation
WO2011158965A1 (en) 2010-06-17 2011-12-22 日本電気株式会社 Sensitivity evaluation system, sensitivity evaluation method, and program
US8531355B2 (en) 2010-07-23 2013-09-10 Gregory A. Maltz Unitized, vision-controlled, wireless eyeglass transceiver
US20120029367A1 (en) 2010-07-31 2012-02-02 Hobeika Hind Louis Heart rate waterproof measuring apparatus
US9247212B2 (en) 2010-08-26 2016-01-26 Blast Motion Inc. Intelligent motion capture element
US8594971B2 (en) 2010-09-22 2013-11-26 Invensense, Inc. Deduced reckoning navigation without a constraint relationship between orientation of a sensor platform and a direction of travel of an object
US9241635B2 (en) 2010-09-30 2016-01-26 Fitbit, Inc. Portable monitoring devices for processing applications and processing analysis of physiological conditions of a user associated with the portable monitoring device
US10216893B2 (en) 2010-09-30 2019-02-26 Fitbit, Inc. Multimode sensor devices
US8738323B2 (en) 2010-09-30 2014-05-27 Fitbit, Inc. Methods and systems for metrics analysis and interactive rendering, including events having combined activity and location information
US8849610B2 (en) 2010-09-30 2014-09-30 Fitbit, Inc. Tracking user physical activity with multiple devices
US8615377B1 (en) 2010-09-30 2013-12-24 Fitbit, Inc. Methods and systems for processing social interactive data and sharing of tracked activity associated with locations
US8738321B2 (en) 2010-09-30 2014-05-27 Fitbit, Inc. Methods and systems for classification of geographic locations for tracked activity
US8762102B2 (en) 2010-09-30 2014-06-24 Fitbit, Inc. Methods and systems for generation and rendering interactive events having combined activity and location information
US8712724B2 (en) 2010-09-30 2014-04-29 Fitbit, Inc. Calendar integration methods and systems for presentation of events having combined activity and location information
EP2439580A1 (en) 2010-10-01 2012-04-11 Ophtimalia Data exchange system
KR101346661B1 (en) 2010-11-15 2014-02-06 부경대학교 산학협력단 Cosmetic composition for preventing skin aging comprising chitooligosaccharides
JP2012113627A (en) 2010-11-26 2012-06-14 Terumo Corp Portable terminal, calorie estimation method, and calorie estimation program
US9113793B2 (en) 2010-12-10 2015-08-25 Rohm Co., Ltd. Pulse wave sensor
US20120169990A1 (en) 2011-01-05 2012-07-05 Burnstein Tracey E Electronic eyewear and footwear
US20120191016A1 (en) * 2011-01-25 2012-07-26 Harris Corporation Gait based notification and control of portable devices
US20120203310A1 (en) 2011-02-04 2012-08-09 Pugh Randall B Spectacles for light therapy
US9785242B2 (en) 2011-03-12 2017-10-10 Uday Parshionikar Multipurpose controllers and methods
US9317660B2 (en) 2011-03-31 2016-04-19 Adidas Ag Group performance monitoring system and method
US8510166B2 (en) 2011-05-11 2013-08-13 Google Inc. Gaze tracking system
US8911087B2 (en) 2011-05-20 2014-12-16 Eyefluence, Inc. Systems and methods for measuring reactions of head, eyes, eyelids and pupils
US9256711B2 (en) 2011-07-05 2016-02-09 Saudi Arabian Oil Company Systems, computer medium and computer-implemented methods for providing health information to employees via augmented reality display
US8184067B1 (en) 2011-07-20 2012-05-22 Google Inc. Nose bridge sensor
US9342610B2 (en) 2011-08-25 2016-05-17 Microsoft Technology Licensing, Llc Portals: registered objects as virtualized, personalized displays
TWI455342B (en) 2011-08-30 2014-10-01 Nat Univ Tsing Hua Solar cell with selective emitter structure and manufacturing method thereof
KR20130025675A (en) 2011-09-02 2013-03-12 삼성전자주식회사 User health monitoring system which comprises 3d glasses and display apparatus, and display apparatus and control method thereof
US8941560B2 (en) 2011-09-21 2015-01-27 Google Inc. Wearable computer with superimposed controls and instructions for external device
US20150057512A1 (en) 2011-11-16 2015-02-26 Rijuven Corporation Wearable heart failure monitor patch
US20130138413A1 (en) 2011-11-24 2013-05-30 Auckland Uniservices Limited System and Method for Determining Motion
US20140315162A1 (en) 2011-12-09 2014-10-23 Joel Ehrenkranz System and methods for monitoring food consumption
US8540583B2 (en) 2011-12-30 2013-09-24 Nike, Inc. System for tracking a golf ball and displaying an enhanced image of the golf ball
US9141194B1 (en) 2012-01-04 2015-09-22 Google Inc. Magnetometer-based gesture sensing with a wearable device
US9529197B2 (en) 2012-03-21 2016-12-27 Google Inc. Wearable device with input and output structures
US9737261B2 (en) 2012-04-13 2017-08-22 Adidas Ag Wearable athletic activity monitoring systems
US9504414B2 (en) 2012-04-13 2016-11-29 Adidas Ag Wearable athletic activity monitoring methods and systems
US20130307670A1 (en) 2012-05-15 2013-11-21 Jonathan E. Ramaci Biometric authentication system
US9763592B2 (en) 2012-05-25 2017-09-19 Emotiv, Inc. System and method for instructing a behavior change in a user
US9001427B2 (en) 2012-05-30 2015-04-07 Microsoft Technology Licensing, Llc Customized head-mounted display device
US20130329183A1 (en) 2012-06-11 2013-12-12 Pixeloptics, Inc. Adapter For Eyewear
US9599632B2 (en) 2012-06-22 2017-03-21 Fitbit, Inc. Fitness monitoring device with altimeter
US9005129B2 (en) 2012-06-22 2015-04-14 Fitbit, Inc. Wearable heart rate monitor
US8948832B2 (en) 2012-06-22 2015-02-03 Fitbit, Inc. Wearable heart rate monitor
US9035970B2 (en) 2012-06-29 2015-05-19 Microsoft Technology Licensing, Llc Constraint based information inference
US9579048B2 (en) 2012-07-30 2017-02-28 Treefrog Developments, Inc Activity monitoring system with haptic feedback
CA2880434A1 (en) 2012-07-30 2014-02-06 Treefrog Developments, Inc. Athletic monitoring
WO2014021602A2 (en) 2012-07-31 2014-02-06 인텔렉추얼디스커버리 주식회사 Wearable electronic device and method for controlling same
US9720231B2 (en) 2012-09-26 2017-08-01 Dolby Laboratories Licensing Corporation Display, imaging system and controller for eyewear display device
JP6021582B2 (en) 2012-10-24 2016-11-09 オリンパス株式会社 Glasses-type wearable device and front part of glasses-type wearable device
US9498128B2 (en) 2012-11-14 2016-11-22 MAD Apparel, Inc. Wearable architecture and methods for performance monitoring, analysis, and feedback
ITMI20121957A1 (en) 2012-11-16 2014-05-17 Marco Carrara GLASSES WITH HIGH FLEXIBILITY OF USE
US10045718B2 (en) 2012-11-22 2018-08-14 Atheer, Inc. Method and apparatus for user-transparent system control using bio-input
US20140218281A1 (en) 2012-12-06 2014-08-07 Eyefluence, Inc. Systems and methods for eye gaze determination
GB2496064B (en) 2012-12-31 2015-03-11 Nicholas Jamie Marston Video camera shooting glasses
ITMI20130024A1 (en) 2013-01-10 2014-07-11 Marco Carrara METHOD OF ACQUISITION AND TREATMENT OF HEART RATE DATA
US9520638B2 (en) 2013-01-15 2016-12-13 Fitbit, Inc. Hybrid radio frequency / inductive loop antenna
US20140204334A1 (en) 2013-01-18 2014-07-24 William Anthony Stoll Bio-sensors in eyeglasses
US9370302B2 (en) 2014-07-08 2016-06-21 Wesley W. O. Krueger System and method for the measurement of vestibulo-ocular reflex to improve human performance in an occupational environment
US9848776B2 (en) 2013-03-04 2017-12-26 Hello Inc. Methods using activity manager for monitoring user activity
US9500464B2 (en) 2013-03-12 2016-11-22 Adidas Ag Methods of determining performance information for individuals and sports objects
WO2014144918A2 (en) 2013-03-15 2014-09-18 Percept Technologies, Inc. Enhanced optical and perceptual digital eyewear
US10268276B2 (en) 2013-03-15 2019-04-23 Eyecam, LLC Autonomous computing and telecommunications head-up displays glasses
US20140276096A1 (en) * 2013-03-15 2014-09-18 Bonutti Research, Inc. Systems and methods for use in diagnosing a medical condition of a patient
JP2016515897A (en) 2013-03-15 2016-06-02 パーセプト テクノロジーズ, インコーポレイテッドPercept Technologies, Inc. Improved optical and perceptual digital eyewear
US9341526B2 (en) 2013-04-01 2016-05-17 Saris Cycling Group, Inc. System for speed-based power calculation
DE102013207064A1 (en) 2013-04-19 2014-10-23 Bayerische Motoren Werke Aktiengesellschaft Method for selecting an information source for display on data glasses
US20140324459A1 (en) 2013-04-30 2014-10-30 Hti Ip, L.L.C Automatic health monitoring alerts
US10930174B2 (en) 2013-05-24 2021-02-23 Lincoln Global, Inc. Systems and methods providing a computerized eyewear device to aid in welding
US20140375470A1 (en) 2013-06-20 2014-12-25 Chester Charles Malveaux Wearable networked and standalone biometric sensor system to record and transmit biometric data for multiple applications
US10512407B2 (en) 2013-06-24 2019-12-24 Fitbit, Inc. Heart rate data collection
ES2530421B1 (en) 2013-07-30 2015-07-09 Ion Eyewear, S.L. ACCESSORY GLASSES FOR MOBILE DEVICES AND PC¿S
FR3009270B1 (en) 2013-07-31 2016-09-09 Michelin & Cie DEVICE AND METHOD FOR CONTROLLING THE POWER OF ASSISTANCE OF A POWER-ASSISTED VELO
ES2576489T3 (en) 2013-08-02 2016-07-07 Essilor International (Compagnie Générale d'Optique) A method to control a programmable ophthalmic lens device
US9704412B2 (en) 2013-08-26 2017-07-11 John Andrew Wells Biometric data gathering
US20150065889A1 (en) 2013-09-02 2015-03-05 Life Beam Technologies Ltd. Bodily worn multiple optical sensors heart rate measuring device and method
US20150148632A1 (en) 2013-11-26 2015-05-28 David Alan Benaron Calorie Monitoring Sensor And Method For Cell Phones, Smart Watches, Occupancy Sensors, And Wearables
TWI548438B (en) 2013-12-20 2016-09-11 岱宇國際股份有限公司 Exercise device providing symmetry index
US9595181B2 (en) 2013-12-20 2017-03-14 Invensense, Inc. Wearable device assisting smart media application and vice versa
US9947012B2 (en) 2013-12-26 2018-04-17 Intel Corporation Secure transactions using a personal device
WO2015095924A1 (en) 2013-12-27 2015-07-02 Koonung Heights Pty Ltd A biofeedback, stress management and cognitive enhancement system
US9579060B1 (en) 2014-02-18 2017-02-28 Orbitol Research Inc. Head-mounted physiological signal monitoring system, devices and methods
CA2939922A1 (en) 2014-02-24 2015-08-27 Brain Power, Llc Systems, environment and methods for evaluation and management of autism spectrum disorder using a wearable data collection device
WO2015127119A2 (en) 2014-02-24 2015-08-27 Sony Corporation Body position optimization and bio-signal feedback for smart wearable devices
US9031812B2 (en) * 2014-02-27 2015-05-12 Fitbit, Inc. Notifications on a user device based on activity detected by an activity monitoring device
FR3019346B1 (en) 2014-03-31 2016-07-08 Withings METHOD FOR COUNTING STEPS PERFORMED BY A USER
CN105893721A (en) 2014-05-13 2016-08-24 陈威宇 Adaptive skin care information prompt system and adaptive skin care prompt method
US20150332149A1 (en) 2014-05-15 2015-11-19 Red Lozenge, Inc. Tracking behavior and goal achievement
US10478127B2 (en) 2014-06-23 2019-11-19 Sherlock Solutions, LLC Apparatuses, methods, processes, and systems related to significant detrimental changes in health parameters and activating lifesaving measures
TWI530276B (en) 2014-07-08 2016-04-21 原相科技股份有限公司 Biometric detection module with denoising function and biometric detection method thereof
WO2016017997A1 (en) 2014-07-31 2016-02-04 Samsung Electronics Co., Ltd. Wearable glasses and method of providing content using the same
US20160041404A1 (en) 2014-08-08 2016-02-11 Marchon Eyewear, Inc. Eyewear with interchangeable temples and brow bar
HK1203120A2 (en) 2014-08-26 2015-10-16 高平 A gait monitor and a method of monitoring the gait of a person
US10448867B2 (en) 2014-09-05 2019-10-22 Vision Service Plan Wearable gait monitoring apparatus, systems, and related methods
US20160066848A1 (en) 2014-09-05 2016-03-10 Vision Service Plan Wearable environmental pollution monitor computer apparatus, systems, and related methods
US10617342B2 (en) 2014-09-05 2020-04-14 Vision Service Plan Systems, apparatus, and methods for using a wearable device to monitor operator alertness
US20160117937A1 (en) 2014-10-27 2016-04-28 Bloom Technologies NV System and method for providing biometric and context based messaging
US9566033B2 (en) 2014-11-03 2017-02-14 Phillip Bogdanovich Garment system with electronic components and associated methods
KR102313220B1 (en) 2015-01-09 2021-10-15 삼성전자주식회사 Wearable device and method for controlling thereof
EP3250108A1 (en) 2015-01-30 2017-12-06 Koninklijke Philips N.V. Photoplethysmography apparatus
US20160223577A1 (en) 2015-01-30 2016-08-04 Vision Service Plan Systems and methods for tracking motion of a bicycle or other vehicles
US10349887B1 (en) 2015-06-14 2019-07-16 Facense Ltd. Blood pressure measuring smartglasses
US10398328B2 (en) 2015-08-25 2019-09-03 Koninklijke Philips N.V. Device and system for monitoring of pulse-related information of a subject
US9726904B1 (en) 2015-09-29 2017-08-08 Snap Inc. Eyewear with conductive temple joint
FR3043245B1 (en) 2015-11-03 2017-10-27 Stmicroelectronics Rousset METHOD FOR READING AN EEPROM MEMORY AND CORRESPONDING DEVICE
US9610476B1 (en) 2016-05-02 2017-04-04 Bao Tran Smart sport device
US20170255029A1 (en) 2016-03-03 2017-09-07 Vision Service Plan Systems and methods for charging eyewear
US20180064399A1 (en) 2016-09-07 2018-03-08 Heptagon Micro Optics Pte. Ltd. Imaging systems including multi-tap demodulation pixels for biometric measurements
EP3299871A1 (en) 2016-09-22 2018-03-28 Essilor International A wearing detection module for spectacle frame
US20180206735A1 (en) 2017-01-26 2018-07-26 Microsoft Technology Licensing, Llc Head-mounted device for capturing pulse data
US10874305B2 (en) 2018-01-15 2020-12-29 Microsoft Technology Licensing, Llc Sensor device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4027349A4 (en) * 2019-11-01 2022-11-09 TERUMO Kabushiki Kaisha Image management system, wearable device, image management method, and image management program

Also Published As

Publication number Publication date
WO2016037091A1 (en) 2016-03-10
US10448867B2 (en) 2019-10-22
US20170245757A1 (en) 2017-08-31
US20160070122A1 (en) 2016-03-10
US20160070121A1 (en) 2016-03-10
US10694981B2 (en) 2020-06-30
EP3189371A1 (en) 2017-07-12
EP3148435A1 (en) 2017-04-05
US20200046260A1 (en) 2020-02-13
EP3189367A4 (en) 2018-05-30
WO2016037120A1 (en) 2016-03-10
US9649052B2 (en) 2017-05-16
EP3189371A4 (en) 2018-05-30
US9795324B2 (en) 2017-10-24
CA2960429A1 (en) 2016-03-10
EP3189367B1 (en) 2024-04-10
US10188323B2 (en) 2019-01-29
US20160066820A1 (en) 2016-03-10
US20190298228A1 (en) 2019-10-03
EP3148435A4 (en) 2018-01-17
US20160071390A1 (en) 2016-03-10
WO2016037117A1 (en) 2016-03-10
US20180042523A1 (en) 2018-02-15
US20160066829A1 (en) 2016-03-10
EP3189367A1 (en) 2017-07-12
US20160071423A1 (en) 2016-03-10
CA2960425A1 (en) 2016-03-10
EP3148435B1 (en) 2023-10-18
US10542915B2 (en) 2020-01-28
US20190159700A1 (en) 2019-05-30
US20160066847A1 (en) 2016-03-10
US10307085B2 (en) 2019-06-04

Similar Documents

Publication Publication Date Title
EP3148435B1 (en) System for monitoring health related information for individuals
US20210365815A1 (en) Artificial intelligence and/or virtual reality for activity optimization/personalization
JP7446295B2 (en) Automatic detection of physical behavioral events and corresponding adjustment of drug delivery systems
US20230004580A1 (en) Data tagging
US9442100B2 (en) Caloric intake measuring system using spectroscopic and 3D imaging analysis
US9536449B2 (en) Smart watch and food utensil for monitoring food consumption
ES2253393T3 (en) SYSTEM TO CONTROL HEALTH, WELFARE AND EXERCISE.
CN109887568B (en) Health management system based on doctor's advice
US20160034764A1 (en) Wearable Imaging Member and Spectroscopic Optical Sensor for Food Identification and Nutrition Modification
US20140347491A1 (en) Smart Watch and Food-Imaging Member for Monitoring Food Consumption
CN104834827A (en) Health data processing method and system based on terminal equipment and cloud computation storage
CN103529684A (en) Intelligent health watch for automatically measuring and recording health data and intelligent health system
US20200359913A1 (en) System, apparatus, and methods for remote health monitoring
CN114616627A (en) Automatic detection of physical behavioral events and corresponding adjustment of medication dispensing systems
WO2021070472A1 (en) Information processing device, information processing system, and information processing method
JP7286093B2 (en) Eating Related Proposal Device and Eating Related Proposing System
Long et al. An Evaluation of Smartwatch Contribution in Improving Human Health
CN117854679A (en) Preinterventional diet behavior management system and management method thereof
Gaspar et al. A Review of Monitoring and Assisted Therapeutic Technology for AAL Applications
Ding Algorithms Embedded Personal Medical Device For Smart Healthcare

Legal Events

Date Code Title Description
EEER Examination request

Effective date: 20161229

FZDE Discontinued

Effective date: 20210831

FZDE Discontinued

Effective date: 20210831