US20120029345A1 - Noninvasive diagnostic system - Google Patents

Noninvasive diagnostic system Download PDF

Info

Publication number
US20120029345A1
US20120029345A1 US13/196,701 US201113196701A US2012029345A1 US 20120029345 A1 US20120029345 A1 US 20120029345A1 US 201113196701 A US201113196701 A US 201113196701A US 2012029345 A1 US2012029345 A1 US 2012029345A1
Authority
US
United States
Prior art keywords
data
bone
transducer
musculoskeletal
patient
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/196,701
Inventor
Mohamed R. Mahfouz
Ray C. Wasielewski
Richard Komistek
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Joint Vue LLC
Original Assignee
Joint Vue LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Joint Vue LLC filed Critical Joint Vue LLC
Priority to US13/196,701 priority Critical patent/US20120029345A1/en
Assigned to JOINT VUE, LLC. reassignment JOINT VUE, LLC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KOMISTEK, RICHARD, MAHFOUZ, MOHAMED R., WASIELEWSKI, RAY C.
Publication of US20120029345A1 publication Critical patent/US20120029345A1/en
Priority to US13/841,632 priority patent/US20130211259A1/en
Priority to US13/841,402 priority patent/US9642572B2/en
Priority to US15/478,148 priority patent/US11004561B2/en
Priority to US17/181,372 priority patent/US20210193313A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0004Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by the type of physiological signal transmitted
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0015Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system
    • A61B5/002Monitoring the patient using a local or closed circuit, e.g. in a room or building
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/1036Measuring load distribution, e.g. podologic studies
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/1036Measuring load distribution, e.g. podologic studies
    • A61B5/1038Measuring plantar pressure during gait
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1113Local tracking of patients, e.g. in a hospital or private home
    • A61B5/1114Tracking parts of the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1121Determining geometric values, e.g. centre of rotation or angular range of movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • A61B5/1127Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using markers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/45For evaluating or diagnosing the musculoskeletal system or teeth
    • A61B5/4528Joints
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/45For evaluating or diagnosing the musculoskeletal system or teeth
    • A61B5/4533Ligaments
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/45For evaluating or diagnosing the musculoskeletal system or teeth
    • A61B5/4538Evaluating a particular part of the muscoloskeletal system or a particular medical condition
    • A61B5/4585Evaluating the knee
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/6804Garments; Clothes
    • A61B5/6807Footwear
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6813Specially adapted to be attached to a specific body part
    • A61B5/6828Leg
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/683Means for maintaining contact with the body
    • A61B5/6831Straps, bands or harnesses
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/683Means for maintaining contact with the body
    • A61B5/6832Means for maintaining contact with the body using adhesives
    • A61B5/6833Adhesive patches
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • A61B5/743Displaying an image simultaneously with additional graphical information, e.g. symbols, charts, function plots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • A61B5/7435Displaying user selection data, e.g. icons in a graphical user interface
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0858Detecting organic movements or changes, e.g. tumours, cysts, swellings involving measuring tissue layers, e.g. skin, interfaces
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0875Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of bone
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4209Details of probe positioning or probe attachment to the patient by using holders, e.g. positioning frames
    • A61B8/4227Details of probe positioning or probe attachment to the patient by using holders, e.g. positioning frames characterised by straps, belts, cuffs or braces
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4245Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
    • A61B8/4254Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient using sensors mounted on the probe
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4245Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
    • A61B8/4263Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient using sensors not mounted on the probe, e.g. mounted on an external reference frame
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5223Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for extracting a diagnostic or physiological parameter from medical diagnostic data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/50ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for simulation or modelling of medical disorders
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/105Modelling of the patient, e.g. for ligaments or bones
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2048Tracking techniques using an accelerometer or inertia sensor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2051Electromagnetic tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0204Acoustic sensors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0219Inertial sensors, e.g. accelerometers, gyroscopes, tilt switches
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0247Pressure sensors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/04Arrangements of multiple sensors of the same type
    • A61B2562/046Arrangements of multiple sensors of the same type in a matrix array
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/112Gait analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/389Electromyography [EMG]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4209Details of probe positioning or probe attachment to the patient by using holders, e.g. positioning frames
    • A61B8/4236Details of probe positioning or probe attachment to the patient by using holders, e.g. positioning frames characterised by adhesive patches
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4444Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to the probe
    • A61B8/4472Wireless probes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4477Constructional features of the ultrasonic, sonic or infrasonic diagnostic device using several separate ultrasound transducers or probes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/466Displaying means of special interest adapted to display 3D data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/467Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/483Diagnostic techniques involving the acquisition of a 3D volume of data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5238Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/30ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising

Definitions

  • the present invention relates to devices and methods for evaluating a physiological condition of a musculoskeletal system and, more particularly, to evaluating the physiological condition of bodily joints.
  • the knee joint 50 is functionally controlled by a mechanical system governed by three unique types of forces: (1) active forces resulting from motion, such as those resulting from a muscle flexing or relaxing; (2) constraining forces that constrain motion, such as those resulting from ligaments being in tension; and (3) interaction forces that resist motion, such as those acting upon bones.
  • the soft tissue in the knee joint 50 e.g., cartilage and the meniscus
  • Knee joint motions are stabilized primarily by five ligaments which restrict and regulate the relative motion between the femur 52 , the tibia 54 , and the patella 56 .
  • These ligaments are the anterior cruciate ligament (“ACL”) 58 , the posterior cruciate ligament (“PCL”) 60 , the medial collateral ligament (“MCL”) 62 , the lateral collateral ligament (“LCL”) 64 , and the patellar ligament 66 .
  • ACL anterior cruciate ligament
  • PCL posterior cruciate ligament
  • MCL medial collateral ligament
  • LCL lateral collateral ligament
  • patellar ligament 66 An injury to any one of these ligaments 58 - 66 or other soft-tissue structures may cause detectable changes in knee kinematics and the creation of detectable vibrations, each of which may be representative of the type of knee joint injury and/or the severity of the injury.
  • a device for acquiring data and diagnosing a musculoskeletal injury in accordance with one embodiment of the present invention includes a semi-flexible housing, at least one ultrasonic transducer, a positional localizer, and a transmission system.
  • the semi-flexible housing is positioned proximate a portion of the musculoskeletal system of a patient and supports the at least one ultrasonic transducer and the positional localizer.
  • the at least one ultrasonic transducer is configured to acquire an ultrasonic data indicative of a bone surface.
  • the positional localizer is positioned at a select location relative to the at least one ultrasonic transducer and tracks movement of the housing.
  • the transmission system transmits the ultrasonic data of the at least one ultrasonic transducer and the movement data of the positional localizer to a data analyzer for analysis and diagnosis.
  • Another embodiment of the present invention is directed to a method of diagnosing a musculoskeletal injury.
  • the method includes creates a 3D model of a portion of the musculoskeletal system of a patient.
  • a feature data is acquires by a sensor that is positioned proximate the portion of the musculoskeletal injury.
  • the feature data is compared, by a neural network, to a database of feature data.
  • a dataset within the database of feature data is representative of the musculoskeletal injury. Then, based on the comparing, a diagnosis is returned.
  • Still another embodiment of the present invention is directed to a diagnostic system for diagnosing a musculoskeletal injury.
  • the system includes a 3D model reconstruction module that acquires a structural data indicative of a bone surface. The bone is within a portion of the musculoskeletal system of a patient.
  • the 3D model reconstruction module constructs a patient-specific model from the structural data.
  • the system further includes a kinematic tracking module that acquires movement data while the portion of the musculoskeletal system is articulated.
  • a vibroarthography model acquires vibration data generated by the articulation.
  • the structural data, the movement data, and the vibration data are received and analyzed by an intelligent diagnosis module in order to determine injury type.
  • FIG. 1 is a side elevational view of a posterior portion of a knee joint with a 90° flexion.
  • FIG. 2 is a side elevational view of the knee joint of FIG. 1 but with the knee joint fully extended.
  • FIG. 3 is a side elevational view of an anterior portion of the knee joint in FIG. 1 .
  • FIG. 4 is a flow chart illustrating a method of determining a type of knee injury in accordance with one embodiment of the present invention.
  • FIG. 5 is a schematic diagram of a diagnostic system in accordance with one embodiment of the present invention.
  • FIG. 6 is another schematic diagram of the diagnostic system of FIG. 5 .
  • FIG. 7 is a schematic view of a knee brace in accordance with one embodiment of the present invention.
  • FIG. 8 is a side elevational view of a vibration detection module in accordance with one embodiment of the present invention.
  • FIG. 9 is a side elevational view of an exemplary shoe having a sensor array, for a shoe module, in accordance with one embodiment of the present invention.
  • FIG. 9A is an exemplary wireless transmitter for use with the shoe module of FIG. 9 .
  • FIG. 9B is an enlarged view of one exemplary positional sensor of the shoe module of FIG. 9 .
  • FIG. 10 is a schematic view of an ultrasound transducer wand for use with the diagnostic system in accordance with one embodiment of the present invention.
  • FIG. 11 is a diagrammatic view of an ultra wide band transmitter in accordance with one embodiment of the present invention.
  • FIG. 12 is a diagrammatic view of an ultra wide band receiver in accordance with one embodiment of the present invention.
  • FIG. 13 is a Cartesian coordinate system depicting an ultra wide band positioning system in accordance with one embodiment of the present invention.
  • FIG. 14 is a diagrammatic view comparing one embodiment of an ultra wide band positioning system to a global positioning system.
  • FIG. 15 illustrates the error in detecting a position along each of the x-, y-, and z-axes and with respect to a sequentially acquired series of data points.
  • FIG. 16 is an exemplary screen capture of a user interface of the diagnostic system of FIG. 5 .
  • FIG. 17 is a side elevational view of a leg with a knee brace in accordance with another embodiment of the present invention.
  • FIG. 18 is a side elevational view of a leg with a knee brace in accordance with another embodiment of the present invention.
  • FIG. 19 is an individual transducer tracking sub-brace for use with a knee brace in accordance with one embodiment of the present invention.
  • FIG. 20 is an inter-transducers mechanical link sub-brace for use with a knee brace in accordance with one embodiment of the present invention.
  • FIG. 21 is a rotating transducer sub-brace for use with a knee brace in accordance with one embodiment of the present invention.
  • FIG. 22 is a diagrammatic representation of an inertia based localizer circuit in accordance with one embodiment of the present invention.
  • FIG. 23 is a diagrammatic representation of an alternate individual transducer tracking sub-brace circuit architecture in accordance with one embodiment of the present invention.
  • FIG. 24 is a diagrammatic representation of a high voltage circuit for use with a knee brace in accordance with one embodiment of the present invention.
  • FIG. 25 is a diagrammatic representation of the circuit layout of the high voltage circuit of FIG. 24 .
  • FIG. 26 is a diagrammatic representation of a high voltage multiplexer for use with a sub-brace of a knee brace in accordance with one embodiment of the present invention.
  • FIG. 27 a diagrammatic representation of a receiving circuit for use with a sub-brace of a knee brace in accordance with one embodiment of the present invention.
  • FIG. 28 is a diagrammatic representation of a diagnostic system in accordance with an embodiment of the present invention.
  • FIG. 29 is a flow chart illustrating one method of using the diagnostic system of FIG. 28 .
  • FIGS. 30A-30C illustrate various kinematic feature vectors acquired from a knee joint moving through a range of motion.
  • FIGS. 31A-31C illustrate feature vectors of a femoral position with respect to the tibia.
  • FIG. 32 illustrates average medial and lateral femoral condyle positions during a deep knee bend of a patient having an anterior cruciate ligament deficit.
  • FIG. 33 is a diagrammatic representation of a neural network classifier in accordance with one embodiment of the present invention.
  • FIG. 34 is a diagrammatic representation of a construction of a neural network.
  • the exemplary embodiments of the present invention are illustrated and described below to encompass diagnosis of bodily abnormalities and, more particularly, devices and methods for evaluating the physiological condition of the musculoskeletal system (such as joints) to discern whether abnormalities exist and the extent of any abnormalities.
  • the exemplary embodiments discussed below are merely examples and may be reconfigured without departing from the scope and spirit of the present invention.
  • the exemplary embodiments, as discussed below may include optional steps, methods, and features that one of ordinary skill should recognize as not being a requisite to fall within the scope of the present invention.
  • the exemplary embodiments disclosed herein are described with respect to diagnosing a knee joint injury. Nevertheless, the exemplary embodiments may be utilized to diagnose other injuries of the musculoskeletal system (such as a hip joint injury or a bone fracture), as the knee joint 50 ( FIG. 1 ) is merely exemplary to facilitate an understanding of the embodiments disclosed.
  • the method 70 includes constructing a 3D model of the knee joint 50 (Block 72 ), which may include the detection of motion sound (Block 74 ) as well as tracking the kinematics (Block 76 ).
  • the detected sound and tracked kinematics are automatically analyzed (Block 78 ) and the knee injury recognized based upon the analysis (Block 80 ).
  • FIG. 5 illustrates a first exemplary diagnostic system 82 for implementing the method 70 of FIG. 3 .
  • the diagnostic system 82 includes four modules: (1) a pulse echo A-mode ultrasound based 3D model reconstruction (“PEAUMR”) module 84 ( FIG. 6 ) for constructing a patient-specific 3D-model of the patient's knee joint 50 ( FIG. 1 ); (2) a joint kinematics tracking (“JKT”) module 86 for tracking the kinematics of the knee joint 50 ( FIG. 1 ) using the patient-specific 3D model of the knee joint 50 ( FIG. 1 ) from the PEAUMR module 84 ; (3) vibroarthography (“VA”) module 88 for capturing sounds emanating from the knee joint 50 ( FIG.
  • PEAUMR pulse echo A-mode ultrasound based 3D model reconstruction
  • JKT joint kinematics tracking
  • VA vibroarthography
  • an intelligent diagnosis (“ID”) module 90 for identifying a likely diagnosis of the knee joint 50 ( FIG. 1 ) using the kinematic data and the vibration data.
  • ID intelligent diagnosis
  • Each of these four modules 84 - 90 is described in further detail below.
  • a foot module 92 FIG. 6
  • the JKT module 86 for providing dynamic force data, also described in detail below.
  • the diagnosis system 82 is usable with or without the use of the VA module 88 .
  • the present invention may be used to mathematically describe the relative motion of the bones 52 , 54 , 56 in the patient's knee joint 50 as such motion is tracked on a 3D-patient specific bone model.
  • the bone model and motion may be compared with a database of mathematical descriptions of joint motion.
  • the database could contain mathematical descriptions of healthy or clinically undesirable joint motion.
  • the interaction between bodily tissue e.g., bone against cartilage or bone against bone
  • bodily tissue e.g., bone against cartilage or bone against bone
  • the VA module 88 with the diagnostic system 82 utilizes those sounds, such as vibrations, exhibited by the joint during a range of motion to diagnose the condition of the joint without requiring an invasive procedure or subjecting the patient to radiation.
  • FIG. 6 provides still further details of the diagnostic system 82 .
  • the modules 84 - 92 may output the acquired data to a computer 96 for data processing by way of, for example, a neural network 98 .
  • the data processing may provide one or more of a visual output, an audible output, and a diagnosis by way of a visual display 100 .
  • the VA module 88 is shown and comprises a plurality of accelerometers (three are shown 120 a , 120 b , 120 c ) that are utilized to detect sound, specifically, vibrations occurring as a result of motion of the knee joint 50 .
  • the accelerometers 120 a , 120 b , 120 c are mounted directly to the skin or external tissue surface of the patient, as skin-mounted sensor 119 s , in order to detect sounds from bone and soft tissue interaction.
  • An intervening adhesive may be utilized between the accelerometers 120 a , 120 b , 120 c .
  • the VA module 88 includes one accelerometer 120 a mounted on the medial side of the knee joint 50 , a second accelerometer 120 b mounted on the lateral side of the knee joint 50 , and a third accelerometer 120 c mounted on the front side of the knee joint 50 , proximate the patella 56 ( FIG. 3 ).
  • the accelerometers 120 a , 120 b , 120 c are mounted to the patient so that each lies along a common plane 121 , though this is not required. It should also be understood, however, that any number of accelerometers 120 a , 120 b , 120 c may be utilized to detect sounds generated by the patient's knee joint 50 .
  • Each accelerometer 120 a , 120 b , 120 c is in communication with one or more signal conditioning circuits or electronics 122 .
  • the accelerometers 120 a , 120 b , 120 c are operative to detect sound, specifically vibrations, and output the sound detected in the form of frequency data (measured in Hertz) to the conditioning circuits 122 .
  • This frequency data is processed by the conditioning circuits 122 and communicated to the computer 96 as digital frequency data.
  • the conditioning circuits 122 may include a clock 123 to time stamp the frequency data generated. As will be discussed in more detail below, correlating the frequency data with the time stamp provides a constant against which all of the detected data can be compared on a relative scale.
  • the first accelerometer 120 a on the medial side of the knee joint 50 detects vibrations generated primarily by the interactions between the medial condyle 110 ( FIG. 1 ) of the femur 52 against the medial cartilage 112 ( FIG. 1 ) on top of the medial portion of the tibia 54 .
  • the second accelerometer 120 b on the lateral side of the knee joint 50 detects vibrations generated primarily by the interactions between the lateral condyle 114 ( FIG. 1 ) of the femur 52 against the lateral cartilage 116 ( FIG. 1 ) on top of the lateral portion of the tibia 54 .
  • the third accelerometer 120 c on the front of the knee joint 50 proximate the patella 56 ( FIG.
  • the resulting data output by the accelerometers 120 a , 120 b , 120 c may then be wirelessly transmitted to the computer 96 via a wireless transmitter 124 , such as an ultra-wide band transmitter, and utilized in combination with data from the other modules to ascertain the appropriate diagnosis.
  • a wireless transmitter 124 such as an ultra-wide band transmitter
  • FIG. 8 illustrates one example of a plurality of thin film accelerometers (four are shown, 120 a , 120 b , 120 c , 120 d ) that are suitable for detecting the vibrations produced by motion of the knee joint 50 .
  • Thin film accelerometers 120 a , 120 b , 120 c , 120 d may be used in lieu of sound sensors because of better performance and less noise susceptibility.
  • the thin film accelerometers 120 a , 120 b , 120 c , 120 d may also be used as a localizer and include the same circuitry.
  • the accelerometers 120 a , 120 b , 120 c , 120 d are attached to the patients so the outputs may be amplified, digitized, and sent wirelessly to the computer 96 as described below.
  • the foot module 92 (also referred to as the contact force module (“CFM”)) is shown and includes a plurality of pressure sensors 130 that are utilized to detect pressure or a contact force occurring at the bottom of the foot (not shown) when the knee joint 50 ( FIG. 1 ) is moved through a range of motion under a loaded condition.
  • the foot module 92 detects pressure data at the bottom of the foot when the foot is partially or fully in contact with the ground.
  • the pressure sensors 130 are incorporated into an insole 132 of a shoe 134 that conforms to the general shape of a patient's foot. Because humans have different sized feet, the insoles 132 may be incrementally sized to accommodate humans with differently sized feet or to accommodate a particular type of shoe 134 (or lack thereof) needed for a particular activity.
  • the pressure sensors 130 may be arranged in a grid-shaped pattern on the insole 132 , which may include a series of rows and columns.
  • the pressure sensors 130 are exposed to the underside of a patient's foot so that the location and amplitude (or amount) of the contact forces applied by the foot to the shoe 134 , by way of the insole 132 , may be measured.
  • the location of the pressures and the relative amount of pressures provides information relevant to diagnosis of injury. For example, the detected pressures of a patient with a limp caused by a knee joint injury would differ from the detected pressures of a patient with a healthy knee joint and a normal gait.
  • each sensor 130 may include a capacitor having a deformable dielectric between two electrode plates. Changes in the pressure applied to the plates cause a strain, or deformation, of the dielectric medium. Thus, a pressure applied to the capacitive sensor 130 changes the spacing between the plates and the measured capacitance.
  • the capacitive sensors 130 are arrayed across the area of pressure measurement to provide discrete pressure data points corresponding to strains/deformation at the various locations of the array. These strains/deformations are used to find the stresses and thus the compressive forces and to calculate the output of pressure data having units of force per unit area and time (i.e., N/m sec).
  • the sensors 130 in the grid-shape enable positioning of each detected pressure from each of the sensors 130 relative to another sensor 130 .
  • the resultant data which includes a two-dimensional map of the pressure sensors 130 , is either stored on the computer 96 or stored locally with the sensors 130 .
  • the resulting data may be wirelessly transmitted to the computer 96 via a wireless transmitter 136 , such as an ultra-wide band transmitter.
  • the computer 96 is operative to generate data tying detected pressure to position, specifically the position of one pressure sensor 130 with respect to another.
  • the foot module 92 provides data reflecting precisely what pressures are exerted at what location.
  • the computer 96 may include an internal clock 97 to associate a time of which the pressure is applied with the pressure data generated by the pressure sensors 130 . Accordingly, the diagnostic system 82 not only knows how much pressure was exerted and the location where the pressure was applied, but also has time data indicating the duration of the applied pressures. Again, by tying the pressure data generated by the pressure sensors 130 to time, the pressure data can be correlated with the sound data generated by the VA module 88 using a common time scale. As a result, the diagnostic system 82 may evaluate how pressures exhibited at the bottom of the foot change as a function of time, along with how the vibrational data changes during the same time.
  • FIGS. 4-6 and 10 illustrate the details of the JKT module 86 , which comprises an ultrasound creation and positioning submodule 140 , an ultrasound registration submodule 142 , and an ultrasound dynamic movement submodule 144 .
  • each submodule 140 , 142 , 144 includes an A-mode ultrasound transducer to generate sound and to detect reflected sound, wherein the reflected sound is representative of the structure, position, and acoustical impedance of the knee joint 50 ( FIG. 1 ).
  • Commercially-available transducers may include, for example, an immersion unfocused 3.5 MHz transducer, such as those that are available from Olympus Corp. (Tokyo, Japan).
  • ultrasound transducers generally and, more specifically, an A-mode ultrasound transducer that generates sound pulses and detects sound that is reflected at tissue boundaries of tissues having different acoustic impedances. The magnitude of the reflected sound and the time delay are utilized to determine the distance between the ultrasound transducer and the tissue interface.
  • the A-mode ultrasound transducers are utilized to detect the interface between bone and the surrounding soft tissue so that the location of the bone surface may be determined. Because the operation of ultrasound transducers (including the A-mode ultrasound transducers) is well known to those skilled in the art, a detailed discussion of the operation of ultrasound transducers in general, and A-mode ultrasound transducers specifically, has been omitted only for purposes of brevity.
  • the ultrasound creation and positioning submodule 140 as shown in FIG. 10 comprises one or more A-mode ultrasound transducers 150 fixedly mounted to a wand 152 .
  • the wand 152 further includes at least one positioning device 170 .
  • the ultrasound creation and positioning submodule 140 is physically separate from the ultrasound registration submodule 142 ( FIG. 6 ) and the ultrasound dynamic movement submodule 144 ( FIG. 6 ), the latter two of which are mounted to a rigid knee brace 220 schematically illustrated in FIG. 17 . In this fashion, the ultrasound creation and positioning submodule 140 is repositionable with respect to the rigid knee brace 220 ( FIG.
  • the knee brace 220 ( FIG. 17 ) and adapted to place one or more of its A-mode ultrasound transducers 150 in contact with the patient's epidermis, proximate the knee joint 50 ( FIG. 1 ). It should be noted, however, that the knee brace 220 ( FIG. 17 ) does not have to be rigid, other than the linkages between certain components. Moreover, the knee joint 50 ( FIG. 1 ) may be scanned by the ultrasound wand 152 before positioning the brace 220 ( FIG. 17 ) thereon.
  • One of the functions of the ultrasound creation and positioning submodule 140 is to generate an electrical signal that is representative of the ultrasonic wave detected by the transducers 150 as the wand 152 moves over the patient's epidermis, proximate the knee joint 50 ( FIG. 1 ).
  • the ultrasound transducers 150 receive the ultrasonic wave based upon the magnitude of the reflected ultrasonic wave from the bone-tissue interface.
  • the magnitude of the electrical signal and the delay between the generation of the ultrasonic wave by the ultrasound transducer 150 to detection of the reflected ultrasonic wave by the ultrasound transducer 150 is indicative of the distance to the bone underneath the transducer 150 .
  • distance data alone is not particularly useful; therefore, one or more positioning devices 170 are used to provide a 3D coordinate system, one example of which is shown in FIG. 11 .
  • the positioning devices 170 of the ultrasound creation and positioning submodule 140 are fixedly mounted to the wand 152 and may include any of a number of positioning devices 170 .
  • the wand 152 may include one or more optical devices (as the positioning devices 170 ) that are configured to generate, detect, and/or reflect pulses of light. These pulses of light interact with a corresponding detector or light generator to discern the position of the wand 152 , in 3D space, and with respect to a fixed or reference position.
  • One such device includes a light detector configured to detect pulses of light emitted from light emitters having known positions. The light detector detects the light and sends a representative signal to the computer 96 or otherwise a controller (not shown) of the light detector.
  • the computer 96 is also provided the time at which the light pulses were emitted by the optical devices 170 . In this matter, the computer 96 determines the position of the wand 152 relative to the known positions of the detectors. Because the ultrasound transducer 150 and the optical devices 170 are fixedly mounted to the wand 152 , the position of the ultrasound transducers 150 with respect to the position of the optical devices 170 is known. Similarly, because the ultrasound transducers 150 are generating signals representative of the straight line distance between the transducers 150 and the bone-tissue interface, and the position of the transducers 150 with respect to the optical devices 170 is known, the position of the bone-tissue interface with respect to the optical devices 170 may be determined.
  • the optical devices 170 As the wand 152 moves over the patient's epidermis, the optical devices 170 generate data that is determined, by the computer 96 , to represent that the relative position of the optical devices 170 with respect to the light detectors has changed in the 3D coordinate system.
  • This change in the position of the optical devices 170 may be easily correlated to the position of the bone-tissue interface, in 3D, because the position of the bone-tissue interface relative to the ultrasound transducers 150 , as well as the position of the optical devices 170 with respect to the ultrasound transducers 150 are known.
  • the 3D position data may be used in combination with the fixed position data (distance data for the position of the ultrasound transducers 150 with respect to the optical devices 170 ) for the ultrasound transducers 150 in combination with the distance data generated in response to the signals received from the ultrasound transducers 150 to generate composite data.
  • the composite data may, in turn, be used to create a plurality of 3D points representing a plurality of distinct points on the surface of the bone, along the bone-tissue interface. As will be discussed in more detail below, these 3D points are utilized in conjunction with a default bone model to generate a virtual, 3D representation of the patient's bone.
  • the positioning devices 170 may comprise one or more inertial measurement units (“IMUs”).
  • IMUs are known to those skilled in the art and include accelerometers, gyroscopes, and magnetometers that work together to determine the position of the IMUs in a 3D coordinate system. Because the A-mode ultrasound transducer 150 and the IMUs 170 are fixedly mounted to the wand 152 , the position of the ultrasound transducers 150 with respect to the position of the IMUs 170 is known.
  • the ultrasound transducers 150 are generating signals representative of the straight line distance between the transducers 150 and the bone-tissue interface and the position of the transducers 150 with respect to the IMUs 170 is known, the position of the bone-tissue interface with respect of the IMUs 170 may be determined.
  • the IMUs 170 generate data that is determined, by the computer 96 , to represent that the relative position of the IMUs 170 has changed in the 3D coordinate system.
  • This change in the position of the IMUs 170 may be easily correlated to the position of the bone-tissue interface in 3D because the position of the bone tissue interface relative to the ultrasound transducer 150 is known, as is also the position of the IMUs 170 with respect to the ultrasound transducers 150 . Accordingly, the 3D position data may be used in combination with the fixed position data (distance data for the position of the ultrasound transducers 150 with respect to the IMUs 170 ) for the ultrasound transducers 150 in combination with the distance data generated in response to the signals received from the ultrasound transducers 150 to generate the composite data as described above.
  • the positioning devices 170 may still alternatively comprise one or more ultra-wide band (UWB) transmitters.
  • UWB transmitters are known to those skilled in the art, but the use of UWB transmitters and receivers for millimeter resolution 3D positioning is novel.
  • one or more UWB transmitters 170 are fixedly mounted to the wand 152 and configured to sequentially transmit UWB signals to three or more UWB receivers 172 having known positions in a 3D coordinate system.
  • This embodiment of the positioning device 170 is comprised of active tags or transmitters 170 that are tracked by the UWB receivers 172 .
  • the system architecture of the UWB transmitter 170 is shown in FIG.
  • a low noise system clock (“crystal clock”) 174 triggers a baseband UWB pulse generator 176 (for instance a step recovery diode (“SRD”) pulse generator).
  • the baseband pulse from the baseband UWB pulse generator 176 is upconverted by a local oscillator 178 via a double balanced wideband mixer (not shown).
  • the upconverted signal is amplified and filtered (“bandpass filter” 180 ).
  • bandpass filter 180
  • the signal is transmitted, via an omnidirectional antenna 182 , to the computer 96 ( FIG. 6 ).
  • the UWB signal may travel through an indoor channel where significant multipath and pathless effects cause noticeable signal degradation.
  • the UWB receiver 172 architecture in accordance with one embodiment is shown in FIG. 12 .
  • the signal is received via a directional UWB antenna 184 and is filtered and amplified (“bandpass filter” 186 ), downconverted by a local oscillator 187 , and low-pass filtered (“LPF”) 188 .
  • a sub-sampling mixer 190 triggered by a second low noise system clock (“crystal clock”) 192 is used to tune extend the pulse by about 1,000 to about 100,000 times. This effectively reduces the bandwidth of the UWB pulse and allows sampling by a conventional analog-to-digital converter (“ADC”) 194 .
  • ADC analog-to-digital converter
  • Each UWB transmitter 170 and receiver 172 is in communication with the computer 96 . Accordingly, the computer 96 detects each time the UWB transmitter 170 transmits a UWB signal, as well as the time at which the UWB signal was transmitted. Similarly, the computer 96 detects the position of each of the UWB receivers 172 in the 3D coordinate system, as well as the time at which the UWB signal was received. The final time-difference-of-arrival (“TDOA”) calculation, via a UWB positioning system 183 , is shown in FIG. 13 .
  • TDOA time-difference-of-arrival
  • At least four base receivers 172 are needed to localize the 3D position of the UWB transmitter 170 (“Tag”).
  • the geometry of the receivers Rx 1 , Rx 2 , Rx 3 , Rx 4 has important ramifications on the achievable 3D accuracy through what is known as geometric position dilution of precision (“PDOP”).
  • PDOP geometric position dilution of precision
  • a combination of novel filtering techniques, high sample rates, robustness to multipath interference, accurate digital ranging algorithms, low phase noise local oscillators, and high integrity microwave hardware are needed to achieve millimeter range accuracy (e.g. ranging from about 5 mm to about 7 mm in 3D real-time).
  • An analogy of the UWB positioning system 183 to a GPS system 185 is shown in FIG. 14 .
  • FIG. 15 shows actual experimental errors in each of the x-, y-, and z-coordinates for detecting the position of the UWB transmitter 170 in 3D space and in real-time for over 1000 samples while the transmitter 170 is moving freely within the 3D space.
  • the A-mode ultrasound transducers 150 and the UWB transmitters 170 are fixedly mounted to the wand 152 , the position of the ultrasound transducers 150 with respect to the position of the UWB transmitters 170 is known. Similarly, because the ultrasound transducers 150 are generating signals representative of the straight line distance between the ultrasound transducers 150 and the bone-tissue interface and the position of the ultrasound transducers 150 with respect to the UWB transmitters 170 is known, the position of the bone-tissue interface with respect to the UWB transmitter 170 may be determined. In other words, as wand 152 moves over the patient's epidermis, the UWB transmitters 170 transmit UWB signals that are correspondingly received by the UWB receivers 172 .
  • UWB signals are processed by the computer 96 in order to discern whether the relative position of the UWB transmitters 170 has changed in the 3D coordinate systems, as well as the extent of such a change.
  • This change in 3D position of the UWB transmitters 170 can be easily correlated to the position of the bone-tissue interface in 3D because the position of the bone relative to the ultrasound transducer 150 and the position of the UWB transmitters 170 with respect to the ultrasound transducer 150 are known.
  • the UWB 3D position data may be used in combination with the fixed position data (distance data for the position of the ultrasound transducers 150 ) to generate the composite data as described above.
  • the wand 152 is repositioned over the skin of the patient, proximate to the knee joint 50 ( FIG. 1 ) while the knee joint 50 ( FIG. 1 ) is bent. Bending the patient's knee joint 50 ( FIG. 1 ) during data acquisition enables the creation of a 3D series of points for each of the bones of the knee joint 50 ( FIG. 1 ) (the distal femur 52 , the proximal tibia 54 , and the patella 56 ).
  • the data from the transducer 150 is transmitted to a wireless transmitter 200 mounted to the wand 152 .
  • the wireless transmitter 200 receives the data from the transducers 150
  • the transmitter 200 transmits the data via a wireless link to the computer 96 .
  • an internal power supply (not shown) may be provided.
  • the internal power supply comprises one or more rechargeable batteries.
  • Transformation is needed for transforming the position data from a reference coordinate frame of reference to a world frame of reference.
  • a linear movement of the ultrasound transducer 150 may be described:
  • s(n+1) is the position of the ultrasound transducer 150 at a current state
  • s(n) is the position from a previous state
  • v(n+1) is the instantaneous velocity of the current state
  • v(n) is the velocity from previous state
  • a(n) is the detected acceleration
  • dt is the sampling time interval.
  • the orientation of the ultrasound transducer 150 may be described by using a gravity-based accelerometer (for example ADXL-330, analog device) and extracting the tilting information from each of a pair of orthogonal axes.
  • the acceleration output on each of the x-, y-, or z-axes is due to gravity and is equal to the following:
  • a i is the acceleration of the ultrasound transducer 150 along each of the x-, y-, or z-axes
  • V outx is the voltage output on each of the x-, y-, or z-axes
  • V off is the offset voltage
  • S is the sensitivity of the accelerometer.
  • the orientation does not require information from the previous state once the accelerometer is calibrated.
  • the static calibration requires the resultant sum of accelerations from each of the three axes to equal 1 ⁇ g (where g is the nominal acceleration due to gravity at the Earth's surface at sea level, defined to be precisely 9.80665 m/s 2 (approximately 32,174 ft/s 2 )).
  • an orientation sensor that provides yaw, pitch and roll information of the bodily tissue in question may be used.
  • One such orientation sensor may be the commercially-available model IDG-300 from InvenSense (Sunnyvale, Calif.).
  • the orientation of the ultrasound transducer 150 may then be resolved by using, for example, a direction cosine matrix transformation:
  • the PEAUMR module 84 constructs a 3D model of the patient's knee joint 50 ( FIG. 6 ) by converting the transcutaneously acquired a set of 3D data points (using the tracked pulse echo A-mode ultrasound transducer 150 ), that, in total, are representative of the shape of the bone-tissue interface and therefore each bones' surface.
  • software residing on the computer 96 may request a series of inputs from the user to adapt the diagnostic system 82 to equipment specific devices and the particular portion of the musculoskeletal anatomy to be modeled.
  • a menu 204 on a user interface 206 may be presented for the user to select the type of digitizer, which may include, without limitation, ultrasound.
  • the user may actuate buttons 205 a , 205 b to connect to or disconnect from the digitizer, respectively.
  • the set of points is generated, numerically recorded, viewable in a data window 210 , and ultimately utilized by the software to conform a selected bone model to the patient's actual bone shape. Consequently, the wand 152 is repositioned over the bones (the distal femur 52 , the patella 56 , the proximal tibia 54 ) for approximately 30 seconds so that the discrete points to typify the topography of the bone. Repositioning the wand 152 over the bone in question for a longer duration results in more 3D points being generated increases the resolution and improves the accuracy of the patient-specific bone model.
  • a partial range of motion of the knee joint 50 ( FIG. 1 ) while repositioning the wand 152 over the knee joint ( FIG. 1 ) aids in scanning additional portions of the bone in question for new 3D points that may have been obscured by other bones in another range of motion position.
  • the software Before, during, or after the ultrasound data is acquired, the software provides various drop-down menus allowing the software to load a bone model 208 that is roughly the same shape as the patient's bone.
  • the computer 96 receives the ultrasound data, the computer 96 includes software that interprets the A-mode ultrasound transducer data and constructs a 3D map having discrete 3D points corresponding to points on the surface of the scanned bone. That is, the shape of the patient's bone is reconstructed in virtual space, using a set of points outlining the surface of the patient's bone as acquired by the tracked ultrasound transducer 150 ( FIG. 10 ). The set of points is applied to an atlas-based deformable model software to reconstruct the patient-specific 3-D model.
  • the computer 96 may include a database having a plurality of bone models of various portions of the musculoskeletal system, for example, the femur 52 , the tibia, 54 , and the patella 56 , that are classified and selectable in a menu 212 , for example based upon ethnicity, gender, height ranges, the side of the body, and so forth. Each of these classifications is accounted for in a drop-down menu of the software so that the model initially chose by the software most closely approximates the body of the patient.
  • the computer 96 uses either a default bone model or the selected bone model as a starting point to construction of the ultimate patient-specific, virtual bone model.
  • the default bone model may be a generalized average, as the morphing algorithms use statistical knowledge of a wide database population of bones for a very accurate model.
  • the selected bone model expedites computation. For example, in the case of generating a patient-specific model of the femur 52 where the patient is a 53 year old, Caucasian male, who is six feet tall, a default femoral bone model is selected based upon the classification of Caucasian males having an age between 50-60, and a height ranging from 5′10′′ to 6′2′′.
  • the computer 96 superimposes the 3D points onto the default bone model and, thereafter, carries out a deformation process so that the bone model exhibits the 3D bone points detected during the signal acquisition.
  • the deformation process also makes use of statistical knowledge of the bone shape based upon reference bones of a wide population.
  • the resulting bone model is a patient-specific, virtual 3D model of the patient's actual bone. The foregoing process is repeated for each bone comprising the specific joint to create patient-specific, virtual 3D models of the patient's anatomy.
  • the JKT module 86 may be configured to track the kinematics of the knee joint 50 ( FIG. 1 ) and display the kinematics on the patient-specific 3D bone model generated by the PEAUMR module 84 using, for example, one or more bone motion tracking braces 220 .
  • the bone motion tracking brace 220 includes pulse echo A-mode ultrasound transducers 222 to transcutaneously localize the bone-tissue interface and derive a set of points outlining each bone's surface.
  • the brace 220 includes a plurality of A-mode ultrasound transducers 222 fixedly mounted to the knee brace 220 .
  • A-mode ultrasound transducers 222 there are at least two A-mode ultrasound transducers 222 (i.e., “a transducer group” 222 a , 222 b ) fixedly mounted to the knee brace 220 for tracking of the tibia 54 ( FIG. 1 ) and the femur 52 ( FIG. 1 ).
  • the knee brace 220 includes at least six ultrasound transducers 222 in order to track the two primary bones 52 , 54 ( FIG. 1 ) of the knee joint 50 .
  • Each transducer group 222 a , 222 b includes a rigid, mechanical connection linking the transducers 222 and the positioning devices 224 to the knee brace 220 . In this manner, the relative positions of the transducers 222 with respect to one another do not change.
  • a first transducer group 222 a at least partially circumscribes a distal portion of the femur 52 ( FIG. 1 ); while a second transducer group 222 b at least partially circumscribes a proximal portion of the tibia 54 ( FIG. 1 ); and an optional third transducer group (not shown) overlies the patella 56 ( FIG. 3 ) if patella kinematics are desired.
  • the ultrasound registration submodule 142 is accordingly configured to provide a plurality of static reference points for each bone as the bone is moved through a range of motion.
  • Each ultrasound transducer 222 is tracked using an accelerometer or a sensor-specific localizer (or any other appropriate inertial sensor). The tracking may then be used to generate localized bone points from the outputs of the ultrasound transducers 222 and to virtually display bone movement on the 3D model while the knee joint 50 ( FIG. 1 ) is taken through the range of motion.
  • the ultrasound dynamic movement submodule 144 comprises a plurality of positioning devices 224 that is configured to feed information to the computer 96 regarding the 3D position of each transducer group 222 a , 222 b of the ultrasound registration submodule 142 .
  • the position devices 224 may comprise light detectors operative to detect pulses of light emitted from light emitters having known positions. The light detectors 224 detect the light and transmit representative signals to a control circuitry (not shown) associated with the knee brace 220 . The knee brace 220 transmits this information to the computer 96 , which also knows when the light pulses were emitted as a function of time and position.
  • the computer 96 may determine the position of the transducers 222 in the 3D coordinate system. Because the ultrasound transducers 222 and the optical devices 224 are fixedly mounted to the knee brace 220 , the position of the ultrasound transducers 222 with respect to the position of the optical devices 224 is known. Similarly, because the ultrasound transducers 222 are generating signals representative of the straight line distance between each of the ultrasound transducers 222 and the bone-tissue interface beneath, and the position of the ultrasound transducers 222 with respect to the optical devices 224 is known, the position of the bone-tissue interface with respect to the optical devices 224 may be easily determined. In other words, as the knee joint 50 ( FIG.
  • the optical devices 224 generate data that is determined by the computer 96 that the relative position of the optical devices 224 has changed in the 3D coordinate system.
  • This change in the position of the optical devices 224 may be easily correlated to the position of the bone in question in 3D because the position of the bone relative to the ultrasound transducer groups 222 a , 222 b is known, as is the position of the optical devices 224 with respect to the ultrasound transducer groups 222 a , 222 b .
  • the optical devices 224 generate data that is used in combination with the fixed position data (distance data for the position of the ultrasound transducers 222 ) to generate the composite data.
  • the composite data may, in turn, be used to create dynamically moving map of the bone on the patient-specific 3D model.
  • the positioning devices 224 may be comprised of one or more IMUs. Because the ultrasound transducers 222 and the IMUs 224 are fixedly mounted to the knee brace 220 , the relative positions between the ultrasound transducers 222 and the IMUs 224 are known. Similarly, because the ultrasound transducers 222 are generating signals representative of the straight line distance between the transducer 222 and the bone-tissue interface, and the position of the transducers 222 with respect to the IMUs 224 is known, the position of the bone with respect to the IMUs 224 may be easily determined. In other words, as the knee joint 50 ( FIG.
  • the IMUs 224 generate data that is determined, by the computer 96 , as a change in the position of the IMUs 224 .
  • This change in the position of the IMUs 224 may be easily correlated to the position of the bone in 3D because the position of the bone relative to the ultrasound transducer groups 222 a , 222 b is known, as well as the position of the IMUs 224 with respect to the ultrasound transducer groups 222 a , 222 b .
  • any movement of the IMUs 224 in space means that the knee brace 220 has also moved in space, and by continuing to track the distance data provided by each IMU 224 , the movement of the bone may be correspondingly tracked.
  • IMU tracking of the bone movements requires a static registration between the IMUs 224 and an initial known body position (such as standing).
  • the IMUs 224 enable measurement of the relative motion between different bones via their corresponding ultrasound transducer group data and the IMU data.
  • the IMUs 224 may be used alone or in conjunction with other positioning devices 170 ( FIG. 10 ), such as those described in detail above. In this scenario, the IMU position is updated at a certain interval with the absolute position provided by the additional positioning system to minimize error. Therefore the two positioning systems act together as one positioning system.
  • the positioning devices 224 of the brace 220 may alternatively be comprised of one or more ultra wide band (UWB) transmitters.
  • UWB transmitters 224 are fixedly mounted to the brace 220 and operable to transmit sequential UWB signals to three or more UWB receivers (not shown) having known positions in the 3D coordinate system.
  • Each UWB transmitter 224 is in communication with the computer 96 , as are the plurality of UWB receivers (not shown). Accordingly, the computer 96 detects each time the UWB transmitter transmits a UWB signal, as well as the time at which the UWB signal was transmitted.
  • the computer 96 detects the position of each of the UWB receivers (not shown) in the 3D coordinate system, as well as the time at which the UWB signal was received. The computer 96 may then use the custom digital signal processing algorithms to accurately locate the leading-edge of the received UWB pulse based on the position of each UWB receiver (not shown), the time when each UWB signal was received, and the time that the UWB signal was transmitted. The position may then be determined by the TDOA calculation as was described with reference to FIG. 11 ). Again, because the ultrasound transducers 222 do not move with respect to the knee brace 220 , any movement of the transducers 222 in space means that the brace 220 has moved.
  • the movement of the knee brace 220 is tracked using the computer 96 in combination with the UWB transmitters 224 and the UWB receivers (not shown). Similarly, because the fixed orientation between the UWB transmitters 224 and the ultrasound transducers 222 changes in position in the 3D coordinate system, the UWB transmitter 224 may correspondingly be used to track movement of each bone.
  • the brace 220 may include a transmitter 228 , such as a UWB transmitter, in communication with the ultrasound transducer 222 to facilitate wireless communication of data to the computer 96 .
  • a transmitter 228 such as a UWB transmitter
  • UWB transmitter 228 is also utilized as the positioning devices 224
  • a dedicated transmitter 228 is unnecessary as the UWB transmitters 224 could function to also send ultrasound data directly to the computer 96 over a wireless link.
  • the transmitter 228 and a field programmable gate array design enables the computations to be cammed out on a real-time basis. For example, as patient's knee joint 50 ( FIG. 1 ) is bent while wearing the brace 220 , the ultrasound data is immediately transmitted to the computer 96 , which in real-time, calculates and displays the position and movement of each bone with the 3D patient-specific bone model.
  • FIG. 18 illustrates a knee brace 230 in accordance with another embodiment of the present invention.
  • the knee brace 230 has a first sub-brace 232 positioned at the distal portion of the femur 52 , a second sub-brace 234 positioned at the proximal end of the tibia 54 , and a third sub-brace 236 positioned at the patella 56 ( FIG. 3 ).
  • the sub-braces 232 , 234 , 236 include a plurality of transducers mounted thereto. Each transducer is responsible for determining the location of a point on the surface of the bone during movement of the knee joint 50 .
  • the sub-braces 232 , 234 , 236 reduce the occurrence of problems of locating and tracking the bone using ultrasound data when the motion of the bone relative to the skin is small compared to the gross joint motion. There are at least three approaches disclosed herein for tracking the motion of the ultrasound transducers themselves.
  • FIG. 19 illustrates the first approach commonly referred to herein as an “ITT” (individual transducer tracking) approach.
  • each transducer 238 in the sub-brace 232 has an associated tracking module 240 to individually track each transducer 238 .
  • the transducers 238 may be supported by a flexible length of strap.
  • the second approach involves the transducers 242 being connected to each other by movable mechanical links 244 .
  • Each mechanical link 244 includes length and angle sensors 246 that allow for detection of the movement of the transducers 242 relative to one another and the relative translational motions of the links 244 . Every two links 244 are connected by a pivot pin 248 that allows rotation and translation of the links 244 relative to each other.
  • the length and angle sensors 246 are mounted to at least one link 244 and proximate to the pivot pin 248 to allow for detection of the angle between adjacent 244 links.
  • the ITML approach features a fewer number of localizers than the ITT approach of FIG. 19 .
  • the third approach commonly referred to herein as a “RT” (Rotating Transducer) approach, involves using a single ultrasound transducer 250 that is mounted to a carriage 252 .
  • the carriage 252 traverses along a track 254 , located on the inner circumference 256 of the sub-brace 249 .
  • the carriage 252 may be moved along the track 254 by a string loop 258 that is wrapped around the drive shaft (not shown) of a motor 260 .
  • the transducer 250 reaches the motor 260 , the rotation direction of the motor 260 is changed and the transducer 250 moves in the opposite direction.
  • a tracking module 262 such as an inertia-based localizer is mounted to the transducer 250 to track its motion. As the transducer 250 rotates within the inner circumference 256 of the sub-brace 249 , it collects data as to the bone-tissue interface.
  • the RT approach includes the advantage of lower cost than the stationary transducer designs and higher accuracy due to the greater number of localized bone surface points for each tracking step, while maintain a mechanical flexibility.
  • a localizer 270 of tracking each ultrasound transducer 238 ( FIG. 19 ) mounted to the sub-brace 232 ( FIG. 19 ) is shown.
  • the localizer 270 comprises a plurality of nodes 272 with each node 272 comprising a CMOS accelerometer and a temperature sensor (not shown) for thermal drive comparison. Each node 272 is integrated to minimize noise and distortion.
  • the outputs of the accelerometers 272 regarding the x-, y-, and z-coordinates and the temperature sensors (not shown) are directed to a multiplexer 274 (“MUX”) that multiplexes the signals. Multiplexed outputs are amplified by an amplifier 276 (“AMP”), and then directed to an ADC 278 .
  • the digital conversion of the signal may be performed within or outside the accelerometers 272 . Outputting digital signals may then be directed to a wireless transmitter 280 by way of a parallel input/serial output device 282 .
  • the electronic architecture includes a high voltage amplifier circuit 286 (“HV IX AMP”) feeding a voltage multiplex circuit (“HV MUX”) 288 to excite each ultrasound transducer 238 and thereby acts as an analog switch.
  • the echo signals from each transducer 238 are multiplexed pursuant to a logic control directing the opening of the switches in the MUX 290 at precise intervals.
  • An exemplary logic control is the MSP430, available from Texas Instruments, Inc. (Dallas, Tex.).
  • the output from the MUX 290 is I amplified by a low noise AMP 292 (“LNA”) and the signal is conditioned using a conditioning circuit (for example, a time-gain-control (“TGC”) circuit 294 and a band-pass filter (“BPF”) 296 , and digitized using an ACS 298 .
  • Electric power to the foregoing components is supplied by way of a battery 300 , which also supplies power to a wireless transmitter module 302 .
  • the wireless transmitter module 302 utilizes a universe asynchronous receiver/transmitter (“UART”) protocol.
  • the wireless transmitter module 302 includes a wireless transmitter circuit 304 receiving the output from a first in-first out (“FIFO”) buffer (not shown) of the ADS 298 by way of a serial interface 306 .
  • An output from the wireless transmitter circuit 304 is conveyed using a serial link coupled to an antenna 308 .
  • Signals conveyed through the antenna 308 are broadcast for reception by a wireless receiver (not shown) coupled to a controller (not shown) or the computer 96 ( FIG. 6 ).
  • an exemplary high voltage circuit 310 is shown and may be used to trigger and generate the excitation energy for a piezoelectric crystal in the ultrasound transducer 238 ( FIG. 19 ).
  • Exemplary high voltage circuits 10 for use in this embodiment may include, without limitation, the pulsar integrated circuit (HV379) available from Supertex, Inc (Sunnyvale, Calif.).
  • an exemplary high voltage multiplexer 312 is shown and may be used to trigger and excite multiple piezoelectric transducers 238 ( FIG. 19 ) without increasing the number of high voltage circuits 310 ( FIG. 24 ).
  • Exemplary high voltage multiplexer 312 for use in this embodiment may include, without limitation, the high voltage multiplexer (HV2221) available from Supertex, Inc (Sunnyvale, Calif.).
  • HV2221 available from Supertex, Inc (Sunnyvale, Calif.).
  • the advantage of using a high voltage multiplexer 312 is the ability to use CMOS level control circuitry, thereby making the control logic compatible with virtually any microcontroller or field programmable gate array that is commercially-available.
  • an exemplary receiving circuit 314 which comprises the MUX 290 , the LNA 292 , the TGC 294 , the BPF 296 , and the ADC 298 is shown and may be utilized to receive the echo signals from each transducer 238 .
  • Exemplary receiving circuits 314 for use in the this embodiment include, without limitation, the AD9271 8-channel ultrasound receiving integrated circuits, available from Analog Devices, Inc. (Norwood, Mass.).
  • FIGS. 28 and 29 one method 316 of using X-ray fluoroscopy and in-vivo measurements of dynamic knee kinematics, as described above, for understanding the effects of joint injuries, diseases, and evaluating the outcome of surgical procedures is described.
  • six degrees of freedom (“DOF”) are determined for the knee joint 50 ( FIG. 1 ) and include the position and orientation of each bone comprising the knee joint 50 ( FIG. 1 ).
  • the accuracy of this method 316 is within 1° of rotation and 1 mm of translation (except for translations that are parallel to the viewing plane).
  • Implementation of the method 316 includes joint movement visualization via the 3D model reconstruction with A-mode ultrasound system, as described previously.
  • the method 316 also measures the vibrations produced to accurately localize the vibrational center and to determine the cause of the vibrations' occurrence.
  • Vibrations generated through the interactions of implant components, bones, and/or soft tissues result from induced by driving force leading to a dynamic response.
  • the driving force may be associated with knee-ligament instability, bone properties, and conditions.
  • a normal intact knee joint 50 FIG. 1
  • degeneration or damage occurs to the knee joint 50 FIG. 1
  • both the kinematic and vibrational characteristics change. This altering, for each type of injury or degeneration, leads to distinct changes (or signature) that may be captured by the kinematic and vibration methods described herein.
  • FIGS. 28-34 illustrate a diagnostic system 320 configured to perform the method 316 in accordance with one embodiment of the present invention.
  • the diagnostic system 320 includes the ID module 90 configured to diagnose soft tissue and bone injuries.
  • a first patient having a normal knee joint and a second patient having an anterior cruciate ligament deficit (“ACLD”) may exhibit a similar pattern of posterior femoral translation during progressive knee flexion; however, the first and second patients exhibit different axial rotation patterns of 30° of knee flexion.
  • the ID module 90 includes three stages: (1) a first stage that involves data analysis, (2) a second stage that includes sending the data to a neural network for detecting an injury, and (3) a third stage that classifies or determines severity of a detected injury.
  • the first stage includes acquisition of kinematic feature vectors, using multiple physiological measurements taken from the patient while the patient moves the knee joint 50 ( FIG. 1 ) through a range of motion.
  • Exemplary measurements may include, without limitation, medial condyle anteroposterior (“MAP”) motion and lateral condyle anteroposterior (“LAP”) motion.
  • the LAP motion pertains to the anterior-posterior (“AP”) distance of the medial and lateral condyle points 110 , 114 ( FIG. 1 ) relative to a tibia geometric center.
  • Other exemplary measurements may include lateral shear interferometer (“LSI”) measurement of the distance between the lateral femoral condyle 114 ( FIG.
  • FIGS. 30A-30C medial shear interferometer
  • Feature vectors may also include the femoral position with respect to the tibia which is defined by three Euler angles 340 , three translation components with the vibrational signal 342 , and force data 344 . Examples of these vectors are shown in FIGS. 31A-31C , respectively.
  • FIG. 32 is a graphical representation 346 showing the average medial and lateral condyle positions during a deep knee bend activity for the second patient having ACLD.
  • the feature vectors that are extracted from the kinematic and vibration analyses are output to the neural network 98 ( FIG. 6 ) for determining the injury, as described in greater detail below.
  • FIG. 33 illustrates one embodiment of a neural network classifier 322 having multiple binary outputs 323 a , 323 b , 323 c , 323 d , i.e., each output is either a “1” or “0,” wherein the “1” corresponding to “yes” and the “0” corresponding to “no.”
  • each output 323 a , 323 b , 323 c , 323 d represents the response of the neural network 98 ( FIG. 1 ) to a particular injury type.
  • one output 323 b may represent the response for ACLD, wherein its state will be “1” if an ACL injury is detected, and “0” otherwise.
  • the neural network 98 ( FIG. 1 ) and the classifier 322 may be significantly more or less sophisticated, depending on the underlying model of the joint in question.
  • FIG. 34 illustrates one embodiment of a construction 325 of the neural network 98 ( FIG. 6 ).
  • the construction 325 includes formulating a supervised classifier using a training set 324 of the kinematic and vibration data corresponding to a dataset 326 of normal and injured knee joints.
  • the neural network 98 ( FIG. 6 ) is trained with the training set 324 of vectors, wherein each vector consists of data (sound 328 , kinematic 330 , and force 332 ) collected from the knee joint 50 ( FIG. 1 ).
  • Fluoroscopy data 333 may be used to calculate the kinematics. While fluoroscopy data 333 is highly accurate, it requires the patient to remain within the small working volume of the fluoroscope unit and subjects the patient to ionizing radiation for a prolonged period of time. For most dynamic activities where the joints are loaded, such as running, jumping, or other dynamic activities, fluoroscopy is an unacceptable alternative. Therefore, use of fluoroscopy data 333 is not required.
  • EMG electrodes 337 may also be utilized as a data input for the computer 96 ( FIG. 6 ) and the neural network 98 ( FIG. 6 ).
  • EMG electrodes 337 are mounted to the surface of the skin proximate the muscles adjacent the knee joint 50 ( FIG. 1 ) to monitor the electrical signal transmitted to the muscles in order to provide relevant data of a muscle injury or disorder.
  • the neural network 98 may be used to classify new cases and categorize an injury type using these kinematic 330 , vibration 328 , and force 332 data.
  • the types and classifications desired to be accommodated by the neural network 98 ( FIG. 6 ) necessarily include training the neural network 98 ( FIG. 6 ) on these very types of classifications.
  • Exemplary types and classifications of injuries to mammalian knee joints include, without limitation, osteoarthritic conditions, soft tissue damage, and abnormal growths.
  • the neural network 98 ( FIG. 6 ) needs to be trained to differentiate between and normal and abnormal knee conditions.
  • the knee joint 50 ( FIG. 1 ) are compiled and input as a testing set 327 to the trained neural network 334 .
  • the trained neural network 334 then diagnoses the condition of the knee joint 50 ( FIG. 1 ), and returns one of the outputs 323 a , 323 b , 323 c , 323 d.
  • a knee brace in accordance with an embodiment of the present invention may be worn by a patient for an extended period of time while performing normal activities.
  • the patient may wear a device incorporating components of at least one of the JKT module 86 ( FIG. 6 ), the VA module 88 ( FIG. 6 ), and the foot module 92 ( FIG. 6 ) during activities that are not reproducible in the office (for example, weight lifting, racquet ball, etc.) and that elicit the pain or patient's symptoms.
  • the patient may turn the device on immediately prior to the activity and/or the patient may mark onset of the pain or symptoms when it occurs. This enables analysis of the data range from few seconds before the marked time to see what abnormal sounds or joint kinematic were occurring.
  • Data may be stored on a portable hard drive (or any other portable storage device) and then may be downloaded to exemplary systems for analysis.
  • the data can be wirelessly transmitted and stored in a computer. It can also be stored with a miniature memory drive if field data is desired. If the occurrence of the pain is more random, some embodiments of the devices may continuously acquire data. Although, continuously monitoring devices may require a larger data storage capacity.
  • embodiments may be easily adapted to other joints of the musculoskeletal system of a mammalian animal.
  • embodiments may be adapted for use on hips, ankles, toes, spines, shoulders, elbows, wrists, fingers, and temporomandibular joints.

Abstract

A device for acquiring data and diagnosing a musculoskeletal injury. The device includes a semi-flexible housing, at least one ultrasonic transducer, a positional localizer, and a transmission system. The semi-flexible housing is positioned proximate a portion of the musculoskeletal system of a patient and supports the at least one ultrasonic transducer and the positional localizer. The at least one ultrasonic transducer is configured to acquire an ultrasonic data indicative of a bone surface. The positional localizer is positioned at a select location relative to the at least one ultrasonic transducer and tracks movement of the housing. The transmission system transmits the ultrasonic data of the at least one ultrasonic transducer and the movement data of the positional localizer to a data analyzer for analysis and diagnosis.

Description

    RELATED APPLICATIONS
  • The present application claims the filing benefit of co-pending PCT Patent Application No. PCT/US2010/022939, filed on Feb. 2, 2010, and is a Continuation-In-Part of co-pending U.S. patent application Ser. No. 12/364,267, filed on Feb. 2, 2009, the disclosures of both applications are hereby incorporated by reference herein in their entirety.
  • FIELD OF THE INVENTION
  • The present invention relates to devices and methods for evaluating a physiological condition of a musculoskeletal system and, more particularly, to evaluating the physiological condition of bodily joints.
  • BACKGROUND OF THE INVENTION
  • In humans, the knee joint 50, as shown in FIGS. 1, 2, and 3, is functionally controlled by a mechanical system governed by three unique types of forces: (1) active forces resulting from motion, such as those resulting from a muscle flexing or relaxing; (2) constraining forces that constrain motion, such as those resulting from ligaments being in tension; and (3) interaction forces that resist motion, such as those acting upon bones. In addition to these three types of forces, the soft tissue in the knee joint 50 (e.g., cartilage and the meniscus) produce a dampening effect distributing the compressive loads acting on the knee joint 50.
  • Knee joint motions are stabilized primarily by five ligaments which restrict and regulate the relative motion between the femur 52, the tibia 54, and the patella 56. These ligaments are the anterior cruciate ligament (“ACL”) 58, the posterior cruciate ligament (“PCL”) 60, the medial collateral ligament (“MCL”) 62, the lateral collateral ligament (“LCL”) 64, and the patellar ligament 66. An injury to any one of these ligaments 58-66 or other soft-tissue structures may cause detectable changes in knee kinematics and the creation of detectable vibrations, each of which may be representative of the type of knee joint injury and/or the severity of the injury. These visual (knee kinematics) and auditory (vibrations) changes are produced as the bones 52, 54, 56 move in a distorted kinematic pattern and differ significantly from the look and sound of a properly balanced knee joint 50 moving through the same range and types of motion.
  • Conventionally, knee vibration has been detected using microphones with or without stethoscope equipment and correlated with clinical data regarding various joint problems. However, microphones and stethoscopes cannot reliably detect frequencies, especially those experiencing strong interference from noise. Also the signal clearance can be substantially be influenced by skin friction. It is desirable, therefore, to provide a diagnostic tool that compares patient specific data with kinematic data while providing visual feedback to clinicians.
  • SUMMARY OF THE INVENTION
  • While the present invention will be described in connection with certain embodiments, it will be understood that the present invention is not limited to these embodiments. To the contrary, this invention includes all alternatives, modifications, and equivalents as may be included within the spirit and scope of the present invention.
  • A device for acquiring data and diagnosing a musculoskeletal injury in accordance with one embodiment of the present invention includes a semi-flexible housing, at least one ultrasonic transducer, a positional localizer, and a transmission system. The semi-flexible housing is positioned proximate a portion of the musculoskeletal system of a patient and supports the at least one ultrasonic transducer and the positional localizer. The at least one ultrasonic transducer is configured to acquire an ultrasonic data indicative of a bone surface. The positional localizer is positioned at a select location relative to the at least one ultrasonic transducer and tracks movement of the housing. The transmission system transmits the ultrasonic data of the at least one ultrasonic transducer and the movement data of the positional localizer to a data analyzer for analysis and diagnosis.
  • Another embodiment of the present invention is directed to a method of diagnosing a musculoskeletal injury. The method includes creates a 3D model of a portion of the musculoskeletal system of a patient. A feature data is acquires by a sensor that is positioned proximate the portion of the musculoskeletal injury. The feature data is compared, by a neural network, to a database of feature data. A dataset within the database of feature data is representative of the musculoskeletal injury. Then, based on the comparing, a diagnosis is returned.
  • Still another embodiment of the present invention is directed to a diagnostic system for diagnosing a musculoskeletal injury. The system includes a 3D model reconstruction module that acquires a structural data indicative of a bone surface. The bone is within a portion of the musculoskeletal system of a patient. The 3D model reconstruction module constructs a patient-specific model from the structural data. The system further includes a kinematic tracking module that acquires movement data while the portion of the musculoskeletal system is articulated. A vibroarthography model acquires vibration data generated by the articulation. The structural data, the movement data, and the vibration data are received and analyzed by an intelligent diagnosis module in order to determine injury type.
  • The above and other objects and advantages of the present invention shall be made apparent from the accompanying drawings and the description thereof.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the present invention and, together with a general description of the invention given above, and the detailed description of the embodiments given below, serve to explain the principles of the present invention.
  • FIG. 1 is a side elevational view of a posterior portion of a knee joint with a 90° flexion.
  • FIG. 2 is a side elevational view of the knee joint of FIG. 1 but with the knee joint fully extended.
  • FIG. 3 is a side elevational view of an anterior portion of the knee joint in FIG. 1.
  • FIG. 4 is a flow chart illustrating a method of determining a type of knee injury in accordance with one embodiment of the present invention.
  • FIG. 5 is a schematic diagram of a diagnostic system in accordance with one embodiment of the present invention.
  • FIG. 6 is another schematic diagram of the diagnostic system of FIG. 5.
  • FIG. 7 is a schematic view of a knee brace in accordance with one embodiment of the present invention.
  • FIG. 8 is a side elevational view of a vibration detection module in accordance with one embodiment of the present invention.
  • FIG. 9 is a side elevational view of an exemplary shoe having a sensor array, for a shoe module, in accordance with one embodiment of the present invention.
  • FIG. 9A is an exemplary wireless transmitter for use with the shoe module of FIG. 9.
  • FIG. 9B is an enlarged view of one exemplary positional sensor of the shoe module of FIG. 9.
  • FIG. 10 is a schematic view of an ultrasound transducer wand for use with the diagnostic system in accordance with one embodiment of the present invention.
  • FIG. 11 is a diagrammatic view of an ultra wide band transmitter in accordance with one embodiment of the present invention.
  • FIG. 12 is a diagrammatic view of an ultra wide band receiver in accordance with one embodiment of the present invention.
  • FIG. 13 is a Cartesian coordinate system depicting an ultra wide band positioning system in accordance with one embodiment of the present invention.
  • FIG. 14 is a diagrammatic view comparing one embodiment of an ultra wide band positioning system to a global positioning system.
  • FIG. 15 illustrates the error in detecting a position along each of the x-, y-, and z-axes and with respect to a sequentially acquired series of data points.
  • FIG. 16 is an exemplary screen capture of a user interface of the diagnostic system of FIG. 5.
  • FIG. 17 is a side elevational view of a leg with a knee brace in accordance with another embodiment of the present invention.
  • FIG. 18 is a side elevational view of a leg with a knee brace in accordance with another embodiment of the present invention.
  • FIG. 19 is an individual transducer tracking sub-brace for use with a knee brace in accordance with one embodiment of the present invention.
  • FIG. 20 is an inter-transducers mechanical link sub-brace for use with a knee brace in accordance with one embodiment of the present invention.
  • FIG. 21 is a rotating transducer sub-brace for use with a knee brace in accordance with one embodiment of the present invention.
  • FIG. 22 is a diagrammatic representation of an inertia based localizer circuit in accordance with one embodiment of the present invention.
  • FIG. 23 is a diagrammatic representation of an alternate individual transducer tracking sub-brace circuit architecture in accordance with one embodiment of the present invention.
  • FIG. 24 is a diagrammatic representation of a high voltage circuit for use with a knee brace in accordance with one embodiment of the present invention.
  • FIG. 25 is a diagrammatic representation of the circuit layout of the high voltage circuit of FIG. 24.
  • FIG. 26 is a diagrammatic representation of a high voltage multiplexer for use with a sub-brace of a knee brace in accordance with one embodiment of the present invention.
  • FIG. 27 a diagrammatic representation of a receiving circuit for use with a sub-brace of a knee brace in accordance with one embodiment of the present invention.
  • FIG. 28 is a diagrammatic representation of a diagnostic system in accordance with an embodiment of the present invention.
  • FIG. 29 is a flow chart illustrating one method of using the diagnostic system of FIG. 28.
  • FIGS. 30A-30C illustrate various kinematic feature vectors acquired from a knee joint moving through a range of motion.
  • FIGS. 31A-31C illustrate feature vectors of a femoral position with respect to the tibia.
  • FIG. 32 illustrates average medial and lateral femoral condyle positions during a deep knee bend of a patient having an anterior cruciate ligament deficit.
  • FIG. 33 is a diagrammatic representation of a neural network classifier in accordance with one embodiment of the present invention.
  • FIG. 34 is a diagrammatic representation of a construction of a neural network.
  • DETAILED DESCRIPTION OF THE INVENTION
  • The exemplary embodiments of the present invention are illustrated and described below to encompass diagnosis of bodily abnormalities and, more particularly, devices and methods for evaluating the physiological condition of the musculoskeletal system (such as joints) to discern whether abnormalities exist and the extent of any abnormalities. Of course, it will be apparent to those of ordinary skill in the art that the exemplary embodiments discussed below are merely examples and may be reconfigured without departing from the scope and spirit of the present invention. However, for clarity and precision, the exemplary embodiments, as discussed below, may include optional steps, methods, and features that one of ordinary skill should recognize as not being a requisite to fall within the scope of the present invention. By way of example, the exemplary embodiments disclosed herein are described with respect to diagnosing a knee joint injury. Nevertheless, the exemplary embodiments may be utilized to diagnose other injuries of the musculoskeletal system (such as a hip joint injury or a bone fracture), as the knee joint 50 (FIG. 1) is merely exemplary to facilitate an understanding of the embodiments disclosed.
  • Turning now to the figures and in particular to FIG. 4, with reference also to FIG. 1, a low level exemplary process flow for a method 70 of determining a type of knee joint injury in accordance with one embodiment of the present invention is described. Still more particularly, the method 70 includes constructing a 3D model of the knee joint 50 (Block 72), which may include the detection of motion sound (Block 74) as well as tracking the kinematics (Block 76). The detected sound and tracked kinematics are automatically analyzed (Block 78) and the knee injury recognized based upon the analysis (Block 80).
  • FIG. 5 illustrates a first exemplary diagnostic system 82 for implementing the method 70 of FIG. 3. The diagnostic system 82 includes four modules: (1) a pulse echo A-mode ultrasound based 3D model reconstruction (“PEAUMR”) module 84 (FIG. 6) for constructing a patient-specific 3D-model of the patient's knee joint 50 (FIG. 1); (2) a joint kinematics tracking (“JKT”) module 86 for tracking the kinematics of the knee joint 50 (FIG. 1) using the patient-specific 3D model of the knee joint 50 (FIG. 1) from the PEAUMR module 84; (3) vibroarthography (“VA”) module 88 for capturing sounds emanating from the knee joint 50 (FIG. 1) while in motion; and (4) an intelligent diagnosis (“ID”) module 90 for identifying a likely diagnosis of the knee joint 50 (FIG. 1) using the kinematic data and the vibration data. Each of these four modules 84-90 is described in further detail below. If desired, a foot module 92 (FIG. 6) may be included with the JKT module 86 for providing dynamic force data, also described in detail below.
  • It will be understood by those of skill in the art that the diagnosis system 82 is usable with or without the use of the VA module 88. For example, the present invention may be used to mathematically describe the relative motion of the bones 52, 54, 56 in the patient's knee joint 50 as such motion is tracked on a 3D-patient specific bone model. The bone model and motion may be compared with a database of mathematical descriptions of joint motion. The database could contain mathematical descriptions of healthy or clinically undesirable joint motion.
  • As will be discussed in more detail hereafter, the interaction between bodily tissue (e.g., bone against cartilage or bone against bone) in a dynamic environment creates certain vibrations that are indicative of the condition or state of health of the joint. Even the healthiest and youngest joints create vibrations. However, joints that exhibit degradation, whether through wear or injury, will exhibit vibrations that are much more pronounced and amplified as compared to those of a healthy joint. The VA module 88 with the diagnostic system 82 utilizes those sounds, such as vibrations, exhibited by the joint during a range of motion to diagnose the condition of the joint without requiring an invasive procedure or subjecting the patient to radiation.
  • FIG. 6 provides still further details of the diagnostic system 82. The modules 84-92 may output the acquired data to a computer 96 for data processing by way of, for example, a neural network 98. The data processing, as will be discussed in more detail below, may provide one or more of a visual output, an audible output, and a diagnosis by way of a visual display 100.
  • Referring still to FIGS. 4-6, and now also FIG. 7, the VA module 88 is shown and comprises a plurality of accelerometers (three are shown 120 a, 120 b, 120 c) that are utilized to detect sound, specifically, vibrations occurring as a result of motion of the knee joint 50. In this exemplary VA module 88, the accelerometers 120 a, 120 b, 120 c are mounted directly to the skin or external tissue surface of the patient, as skin-mounted sensor 119 s, in order to detect sounds from bone and soft tissue interaction. An intervening adhesive may be utilized between the accelerometers 120 a, 120 b, 120 c. In the context of the knee joint 50, the VA module 88 includes one accelerometer 120 a mounted on the medial side of the knee joint 50, a second accelerometer 120 b mounted on the lateral side of the knee joint 50, and a third accelerometer 120 c mounted on the front side of the knee joint 50, proximate the patella 56 (FIG. 3). As illustrated, the accelerometers 120 a, 120 b, 120 c are mounted to the patient so that each lies along a common plane 121, though this is not required. It should also be understood, however, that any number of accelerometers 120 a, 120 b, 120 c may be utilized to detect sounds generated by the patient's knee joint 50.
  • Each accelerometer 120 a, 120 b, 120 c is in communication with one or more signal conditioning circuits or electronics 122. The accelerometers 120 a, 120 b, 120 c are operative to detect sound, specifically vibrations, and output the sound detected in the form of frequency data (measured in Hertz) to the conditioning circuits 122. This frequency data is processed by the conditioning circuits 122 and communicated to the computer 96 as digital frequency data. While the accelerometers 120 a, 120 b, 120 c are generating frequency data, the conditioning circuits 122 may include a clock 123 to time stamp the frequency data generated. As will be discussed in more detail below, correlating the frequency data with the time stamp provides a constant against which all of the detected data can be compared on a relative scale.
  • The first accelerometer 120 a on the medial side of the knee joint 50 detects vibrations generated primarily by the interactions between the medial condyle 110 (FIG. 1) of the femur 52 against the medial cartilage 112 (FIG. 1) on top of the medial portion of the tibia 54. Similarly, the second accelerometer 120 b on the lateral side of the knee joint 50 detects vibrations generated primarily by the interactions between the lateral condyle 114 (FIG. 1) of the femur 52 against the lateral cartilage 116 (FIG. 1) on top of the lateral portion of the tibia 54. The third accelerometer 120 c on the front of the knee joint 50, proximate the patella 56 (FIG. 3), detects vibrations generated primarily by the interactions between the femur 52 against the patella 56 (FIG. 3). The resulting data output by the accelerometers 120 a, 120 b, 120 c may then be wirelessly transmitted to the computer 96 via a wireless transmitter 124, such as an ultra-wide band transmitter, and utilized in combination with data from the other modules to ascertain the appropriate diagnosis.
  • FIG. 8 illustrates one example of a plurality of thin film accelerometers (four are shown, 120 a, 120 b, 120 c, 120 d) that are suitable for detecting the vibrations produced by motion of the knee joint 50. Thin film accelerometers 120 a, 120 b, 120 c, 120 d may be used in lieu of sound sensors because of better performance and less noise susceptibility. The thin film accelerometers 120 a, 120 b, 120 c, 120 d may also be used as a localizer and include the same circuitry. The accelerometers 120 a, 120 b, 120 c, 120 d are attached to the patients so the outputs may be amplified, digitized, and sent wirelessly to the computer 96 as described below.
  • With reference now to FIGS. 4-6 and 9, the foot module 92 (also referred to as the contact force module (“CFM”)) is shown and includes a plurality of pressure sensors 130 that are utilized to detect pressure or a contact force occurring at the bottom of the foot (not shown) when the knee joint 50 (FIG. 1) is moved through a range of motion under a loaded condition. In other words, as the patient walks, jogs, runs, etc., the foot module 92 detects pressure data at the bottom of the foot when the foot is partially or fully in contact with the ground. In exemplary form, the pressure sensors 130 are incorporated into an insole 132 of a shoe 134 that conforms to the general shape of a patient's foot. Because humans have different sized feet, the insoles 132 may be incrementally sized to accommodate humans with differently sized feet or to accommodate a particular type of shoe 134 (or lack thereof) needed for a particular activity.
  • The pressure sensors 130 may be arranged in a grid-shaped pattern on the insole 132, which may include a series of rows and columns. The pressure sensors 130 are exposed to the underside of a patient's foot so that the location and amplitude (or amount) of the contact forces applied by the foot to the shoe 134, by way of the insole 132, may be measured. As will be discussed in more detail hereafter, the location of the pressures and the relative amount of pressures provides information relevant to diagnosis of injury. For example, the detected pressures of a patient with a limp caused by a knee joint injury would differ from the detected pressures of a patient with a healthy knee joint and a normal gait.
  • In one embodiment, each sensor 130 may include a capacitor having a deformable dielectric between two electrode plates. Changes in the pressure applied to the plates cause a strain, or deformation, of the dielectric medium. Thus, a pressure applied to the capacitive sensor 130 changes the spacing between the plates and the measured capacitance. The capacitive sensors 130 are arrayed across the area of pressure measurement to provide discrete pressure data points corresponding to strains/deformation at the various locations of the array. These strains/deformations are used to find the stresses and thus the compressive forces and to calculate the output of pressure data having units of force per unit area and time (i.e., N/m sec).
  • The sensors 130 in the grid-shape enable positioning of each detected pressure from each of the sensors 130 relative to another sensor 130. The resultant data, which includes a two-dimensional map of the pressure sensors 130, is either stored on the computer 96 or stored locally with the sensors 130. The resulting data may be wirelessly transmitted to the computer 96 via a wireless transmitter 136, such as an ultra-wide band transmitter. Using the 2D map of the sensors 130 stored on the computer 96 in combination with the received sensor pressure data, the computer 96 is operative to generate data tying detected pressure to position, specifically the position of one pressure sensor 130 with respect to another.
  • By tying amounts of compressive force to its applied position, the foot module 92 provides data reflecting precisely what pressures are exerted at what location. In addition, the computer 96 may include an internal clock 97 to associate a time of which the pressure is applied with the pressure data generated by the pressure sensors 130. Accordingly, the diagnostic system 82 not only knows how much pressure was exerted and the location where the pressure was applied, but also has time data indicating the duration of the applied pressures. Again, by tying the pressure data generated by the pressure sensors 130 to time, the pressure data can be correlated with the sound data generated by the VA module 88 using a common time scale. As a result, the diagnostic system 82 may evaluate how pressures exhibited at the bottom of the foot change as a function of time, along with how the vibrational data changes during the same time.
  • FIGS. 4-6 and 10 illustrate the details of the JKT module 86, which comprises an ultrasound creation and positioning submodule 140, an ultrasound registration submodule 142, and an ultrasound dynamic movement submodule 144. Specifically, each submodule 140, 142, 144 includes an A-mode ultrasound transducer to generate sound and to detect reflected sound, wherein the reflected sound is representative of the structure, position, and acoustical impedance of the knee joint 50 (FIG. 1). Commercially-available transducers may include, for example, an immersion unfocused 3.5 MHz transducer, such as those that are available from Olympus Corp. (Tokyo, Japan). Those skilled in the art are familiar with the operation of ultrasound transducers generally and, more specifically, an A-mode ultrasound transducer that generates sound pulses and detects sound that is reflected at tissue boundaries of tissues having different acoustic impedances. The magnitude of the reflected sound and the time delay are utilized to determine the distance between the ultrasound transducer and the tissue interface.
  • In the illustrated embodiment, the A-mode ultrasound transducers are utilized to detect the interface between bone and the surrounding soft tissue so that the location of the bone surface may be determined. Because the operation of ultrasound transducers (including the A-mode ultrasound transducers) is well known to those skilled in the art, a detailed discussion of the operation of ultrasound transducers in general, and A-mode ultrasound transducers specifically, has been omitted only for purposes of brevity.
  • The ultrasound creation and positioning submodule 140 as shown in FIG. 10 comprises one or more A-mode ultrasound transducers 150 fixedly mounted to a wand 152. The wand 152 further includes at least one positioning device 170. In this exemplary embodiment, the ultrasound creation and positioning submodule 140 is physically separate from the ultrasound registration submodule 142 (FIG. 6) and the ultrasound dynamic movement submodule 144 (FIG. 6), the latter two of which are mounted to a rigid knee brace 220 schematically illustrated in FIG. 17. In this fashion, the ultrasound creation and positioning submodule 140 is repositionable with respect to the rigid knee brace 220 (FIG. 17) and adapted to place one or more of its A-mode ultrasound transducers 150 in contact with the patient's epidermis, proximate the knee joint 50 (FIG. 1). It should be noted, however, that the knee brace 220 (FIG. 17) does not have to be rigid, other than the linkages between certain components. Moreover, the knee joint 50 (FIG. 1) may be scanned by the ultrasound wand 152 before positioning the brace 220 (FIG. 17) thereon.
  • One of the functions of the ultrasound creation and positioning submodule 140 is to generate an electrical signal that is representative of the ultrasonic wave detected by the transducers 150 as the wand 152 moves over the patient's epidermis, proximate the knee joint 50 (FIG. 1). The ultrasound transducers 150 receive the ultrasonic wave based upon the magnitude of the reflected ultrasonic wave from the bone-tissue interface. As discussed previously, the magnitude of the electrical signal and the delay between the generation of the ultrasonic wave by the ultrasound transducer 150 to detection of the reflected ultrasonic wave by the ultrasound transducer 150 is indicative of the distance to the bone underneath the transducer 150. But, distance data alone is not particularly useful; therefore, one or more positioning devices 170 are used to provide a 3D coordinate system, one example of which is shown in FIG. 11.
  • The positioning devices 170 of the ultrasound creation and positioning submodule 140 are fixedly mounted to the wand 152 and may include any of a number of positioning devices 170. For example, the wand 152 may include one or more optical devices (as the positioning devices 170) that are configured to generate, detect, and/or reflect pulses of light. These pulses of light interact with a corresponding detector or light generator to discern the position of the wand 152, in 3D space, and with respect to a fixed or reference position. One such device includes a light detector configured to detect pulses of light emitted from light emitters having known positions. The light detector detects the light and sends a representative signal to the computer 96 or otherwise a controller (not shown) of the light detector. The computer 96 is also provided the time at which the light pulses were emitted by the optical devices 170. In this matter, the computer 96 determines the position of the wand 152 relative to the known positions of the detectors. Because the ultrasound transducer 150 and the optical devices 170 are fixedly mounted to the wand 152, the position of the ultrasound transducers 150 with respect to the position of the optical devices 170 is known. Similarly, because the ultrasound transducers 150 are generating signals representative of the straight line distance between the transducers 150 and the bone-tissue interface, and the position of the transducers 150 with respect to the optical devices 170 is known, the position of the bone-tissue interface with respect to the optical devices 170 may be determined. In other words, as the wand 152 moves over the patient's epidermis, the optical devices 170 generate data that is determined, by the computer 96, to represent that the relative position of the optical devices 170 with respect to the light detectors has changed in the 3D coordinate system. This change in the position of the optical devices 170 may be easily correlated to the position of the bone-tissue interface, in 3D, because the position of the bone-tissue interface relative to the ultrasound transducers 150, as well as the position of the optical devices 170 with respect to the ultrasound transducers 150 are known. Accordingly, the 3D position data may be used in combination with the fixed position data (distance data for the position of the ultrasound transducers 150 with respect to the optical devices 170) for the ultrasound transducers 150 in combination with the distance data generated in response to the signals received from the ultrasound transducers 150 to generate composite data. The composite data may, in turn, be used to create a plurality of 3D points representing a plurality of distinct points on the surface of the bone, along the bone-tissue interface. As will be discussed in more detail below, these 3D points are utilized in conjunction with a default bone model to generate a virtual, 3D representation of the patient's bone.
  • Alternatively, the positioning devices 170 may comprise one or more inertial measurement units (“IMUs”). IMUs are known to those skilled in the art and include accelerometers, gyroscopes, and magnetometers that work together to determine the position of the IMUs in a 3D coordinate system. Because the A-mode ultrasound transducer 150 and the IMUs 170 are fixedly mounted to the wand 152, the position of the ultrasound transducers 150 with respect to the position of the IMUs 170 is known. Similarly, because the ultrasound transducers 150 are generating signals representative of the straight line distance between the transducers 150 and the bone-tissue interface and the position of the transducers 150 with respect to the IMUs 170 is known, the position of the bone-tissue interface with respect of the IMUs 170 may be determined. In other words, as the wand 152 moves over the patient's epidermis, the IMUs 170 generate data that is determined, by the computer 96, to represent that the relative position of the IMUs 170 has changed in the 3D coordinate system. This change in the position of the IMUs 170 may be easily correlated to the position of the bone-tissue interface in 3D because the position of the bone tissue interface relative to the ultrasound transducer 150 is known, as is also the position of the IMUs 170 with respect to the ultrasound transducers 150. Accordingly, the 3D position data may be used in combination with the fixed position data (distance data for the position of the ultrasound transducers 150 with respect to the IMUs 170) for the ultrasound transducers 150 in combination with the distance data generated in response to the signals received from the ultrasound transducers 150 to generate the composite data as described above.
  • Referring now also to FIGS. 11-12, the positioning devices 170 may still alternatively comprise one or more ultra-wide band (UWB) transmitters. UWB transmitters are known to those skilled in the art, but the use of UWB transmitters and receivers for millimeter resolution 3D positioning is novel. In that regard, one or more UWB transmitters 170 are fixedly mounted to the wand 152 and configured to sequentially transmit UWB signals to three or more UWB receivers 172 having known positions in a 3D coordinate system. This embodiment of the positioning device 170 is comprised of active tags or transmitters 170 that are tracked by the UWB receivers 172. The system architecture of the UWB transmitter 170 is shown in FIG. 11 where a low noise system clock (“crystal clock”) 174 triggers a baseband UWB pulse generator 176 (for instance a step recovery diode (“SRD”) pulse generator). The baseband pulse from the baseband UWB pulse generator 176 is upconverted by a local oscillator 178 via a double balanced wideband mixer (not shown). The upconverted signal is amplified and filtered (“bandpass filter” 180). Finally the signal is transmitted, via an omnidirectional antenna 182, to the computer 96 (FIG. 6). The UWB signal may travel through an indoor channel where significant multipath and pathless effects cause noticeable signal degradation.
  • The UWB receiver 172 architecture in accordance with one embodiment is shown in FIG. 12. The signal is received via a directional UWB antenna 184 and is filtered and amplified (“bandpass filter” 186), downconverted by a local oscillator 187, and low-pass filtered (“LPF”) 188. A sub-sampling mixer 190 triggered by a second low noise system clock (“crystal clock”) 192 is used to tune extend the pulse by about 1,000 to about 100,000 times. This effectively reduces the bandwidth of the UWB pulse and allows sampling by a conventional analog-to-digital converter (“ADC”) 194.
  • Each UWB transmitter 170 and receiver 172 is in communication with the computer 96. Accordingly, the computer 96 detects each time the UWB transmitter 170 transmits a UWB signal, as well as the time at which the UWB signal was transmitted. Similarly, the computer 96 detects the position of each of the UWB receivers 172 in the 3D coordinate system, as well as the time at which the UWB signal was received. The final time-difference-of-arrival (“TDOA”) calculation, via a UWB positioning system 183, is shown in FIG. 13.
  • Referring now to FIG. 13, for the TDOA calculation, at least four base receivers 172 (Rx1, Rx2, Rx3, Rx4) are needed to localize the 3D position of the UWB transmitter 170 (“Tag”). The geometry of the receivers Rx1, Rx2, Rx3, Rx4 has important ramifications on the achievable 3D accuracy through what is known as geometric position dilution of precision (“PDOP”). A combination of novel filtering techniques, high sample rates, robustness to multipath interference, accurate digital ranging algorithms, low phase noise local oscillators, and high integrity microwave hardware are needed to achieve millimeter range accuracy (e.g. ranging from about 5 mm to about 7 mm in 3D real-time). An analogy of the UWB positioning system 183 to a GPS system 185 is shown in FIG. 14.
  • FIG. 15 shows actual experimental errors in each of the x-, y-, and z-coordinates for detecting the position of the UWB transmitter 170 in 3D space and in real-time for over 1000 samples while the transmitter 170 is moving freely within the 3D space.
  • Because the A-mode ultrasound transducers 150 and the UWB transmitters 170 are fixedly mounted to the wand 152, the position of the ultrasound transducers 150 with respect to the position of the UWB transmitters 170 is known. Similarly, because the ultrasound transducers 150 are generating signals representative of the straight line distance between the ultrasound transducers 150 and the bone-tissue interface and the position of the ultrasound transducers 150 with respect to the UWB transmitters 170 is known, the position of the bone-tissue interface with respect to the UWB transmitter 170 may be determined. In other words, as wand 152 moves over the patient's epidermis, the UWB transmitters 170 transmit UWB signals that are correspondingly received by the UWB receivers 172. These UWB signals are processed by the computer 96 in order to discern whether the relative position of the UWB transmitters 170 has changed in the 3D coordinate systems, as well as the extent of such a change. This change in 3D position of the UWB transmitters 170 can be easily correlated to the position of the bone-tissue interface in 3D because the position of the bone relative to the ultrasound transducer 150 and the position of the UWB transmitters 170 with respect to the ultrasound transducer 150 are known. Accordingly, the UWB 3D position data may be used in combination with the fixed position data (distance data for the position of the ultrasound transducers 150) to generate the composite data as described above.
  • Regardless of the positioning device 170 utilized with the ultrasound creation and positioning submodule 140, the wand 152 is repositioned over the skin of the patient, proximate to the knee joint 50 (FIG. 1) while the knee joint 50 (FIG. 1) is bent. Bending the patient's knee joint 50 (FIG. 1) during data acquisition enables the creation of a 3D series of points for each of the bones of the knee joint 50 (FIG. 1) (the distal femur 52, the proximal tibia 54, and the patella 56). Thus, as the wand 152 is repositioned, the data from the transducer 150 is transmitted to a wireless transmitter 200 mounted to the wand 152. When the wireless transmitter 200 receives the data from the transducers 150, the transmitter 200 transmits the data via a wireless link to the computer 96.
  • In order to power the devices on-board the wand 152, an internal power supply (not shown) may be provided. In one embodiment, the internal power supply comprises one or more rechargeable batteries.
  • Transformation is needed for transforming the position data from a reference coordinate frame of reference to a world frame of reference. According to one embodiment of the present invention, a linear movement of the ultrasound transducer 150 may be described:

  • v(n+1)=v(n)+a(n) dt   Equation 1

  • s(n+t)=s(n)+v(n)dt=0.5a(n)dt 2  Equation 2
  • where s(n+1) is the position of the ultrasound transducer 150 at a current state, s(n) is the position from a previous state, v(n+1) is the instantaneous velocity of the current state, v(n) is the velocity from previous state, a(n) is the detected acceleration, and dt is the sampling time interval. The previous equations describe the dynamic motion and positioning of a point in 3D Euclidean space. Additional information is needed to describe 3D orientation and motion.
  • The orientation of the ultrasound transducer 150 may be described by using a gravity-based accelerometer (for example ADXL-330, analog device) and extracting the tilting information from each of a pair of orthogonal axes. The acceleration output on each of the x-, y-, or z-axes is due to gravity and is equal to the following:

  • A i=(V outx −V off)= S   Equation 3
  • where Ai is the acceleration of the ultrasound transducer 150 along each of the x-, y-, or z-axes, Voutx is the voltage output on each of the x-, y-, or z-axes, Voff is the offset voltage, and S is the sensitivity of the accelerometer. The yaw, pitch, and roll may be thus calculated as:
  • ρ = arctan ( A x A y 2 + A z 2 ) Equation 4 ϕ = arctan ( A y A x 2 + A z 2 ) Equation 5 θ = arctan ( A y 2 + A x 2 A z ) Equation 6
  • where pitch is ρ (the x-axis relative to the ground), roll is φ (the y-axis relative to the ground) and roll is θ (the z-axis relative to the ground). Since the accelerometer is gravity-based, the orientation does not require information from the previous state once the accelerometer is calibrated. The static calibration requires the resultant sum of accelerations from each of the three axes to equal 1−g (where g is the nominal acceleration due to gravity at the Earth's surface at sea level, defined to be precisely 9.80665 m/s2 (approximately 32,174 ft/s2)). Alternatively, an orientation sensor that provides yaw, pitch and roll information of the bodily tissue in question may be used. One such orientation sensor may be the commercially-available model IDG-300 from InvenSense (Sunnyvale, Calif.). The orientation of the ultrasound transducer 150 may then be resolved by using, for example, a direction cosine matrix transformation:

  • X 2 CθCφCθCφS p−SθCp CθSφC p −SθS p X 1

  • Y 2 =SθCφSθSφS p −CθC p SθSφC p −CθS p Y 1  Equation 7

  • Z 2 −SφCφS p CθC p
  • where C represents cosine and S represents sine.
  • Referring again to FIGS. 4-6, and now also to FIG. 16, the PEAUMR module 84 is described in greater detail. The PEAUMR module 84 constructs a 3D model of the patient's knee joint 50 (FIG. 6) by converting the transcutaneously acquired a set of 3D data points (using the tracked pulse echo A-mode ultrasound transducer 150), that, in total, are representative of the shape of the bone-tissue interface and therefore each bones' surface.
  • Before the patient data is acquired, software residing on the computer 96 may request a series of inputs from the user to adapt the diagnostic system 82 to equipment specific devices and the particular portion of the musculoskeletal anatomy to be modeled. For example, a menu 204 on a user interface 206 may be presented for the user to select the type of digitizer, which may include, without limitation, ultrasound. After the type of digitizer is selected, the user may actuate buttons 205 a, 205 b to connect to or disconnect from the digitizer, respectively.
  • As wand 152 moves over the patient's epidermis, the set of points is generated, numerically recorded, viewable in a data window 210, and ultimately utilized by the software to conform a selected bone model to the patient's actual bone shape. Consequently, the wand 152 is repositioned over the bones (the distal femur 52, the patella 56, the proximal tibia 54) for approximately 30 seconds so that the discrete points to typify the topography of the bone. Repositioning the wand 152 over the bone in question for a longer duration results in more 3D points being generated increases the resolution and improves the accuracy of the patient-specific bone model. A partial range of motion of the knee joint 50 (FIG. 1) while repositioning the wand 152 over the knee joint (FIG. 1) aids in scanning additional portions of the bone in question for new 3D points that may have been obscured by other bones in another range of motion position.
  • Before, during, or after the ultrasound data is acquired, the software provides various drop-down menus allowing the software to load a bone model 208 that is roughly the same shape as the patient's bone. The computer 96 receives the ultrasound data, the computer 96 includes software that interprets the A-mode ultrasound transducer data and constructs a 3D map having discrete 3D points corresponding to points on the surface of the scanned bone. That is, the shape of the patient's bone is reconstructed in virtual space, using a set of points outlining the surface of the patient's bone as acquired by the tracked ultrasound transducer 150 (FIG. 10). The set of points is applied to an atlas-based deformable model software to reconstruct the patient-specific 3-D model.
  • More specifically, the computer 96 may include a database having a plurality of bone models of various portions of the musculoskeletal system, for example, the femur 52, the tibia, 54, and the patella 56, that are classified and selectable in a menu 212, for example based upon ethnicity, gender, height ranges, the side of the body, and so forth. Each of these classifications is accounted for in a drop-down menu of the software so that the model initially chose by the software most closely approximates the body of the patient.
  • For mapping each bone, the computer 96 uses either a default bone model or the selected bone model as a starting point to construction of the ultimate patient-specific, virtual bone model. The default bone model may be a generalized average, as the morphing algorithms use statistical knowledge of a wide database population of bones for a very accurate model. The selected bone model expedites computation. For example, in the case of generating a patient-specific model of the femur 52 where the patient is a 53 year old, Caucasian male, who is six feet tall, a default femoral bone model is selected based upon the classification of Caucasian males having an age between 50-60, and a height ranging from 5′10″ to 6′2″. In this manner, selection of the appropriate default bone model more quickly achieves an accurate patient-specific, virtual bone model because of the number of iterations between the patient's actual bone (typified by the 3D map of bone points) and the default bone model are reduced. Nevertheless, in view of the model bones taking into account numerous traits of the patient (ethnicity, gender, bone modeled, and body side of the bone), it is quite possible to construct an accurate patient-specific 3D model with as few as 150 data points comprising the set which typically may be acquired by repositioning the wand 152 over the bone for 30 seconds for each bone. Ultrasound will not be affected whether the patient has a prosthetic implant.
  • After the appropriate bone model is selected, the computer 96 superimposes the 3D points onto the default bone model and, thereafter, carries out a deformation process so that the bone model exhibits the 3D bone points detected during the signal acquisition. The deformation process also makes use of statistical knowledge of the bone shape based upon reference bones of a wide population. After the deformation process is complete, the resulting bone model is a patient-specific, virtual 3D model of the patient's actual bone. The foregoing process is repeated for each bone comprising the specific joint to create patient-specific, virtual 3D models of the patient's anatomy.
  • Referring back to FIGS. 4-7, and now also FIG. 17, the JKT module 86 may be configured to track the kinematics of the knee joint 50 (FIG. 1) and display the kinematics on the patient-specific 3D bone model generated by the PEAUMR module 84 using, for example, one or more bone motion tracking braces 220. Generally, the bone motion tracking brace 220 includes pulse echo A-mode ultrasound transducers 222 to transcutaneously localize the bone-tissue interface and derive a set of points outlining each bone's surface.
  • Turning specifically to FIG. 17, the brace 220 includes a plurality of A-mode ultrasound transducers 222 fixedly mounted to the knee brace 220. Specifically, in the context of a knee joint 50, there are at least two A-mode ultrasound transducers 222 (i.e., “a transducer group” 222 a, 222 b) fixedly mounted to the knee brace 220 for tracking of the tibia 54 (FIG. 1) and the femur 52 (FIG. 1). In other words, the knee brace 220 includes at least six ultrasound transducers 222 in order to track the two primary bones 52, 54 (FIG. 1) of the knee joint 50. Each transducer group 222 a, 222 b includes a rigid, mechanical connection linking the transducers 222 and the positioning devices 224 to the knee brace 220. In this manner, the relative positions of the transducers 222 with respect to one another do not change. A first transducer group 222 a at least partially circumscribes a distal portion of the femur 52 (FIG. 1); while a second transducer group 222 b at least partially circumscribes a proximal portion of the tibia 54 (FIG. 1); and an optional third transducer group (not shown) overlies the patella 56 (FIG. 3) if patella kinematics are desired. The ultrasound registration submodule 142 is accordingly configured to provide a plurality of static reference points for each bone as the bone is moved through a range of motion.
  • Each ultrasound transducer 222 is tracked using an accelerometer or a sensor-specific localizer (or any other appropriate inertial sensor). The tracking may then be used to generate localized bone points from the outputs of the ultrasound transducers 222 and to virtually display bone movement on the 3D model while the knee joint 50 (FIG. 1) is taken through the range of motion.
  • Referring to FIGS. 6 and 17, the ultrasound dynamic movement submodule 144 comprises a plurality of positioning devices 224 that is configured to feed information to the computer 96 regarding the 3D position of each transducer group 222 a, 222 b of the ultrasound registration submodule 142. In exemplary form, the position devices 224 may comprise light detectors operative to detect pulses of light emitted from light emitters having known positions. The light detectors 224 detect the light and transmit representative signals to a control circuitry (not shown) associated with the knee brace 220. The knee brace 220 transmits this information to the computer 96, which also knows when the light pulses were emitted as a function of time and position. In this manner, the computer 96 may determine the position of the transducers 222 in the 3D coordinate system. Because the ultrasound transducers 222 and the optical devices 224 are fixedly mounted to the knee brace 220, the position of the ultrasound transducers 222 with respect to the position of the optical devices 224 is known. Similarly, because the ultrasound transducers 222 are generating signals representative of the straight line distance between each of the ultrasound transducers 222 and the bone-tissue interface beneath, and the position of the ultrasound transducers 222 with respect to the optical devices 224 is known, the position of the bone-tissue interface with respect to the optical devices 224 may be easily determined. In other words, as the knee joint 50 (FIG. 1) is moved, and correspondingly so too is the knee brace 220, the optical devices 224 generate data that is determined by the computer 96 that the relative position of the optical devices 224 has changed in the 3D coordinate system. This change in the position of the optical devices 224 may be easily correlated to the position of the bone in question in 3D because the position of the bone relative to the ultrasound transducer groups 222 a, 222 b is known, as is the position of the optical devices 224 with respect to the ultrasound transducer groups 222 a, 222 b. Accordingly, the optical devices 224 generate data that is used in combination with the fixed position data (distance data for the position of the ultrasound transducers 222) to generate the composite data. The composite data may, in turn, be used to create dynamically moving map of the bone on the patient-specific 3D model.
  • Alternatively, the positioning devices 224 may be comprised of one or more IMUs. Because the ultrasound transducers 222 and the IMUs 224 are fixedly mounted to the knee brace 220, the relative positions between the ultrasound transducers 222 and the IMUs 224 are known. Similarly, because the ultrasound transducers 222 are generating signals representative of the straight line distance between the transducer 222 and the bone-tissue interface, and the position of the transducers 222 with respect to the IMUs 224 is known, the position of the bone with respect to the IMUs 224 may be easily determined. In other words, as the knee joint 50 (FIG. 1) with the knee brace 220 moves, the IMUs 224 generate data that is determined, by the computer 96, as a change in the position of the IMUs 224. This change in the position of the IMUs 224 may be easily correlated to the position of the bone in 3D because the position of the bone relative to the ultrasound transducer groups 222 a, 222 b is known, as well as the position of the IMUs 224 with respect to the ultrasound transducer groups 222 a, 222 b. By way of example, because the ultrasound transducers 222 do not move with respect to the knee brace 220, any movement of the IMUs 224 in space means that the knee brace 220 has also moved in space, and by continuing to track the distance data provided by each IMU 224, the movement of the bone may be correspondingly tracked. IMU tracking of the bone movements requires a static registration between the IMUs 224 and an initial known body position (such as standing). The IMUs 224 enable measurement of the relative motion between different bones via their corresponding ultrasound transducer group data and the IMU data. The IMUs 224 may be used alone or in conjunction with other positioning devices 170 (FIG. 10), such as those described in detail above. In this scenario, the IMU position is updated at a certain interval with the absolute position provided by the additional positioning system to minimize error. Therefore the two positioning systems act together as one positioning system.
  • As was described previously with respect to the wand 152, the positioning devices 224 of the brace 220 may alternatively be comprised of one or more ultra wide band (UWB) transmitters. In that regard, one or more UWB transmitters 224 are fixedly mounted to the brace 220 and operable to transmit sequential UWB signals to three or more UWB receivers (not shown) having known positions in the 3D coordinate system. Each UWB transmitter 224 is in communication with the computer 96, as are the plurality of UWB receivers (not shown). Accordingly, the computer 96 detects each time the UWB transmitter transmits a UWB signal, as well as the time at which the UWB signal was transmitted. Similarly, the computer 96 detects the position of each of the UWB receivers (not shown) in the 3D coordinate system, as well as the time at which the UWB signal was received. The computer 96 may then use the custom digital signal processing algorithms to accurately locate the leading-edge of the received UWB pulse based on the position of each UWB receiver (not shown), the time when each UWB signal was received, and the time that the UWB signal was transmitted. The position may then be determined by the TDOA calculation as was described with reference to FIG. 11). Again, because the ultrasound transducers 222 do not move with respect to the knee brace 220, any movement of the transducers 222 in space means that the brace 220 has moved. The movement of the knee brace 220 is tracked using the computer 96 in combination with the UWB transmitters 224 and the UWB receivers (not shown). Similarly, because the fixed orientation between the UWB transmitters 224 and the ultrasound transducers 222 changes in position in the 3D coordinate system, the UWB transmitter 224 may correspondingly be used to track movement of each bone.
  • In order to communicate information from the submodules 142, 144 to the computer 96, the brace 220 may include a transmitter 228, such as a UWB transmitter, in communication with the ultrasound transducer 222 to facilitate wireless communication of data to the computer 96. It should be noted that if UWB transmitter 228 is also utilized as the positioning devices 224, a dedicated transmitter 228 is unnecessary as the UWB transmitters 224 could function to also send ultrasound data directly to the computer 96 over a wireless link.
  • It should be understood that use of the transmitter 228 and a field programmable gate array design enables the computations to be cammed out on a real-time basis. For example, as patient's knee joint 50 (FIG. 1) is bent while wearing the brace 220, the ultrasound data is immediately transmitted to the computer 96, which in real-time, calculates and displays the position and movement of each bone with the 3D patient-specific bone model.
  • FIG. 18 illustrates a knee brace 230 in accordance with another embodiment of the present invention. The knee brace 230 has a first sub-brace 232 positioned at the distal portion of the femur 52, a second sub-brace 234 positioned at the proximal end of the tibia 54, and a third sub-brace 236 positioned at the patella 56 (FIG. 3). The sub-braces 232, 234, 236 include a plurality of transducers mounted thereto. Each transducer is responsible for determining the location of a point on the surface of the bone during movement of the knee joint 50. The sub-braces 232, 234, 236 reduce the occurrence of problems of locating and tracking the bone using ultrasound data when the motion of the bone relative to the skin is small compared to the gross joint motion. There are at least three approaches disclosed herein for tracking the motion of the ultrasound transducers themselves.
  • FIG. 19 illustrates the first approach commonly referred to herein as an “ITT” (individual transducer tracking) approach. In FIG. 19, each transducer 238 in the sub-brace 232 has an associated tracking module 240 to individually track each transducer 238. Using the ITT approach, the transducers 238 may be supported by a flexible length of strap.
  • Referencing FIG. 20, a sub-brace 241 according to the second approach is shown. The second approach, commonly referred to herein as an “ITML” (Inter-Transducers Mechanical Links) approach, involves the transducers 242 being connected to each other by movable mechanical links 244. Each mechanical link 244 includes length and angle sensors 246 that allow for detection of the movement of the transducers 242 relative to one another and the relative translational motions of the links 244. Every two links 244 are connected by a pivot pin 248 that allows rotation and translation of the links 244 relative to each other. The length and angle sensors 246 are mounted to at least one link 244 and proximate to the pivot pin 248 to allow for detection of the angle between adjacent 244 links. The ITML approach features a fewer number of localizers than the ITT approach of FIG. 19.
  • Referring now to FIG. 21, a sub-brace 249 according to the third approach is shown. The third approach, commonly referred to herein as a “RT” (Rotating Transducer) approach, involves using a single ultrasound transducer 250 that is mounted to a carriage 252. The carriage 252 traverses along a track 254, located on the inner circumference 256 of the sub-brace 249. For example, the carriage 252 may be moved along the track 254 by a string loop 258 that is wrapped around the drive shaft (not shown) of a motor 260. When the transducer 250 reaches the motor 260, the rotation direction of the motor 260 is changed and the transducer 250 moves in the opposite direction.
  • A tracking module 262 such as an inertia-based localizer is mounted to the transducer 250 to track its motion. As the transducer 250 rotates within the inner circumference 256 of the sub-brace 249, it collects data as to the bone-tissue interface. By using a single transducer 250, the RT approach includes the advantage of lower cost than the stationary transducer designs and higher accuracy due to the greater number of localized bone surface points for each tracking step, while maintain a mechanical flexibility.
  • Referring to FIG. 22, a localizer 270 of tracking each ultrasound transducer 238 (FIG. 19) mounted to the sub-brace 232 (FIG. 19) is shown. The localizer 270 comprises a plurality of nodes 272 with each node 272 comprising a CMOS accelerometer and a temperature sensor (not shown) for thermal drive comparison. Each node 272 is integrated to minimize noise and distortion. The outputs of the accelerometers 272 regarding the x-, y-, and z-coordinates and the temperature sensors (not shown) are directed to a multiplexer 274 (“MUX”) that multiplexes the signals. Multiplexed outputs are amplified by an amplifier 276 (“AMP”), and then directed to an ADC 278. The digital conversion of the signal may be performed within or outside the accelerometers 272. Outputting digital signals may then be directed to a wireless transmitter 280 by way of a parallel input/serial output device 282.
  • In FIG. 23 a design alternative for the sub-brace 232 is shown. The electronic architecture includes a high voltage amplifier circuit 286 (“HV IX AMP”) feeding a voltage multiplex circuit (“HV MUX”) 288 to excite each ultrasound transducer 238 and thereby acts as an analog switch. The echo signals from each transducer 238 are multiplexed pursuant to a logic control directing the opening of the switches in the MUX 290 at precise intervals. An exemplary logic control is the MSP430, available from Texas Instruments, Inc. (Dallas, Tex.). The output from the MUX 290 is I amplified by a low noise AMP 292 (“LNA”) and the signal is conditioned using a conditioning circuit (for example, a time-gain-control (“TGC”) circuit 294 and a band-pass filter (“BPF”) 296, and digitized using an ACS 298. Electric power to the foregoing components is supplied by way of a battery 300, which also supplies power to a wireless transmitter module 302. In exemplary form, the wireless transmitter module 302 utilizes a universe asynchronous receiver/transmitter (“UART”) protocol. The wireless transmitter module 302 includes a wireless transmitter circuit 304 receiving the output from a first in-first out (“FIFO”) buffer (not shown) of the ADS 298 by way of a serial interface 306. An output from the wireless transmitter circuit 304 is conveyed using a serial link coupled to an antenna 308. Signals conveyed through the antenna 308 are broadcast for reception by a wireless receiver (not shown) coupled to a controller (not shown) or the computer 96 (FIG. 6).
  • Referring now to FIGS. 24 and 25, an exemplary high voltage circuit 310 is shown and may be used to trigger and generate the excitation energy for a piezoelectric crystal in the ultrasound transducer 238 (FIG. 19). Exemplary high voltage circuits 10 for use in this embodiment may include, without limitation, the pulsar integrated circuit (HV379) available from Supertex, Inc (Sunnyvale, Calif.).
  • Referencing FIG. 26, an exemplary high voltage multiplexer 312 is shown and may be used to trigger and excite multiple piezoelectric transducers 238 (FIG. 19) without increasing the number of high voltage circuits 310 (FIG. 24). Exemplary high voltage multiplexer 312 for use in this embodiment may include, without limitation, the high voltage multiplexer (HV2221) available from Supertex, Inc (Sunnyvale, Calif.). The advantage of using a high voltage multiplexer 312 is the ability to use CMOS level control circuitry, thereby making the control logic compatible with virtually any microcontroller or field programmable gate array that is commercially-available.
  • Referring to FIGS. 23 and 27, an exemplary receiving circuit 314, which comprises the MUX 290, the LNA 292, the TGC 294, the BPF 296, and the ADC 298 is shown and may be utilized to receive the echo signals from each transducer 238. Exemplary receiving circuits 314 for use in the this embodiment include, without limitation, the AD9271 8-channel ultrasound receiving integrated circuits, available from Analog Devices, Inc. (Norwood, Mass.).
  • With reference now to FIGS. 28 and 29, one method 316 of using X-ray fluoroscopy and in-vivo measurements of dynamic knee kinematics, as described above, for understanding the effects of joint injuries, diseases, and evaluating the outcome of surgical procedures is described. In the particular illustrated embodiment, and using the two aforementioned techniques, six degrees of freedom (“DOF”) are determined for the knee joint 50 (FIG. 1) and include the position and orientation of each bone comprising the knee joint 50 (FIG. 1). The accuracy of this method 316 is within 1° of rotation and 1 mm of translation (except for translations that are parallel to the viewing plane).
  • Implementation of the method 316 includes joint movement visualization via the 3D model reconstruction with A-mode ultrasound system, as described previously. The method 316 also measures the vibrations produced to accurately localize the vibrational center and to determine the cause of the vibrations' occurrence.
  • Interpretation of the vibration and kinematic data is a complicated task involving an in-depth understanding of data acquisition, training data sets, signal analysis, as well as the mechanical system characteristics. Vibrations generated through the interactions of implant components, bones, and/or soft tissues result from induced by driving force leading to a dynamic response. The driving force may be associated with knee-ligament instability, bone properties, and conditions. A normal intact knee joint 50 (FIG. 1) will have a distinct pattern of motion and vibrational characteristics. Once degeneration or damage occurs to the knee joint 50 (FIG. 1), both the kinematic and vibrational characteristics change. This altering, for each type of injury or degeneration, leads to distinct changes (or signature) that may be captured by the kinematic and vibration methods described herein.
  • FIGS. 28-34 illustrate a diagnostic system 320 configured to perform the method 316 in accordance with one embodiment of the present invention. The diagnostic system 320 includes the ID module 90 configured to diagnose soft tissue and bone injuries. For example, a first patient having a normal knee joint and a second patient having an anterior cruciate ligament deficit (“ACLD”) may exhibit a similar pattern of posterior femoral translation during progressive knee flexion; however, the first and second patients exhibit different axial rotation patterns of 30° of knee flexion. Accordingly, the ID module 90 includes three stages: (1) a first stage that involves data analysis, (2) a second stage that includes sending the data to a neural network for detecting an injury, and (3) a third stage that classifies or determines severity of a detected injury.
  • The first stage includes acquisition of kinematic feature vectors, using multiple physiological measurements taken from the patient while the patient moves the knee joint 50 (FIG. 1) through a range of motion. Exemplary measurements may include, without limitation, medial condyle anteroposterior (“MAP”) motion and lateral condyle anteroposterior (“LAP”) motion. The LAP motion pertains to the anterior-posterior (“AP”) distance of the medial and lateral condyle points 110, 114 (FIG. 1) relative to a tibia geometric center. Other exemplary measurements may include lateral shear interferometer (“LSI”) measurement of the distance between the lateral femoral condyle 114 (FIG. 1) and the lateral tibial plateau 321 (FIG. 3), and medial shear interferometer (“MSI”) measurement of the distance between the medial, femoral condyle 310 (FIG. 1) and the medial tibial plateau 321 (FIG. 3) which includes the superior/inferior (“S/I”) distance of the lateral and medial condoyle points 114, 110 (FIG. 1) to a tibial plane, as shown in FIGS. 30A-30C.
  • Feature vectors may also include the femoral position with respect to the tibia which is defined by three Euler angles 340, three translation components with the vibrational signal 342, and force data 344. Examples of these vectors are shown in FIGS. 31A-31C, respectively. FIG. 32 is a graphical representation 346 showing the average medial and lateral condyle positions during a deep knee bend activity for the second patient having ACLD. The feature vectors that are extracted from the kinematic and vibration analyses are output to the neural network 98 (FIG. 6) for determining the injury, as described in greater detail below.
  • FIG. 33 illustrates one embodiment of a neural network classifier 322 having multiple binary outputs 323 a, 323 b, 323 c, 323 d, i.e., each output is either a “1” or “0,” wherein the “1” corresponding to “yes” and the “0” corresponding to “no.” In this neural network classifier 322, each output 323 a, 323 b, 323 c, 323 d represents the response of the neural network 98 (FIG. 1) to a particular injury type. For example, one output 323 b may represent the response for ACLD, wherein its state will be “1” if an ACL injury is detected, and “0” otherwise. Obviously, the neural network 98 (FIG. 1) and the classifier 322 may be significantly more or less sophisticated, depending on the underlying model of the joint in question.
  • FIG. 34 illustrates one embodiment of a construction 325 of the neural network 98 (FIG. 6). The construction 325 includes formulating a supervised classifier using a training set 324 of the kinematic and vibration data corresponding to a dataset 326 of normal and injured knee joints. The neural network 98 (FIG. 6) is trained with the training set 324 of vectors, wherein each vector consists of data (sound 328, kinematic 330, and force 332) collected from the knee joint 50 (FIG. 1).
  • Fluoroscopy data 333 may be used to calculate the kinematics. While fluoroscopy data 333 is highly accurate, it requires the patient to remain within the small working volume of the fluoroscope unit and subjects the patient to ionizing radiation for a prolonged period of time. For most dynamic activities where the joints are loaded, such as running, jumping, or other dynamic activities, fluoroscopy is an unacceptable alternative. Therefore, use of fluoroscopy data 333 is not required.
  • It should further be noted that electromyography (“EMG”) electrodes 337 (FIG. 6) may also be utilized as a data input for the computer 96 (FIG. 6) and the neural network 98 (FIG. 6). In this fashion, one or more EMG electrodes 337 (FIG. 6) are mounted to the surface of the skin proximate the muscles adjacent the knee joint 50 (FIG. 1) to monitor the electrical signal transmitted to the muscles in order to provide relevant data of a muscle injury or disorder.
  • Once the neural network 98 (FIG. 6) is trained, it may be used to classify new cases and categorize an injury type using these kinematic 330, vibration 328, and force 332 data. Those skilled in the art will readily understand that the types and classifications desired to be accommodated by the neural network 98 (FIG. 6) necessarily include training the neural network 98 (FIG. 6) on these very types of classifications. Exemplary types and classifications of injuries to mammalian knee joints include, without limitation, osteoarthritic conditions, soft tissue damage, and abnormal growths. Likewise, the neural network 98 (FIG. 6) needs to be trained to differentiate between and normal and abnormal knee conditions.
  • Referring again to FIG. 29, for a new patient 326, acquired vibrational, kinematic, and force features 328, 330, 332 the knee joint 50 (FIG. 1) are compiled and input as a testing set 327 to the trained neural network 334. The trained neural network 334 then diagnoses the condition of the knee joint 50 (FIG. 1), and returns one of the outputs 323 a, 323 b, 323 c, 323 d.
  • Although now shown, some embodiments of the method may be adapted so that the testing set 327 is acquired outside of a clinical setting. For example, a knee brace in accordance with an embodiment of the present invention may be worn by a patient for an extended period of time while performing normal activities. For example, the patient may wear a device incorporating components of at least one of the JKT module 86 (FIG. 6), the VA module 88 (FIG. 6), and the foot module 92 (FIG. 6) during activities that are not reproducible in the office (for example, weight lifting, racquet ball, etc.) and that elicit the pain or patient's symptoms. In some embodiments, the patient may turn the device on immediately prior to the activity and/or the patient may mark onset of the pain or symptoms when it occurs. This enables analysis of the data range from few seconds before the marked time to see what abnormal sounds or joint kinematic were occurring.
  • Data may be stored on a portable hard drive (or any other portable storage device) and then may be downloaded to exemplary systems for analysis. The data can be wirelessly transmitted and stored in a computer. It can also be stored with a miniature memory drive if field data is desired. If the occurrence of the pain is more random, some embodiments of the devices may continuously acquire data. Although, continuously monitoring devices may require a larger data storage capacity.
  • It is understood that while the exemplary embodiments have been described herein with respect to the knee joint 50 (FIG. 1), those skilled in the art will readily understand that the aforementioned embodiments may be easily adapted to other joints of the musculoskeletal system of a mammalian animal. For example, embodiments may be adapted for use on hips, ankles, toes, spines, shoulders, elbows, wrists, fingers, and temporomandibular joints.
  • While the present invention has been illustrated by a description of various embodiments, and while these embodiments have been described in some detail, they are not intended to restrict or in any way limit the scope of the disclosed invention. Additional advantages and modifications will readily appear to those skilled in the art. The various features of the present invention may be used alone or in any combination depending on the needs and preferences of the user. This has been a description of the present invention, along with methods of practicing the present invention as currently known.

Claims (26)

1. A device for acquiring data and diagnosing a musculoskeletal injury, the device comprising:
a semi-flexible housing configured to be positioned proximate a portion of the musculoskeletal system of a patient;
at least one ultrasonic transducer operably coupled to the housing and configured to acquire an ultrasonic data indicative of a bone surface;
a positional localizer operably coupled to the housing at a select location relative to the at least one ultrasonic transducer, the positional localizer configured to track movement of the housing; and
a transmission system operably coupled to the housing and configured to transmit the ultrasonic data from the at least one ultrasonic transducer and the movement data from the positional localizer to a data analyzer for analysis and diagnosis.
2. The device of claim 1, wherein the device is a brace configured to surround the portion of the musculoskeletal system for diagnosis.
3. The device of claim 2, wherein the brace is a knee brace for diagnosing a knee injury, the knee brace further comprising:
a first ultrasonic transducer positioned proximate the distal femur; and
a second ultrasonic transducer positioned proximate the proximal tibia.
4. The device of claim 3, wherein the first ultrasonic transducer, the second ultrasonic transducer, or both is comprised of an individual transducer tracking unit.
5. The device of claim 3, wherein the first ultrasonic transducer, the second ultrasonic transducer, or both is comprised of an inter-transducer mechanical link unit.
6. The device of claim 3, wherein the first ultrasonic transducer, the second ultrasonic transducer, or both is comprised of a rotating transducer unit.
7. The device of claim 1, wherein the positional localizer is an optical sensor device, an inertial measurement unit device, an ultra-wide band sensor device, or a combination thereof.
8. The device of claim 1, further comprising:
a vibrational sensor operably coupled to the housing and configured to acquire a vibration signal generated during movement of the portion of the musculoskeletal system.
9. The device of claim 8, wherein the vibrational sensor comprises at least one accelerometer.
10. A method of diagnosing a musculoskeletal injury, the method comprising:
creating a 3D model of a portion of the musculoskeletal system of a patient;
acquiring a feature data with a sensor positioned proximate the portion of the musculoskeletal system while the portion is articulated;
comparing, with a neural network, the acquired feature data with a database of feature data, wherein the database of feature data includes a dataset representative of the musculoskeletal injury; and
returning a diagnosis based on the comparing.
11. The method of claim 10, further comprising:
positioning a sensor proximate the portion of the musculoskeletal system;
operating the sensor to acquire the feature data; and
transferring the acquired feature data to the neural network.
12. The method of claim 11, wherein the sensor is an ultrasound transducer and the feature data includes an ultrasonic signal indicative of a bone surface, the method further comprising:
tracking a position of the ultrasound transducer relative to the portion of the musculoskeletal system.
13. The method of claim 10, where creating a 3D model further comprises:
acquiring structural data indicative of a surface of a bone within the portion of the musculoskeletal system; and
morphing a general bone model in accordance with the structural data.
14. The method of claim 13, wherein the structural data includes an ultrasonic signal, a computerized tomography data, a fluoroscopy data, or a combination thereof.
15. The method of claim 10, wherein the feature data includes a vibrational data, a kinematic data, a contact force data, or a combination thereof.
16. The method of claim 15, wherein the feature data comprises the vibrational data and the kinematic data, the vibrational data being time-synchronized with the kinematic data.
17. The method of claim 10, wherein comparing the acquired feature data further comprises:
training the neural network with a plurality of datasets, wherein at least one of the plurality of datasets is the dataset representative of the musculoskeletal injury.
18. The method of claim 10, wherein the feature data includes a shear measurement, at least one Euler angle, a translational component, a force data, or a combination thereof.
19. The method of claim 10, further comprising:
displaying the returned diagnosis, the 3D model, the acquired feature data, or a combination thereof on a user interface.
20. A diagnostic system for diagnosing a musculoskeletal injury, the diagnostic system comprising:
a 3D model reconstruction module configured to acquire a structural data indicative of a bone surface within a portion of the musculoskeletal system of a patient and to construct a patient-specific model from the structural data;
a kinematics tracking module configured to acquire a movement data while the portion of the musculoskeletal system is articulated;
a vibroarthography module configured to acquire a vibration data generated during the articulation; and
an intelligent diagnosis module configured to receive and analyze the structural data, the movement data, and the vibration data and to determine an injury type from the analysis.
21. The diagnostic system of claim 20, wherein the 3D model reconstruction module further comprises:
an ultrasound transducer configured to acquire an ultrasonic signal indicative of the bone surface;
a position sensor having a select location relative to the ultrasound transducer, the position sensor configured to track movement of the portion of the musculoskeletal system; and
a statistical bone atlas comprising a plurality of bone models, wherein at least one of the plurality of bone models is morphed in accordance with the ultrasonic signal.
22. The diagnostic system of claim 20, wherein the kinematics tracking module further comprises:
a brace configured to be positioned proximate the portion of the musculoskeletal system;
at least one ultrasonic transducer operably coupled to the brace and configured to acquire an ultrasonic data indicative of the bone surface;
a positional localizer operably coupled to the brace at a select location relative to the at least one ultrasonic transducer, the positional localizer configured to track movement of the brace; and
a transmission system operably coupled to the brace and configured to transmit the ultrasonic data from the at least one ultrasonic transducer and the movement data from the positional localizer to the intelligent diagnosis module.
23. The diagnostic system of claim 20, wherein the vibroarthography module further comprises:
at least one vibrational sensor positioned proximate the portion of the musculoskeletal system; and
a transmission system configured to transmit the vibration data from the at least one vibrational sensor to the intelligent diagnosis module.
24. The diagnostic system of claim 20, wherein the intelligent diagnosis module further comprises:
a neural network configured to compare the movement data, the vibration data, or both to a database comprising of movement, vibrational, and injury data, wherein the database includes the movement data or the vibration data and an associated musculoskeletal injury type;
at least one transformation configured to transfer an acquired data to a virtual data; and
a statistical atlas comprising a plurality of bone models, wherein at least one of the plurality of bone models is morphed in accordance with the structural data to construct the patient-specific model.
25. The diagnostic system of claim 20, further comprising:
a contact force module configured to acquire a pressure data while the portion of the musculoskeletal system is articulated.
26. The diagnostic system of claim 25, wherein the contract for module comprises:
a shoe insole configured to be positioned on a foot of a patient;
a plurality of pressure sensors operably coupled to the shoe insole and arranged in a pattern; and
a transmission system operably coupled to the shoe insole and configured to transmit the pressure data from the plurality of pressure sensors to the intelligent diagnosis module.
US13/196,701 2009-02-02 2011-08-02 Noninvasive diagnostic system Abandoned US20120029345A1 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
US13/196,701 US20120029345A1 (en) 2009-02-02 2011-08-02 Noninvasive diagnostic system
US13/841,632 US20130211259A1 (en) 2009-02-02 2013-03-15 Determination of joint condition based on vibration analysis
US13/841,402 US9642572B2 (en) 2009-02-02 2013-03-15 Motion Tracking system with inertial-based sensing units
US15/478,148 US11004561B2 (en) 2009-02-02 2017-04-03 Motion tracking system with inertial-based sensing units
US17/181,372 US20210193313A1 (en) 2009-02-02 2021-02-22 Motion Tracking System with Inertial-Based Sensing Units

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US12/364,267 US8444564B2 (en) 2009-02-02 2009-02-02 Noninvasive diagnostic system
PCT/US2010/022939 WO2010088696A1 (en) 2009-02-02 2010-02-02 Noninvasive diagnostic system
WOUS/PCT2010/022939 2010-02-02
US13/196,701 US20120029345A1 (en) 2009-02-02 2011-08-02 Noninvasive diagnostic system

Related Parent Applications (2)

Application Number Title Priority Date Filing Date
US12/364,267 Continuation-In-Part US8444564B2 (en) 2009-02-02 2009-02-02 Noninvasive diagnostic system
PCT/US2010/022939 Continuation-In-Part WO2010088696A1 (en) 2009-02-02 2010-02-02 Noninvasive diagnostic system

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US13/841,632 Continuation-In-Part US20130211259A1 (en) 2009-02-02 2013-03-15 Determination of joint condition based on vibration analysis
US13/841,402 Continuation-In-Part US9642572B2 (en) 2009-02-02 2013-03-15 Motion Tracking system with inertial-based sensing units

Publications (1)

Publication Number Publication Date
US20120029345A1 true US20120029345A1 (en) 2012-02-02

Family

ID=42396088

Family Applications (8)

Application Number Title Priority Date Filing Date
US12/364,267 Active 2030-11-26 US8444564B2 (en) 2009-02-02 2009-02-02 Noninvasive diagnostic system
US13/196,701 Abandoned US20120029345A1 (en) 2009-02-02 2011-08-02 Noninvasive diagnostic system
US13/841,402 Active 2030-06-06 US9642572B2 (en) 2009-02-02 2013-03-15 Motion Tracking system with inertial-based sensing units
US13/898,092 Active 2032-10-16 US11342071B2 (en) 2009-02-02 2013-05-20 Noninvasive diagnostic system
US15/478,148 Active US11004561B2 (en) 2009-02-02 2017-04-03 Motion tracking system with inertial-based sensing units
US17/181,372 Pending US20210193313A1 (en) 2009-02-02 2021-02-22 Motion Tracking System with Inertial-Based Sensing Units
US17/704,376 Active US11776686B2 (en) 2009-02-02 2022-03-25 Noninvasive diagnostic system
US18/238,338 Active US11935648B1 (en) 2009-02-02 2023-08-25 Noninvasive diagnostic system

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US12/364,267 Active 2030-11-26 US8444564B2 (en) 2009-02-02 2009-02-02 Noninvasive diagnostic system

Family Applications After (6)

Application Number Title Priority Date Filing Date
US13/841,402 Active 2030-06-06 US9642572B2 (en) 2009-02-02 2013-03-15 Motion Tracking system with inertial-based sensing units
US13/898,092 Active 2032-10-16 US11342071B2 (en) 2009-02-02 2013-05-20 Noninvasive diagnostic system
US15/478,148 Active US11004561B2 (en) 2009-02-02 2017-04-03 Motion tracking system with inertial-based sensing units
US17/181,372 Pending US20210193313A1 (en) 2009-02-02 2021-02-22 Motion Tracking System with Inertial-Based Sensing Units
US17/704,376 Active US11776686B2 (en) 2009-02-02 2022-03-25 Noninvasive diagnostic system
US18/238,338 Active US11935648B1 (en) 2009-02-02 2023-08-25 Noninvasive diagnostic system

Country Status (5)

Country Link
US (8) US8444564B2 (en)
EP (2) EP2391971B1 (en)
JP (3) JP5723788B2 (en)
CA (5) CA2751422C (en)
WO (1) WO2010088696A1 (en)

Cited By (112)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110092804A1 (en) * 2006-02-27 2011-04-21 Biomet Manufacturing Corp. Patient-Specific Pre-Operative Planning
US20110132094A1 (en) * 2008-07-22 2011-06-09 The University Of Tokyo Ultrasonic probe support device
US20130110011A1 (en) * 2010-06-22 2013-05-02 Stephen J. McGregor Method of monitoring human body movement
US20130185310A1 (en) * 2012-01-16 2013-07-18 Emovi Inc. Method and system for human joint treatment plan and personalized surgery planning using 3-d kinematics, fusion imaging and simulation
US20130217998A1 (en) * 2009-02-02 2013-08-22 Jointvue, Llc Motion Tracking System with Inertial-Based Sensing Units
US20140159959A1 (en) * 2012-07-11 2014-06-12 Digimarc Corporation Body-worn phased-array antenna
WO2014122544A1 (en) * 2013-02-11 2014-08-14 Koninklijke Philips N.V. Ultrasound imaging system and method
US9060788B2 (en) 2012-12-11 2015-06-23 Biomet Manufacturing, Llc Patient-specific acetabular guide for anterior approach
US9066734B2 (en) 2011-08-31 2015-06-30 Biomet Manufacturing, Llc Patient-specific sacroiliac guides and associated methods
US9076212B2 (en) 2006-05-19 2015-07-07 The Queen's Medical Center Motion tracking system for real time adaptive imaging and spectroscopy
US9084618B2 (en) 2011-06-13 2015-07-21 Biomet Manufacturing, Llc Drill guides for confirming alignment of patient-specific alignment guides
US9113971B2 (en) 2006-02-27 2015-08-25 Biomet Manufacturing, Llc Femoral acetabular impingement guide
US9173661B2 (en) 2006-02-27 2015-11-03 Biomet Manufacturing, Llc Patient specific alignment guide with cutting surface and laser indicator
US9173666B2 (en) 2011-07-01 2015-11-03 Biomet Manufacturing, Llc Patient-specific-bone-cutting guidance instruments and methods
US20150313546A1 (en) * 2004-03-05 2015-11-05 Depuy International Limited Orthopaedic monitoring system, methods and apparatus
US9204977B2 (en) 2012-12-11 2015-12-08 Biomet Manufacturing, Llc Patient-specific acetabular guide for anterior approach
US9237950B2 (en) 2012-02-02 2016-01-19 Biomet Manufacturing, Llc Implant with patient-specific porous structure
US20160015319A1 (en) * 2013-03-07 2016-01-21 The Regents Of The University Of California System for health monitoring on prosthetic and fixation devices
US9241745B2 (en) 2011-03-07 2016-01-26 Biomet Manufacturing, Llc Patient-specific femoral version guide
US9271744B2 (en) 2010-09-29 2016-03-01 Biomet Manufacturing, Llc Patient-specific guide for partial acetabular socket replacement
WO2016007936A3 (en) * 2014-07-10 2016-03-17 Mahfouz Mohamed R Bone reconstruction and orthopedic implants
US9289253B2 (en) 2006-02-27 2016-03-22 Biomet Manufacturing, Llc Patient-specific shoulder guide
US9295497B2 (en) 2011-08-31 2016-03-29 Biomet Manufacturing, Llc Patient-specific sacroiliac and pedicle guides
US9301812B2 (en) 2011-10-27 2016-04-05 Biomet Manufacturing, Llc Methods for patient-specific shoulder arthroplasty
US9305365B2 (en) 2013-01-24 2016-04-05 Kineticor, Inc. Systems, devices, and methods for tracking moving targets
US9339278B2 (en) 2006-02-27 2016-05-17 Biomet Manufacturing, Llc Patient-specific acetabular guides and associated instruments
US9351743B2 (en) 2011-10-27 2016-05-31 Biomet Manufacturing, Llc Patient-specific glenoid guides
US9386993B2 (en) 2011-09-29 2016-07-12 Biomet Manufacturing, Llc Patient-specific femoroacetabular impingement instruments and methods
US9393028B2 (en) 2009-08-13 2016-07-19 Biomet Manufacturing, Llc Device for the resection of bones, method for producing such a device, endoprosthesis suited for this purpose and method for producing such an endoprosthesis
US9408616B2 (en) 2014-05-12 2016-08-09 Biomet Manufacturing, Llc Humeral cut guide
US20160234369A1 (en) * 2015-02-06 2016-08-11 Samsung Electronics Co., Ltd. Multi-purpose device including mobile terminal and sensing device using radio-wave based sensor module
US9427320B2 (en) 2011-08-04 2016-08-30 Biomet Manufacturing, Llc Patient-specific pelvic implants for acetabular reconstruction
US9445907B2 (en) 2011-03-07 2016-09-20 Biomet Manufacturing, Llc Patient-specific tools and implants
US9451973B2 (en) 2011-10-27 2016-09-27 Biomet Manufacturing, Llc Patient specific glenoid guide
US9456833B2 (en) 2010-02-26 2016-10-04 Biomet Sports Medicine, Llc Patient-specific osteotomy devices and methods
US20160291860A1 (en) * 2013-10-08 2016-10-06 Sony Computer Entertainment Inc. Information processing device
EP2967440A4 (en) * 2013-03-15 2016-10-12 Jointvue Llc Determination of joint condition based on vibration analysis
US9468538B2 (en) 2009-03-24 2016-10-18 Biomet Manufacturing, Llc Method and apparatus for aligning and securing an implant relative to a patient
US9474539B2 (en) 2011-04-29 2016-10-25 Biomet Manufacturing, Llc Patient-specific convertible guides
US9480490B2 (en) 2006-02-27 2016-11-01 Biomet Manufacturing, Llc Patient-specific guides
US9480580B2 (en) 2006-02-27 2016-11-01 Biomet Manufacturing, Llc Patient-specific acetabular alignment guides
US9498233B2 (en) 2013-03-13 2016-11-22 Biomet Manufacturing, Llc. Universal acetabular guide and associated hardware
CN106170705A (en) * 2013-12-09 2016-11-30 穆罕默德·R·马赫福兹 Skeletal reconstruction and Orthopeadic Surgery implant
WO2016191753A1 (en) 2015-05-27 2016-12-01 Georgia Tech Research Corporation Wearable technologies for joint health assessment
WO2016191813A1 (en) * 2015-06-01 2016-12-08 Latey Penelope Jane Foot muscle biofeedback unit
US9517145B2 (en) 2013-03-15 2016-12-13 Biomet Manufacturing, Llc Guide alignment system and method
US9522010B2 (en) 2006-02-27 2016-12-20 Biomet Manufacturing, Llc Patient-specific orthopedic instruments
US9526437B2 (en) 2012-11-21 2016-12-27 i4c Innovations Inc. Animal health and wellness monitoring using UWB radar
US9539013B2 (en) 2006-02-27 2017-01-10 Biomet Manufacturing, Llc Patient-specific elbow guides and associated methods
US9554910B2 (en) 2011-10-27 2017-01-31 Biomet Manufacturing, Llc Patient-specific glenoid guide and implants
US9561040B2 (en) 2014-06-03 2017-02-07 Biomet Manufacturing, Llc Patient-specific glenoid depth control
US9572590B2 (en) 2006-10-03 2017-02-21 Biomet Uk Limited Surgical instrument
US9579107B2 (en) 2013-03-12 2017-02-28 Biomet Manufacturing, Llc Multi-point fit for patient specific guide
US9606209B2 (en) 2011-08-26 2017-03-28 Kineticor, Inc. Methods, systems, and devices for intra-scan motion correction
EP3171286A1 (en) * 2015-11-17 2017-05-24 Universitat De València, Estudi General Methods for determining an identifier for use in methods for diagnosing haemophilic arthropathy, methods and apparatus for diagnosing
US9662127B2 (en) 2006-02-27 2017-05-30 Biomet Manufacturing, Llc Patient-specific acetabular guides and associated instruments
US9662216B2 (en) 2006-02-27 2017-05-30 Biomet Manufacturing, Llc Patient-specific hip joint devices
US9675400B2 (en) 2011-04-19 2017-06-13 Biomet Manufacturing, Llc Patient-specific fracture fixation instrumentation and method
US9717510B2 (en) 2011-04-15 2017-08-01 Biomet Manufacturing, Llc Patient-specific numerically controlled instrument
US9717461B2 (en) 2013-01-24 2017-08-01 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
US9734589B2 (en) 2014-07-23 2017-08-15 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
US9743940B2 (en) 2011-04-29 2017-08-29 Biomet Manufacturing, Llc Patient-specific partial knee guides and other instruments
US9757238B2 (en) 2011-06-06 2017-09-12 Biomet Manufacturing, Llc Pre-operative planning and manufacturing method for orthopedic procedure
US9782141B2 (en) 2013-02-01 2017-10-10 Kineticor, Inc. Motion tracking system for real time adaptive motion compensation in biomedical imaging
US9795399B2 (en) 2006-06-09 2017-10-24 Biomet Manufacturing, Llc Patient-specific knee alignment guide and associated method
US20170312099A1 (en) * 2016-04-28 2017-11-02 medFit Beratungs-und Beteiligungsges.m.B.H. Dynamic Ligament Balancing System
US9820868B2 (en) 2015-03-30 2017-11-21 Biomet Manufacturing, Llc Method and apparatus for a pin apparatus
US9826981B2 (en) 2013-03-13 2017-11-28 Biomet Manufacturing, Llc Tangential fit of patient-specific guides
US9826994B2 (en) 2014-09-29 2017-11-28 Biomet Manufacturing, Llc Adjustable glenoid pin insertion guide
US9833245B2 (en) 2014-09-29 2017-12-05 Biomet Sports Medicine, Llc Tibial tubercule osteotomy
US9839438B2 (en) 2013-03-11 2017-12-12 Biomet Manufacturing, Llc Patient-specific glenoid guide with a reusable guide holder
US9839436B2 (en) 2014-06-03 2017-12-12 Biomet Manufacturing, Llc Patient-specific glenoid depth control
US9861387B2 (en) 2006-06-09 2018-01-09 Biomet Manufacturing, Llc Patient-specific knee alignment guide and associated method
US9907659B2 (en) 2007-04-17 2018-03-06 Biomet Manufacturing, Llc Method and apparatus for manufacturing an implant
US9918740B2 (en) 2006-02-27 2018-03-20 Biomet Manufacturing, Llc Backup surgical instrument system and method
US9943247B2 (en) 2015-07-28 2018-04-17 The University Of Hawai'i Systems, devices, and methods for detecting false movements for motion correction during a medical imaging scan
US9968376B2 (en) 2010-11-29 2018-05-15 Biomet Manufacturing, Llc Patient-specific orthopedic instruments
US9993344B2 (en) 2006-06-09 2018-06-12 Biomet Manufacturing, Llc Patient-modified implant
US10004462B2 (en) 2014-03-24 2018-06-26 Kineticor, Inc. Systems, methods, and devices for removing prospective motion correction from medical imaging scans
WO2018162808A1 (en) * 2017-01-10 2018-09-13 Braindex S.A.S Physiological sensor for near-infrared spectroscopy at various depths
WO2018213749A1 (en) * 2017-05-18 2018-11-22 Smith & Nephew, Inc. Systems and methods for determining the position and orientation of an implant for joint replacement surgery
US10149617B2 (en) 2013-03-15 2018-12-11 i4c Innovations Inc. Multiple sensors for monitoring health and wellness of an animal
US10159498B2 (en) 2008-04-16 2018-12-25 Biomet Manufacturing, Llc Method and apparatus for manufacturing an implant
US20190014174A1 (en) * 2010-12-17 2019-01-10 Amazon Technologies, Inc. Personal Remote Storage for Purchased Electronic Content Items
US10226262B2 (en) 2015-06-25 2019-03-12 Biomet Manufacturing, Llc Patient-specific humeral guide designs
US10278711B2 (en) 2006-02-27 2019-05-07 Biomet Manufacturing, Llc Patient-specific femoral guide
US10282488B2 (en) 2014-04-25 2019-05-07 Biomet Manufacturing, Llc HTO guide with optional guided ACL/PCL tunnels
CN109801278A (en) * 2019-01-21 2019-05-24 燕山大学 A kind of high-speed slide electrical contact movement pair surface damage classifying method
US10327708B2 (en) 2013-01-24 2019-06-25 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
EP3359048A4 (en) * 2015-10-08 2019-07-17 Decision Sciences Medical Company, LLC Acoustic orthopedic tracking system and methods
EP3552538A1 (en) 2013-03-15 2019-10-16 Joint Vue, LLC Motion tracking system with inertial-based sensing units
US10492798B2 (en) 2011-07-01 2019-12-03 Biomet Manufacturing, Llc Backup kit for a patient-specific arthroplasty kit assembly
US20200022641A1 (en) * 2012-05-17 2020-01-23 Alan N. Schwartz Localization Of The Parathyroid
US10568647B2 (en) 2015-06-25 2020-02-25 Biomet Manufacturing, Llc Patient-specific humeral guide designs
US10603179B2 (en) 2006-02-27 2020-03-31 Biomet Manufacturing, Llc Patient-specific augments
US10631231B2 (en) * 2012-10-22 2020-04-21 The Nielsen Company (Us), Llc Systems and methods for wirelessly modifying detection characteristics of portable devices
WO2020101569A1 (en) 2018-11-14 2020-05-22 Precision Medical Pte Ltd Method and device for measuring anatomical movement of a joint
US10716515B2 (en) 2015-11-23 2020-07-21 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
US10722310B2 (en) 2017-03-13 2020-07-28 Zimmer Biomet CMF and Thoracic, LLC Virtual surgery planning system and method
US10743838B2 (en) 2015-02-25 2020-08-18 Decision Sciences Medical Company, LLC Acoustic signal transmission couplants and coupling mediums
US10918333B2 (en) 2017-11-30 2021-02-16 Bruin Biometrics, Llc Implant evaluation using acoustic emissions
US10993699B2 (en) 2011-10-28 2021-05-04 Decision Sciences International Corporation Spread spectrum coded waveforms in ultrasound diagnostics
US11096661B2 (en) 2013-09-13 2021-08-24 Decision Sciences International Corporation Coherent spread-spectrum coded waveforms in synthetic aperture image formation
US11154274B2 (en) 2019-04-23 2021-10-26 Decision Sciences Medical Company, LLC Semi-rigid acoustic coupling articles for ultrasound diagnostic and treatment applications
US11179165B2 (en) 2013-10-21 2021-11-23 Biomet Manufacturing, Llc Ligament guide registration
US20220183591A1 (en) * 2020-12-16 2022-06-16 Polar Electro Oy Biomechanical modelling of motion measurements
US11419618B2 (en) 2011-10-27 2022-08-23 Biomet Manufacturing, Llc Patient-specific glenoid guides
US11520043B2 (en) 2020-11-13 2022-12-06 Decision Sciences Medical Company, LLC Systems and methods for synthetic aperture ultrasound imaging of an object
WO2022266254A1 (en) * 2021-06-16 2022-12-22 Kinisi Inc Wearable imaging system for measuring bone displacement
US11813049B2 (en) 2013-12-09 2023-11-14 Techmah Medical Llc Bone reconstruction and orthopedic implants
US11862348B2 (en) * 2013-03-13 2024-01-02 Blue Belt Technologies, Inc. Systems and methods for using generic anatomy models in surgical planning
US11877870B2 (en) 2019-08-05 2024-01-23 Consultation Semperform Inc Systems, methods and apparatus for prevention of injury

Families Citing this family (215)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7559931B2 (en) 2003-06-09 2009-07-14 OrthAlign, Inc. Surgical orientation system and method
AU2007351804B2 (en) * 2007-04-19 2013-09-05 Mako Surgical Corp. Implant planning using captured joint motion information
ES2683029T3 (en) 2008-07-24 2018-09-24 OrthAlign, Inc. Joint replacement systems
AU2009291743B2 (en) 2008-09-10 2015-02-05 Orthalign, Inc Hip surgery systems and methods
US8647287B2 (en) 2008-12-07 2014-02-11 Andrew Greenberg Wireless synchronized movement monitoring apparatus and system
US9364291B2 (en) * 2008-12-11 2016-06-14 Mako Surgical Corp. Implant planning using areas representing cartilage
US20100331733A1 (en) * 2009-06-30 2010-12-30 Orthosensor Sensing device and method for an orthopedic joint
US8826733B2 (en) 2009-06-30 2014-09-09 Orthosensor Inc Sensored prosthetic component and method
US8714009B2 (en) 2010-06-29 2014-05-06 Orthosensor Inc. Shielded capacitor sensor system for medical applications and method
US9462964B2 (en) 2011-09-23 2016-10-11 Orthosensor Inc Small form factor muscular-skeletal parameter measurement system
US8701484B2 (en) 2010-06-29 2014-04-22 Orthosensor Inc. Small form factor medical sensor structure and method therefor
US8720270B2 (en) 2010-06-29 2014-05-13 Ortho Sensor Inc. Prosthetic component for monitoring joint health
US9259179B2 (en) 2012-02-27 2016-02-16 Orthosensor Inc. Prosthetic knee joint measurement system including energy harvesting and method therefor
US8707782B2 (en) 2009-06-30 2014-04-29 Orthosensor Inc Prosthetic component for monitoring synovial fluid and method
US8679186B2 (en) 2010-06-29 2014-03-25 Ortho Sensor Inc. Hermetically sealed prosthetic component and method therefor
US8118815B2 (en) 2009-07-24 2012-02-21 OrthAlign, Inc. Systems and methods for joint replacement
US10869771B2 (en) 2009-07-24 2020-12-22 OrthAlign, Inc. Systems and methods for joint replacement
EP2525740A4 (en) * 2010-01-21 2016-01-20 Orthalign Inc Systems and methods for joint replacement
US20130079675A1 (en) 2011-09-23 2013-03-28 Orthosensor Insert measuring system having an internal sensor assembly
US10512451B2 (en) 2010-08-02 2019-12-24 Jointvue, Llc Method and apparatus for three dimensional reconstruction of a joint using ultrasound
US20130144135A1 (en) * 2011-08-02 2013-06-06 Mohamed R. Mahfouz Method and apparatus for three dimensional reconstruction of a joint using ultrasound
US20120123252A1 (en) * 2010-11-16 2012-05-17 Zebris Medical Gmbh Imaging apparatus for large area imaging of a body portion
US20140128689A1 (en) * 2011-04-06 2014-05-08 Northeastern University Joint sensor devices and methods
WO2013025613A1 (en) * 2011-08-12 2013-02-21 Jointvue, Llc 3-d ultrasound imaging device and methods
HUP1100471A2 (en) * 2011-08-30 2013-04-29 Bay Zoltan Alkalmazott Kutatasi Koezhasznu Nonprofit Kft Method and instrument for detecting equilibrium and intelligent insole suitable for monitoring walking parameters
US9414940B2 (en) 2011-09-23 2016-08-16 Orthosensor Inc. Sensored head for a measurement tool for the muscular-skeletal system
US9839374B2 (en) 2011-09-23 2017-12-12 Orthosensor Inc. System and method for vertebral load and location sensing
US8911448B2 (en) 2011-09-23 2014-12-16 Orthosensor, Inc Device and method for enabling an orthopedic tool for parameter measurement
CA3194212A1 (en) 2011-10-14 2013-04-18 Jointvue, Llc Real-time 3-d ultrasound reconstruction of knee and its implications for patient specific implants and 3-d joint injections
US11247040B2 (en) 2011-11-15 2022-02-15 Neurometrix, Inc. Dynamic control of transcutaneous electrical nerve stimulation therapy using continuous sleep detection
US10112040B2 (en) 2011-11-15 2018-10-30 Neurometrix, Inc. Transcutaneous electrical nerve stimulation using novel unbalanced biphasic waveform and novel electrode arrangement
US10335595B2 (en) * 2011-11-15 2019-07-02 Neurometrix, Inc. Dynamic control of transcutaneous electrical nerve stimulation therapy using continuous sleep detection
EP2780073B1 (en) 2011-11-15 2017-09-13 NeuroMetrix, Inc. Apparatus for relieving pain using transcutaneous electrical nerve stimulation
US11259744B2 (en) 2011-11-15 2022-03-01 Neurometrix, Inc. Transcutaneous electrical nerve stimulator with automatic detection of leg orientation and leg motion for enhanced sleep analysis, including enhanced transcutaneous electrical nerve stimulation (TENS) using the same
US9675801B2 (en) 2011-11-15 2017-06-13 Neurometrix, Inc. Measuring the “on-skin” time of a transcutaneous electrical nerve stimulator (TENS) device in order to minimize skin irritation due to excessive uninterrupted wearing of the same
US10279179B2 (en) 2013-04-15 2019-05-07 Neurometrix, Inc. Transcutaneous electrical nerve stimulator with automatic detection of user sleep-wake state
NL2008437C2 (en) * 2012-01-19 2013-07-22 Clinical Graphics B V Process to generate a computer-accessible medium comprising information on the functioning of a joint.
EP3466365A1 (en) 2012-02-07 2019-04-10 Joint Vue, LLC Three-dimensional guided injection device and methods
US9271675B2 (en) 2012-02-27 2016-03-01 Orthosensor Inc. Muscular-skeletal joint stability detection and method therefor
US9844335B2 (en) 2012-02-27 2017-12-19 Orthosensor Inc Measurement device for the muscular-skeletal system having load distribution plates
US9622701B2 (en) 2012-02-27 2017-04-18 Orthosensor Inc Muscular-skeletal joint stability detection and method therefor
JP2015517361A (en) 2012-05-18 2015-06-22 オースアライン・インコーポレイテッド Apparatus and method for knee arthroplasty
US9649160B2 (en) 2012-08-14 2017-05-16 OrthAlign, Inc. Hip replacement navigation system and method
US20140135744A1 (en) 2012-11-09 2014-05-15 Orthosensor Inc Motion and orientation sensing module or device for positioning of implants
US10314733B2 (en) * 2012-12-20 2019-06-11 Elwha Llc Sensor-based control of active wearable system
US9345609B2 (en) 2013-01-11 2016-05-24 Elwha Llc Position sensing active torso support
ES2672296T3 (en) * 2012-12-31 2018-06-13 Mako Surgical Corporation Alignment systems using an ultrasonic probe
US11793424B2 (en) 2013-03-18 2023-10-24 Orthosensor, Inc. Kinetic assessment and alignment of the muscular-skeletal system and method therefor
US9408557B2 (en) 2013-03-18 2016-08-09 Orthosensor Inc. System and method to change a contact point of the muscular-skeletal system
US10940311B2 (en) 2013-03-29 2021-03-09 Neurometrix, Inc. Apparatus and method for button-free control of a wearable transcutaneous electrical nerve stimulator using interactive gestures and other means
EP2978488B1 (en) 2013-03-29 2021-04-14 GSK Consumer Healthcare S.A. Detecting cutaneous electrode peeling using electrode-skin impedance
US10420666B2 (en) 2013-04-08 2019-09-24 Elwha Llc Apparatus, system, and method for controlling movement of an orthopedic joint prosthesis in a mammalian subject
US9439797B2 (en) * 2013-04-08 2016-09-13 Elwha Llc Apparatus, system, and method for controlling movement of an orthopedic joint prosthesis in a mammalian subject
JP2016515463A (en) 2013-04-15 2016-05-30 ニューロメトリックス・インコーポレーテッド Transcutaneous electrical nerve stimulation device that automatically detects the user's sleep / wake state
US9417091B2 (en) * 2013-05-13 2016-08-16 The Johns Hopkins University System and method for determining and correcting field sensors errors
US10188309B2 (en) 2013-11-27 2019-01-29 North Inc. Systems, articles, and methods for electromyography sensors
US10042422B2 (en) 2013-11-12 2018-08-07 Thalmic Labs Inc. Systems, articles, and methods for capacitive electromyography sensors
US11921471B2 (en) 2013-08-16 2024-03-05 Meta Platforms Technologies, Llc Systems, articles, and methods for wearable devices having secondary power sources in links of a band for providing secondary power in addition to a primary power source
US20150124566A1 (en) 2013-10-04 2015-05-07 Thalmic Labs Inc. Systems, articles and methods for wearable electronic devices employing contact sensors
FR3012314B1 (en) * 2013-10-30 2018-08-17 Voice DEVICE FOR EVALUATING THE MOBILITY OF AN ARTICULATION
GB2519987B (en) 2013-11-04 2021-03-03 Imperial College Innovations Ltd Biomechanical activity monitoring
WO2015095383A1 (en) 2013-12-17 2015-06-25 The Regents Of The University Of California Diagnostic knee arthrometer for detecting acl structural changes
WO2015137131A1 (en) * 2014-03-12 2015-09-17 古野電気株式会社 Ultrasound diagnostic device and ultrasound diagnostic method
US10274509B1 (en) 2014-04-09 2019-04-30 Inertialwave Inertial motion tracking device
US10993639B2 (en) * 2014-04-25 2021-05-04 Massachusetts Institute Of Technology Feedback method and wearable device to monitor and modulate knee adduction moment
EP3811891A3 (en) * 2014-05-14 2021-05-05 Stryker European Holdings I, LLC Navigation system and processor arrangement for tracking the position of a work target
US9575560B2 (en) 2014-06-03 2017-02-21 Google Inc. Radar-based gesture-recognition through a wearable device
US9880632B2 (en) 2014-06-19 2018-01-30 Thalmic Labs Inc. Systems, devices, and methods for gesture identification
US9423318B2 (en) * 2014-07-29 2016-08-23 Honeywell International Inc. Motion detection devices and systems
US9811164B2 (en) 2014-08-07 2017-11-07 Google Inc. Radar-based gesture sensing and data transmission
US9778749B2 (en) 2014-08-22 2017-10-03 Google Inc. Occluded gesture recognition
US11169988B2 (en) 2014-08-22 2021-11-09 Google Llc Radar recognition-aided search
US11234616B2 (en) * 2014-08-29 2022-02-01 Bionic Skins LLC Mechanisms and methods for the design and fabrication of a mechanical interface between a wearable device and a human body segment
US9950194B2 (en) 2014-09-09 2018-04-24 Mevion Medical Systems, Inc. Patient positioning system
WO2016044830A1 (en) * 2014-09-19 2016-03-24 Think Surgical, Inc. System and process for ultrasonic determination of long bone orientation
US9600080B2 (en) 2014-10-02 2017-03-21 Google Inc. Non-line-of-sight radar-based gesture recognition
CN107106088A (en) * 2014-11-04 2017-08-29 欧斯泰奥德萨格里克科技公司 The method of integrated sensor and effector in customization three-dimensional correction device
CN107003751A (en) * 2014-12-06 2017-08-01 马月有限公司 Gesture recognition system for controlling electronics controlled plant
US9648457B2 (en) * 2014-12-16 2017-05-09 Intel Corporation Multi-signal geometric location sensing for access control
US10363149B2 (en) 2015-02-20 2019-07-30 OrthAlign, Inc. Hip replacement navigation system and method
US20190117128A1 (en) * 2015-03-19 2019-04-25 Meloq Ab Method and device for anatomical angle measurement
US11684260B2 (en) 2015-03-23 2023-06-27 Tracpatch Health, Inc. System and methods with user interfaces for monitoring physical therapy and rehabilitation
US10582891B2 (en) 2015-03-23 2020-03-10 Consensus Orthopedics, Inc. System and methods for monitoring physical therapy and rehabilitation of joints
US11272879B2 (en) 2015-03-23 2022-03-15 Consensus Orthopedics, Inc. Systems and methods using a wearable device for monitoring an orthopedic implant and rehabilitation
JP6964067B2 (en) * 2015-03-23 2021-11-10 コンセンサス オーソペディックス インコーポレイテッド Orthopedic implant and rehabilitation monitoring system
EP3274912B1 (en) * 2015-03-26 2022-05-11 Biomet Manufacturing, LLC System for planning and performing arthroplasty procedures using motion-capture data
EP3289434A1 (en) 2015-04-30 2018-03-07 Google LLC Wide-field radar-based gesture recognition
CN107430444B (en) 2015-04-30 2020-03-03 谷歌有限责任公司 RF-based micro-motion tracking for gesture tracking and recognition
KR102229658B1 (en) 2015-04-30 2021-03-17 구글 엘엘씨 Type-agnostic rf signal representations
EP3297520B1 (en) * 2015-05-18 2022-11-02 Vayu Technology Corp. Devices for measuring human gait and related methods of use
US10088908B1 (en) 2015-05-27 2018-10-02 Google Llc Gesture detection and interactions
US20170225033A1 (en) * 2015-06-23 2017-08-10 Ipcomm Llc Method and Apparatus for Analysis of Gait and to Provide Haptic and Visual Corrective Feedback
CA2934366A1 (en) * 2015-06-30 2016-12-30 Ulterra Drilling Technologies, L.P. Universal joint
WO2017007518A1 (en) * 2015-07-07 2017-01-12 Obma Padraic R Noninvasive medical monitoring device, system and method
WO2017055551A1 (en) * 2015-09-30 2017-04-06 Koninklijke Philips N.V. Ultrasound apparatus and method for determining a medical condition of a subject
US10817065B1 (en) 2015-10-06 2020-10-27 Google Llc Gesture recognition using multiple antenna
JP6449753B2 (en) * 2015-11-05 2019-01-09 国立大学法人佐賀大学 Joint inflammation detection device
CN105468896B (en) * 2015-11-13 2017-06-16 上海逸动医学科技有限公司 Joint motions detecting system and method
CN105902274B (en) * 2016-04-08 2017-08-25 上海逸动医学科技有限公司 Knee joint dynamic assessment method and system
EP3386393B1 (en) * 2015-12-08 2021-05-05 Kneevoice, Inc. Assessing joint condition using acoustic sensors
US10467534B1 (en) * 2015-12-09 2019-11-05 Roger Brent Augmented reality procedural system
AU2016369607B2 (en) * 2015-12-16 2019-06-20 Techmah Medical Llc IMU calibration
US10463279B2 (en) * 2016-02-19 2019-11-05 Trustees Of Dartmouth College Movement monitoring systems and methods
WO2017151683A1 (en) * 2016-02-29 2017-09-08 Mahfouz Mohamed R Connected healthcare environment
US10492302B2 (en) 2016-05-03 2019-11-26 Google Llc Connecting an electronic component to an interactive textile
JP2017202236A (en) * 2016-05-13 2017-11-16 花王株式会社 Gait analysis method and gait analysis device
US10285456B2 (en) 2016-05-16 2019-05-14 Google Llc Interactive fabric
US10314514B2 (en) * 2016-05-29 2019-06-11 Ankon Medical Technologies (Shanghai) Co., Ltd. System and method for using a capsule device
WO2017205983A1 (en) * 2016-06-02 2017-12-07 Bigmotion Technologies Inc. Systems and methods for walking speed estimation
US10078377B2 (en) 2016-06-09 2018-09-18 Microsoft Technology Licensing, Llc Six DOF mixed reality input by fusing inertial handheld controller with hand tracking
CN107510466B (en) * 2016-06-15 2022-04-12 中慧医学成像有限公司 Three-dimensional imaging method and system
EP3474725A1 (en) 2016-06-24 2019-05-01 Surgical Sensors BVBA Integrated ligament strain measurement
WO2017220173A1 (en) * 2016-06-24 2017-12-28 Surgical Sensors Bvba Integrated ligament strain measurement
US20170367644A1 (en) * 2016-06-27 2017-12-28 Claris Healthcare Inc. Apparatus and Method for Monitoring Rehabilitation from Joint Surgery
WO2018013708A1 (en) 2016-07-13 2018-01-18 Neurometrix, Inc. Apparatus and method for automated compensation of transcutaneous electrical nerve stimulation for temporal fluctuations such as circadian rhythms
WO2018022602A1 (en) 2016-07-25 2018-02-01 Ctrl-Labs Corporation Methods and apparatus for predicting musculo-skeletal position information using wearable autonomous sensors
US10687759B2 (en) 2018-05-29 2020-06-23 Facebook Technologies, Llc Shielding techniques for noise reduction in surface electromyography signal measurement and related systems and methods
CN110337269B (en) 2016-07-25 2021-09-21 脸谱科技有限责任公司 Method and apparatus for inferring user intent based on neuromuscular signals
US11331045B1 (en) 2018-01-25 2022-05-17 Facebook Technologies, Llc Systems and methods for mitigating neuromuscular signal artifacts
US10496168B2 (en) 2018-01-25 2019-12-03 Ctrl-Labs Corporation Calibration techniques for handstate representation modeling using neuromuscular signals
WO2020112986A1 (en) 2018-11-27 2020-06-04 Facebook Technologies, Inc. Methods and apparatus for autocalibration of a wearable electrode sensor system
US11216069B2 (en) 2018-05-08 2022-01-04 Facebook Technologies, Llc Systems and methods for improved speech recognition using neuromuscular information
US11337652B2 (en) 2016-07-25 2022-05-24 Facebook Technologies, Llc System and method for measuring the movements of articulated rigid bodies
US11000211B2 (en) 2016-07-25 2021-05-11 Facebook Technologies, Llc Adaptive system for deriving control signals from measurements of neuromuscular activity
US20180028109A1 (en) * 2016-07-27 2018-02-01 Andrew TESNOW System and method for a wearable knee injury prevention
EP3508120B1 (en) 2016-08-30 2022-06-08 Fujitsu Limited Information processing device, information processing system and information processing method
CN106447713B (en) * 2016-08-31 2019-05-28 北京维盛视通科技有限公司 Method for automatic measurement and device based on cloud manikin
JP6738249B2 (en) * 2016-09-09 2020-08-12 花王株式会社 Gait analysis method and gait analysis device
JP6738250B2 (en) * 2016-09-09 2020-08-12 花王株式会社 Gait analysis method and gait analysis device
WO2018051898A1 (en) * 2016-09-14 2018-03-22 Cyberdyne株式会社 Device for producing knee joint correction tool, method for producing knee joint correction tool, device for assisting knee joint treatment, and method for assissting knee joint treatment
WO2018081795A1 (en) * 2016-10-31 2018-05-03 Zipline Medical, Inc. Systems and methods for monitoring physical therapy of the knee and other joints
FI127689B (en) * 2016-11-07 2018-12-14 Oulun Yliopisto Arrangement for knee diagnostics
WO2018085822A1 (en) 2016-11-07 2018-05-11 Synergistic Biosensors, LLC Systems and methods for monitoring implantable devices for detection of implant failure utilizing wireless in vivo micro sensors
US10579150B2 (en) 2016-12-05 2020-03-03 Google Llc Concurrent detection of absolute distance and relative movement for sensing action gestures
CN110418659B (en) 2016-12-23 2023-12-19 纽诺麦斯股份有限公司 "Smart" electrode assemblies for Transcutaneous Electrical Nerve Stimulation (TENS)
US10120455B2 (en) * 2016-12-28 2018-11-06 Industrial Technology Research Institute Control device and control method
CN106725598B (en) * 2016-12-28 2023-09-12 苏州科技城医院 Heart ultrasonic system based on multiple percutaneous ultrasonic transducers and imaging method
WO2018165448A1 (en) * 2017-03-08 2018-09-13 Obma Padraic A method for identifying human joint characteristics
JP7344122B2 (en) 2017-03-14 2023-09-13 オースアライン・インコーポレイテッド Systems and methods for measuring and balancing soft tissue
AU2018236220A1 (en) 2017-03-14 2019-09-26 OrthAlign, Inc. Hip replacement navigation systems and methods
GB2560909B (en) * 2017-03-27 2020-12-02 270 Vision Ltd Movement sensor
EP3606459A1 (en) * 2017-04-07 2020-02-12 Orthosoft Inc. Non-invasive system and method for tracking bones
US11058877B2 (en) 2017-05-30 2021-07-13 Neurometrix, Inc. Apparatus and method for the automated control of transcutaneous electrical nerve stimulation based on current and forecasted weather conditions
EP3618715A4 (en) * 2017-06-19 2021-02-17 Mohamed R. Mahfouz Surgical navigation of the hip using fluoroscopy and tracking sensors
US11000229B2 (en) * 2017-08-03 2021-05-11 Orthini, LLC Systems, methods, and apparatuses for integrating a body joint rehabilitation regimen with a wearable movement capture device operable in conjunction with a cloud based computing environment
AU2018332792A1 (en) 2017-09-14 2020-05-07 Howmedica Osteonics Corp. Non-symmetrical insert sensing system and method therefor
USD865986S1 (en) 2017-09-21 2019-11-05 Neurometrix, Inc. Transcutaneous electrical nerve stimulation device strap
GB201716123D0 (en) * 2017-10-03 2017-11-15 Virtualclinic Direct Ltd Data capture device
CN112040858A (en) 2017-10-19 2020-12-04 脸谱科技有限责任公司 System and method for identifying biological structures associated with neuromuscular source signals
AU2018366306B2 (en) 2017-11-07 2024-02-01 Djo, Llc Brace having integrated remote patient monitoring technology and method of using same
CN108030512A (en) * 2017-12-21 2018-05-15 福州大学 A kind of supersonic array measuring method of ankle arthrosis degree of injury
CN108245164B (en) * 2017-12-22 2021-03-26 北京精密机电控制设备研究所 Human body gait information acquisition and calculation method for wearable inertial device
US11110281B2 (en) 2018-01-04 2021-09-07 Cardiac Pacemakers, Inc. Secure transdermal communication with implanted device
CN108175381A (en) * 2018-01-10 2018-06-19 中山大学附属第医院 A kind of knee joint endoprosthesis surface damage detecting system and its application method
US10706693B1 (en) * 2018-01-11 2020-07-07 Facebook Technologies, Llc. Haptic device for creating vibration-, pressure-, and shear-based haptic cues
US11493993B2 (en) 2019-09-04 2022-11-08 Meta Platforms Technologies, Llc Systems, methods, and interfaces for performing inputs based on neuromuscular control
WO2019147928A1 (en) 2018-01-25 2019-08-01 Ctrl-Labs Corporation Handstate reconstruction based on multiple inputs
US11961494B1 (en) 2019-03-29 2024-04-16 Meta Platforms Technologies, Llc Electromagnetic interference reduction in extended reality environments
CN112074870A (en) 2018-01-25 2020-12-11 脸谱科技有限责任公司 Visualization of reconstructed hand state information
US10504286B2 (en) 2018-01-25 2019-12-10 Ctrl-Labs Corporation Techniques for anonymizing neuromuscular signal data
US10937414B2 (en) 2018-05-08 2021-03-02 Facebook Technologies, Llc Systems and methods for text input using neuromuscular information
US10460455B2 (en) 2018-01-25 2019-10-29 Ctrl-Labs Corporation Real-time processing of handstate representation model estimates
WO2019147958A1 (en) 2018-01-25 2019-08-01 Ctrl-Labs Corporation User-controlled tuning of handstate representation model parameters
US11150730B1 (en) 2019-04-30 2021-10-19 Facebook Technologies, Llc Devices, systems, and methods for controlling computing devices via neuromuscular signals of users
US11907423B2 (en) 2019-11-25 2024-02-20 Meta Platforms Technologies, Llc Systems and methods for contextualized interactions with an environment
US11481030B2 (en) 2019-03-29 2022-10-25 Meta Platforms Technologies, Llc Methods and apparatus for gesture detection and classification
WO2019152566A1 (en) * 2018-01-30 2019-08-08 The Regents Of The University Of California Systems and methods for subject specific kinematic mapping
US10592001B2 (en) 2018-05-08 2020-03-17 Facebook Technologies, Llc Systems and methods for improved speech recognition using neuromuscular information
CN112469469A (en) 2018-05-25 2021-03-09 脸谱科技有限责任公司 Method and apparatus for providing sub-muscular control
WO2019241701A1 (en) 2018-06-14 2019-12-19 Ctrl-Labs Corporation User identification and authentication with neuromuscular signatures
EP3810013A1 (en) * 2018-06-19 2021-04-28 Tornier, Inc. Neural network for recommendation of shoulder surgery type
CN112272537A (en) 2018-06-20 2021-01-26 科马医疗有限责任公司 Method and apparatus for knee joint surgery with inertial sensors
US11510737B2 (en) 2018-06-21 2022-11-29 Mako Surgical Corp. Patella tracking
JP2021528643A (en) * 2018-06-22 2021-10-21 イーエヌデータクト ゲーエムベーハーiNDTact GmbH Sensor placement structure, use of sensor placement structure, and how to detect solid-borne sound
US20210128247A1 (en) * 2018-06-26 2021-05-06 Australian Institute of Robotic Orthopaedics Pty Ltd Implant fit analysis
US11045137B2 (en) 2018-07-19 2021-06-29 Facebook Technologies, Llc Methods and apparatus for improved signal robustness for a wearable neuromuscular recording device
GB2574074B (en) 2018-07-27 2020-05-20 Mclaren Applied Tech Ltd Time synchronisation
WO2020036958A1 (en) 2018-08-13 2020-02-20 Ctrl-Labs Corporation Real-time spike detection and identification
US10842407B2 (en) 2018-08-31 2020-11-24 Facebook Technologies, Llc Camera-guided interpretation of neuromuscular signals
EP3853698A4 (en) 2018-09-20 2021-11-17 Facebook Technologies, LLC Neuromuscular text entry, writing and drawing in augmented reality systems
WO2020069181A1 (en) 2018-09-26 2020-04-02 Ctrl-Labs Corporation Neuromuscular control of physical objects in an environment
CN112822992A (en) 2018-10-05 2021-05-18 脸谱科技有限责任公司 Providing enhanced interaction with physical objects using neuromuscular signals in augmented reality environments
JP7132816B2 (en) * 2018-10-10 2022-09-07 大和ハウス工業株式会社 Joint condition determination system
US11510035B2 (en) 2018-11-07 2022-11-22 Kyle Craig Wearable device for measuring body kinetics
CN109859592B (en) * 2018-11-14 2020-12-08 华中科技大学 Soft tissue injury simulation test device
US11883661B2 (en) 2018-12-07 2024-01-30 Neurometrix, Inc. Intelligent determination of therapeutic stimulation intensity for transcutaneous electrical nerve stimulation
KR102550854B1 (en) * 2018-12-13 2023-07-04 삼성전자주식회사 Method and device for assisting walking
CN111374674B (en) * 2018-12-29 2023-02-10 西安思博探声生物科技有限公司 Knee joint movement information processing equipment
US10905383B2 (en) 2019-02-28 2021-02-02 Facebook Technologies, Llc Methods and apparatus for unsupervised one-shot machine learning for classification of human gestures and estimation of applied forces
WO2020187753A1 (en) * 2019-03-20 2020-09-24 Optima Molliter Srl Orthosis brace with monitoring system
US11537702B2 (en) * 2019-05-13 2022-12-27 Cardiac Pacemakers, Inc. Implanted medical device authentication based on comparison of internal IMU signal to external IMU signal
US11911213B2 (en) 2019-06-03 2024-02-27 General Electric Company Techniques for determining ultrasound probe motion
US11497452B2 (en) * 2019-06-20 2022-11-15 The Hong Kong Polytechnic University Predictive knee joint loading system
KR102251925B1 (en) * 2019-07-18 2021-05-13 경상국립대학교 산학협력단 Apparatus and application for predicting musculoskeletal disorders
US11812978B2 (en) 2019-10-15 2023-11-14 Orthosensor Inc. Knee balancing system using patient specific instruments
GB2588236B (en) 2019-10-18 2024-03-20 Mclaren Applied Ltd Gyroscope bias estimation
GB2588237B (en) * 2019-10-18 2023-12-27 Mclaren Applied Ltd Joint axis direction estimation
US10842415B1 (en) * 2019-10-25 2020-11-24 Plethy, Inc. Devices, systems, and methods for monitoring and assessing gait, stability, and/or balance of a user
WO2021090921A1 (en) * 2019-11-08 2021-05-14 国立大学法人大阪大学 System, program, and method for measuring jaw movement of subject
CN110772262B (en) * 2019-12-05 2020-12-29 广东电网有限责任公司 Comfort evaluation method for human body tower-climbing posture
AU2020402763A1 (en) * 2019-12-09 2022-06-16 OrthAlign, Inc. Cup alignment systems and methods
US10863928B1 (en) 2020-01-28 2020-12-15 Consensus Orthopedics, Inc. System and methods for monitoring the spine, balance, gait, or posture of a patient
AU2021261683A1 (en) * 2020-04-20 2022-12-01 Formus Labs Limited Surgical system
US11832934B1 (en) 2020-05-04 2023-12-05 Qingbin Zheng Joint monitoring
US20210378853A1 (en) * 2020-06-09 2021-12-09 National Cheng Kung University Wearable interface for intelligent health promotion service system
US20230270376A1 (en) * 2020-07-07 2023-08-31 The General Hospital Corporation Evaluating the stability of a joint in the foot and ankle complex via weight-bearing medical imaging
CN113143256B (en) * 2021-01-28 2023-09-26 上海电气集团股份有限公司 Gait feature extraction method, lower limb evaluation and control method, device and medium
CN112754516B (en) * 2021-02-07 2022-04-08 河南省肿瘤医院 Intelligent bowel sound positioning and collecting device
KR20230154257A (en) * 2021-03-07 2023-11-07 리퀴드 와이어 인크. Devices, systems, and methods for monitoring and characterizing user's actions via flexible circuitry
US11868531B1 (en) 2021-04-08 2024-01-09 Meta Platforms Technologies, Llc Wearable device providing for thumb-to-finger-based input gestures detected based on neuromuscular signals, and systems and methods of use thereof
IT202100017267A1 (en) * 2021-06-30 2022-12-30 Scuola Superiore Santanna POSITIONING DEVICE FOR ULTRASONIC PROBE
CN117881370A (en) * 2021-08-30 2024-04-12 西门子工业软件有限公司 Method and system for determining joints in a virtual kinematic device
DE102021124873A1 (en) * 2021-09-27 2023-03-30 Aesculap Ag Medical technology system and method
WO2023148427A1 (en) * 2022-02-03 2023-08-10 Aikoa Technologies Oy Method for training computing arrangement to provide prognosis of progression of tissue condition
WO2023235859A2 (en) * 2022-06-02 2023-12-07 New York Society For The Relief Of The Ruptured And Crippled, Maintaining The Hospital For Special Surgery Method for mechanical phenotyping of knees

Family Cites Families (92)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3881164A (en) * 1973-09-13 1975-04-29 Commw Of Australia Cross array ultrasonic transducer
US4016750B1 (en) * 1975-11-06 1994-04-05 Stanford Research Inst Ultrasonic imaging method and apparatus
US5488952A (en) * 1982-02-24 1996-02-06 Schoolman Scientific Corp. Stereoscopically display three dimensional ultrasound imaging
US4476873A (en) * 1982-09-03 1984-10-16 Medtronic, Inc. Ultrasound scanning system for skeletal imaging
FR2694881B1 (en) * 1992-07-31 1996-09-06 Univ Joseph Fourier METHOD FOR DETERMINING THE POSITION OF AN ORGAN.
US6005916A (en) * 1992-10-14 1999-12-21 Techniscan, Inc. Apparatus and method for imaging with wavefields using inverse scattering techniques
US5413116A (en) * 1993-06-24 1995-05-09 Bioresearch Method and apparatus for diagnosing joints
US5394875A (en) * 1993-10-21 1995-03-07 Lewis; Judith T. Automatic ultrasonic localization of targets implanted in a portion of the anatomy
US5826578A (en) * 1994-05-26 1998-10-27 Curchod; Donald B. Motion measurement apparatus
US5919149A (en) * 1996-03-19 1999-07-06 Allum; John H. Method and apparatus for angular position and velocity based determination of body sway for the diagnosis and rehabilitation of balance and gait disorders
US5806521A (en) * 1996-03-26 1998-09-15 Sandia Corporation Composite ultrasound imaging apparatus and method
US5771310A (en) * 1996-12-30 1998-06-23 Shriners Hospitals For Children Method and apparatus for recording three-dimensional topographies
US6213958B1 (en) 1996-08-29 2001-04-10 Alan A. Winder Method and apparatus for the acoustic emission monitoring detection, localization, and classification of metabolic bone disease
US7468075B2 (en) * 2001-05-25 2008-12-23 Conformis, Inc. Methods and compositions for articular repair
US6205411B1 (en) * 1997-02-21 2001-03-20 Carnegie Mellon University Computer-assisted surgery planner and intra-operative guidance system
JPH10243937A (en) * 1997-03-07 1998-09-14 Mitsubishi Heavy Ind Ltd Joint labile measurement device
US6120453A (en) * 1997-11-17 2000-09-19 Sharp; William A. Three-dimensional ultrasound system based on the coordination of multiple ultrasonic transducers
US6231585B1 (en) 1997-11-20 2001-05-15 Medivas, Llc Device for stabilizing a treatment site and method of use
AU3102199A (en) * 1998-03-20 1999-10-11 Barbara Ann Karmanos Cancer Institute Multidimensional detection and characterization of pathologic tissues
US6280387B1 (en) * 1998-05-06 2001-08-28 Siemens Medical Systems, Inc. Three-dimensional tissue/flow ultrasound imaging system
US7184814B2 (en) 1998-09-14 2007-02-27 The Board Of Trustees Of The Leland Stanford Junior University Assessing the condition of a joint and assessing cartilage loss
US7239908B1 (en) * 1998-09-14 2007-07-03 The Board Of Trustees Of The Leland Stanford Junior University Assessing the condition of a joint and devising treatment
EP0991015B1 (en) * 1998-09-29 2004-12-01 Koninklijke Philips Electronics N.V. Method for processing ultrasonic medical images of bone structures, and an apparatus for computer assisted surgery
JP2000251078A (en) * 1998-12-22 2000-09-14 Atr Media Integration & Communications Res Lab Method and device for estimating three-dimensional posture of person, and method and device for estimating position of elbow of person
US6106464A (en) * 1999-02-22 2000-08-22 Vanderbilt University Apparatus and method for bone surface-based registration of physical space with tomographic images and for guiding an instrument relative to anatomical sites in the image
JP4636696B2 (en) * 1999-04-20 2011-02-23 アーオー テクノロジー アクチエンゲゼルシャフト Device for percutaneous acquisition of 3D coordinates on the surface of a human or animal organ
WO2001032114A1 (en) 1999-11-02 2001-05-10 Wizcare Ltd. Skin-gripper
ES2279799T3 (en) * 2000-01-29 2007-09-01 Paul E. Thomson DETECTION AND QUANTIFICATION OF INFLAMMATION OF ARTICULATIONS AND FABRIC.
US6904123B2 (en) * 2000-08-29 2005-06-07 Imaging Therapeutics, Inc. Methods and devices for quantitative analysis of x-ray images
US6537233B1 (en) * 2000-11-06 2003-03-25 University Technologies International Inc. Auditory display of knee joint vibration signals
US6561991B2 (en) * 2000-12-19 2003-05-13 The Research Foundation Of The State University Of New York (Suny) Non-invasive method and system of quantifying human postural stability
CA2333224A1 (en) * 2001-01-31 2002-07-31 University Technologies International Inc. Non-invasive diagnostic method and apparatus for musculoskeletal systems
WO2003079882A2 (en) 2002-03-25 2003-10-02 Ramot At Tel Aviv University Ltd. Method and system for determining a risk of ulcer onset
US7117026B2 (en) * 2002-06-12 2006-10-03 Koninklijke Philips Electronics N.V. Physiological model based non-rigid image registration
US7981057B2 (en) * 2002-10-11 2011-07-19 Northrop Grumman Guidance And Electronics Company, Inc. Joint motion sensing to make a determination of a positional change of an individual
ATE484231T1 (en) * 2003-01-07 2010-10-15 Imaging Therapeutics Inc DEVICE FOR PREDICTING MUSCLE/SKELETAL DISEASES
US7660623B2 (en) * 2003-01-30 2010-02-09 Medtronic Navigation, Inc. Six degree of freedom alignment display for medical procedures
JP2004264060A (en) * 2003-02-14 2004-09-24 Akebono Brake Ind Co Ltd Error correction method in attitude detector, and action measuring instrument using the same
JP3932360B2 (en) * 2003-03-04 2007-06-20 独立行政法人産業技術総合研究所 Landmark extraction apparatus and landmark extraction method
US20050043660A1 (en) * 2003-03-31 2005-02-24 Izex Technologies, Inc. Orthoses
WO2005007217A2 (en) * 2003-07-10 2005-01-27 Neurocom International, Inc. Apparatus and method for characterizing contributions of forces associated with a body part of a subject
US7454242B2 (en) * 2003-09-17 2008-11-18 Elise Fear Tissue sensing adaptive radar imaging for breast tumor detection
US20050093859A1 (en) 2003-11-04 2005-05-05 Siemens Medical Solutions Usa, Inc. Viewing direction dependent acquisition or processing for 3D ultrasound imaging
US8265728B2 (en) * 2003-11-26 2012-09-11 University Of Chicago Automated method and system for the evaluation of disease and registration accuracy in the subtraction of temporally sequential medical images
EP1722705A2 (en) * 2004-03-10 2006-11-22 Depuy International Limited Orthopaedic operating systems, methods, implants and instruments
JP4455118B2 (en) * 2004-03-30 2010-04-21 独立行政法人科学技術振興機構 Delivery diagnosis support program, recording medium storing the program, and delivery diagnosis support method and apparatus.
US7678052B2 (en) 2004-04-13 2010-03-16 General Electric Company Method and apparatus for detecting anatomic structures
US7483732B2 (en) 2004-04-15 2009-01-27 Boston Scientific Scimed, Inc. Magnetic resonance imaging of a medical device and proximate body tissue
JP4411384B2 (en) * 2004-04-15 2010-02-10 独立行政法人放射線医学総合研究所 Diagnostic system
DE102004026525A1 (en) * 2004-05-25 2005-12-22 Aesculap Ag & Co. Kg Method and device for the non-invasive determination of prominent structures of the human or animal body
US20060052727A1 (en) 2004-09-09 2006-03-09 Laurence Palestrant Activity monitoring device and weight management method utilizing same
CN100573589C (en) * 2004-09-09 2009-12-23 皇家飞利浦电子股份有限公司 The system that is used for the three-dimensional imaging of movable joint
US20060161052A1 (en) * 2004-12-08 2006-07-20 Perception Raisonnement Action En Medecine Computer assisted orthopaedic surgery system for ligament graft reconstruction
US20060245627A1 (en) * 2005-02-08 2006-11-02 Kouki Nagamune Noninvasive dynamic analysis system and method of use thereof
GB0504172D0 (en) * 2005-03-01 2005-04-06 King S College London Surgical planning
JP4304341B2 (en) * 2005-03-17 2009-07-29 国立大学法人 新潟大学 Three-dimensional shape measurement device and socket design device for prosthetic limbs based on the measurement data
US20100100011A1 (en) 2008-10-22 2010-04-22 Martin Roche System and Method for Orthopedic Alignment and Measurement
KR101258912B1 (en) 2005-06-06 2013-04-30 인튜어티브 서지컬 인코포레이티드 Laparoscopic ultrasound robotic surgical system
CA2616700A1 (en) 2005-08-09 2007-02-15 Gil Zwirn High resolution radio frequency medical imaging and therapy system
US8092398B2 (en) * 2005-08-09 2012-01-10 Massachusetts Eye & Ear Infirmary Multi-axis tilt estimation and fall remediation
GB2435614A (en) 2006-03-01 2007-09-05 Samuel George Transducer holder for maintaining signal-receiving contact with a patient's body
US7949386B2 (en) * 2006-03-21 2011-05-24 A2 Surgical Computer-aided osteoplasty surgery system
US8165659B2 (en) * 2006-03-22 2012-04-24 Garrett Sheffer Modeling method and apparatus for use in surgical navigation
US8676293B2 (en) * 2006-04-13 2014-03-18 Aecc Enterprises Ltd. Devices, systems and methods for measuring and evaluating the motion and function of joint structures and associated muscles, determining suitability for orthopedic intervention, and evaluating efficacy of orthopedic intervention
US20080009722A1 (en) 2006-05-11 2008-01-10 Constantine Simopoulos Multi-planar reconstruction for ultrasound volume data
US7578799B2 (en) * 2006-06-30 2009-08-25 Ossur Hf Intelligent orthosis
US7769422B2 (en) * 2006-09-29 2010-08-03 Depuy Products, Inc. Apparatus and method for monitoring the position of an orthopaedic prosthesis
WO2008074151A1 (en) * 2006-12-20 2008-06-26 Mcmaster University System and method of assessing the condition of a joint
US20080194997A1 (en) * 2007-02-08 2008-08-14 Rehabilitation Institute Of Chicago System and method for diagnosing and treating patellar maltracking and malalignment
US20080221487A1 (en) * 2007-03-07 2008-09-11 Motek Bv Method for real time interactive visualization of muscle forces and joint torques in the human body
EP1970005B1 (en) * 2007-03-15 2012-10-03 Xsens Holding B.V. A system and a method for motion tracking using a calibration unit
US7920731B2 (en) 2007-03-27 2011-04-05 Siemens Medical Solutions Usa, Inc. Bleeding detection using a blanket ultrasound device
US8444651B2 (en) * 2007-05-14 2013-05-21 Queen's University At Kingston Patient-specific surgical guidance tool and method of use
US8089417B2 (en) * 2007-06-01 2012-01-03 The Royal Institution For The Advancement Of Learning/Mcgill University Microwave scanning system and miniaturized microwave antenna
US8771188B2 (en) * 2007-06-20 2014-07-08 Perception Raisonnement Action En Medecine Ultrasonic bone motion tracking system
JP5061281B2 (en) * 2007-08-20 2012-10-31 国立大学法人広島大学 Knee joint rotation angle measuring device
EP2194836B1 (en) * 2007-09-25 2015-11-04 Perception Raisonnement Action En Medecine Apparatus for assisting cartilage diagnostic and therapeutic procedures
JP5416900B2 (en) 2007-11-22 2014-02-12 株式会社東芝 Ultrasonic diagnostic apparatus and puncture support control program
WO2009117832A1 (en) * 2008-03-25 2009-10-01 Orthosoft Inc. Tracking system and method
US8377073B2 (en) * 2008-04-21 2013-02-19 Ray Wasielewski Method of designing orthopedic implants using in vivo data
US8456236B2 (en) 2008-05-19 2013-06-04 Hittite Microwave Norway As Multiple input variable gain amplifier
US20100125229A1 (en) * 2008-07-11 2010-05-20 University Of Delaware Controllable Joint Brace
US8444564B2 (en) 2009-02-02 2013-05-21 Jointvue, Llc Noninvasive diagnostic system
JP5377166B2 (en) 2009-09-01 2013-12-25 古野電気株式会社 Ultrasound bone analyzer
US20110125016A1 (en) 2009-11-25 2011-05-26 Siemens Medical Solutions Usa, Inc. Fetal rendering in medical diagnostic ultrasound
US8979758B2 (en) 2010-06-29 2015-03-17 Orthosensor Inc Sensing module for orthopedic load sensing insert device
EP3213682B1 (en) 2010-08-02 2020-03-04 Jointvue, LLC Method and apparatus for three dimensional reconstruction of a joint using ultrasound
US20130144135A1 (en) 2011-08-02 2013-06-06 Mohamed R. Mahfouz Method and apparatus for three dimensional reconstruction of a joint using ultrasound
WO2013025613A1 (en) 2011-08-12 2013-02-21 Jointvue, Llc 3-d ultrasound imaging device and methods
CA3194212A1 (en) 2011-10-14 2013-04-18 Jointvue, Llc Real-time 3-d ultrasound reconstruction of knee and its implications for patient specific implants and 3-d joint injections
WO2014150780A2 (en) 2013-03-15 2014-09-25 Jointvue, Llc Determination of joint condition based on vibration analysis
WO2014150961A1 (en) 2013-03-15 2014-09-25 Jointvue, Llc Motion tracking system with inertial-based sensing units

Cited By (208)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10582896B2 (en) * 2004-03-05 2020-03-10 Depuy International Limited Orthopaedic monitoring system, methods and apparatus
US20150313546A1 (en) * 2004-03-05 2015-11-05 Depuy International Limited Orthopaedic monitoring system, methods and apparatus
US11576616B2 (en) 2004-03-05 2023-02-14 Depuy International Limited Orthopaedic monitoring system, methods and apparatus
US10603179B2 (en) 2006-02-27 2020-03-31 Biomet Manufacturing, Llc Patient-specific augments
US9522010B2 (en) 2006-02-27 2016-12-20 Biomet Manufacturing, Llc Patient-specific orthopedic instruments
US9662127B2 (en) 2006-02-27 2017-05-30 Biomet Manufacturing, Llc Patient-specific acetabular guides and associated instruments
US9289253B2 (en) 2006-02-27 2016-03-22 Biomet Manufacturing, Llc Patient-specific shoulder guide
US10426492B2 (en) 2006-02-27 2019-10-01 Biomet Manufacturing, Llc Patient specific alignment guide with cutting surface and laser indicator
US9480580B2 (en) 2006-02-27 2016-11-01 Biomet Manufacturing, Llc Patient-specific acetabular alignment guides
US10390845B2 (en) 2006-02-27 2019-08-27 Biomet Manufacturing, Llc Patient-specific shoulder guide
US10743937B2 (en) 2006-02-27 2020-08-18 Biomet Manufacturing, Llc Backup surgical instrument system and method
US9480490B2 (en) 2006-02-27 2016-11-01 Biomet Manufacturing, Llc Patient-specific guides
US10278711B2 (en) 2006-02-27 2019-05-07 Biomet Manufacturing, Llc Patient-specific femoral guide
US9113971B2 (en) 2006-02-27 2015-08-25 Biomet Manufacturing, Llc Femoral acetabular impingement guide
US20110092804A1 (en) * 2006-02-27 2011-04-21 Biomet Manufacturing Corp. Patient-Specific Pre-Operative Planning
US9662216B2 (en) 2006-02-27 2017-05-30 Biomet Manufacturing, Llc Patient-specific hip joint devices
US9173661B2 (en) 2006-02-27 2015-11-03 Biomet Manufacturing, Llc Patient specific alignment guide with cutting surface and laser indicator
US9700329B2 (en) 2006-02-27 2017-07-11 Biomet Manufacturing, Llc Patient-specific orthopedic instruments
US9539013B2 (en) 2006-02-27 2017-01-10 Biomet Manufacturing, Llc Patient-specific elbow guides and associated methods
US10206695B2 (en) 2006-02-27 2019-02-19 Biomet Manufacturing, Llc Femoral acetabular impingement guide
US11534313B2 (en) 2006-02-27 2022-12-27 Biomet Manufacturing, Llc Patient-specific pre-operative planning
US9913734B2 (en) 2006-02-27 2018-03-13 Biomet Manufacturing, Llc Patient-specific acetabular alignment guides
US9918740B2 (en) 2006-02-27 2018-03-20 Biomet Manufacturing, Llc Backup surgical instrument system and method
US9345548B2 (en) * 2006-02-27 2016-05-24 Biomet Manufacturing, Llc Patient-specific pre-operative planning
US9339278B2 (en) 2006-02-27 2016-05-17 Biomet Manufacturing, Llc Patient-specific acetabular guides and associated instruments
US10507029B2 (en) 2006-02-27 2019-12-17 Biomet Manufacturing, Llc Patient-specific acetabular guides and associated instruments
US9138175B2 (en) 2006-05-19 2015-09-22 The Queen's Medical Center Motion tracking system for real time adaptive imaging and spectroscopy
US9076212B2 (en) 2006-05-19 2015-07-07 The Queen's Medical Center Motion tracking system for real time adaptive imaging and spectroscopy
US10869611B2 (en) 2006-05-19 2020-12-22 The Queen's Medical Center Motion tracking system for real time adaptive imaging and spectroscopy
US9867549B2 (en) 2006-05-19 2018-01-16 The Queen's Medical Center Motion tracking system for real time adaptive imaging and spectroscopy
US9861387B2 (en) 2006-06-09 2018-01-09 Biomet Manufacturing, Llc Patient-specific knee alignment guide and associated method
US10206697B2 (en) 2006-06-09 2019-02-19 Biomet Manufacturing, Llc Patient-specific knee alignment guide and associated method
US10893879B2 (en) 2006-06-09 2021-01-19 Biomet Manufacturing, Llc Patient-specific knee alignment guide and associated method
US11576689B2 (en) 2006-06-09 2023-02-14 Biomet Manufacturing, Llc Patient-specific knee alignment guide and associated method
US9795399B2 (en) 2006-06-09 2017-10-24 Biomet Manufacturing, Llc Patient-specific knee alignment guide and associated method
US9993344B2 (en) 2006-06-09 2018-06-12 Biomet Manufacturing, Llc Patient-modified implant
US9572590B2 (en) 2006-10-03 2017-02-21 Biomet Uk Limited Surgical instrument
US9907659B2 (en) 2007-04-17 2018-03-06 Biomet Manufacturing, Llc Method and apparatus for manufacturing an implant
US11554019B2 (en) 2007-04-17 2023-01-17 Biomet Manufacturing, Llc Method and apparatus for manufacturing an implant
US10159498B2 (en) 2008-04-16 2018-12-25 Biomet Manufacturing, Llc Method and apparatus for manufacturing an implant
US8453511B2 (en) * 2008-07-22 2013-06-04 The University Of Tokyo Ultrasonic probe support device
US20110132094A1 (en) * 2008-07-22 2011-06-09 The University Of Tokyo Ultrasonic probe support device
US20130217998A1 (en) * 2009-02-02 2013-08-22 Jointvue, Llc Motion Tracking System with Inertial-Based Sensing Units
US9642572B2 (en) * 2009-02-02 2017-05-09 Joint Vue, LLC Motion Tracking system with inertial-based sensing units
US9468538B2 (en) 2009-03-24 2016-10-18 Biomet Manufacturing, Llc Method and apparatus for aligning and securing an implant relative to a patient
US9393028B2 (en) 2009-08-13 2016-07-19 Biomet Manufacturing, Llc Device for the resection of bones, method for producing such a device, endoprosthesis suited for this purpose and method for producing such an endoprosthesis
US9839433B2 (en) 2009-08-13 2017-12-12 Biomet Manufacturing, Llc Device for the resection of bones, method for producing such a device, endoprosthesis suited for this purpose and method for producing such an endoprosthesis
US10052110B2 (en) 2009-08-13 2018-08-21 Biomet Manufacturing, Llc Device for the resection of bones, method for producing such a device, endoprosthesis suited for this purpose and method for producing such an endoprosthesis
US11324522B2 (en) 2009-10-01 2022-05-10 Biomet Manufacturing, Llc Patient specific alignment guide with cutting surface and laser indicator
US9456833B2 (en) 2010-02-26 2016-10-04 Biomet Sports Medicine, Llc Patient-specific osteotomy devices and methods
US10893876B2 (en) 2010-03-05 2021-01-19 Biomet Manufacturing, Llc Method and apparatus for manufacturing an implant
US8821417B2 (en) * 2010-06-22 2014-09-02 Stephen J. McGregor Method of monitoring human body movement
US20130110011A1 (en) * 2010-06-22 2013-05-02 Stephen J. McGregor Method of monitoring human body movement
US9271744B2 (en) 2010-09-29 2016-03-01 Biomet Manufacturing, Llc Patient-specific guide for partial acetabular socket replacement
US10098648B2 (en) 2010-09-29 2018-10-16 Biomet Manufacturing, Llc Patient-specific guide for partial acetabular socket replacement
US11234719B2 (en) 2010-11-03 2022-02-01 Biomet Manufacturing, Llc Patient-specific shoulder guide
US9968376B2 (en) 2010-11-29 2018-05-15 Biomet Manufacturing, Llc Patient-specific orthopedic instruments
US20190014174A1 (en) * 2010-12-17 2019-01-10 Amazon Technologies, Inc. Personal Remote Storage for Purchased Electronic Content Items
US10931754B2 (en) * 2010-12-17 2021-02-23 Amazon Technologies, Inc. Personal remote storage for purchased electronic content items
US9241745B2 (en) 2011-03-07 2016-01-26 Biomet Manufacturing, Llc Patient-specific femoral version guide
US9445907B2 (en) 2011-03-07 2016-09-20 Biomet Manufacturing, Llc Patient-specific tools and implants
US9743935B2 (en) 2011-03-07 2017-08-29 Biomet Manufacturing, Llc Patient-specific femoral version guide
US9717510B2 (en) 2011-04-15 2017-08-01 Biomet Manufacturing, Llc Patient-specific numerically controlled instrument
US10251690B2 (en) 2011-04-19 2019-04-09 Biomet Manufacturing, Llc Patient-specific fracture fixation instrumentation and method
US9675400B2 (en) 2011-04-19 2017-06-13 Biomet Manufacturing, Llc Patient-specific fracture fixation instrumentation and method
US9474539B2 (en) 2011-04-29 2016-10-25 Biomet Manufacturing, Llc Patient-specific convertible guides
US9743940B2 (en) 2011-04-29 2017-08-29 Biomet Manufacturing, Llc Patient-specific partial knee guides and other instruments
US9757238B2 (en) 2011-06-06 2017-09-12 Biomet Manufacturing, Llc Pre-operative planning and manufacturing method for orthopedic procedure
US9084618B2 (en) 2011-06-13 2015-07-21 Biomet Manufacturing, Llc Drill guides for confirming alignment of patient-specific alignment guides
US9687261B2 (en) 2011-06-13 2017-06-27 Biomet Manufacturing, Llc Drill guides for confirming alignment of patient-specific alignment guides
US9668747B2 (en) 2011-07-01 2017-06-06 Biomet Manufacturing, Llc Patient-specific-bone-cutting guidance instruments and methods
US10492798B2 (en) 2011-07-01 2019-12-03 Biomet Manufacturing, Llc Backup kit for a patient-specific arthroplasty kit assembly
US11253269B2 (en) 2011-07-01 2022-02-22 Biomet Manufacturing, Llc Backup kit for a patient-specific arthroplasty kit assembly
US9173666B2 (en) 2011-07-01 2015-11-03 Biomet Manufacturing, Llc Patient-specific-bone-cutting guidance instruments and methods
US9427320B2 (en) 2011-08-04 2016-08-30 Biomet Manufacturing, Llc Patient-specific pelvic implants for acetabular reconstruction
US10663553B2 (en) 2011-08-26 2020-05-26 Kineticor, Inc. Methods, systems, and devices for intra-scan motion correction
US9606209B2 (en) 2011-08-26 2017-03-28 Kineticor, Inc. Methods, systems, and devices for intra-scan motion correction
US9603613B2 (en) 2011-08-31 2017-03-28 Biomet Manufacturing, Llc Patient-specific sacroiliac guides and associated methods
US9066734B2 (en) 2011-08-31 2015-06-30 Biomet Manufacturing, Llc Patient-specific sacroiliac guides and associated methods
US9439659B2 (en) 2011-08-31 2016-09-13 Biomet Manufacturing, Llc Patient-specific sacroiliac guides and associated methods
US9295497B2 (en) 2011-08-31 2016-03-29 Biomet Manufacturing, Llc Patient-specific sacroiliac and pedicle guides
US10456205B2 (en) 2011-09-29 2019-10-29 Biomet Manufacturing, Llc Patient-specific femoroacetabular impingement instruments and methods
US9386993B2 (en) 2011-09-29 2016-07-12 Biomet Manufacturing, Llc Patient-specific femoroacetabular impingement instruments and methods
US11406398B2 (en) 2011-09-29 2022-08-09 Biomet Manufacturing, Llc Patient-specific femoroacetabular impingement instruments and methods
US10426549B2 (en) 2011-10-27 2019-10-01 Biomet Manufacturing, Llc Methods for patient-specific shoulder arthroplasty
US9451973B2 (en) 2011-10-27 2016-09-27 Biomet Manufacturing, Llc Patient specific glenoid guide
US10842510B2 (en) 2011-10-27 2020-11-24 Biomet Manufacturing, Llc Patient specific glenoid guide
US11419618B2 (en) 2011-10-27 2022-08-23 Biomet Manufacturing, Llc Patient-specific glenoid guides
US9301812B2 (en) 2011-10-27 2016-04-05 Biomet Manufacturing, Llc Methods for patient-specific shoulder arthroplasty
US11298188B2 (en) 2011-10-27 2022-04-12 Biomet Manufacturing, Llc Methods for patient-specific shoulder arthroplasty
US9936962B2 (en) 2011-10-27 2018-04-10 Biomet Manufacturing, Llc Patient specific glenoid guide
US9554910B2 (en) 2011-10-27 2017-01-31 Biomet Manufacturing, Llc Patient-specific glenoid guide and implants
US11602360B2 (en) 2011-10-27 2023-03-14 Biomet Manufacturing, Llc Patient specific glenoid guide
US10426493B2 (en) 2011-10-27 2019-10-01 Biomet Manufacturing, Llc Patient-specific glenoid guides
US9351743B2 (en) 2011-10-27 2016-05-31 Biomet Manufacturing, Llc Patient-specific glenoid guides
US11957516B2 (en) 2011-10-28 2024-04-16 Decision Sciences International Corporation Spread spectrum coded waveforms in ultrasound diagnostics
US11596388B2 (en) 2011-10-28 2023-03-07 Decision Sciences International Corporation Spread spectrum coded waveforms in ultrasound diagnostics
US10993699B2 (en) 2011-10-28 2021-05-04 Decision Sciences International Corporation Spread spectrum coded waveforms in ultrasound diagnostics
US20130185310A1 (en) * 2012-01-16 2013-07-18 Emovi Inc. Method and system for human joint treatment plan and personalized surgery planning using 3-d kinematics, fusion imaging and simulation
US9286355B2 (en) * 2012-01-16 2016-03-15 Emovi Inc. Method and system for human joint treatment plan and personalized surgery planning using 3-D kinematics, fusion imaging and simulation
US9827106B2 (en) 2012-02-02 2017-11-28 Biomet Manufacturing, Llc Implant with patient-specific porous structure
US9237950B2 (en) 2012-02-02 2016-01-19 Biomet Manufacturing, Llc Implant with patient-specific porous structure
US20200022641A1 (en) * 2012-05-17 2020-01-23 Alan N. Schwartz Localization Of The Parathyroid
US20200107770A1 (en) * 2012-05-17 2020-04-09 Alan N. Schwartz Localization of the parathyroid
US20140159959A1 (en) * 2012-07-11 2014-06-12 Digimarc Corporation Body-worn phased-array antenna
US9564682B2 (en) * 2012-07-11 2017-02-07 Digimarc Corporation Body-worn phased-array antenna
US11064423B2 (en) 2012-10-22 2021-07-13 The Nielsen Company (Us), Llc Systems and methods for wirelessly modifying detection characteristics of portable devices
US10631231B2 (en) * 2012-10-22 2020-04-21 The Nielsen Company (Us), Llc Systems and methods for wirelessly modifying detection characteristics of portable devices
US11825401B2 (en) 2012-10-22 2023-11-21 The Nielsen Company (Us), Llc Systems and methods for wirelessly modifying detection characteristics of portable devices
US10070627B2 (en) 2012-11-21 2018-09-11 i4c Innovations Inc. Animal health and wellness monitoring using UWB radar
US11317608B2 (en) 2012-11-21 2022-05-03 i4c Innovations Inc. Animal health and wellness monitoring using UWB radar
US9526437B2 (en) 2012-11-21 2016-12-27 i4c Innovations Inc. Animal health and wellness monitoring using UWB radar
US9060788B2 (en) 2012-12-11 2015-06-23 Biomet Manufacturing, Llc Patient-specific acetabular guide for anterior approach
US9597201B2 (en) 2012-12-11 2017-03-21 Biomet Manufacturing, Llc Patient-specific acetabular guide for anterior approach
US9204977B2 (en) 2012-12-11 2015-12-08 Biomet Manufacturing, Llc Patient-specific acetabular guide for anterior approach
US10327708B2 (en) 2013-01-24 2019-06-25 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
US9305365B2 (en) 2013-01-24 2016-04-05 Kineticor, Inc. Systems, devices, and methods for tracking moving targets
US9607377B2 (en) 2013-01-24 2017-03-28 Kineticor, Inc. Systems, devices, and methods for tracking moving targets
US9779502B1 (en) 2013-01-24 2017-10-03 Kineticor, Inc. Systems, devices, and methods for tracking moving targets
US9717461B2 (en) 2013-01-24 2017-08-01 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
US10339654B2 (en) 2013-01-24 2019-07-02 Kineticor, Inc. Systems, devices, and methods for tracking moving targets
US10653381B2 (en) 2013-02-01 2020-05-19 Kineticor, Inc. Motion tracking system for real time adaptive motion compensation in biomedical imaging
US9782141B2 (en) 2013-02-01 2017-10-10 Kineticor, Inc. Motion tracking system for real time adaptive motion compensation in biomedical imaging
CN104968280A (en) * 2013-02-11 2015-10-07 皇家飞利浦有限公司 Ultrasound imaging system and method
WO2014122544A1 (en) * 2013-02-11 2014-08-14 Koninklijke Philips N.V. Ultrasound imaging system and method
US20160015319A1 (en) * 2013-03-07 2016-01-21 The Regents Of The University Of California System for health monitoring on prosthetic and fixation devices
US11712201B2 (en) 2013-03-07 2023-08-01 The Regents Of The University Of California System for health monitoring on prosthetic and fixation devices
US11617591B2 (en) 2013-03-11 2023-04-04 Biomet Manufacturing, Llc Patient-specific glenoid guide with a reusable guide holder
US10441298B2 (en) 2013-03-11 2019-10-15 Biomet Manufacturing, Llc Patient-specific glenoid guide with a reusable guide holder
US9839438B2 (en) 2013-03-11 2017-12-12 Biomet Manufacturing, Llc Patient-specific glenoid guide with a reusable guide holder
US9700325B2 (en) 2013-03-12 2017-07-11 Biomet Manufacturing, Llc Multi-point fit for patient specific guide
US9579107B2 (en) 2013-03-12 2017-02-28 Biomet Manufacturing, Llc Multi-point fit for patient specific guide
US10426491B2 (en) 2013-03-13 2019-10-01 Biomet Manufacturing, Llc Tangential fit of patient-specific guides
US11191549B2 (en) 2013-03-13 2021-12-07 Biomet Manufacturing, Llc Tangential fit of patient-specific guides
US9826981B2 (en) 2013-03-13 2017-11-28 Biomet Manufacturing, Llc Tangential fit of patient-specific guides
US11862348B2 (en) * 2013-03-13 2024-01-02 Blue Belt Technologies, Inc. Systems and methods for using generic anatomy models in surgical planning
US10376270B2 (en) 2013-03-13 2019-08-13 Biomet Manufacturing, Llc Universal acetabular guide and associated hardware
US9498233B2 (en) 2013-03-13 2016-11-22 Biomet Manufacturing, Llc. Universal acetabular guide and associated hardware
US9517145B2 (en) 2013-03-15 2016-12-13 Biomet Manufacturing, Llc Guide alignment system and method
EP2967440A4 (en) * 2013-03-15 2016-10-12 Jointvue Llc Determination of joint condition based on vibration analysis
US10149617B2 (en) 2013-03-15 2018-12-11 i4c Innovations Inc. Multiple sensors for monitoring health and wellness of an animal
EP3552538A1 (en) 2013-03-15 2019-10-16 Joint Vue, LLC Motion tracking system with inertial-based sensing units
US11096661B2 (en) 2013-09-13 2021-08-24 Decision Sciences International Corporation Coherent spread-spectrum coded waveforms in synthetic aperture image formation
US11607192B2 (en) 2013-09-13 2023-03-21 Decision Sciences International Corporation Coherent spread-spectrum coded waveforms in synthetic aperture image formation
US20160291860A1 (en) * 2013-10-08 2016-10-06 Sony Computer Entertainment Inc. Information processing device
US11179165B2 (en) 2013-10-21 2021-11-23 Biomet Manufacturing, Llc Ligament guide registration
AU2019202571B2 (en) * 2013-12-09 2020-10-01 Mohamed R. Mahfouz Bone reconstruction and orthopedic implants
CN106170705A (en) * 2013-12-09 2016-11-30 穆罕默德·R·马赫福兹 Skeletal reconstruction and Orthopeadic Surgery implant
US11813049B2 (en) 2013-12-09 2023-11-14 Techmah Medical Llc Bone reconstruction and orthopedic implants
EP3080619A4 (en) * 2013-12-09 2017-08-30 Mohamed R. Mahfouz Bone reconstruction and orthopedic implants
EP3800476A1 (en) * 2013-12-09 2021-04-07 Mohamed R. Mahfouz Surgical navigation system
AU2014363945B2 (en) * 2013-12-09 2019-04-04 Techmah Medical Llc Bone reconstruction and orthopedic implants
US10004462B2 (en) 2014-03-24 2018-06-26 Kineticor, Inc. Systems, methods, and devices for removing prospective motion correction from medical imaging scans
US10282488B2 (en) 2014-04-25 2019-05-07 Biomet Manufacturing, Llc HTO guide with optional guided ACL/PCL tunnels
US9408616B2 (en) 2014-05-12 2016-08-09 Biomet Manufacturing, Llc Humeral cut guide
US9839436B2 (en) 2014-06-03 2017-12-12 Biomet Manufacturing, Llc Patient-specific glenoid depth control
US9561040B2 (en) 2014-06-03 2017-02-07 Biomet Manufacturing, Llc Patient-specific glenoid depth control
EP3166487A4 (en) * 2014-07-10 2018-04-11 Mohamed R. Mahfouz Bone reconstruction and orthopedic implants
EP3747388A1 (en) * 2014-07-10 2020-12-09 Mohamed R. Mahfouz Surgical navigation
US10575955B2 (en) 2014-07-10 2020-03-03 Mohamed R. Mahfouz Hybrid surgical tracking system
WO2016007936A3 (en) * 2014-07-10 2016-03-17 Mahfouz Mohamed R Bone reconstruction and orthopedic implants
US9734589B2 (en) 2014-07-23 2017-08-15 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
US10438349B2 (en) 2014-07-23 2019-10-08 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
US11100636B2 (en) 2014-07-23 2021-08-24 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
US9826994B2 (en) 2014-09-29 2017-11-28 Biomet Manufacturing, Llc Adjustable glenoid pin insertion guide
US11026699B2 (en) 2014-09-29 2021-06-08 Biomet Manufacturing, Llc Tibial tubercule osteotomy
US10335162B2 (en) 2014-09-29 2019-07-02 Biomet Sports Medicine, Llc Tibial tubercle osteotomy
US9833245B2 (en) 2014-09-29 2017-12-05 Biomet Sports Medicine, Llc Tibial tubercule osteotomy
US20160234369A1 (en) * 2015-02-06 2016-08-11 Samsung Electronics Co., Ltd. Multi-purpose device including mobile terminal and sensing device using radio-wave based sensor module
US10887440B2 (en) * 2015-02-06 2021-01-05 Samsung Electronics Co., Ltd. Multi-purpose device including mobile terminal and sensing device using radio-wave based sensor module
US10425519B2 (en) * 2015-02-06 2019-09-24 Samsung Electronics Co., Ltd. Multi-purpose device including mobile terminal and sensing device using radio-wave based sensor module
US11191521B2 (en) 2015-02-25 2021-12-07 Decision Sciences Medical Company, LLC Acoustic signal transmission couplants and coupling mediums
US11839512B2 (en) 2015-02-25 2023-12-12 Decision Sciences Medical Company, LLC Acoustic signal transmission couplants and coupling mediums
US10743838B2 (en) 2015-02-25 2020-08-18 Decision Sciences Medical Company, LLC Acoustic signal transmission couplants and coupling mediums
US9820868B2 (en) 2015-03-30 2017-11-21 Biomet Manufacturing, Llc Method and apparatus for a pin apparatus
EP3302243A4 (en) * 2015-05-27 2019-01-09 Georgia Tech Research Corporation Wearable technologies for joint health assessment
WO2016191753A1 (en) 2015-05-27 2016-12-01 Georgia Tech Research Corporation Wearable technologies for joint health assessment
US11039782B2 (en) 2015-05-27 2021-06-22 Georgia Tech Research Corporation Wearable technologies for joint health assessment
WO2016191813A1 (en) * 2015-06-01 2016-12-08 Latey Penelope Jane Foot muscle biofeedback unit
US10925622B2 (en) 2015-06-25 2021-02-23 Biomet Manufacturing, Llc Patient-specific humeral guide designs
US10568647B2 (en) 2015-06-25 2020-02-25 Biomet Manufacturing, Llc Patient-specific humeral guide designs
US10226262B2 (en) 2015-06-25 2019-03-12 Biomet Manufacturing, Llc Patient-specific humeral guide designs
US11801064B2 (en) 2015-06-25 2023-10-31 Biomet Manufacturing, Llc Patient-specific humeral guide designs
US10660541B2 (en) 2015-07-28 2020-05-26 The University Of Hawai'i Systems, devices, and methods for detecting false movements for motion correction during a medical imaging scan
US9943247B2 (en) 2015-07-28 2018-04-17 The University Of Hawai'i Systems, devices, and methods for detecting false movements for motion correction during a medical imaging scan
US10426429B2 (en) 2015-10-08 2019-10-01 Decision Sciences Medical Company, LLC Acoustic orthopedic tracking system and methods
EP3359048A4 (en) * 2015-10-08 2019-07-17 Decision Sciences Medical Company, LLC Acoustic orthopedic tracking system and methods
US11737726B2 (en) 2015-10-08 2023-08-29 Decision Sciences Medical Company, LLC Acoustic orthopedic tracking system and methods
EP3171286A1 (en) * 2015-11-17 2017-05-24 Universitat De València, Estudi General Methods for determining an identifier for use in methods for diagnosing haemophilic arthropathy, methods and apparatus for diagnosing
US10716515B2 (en) 2015-11-23 2020-07-21 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
US20170312099A1 (en) * 2016-04-28 2017-11-02 medFit Beratungs-und Beteiligungsges.m.B.H. Dynamic Ligament Balancing System
US20200375760A1 (en) * 2016-04-28 2020-12-03 Mit Entwicklungs Gmbh Dynamic ligament balancing system
US10722385B2 (en) * 2016-04-28 2020-07-28 medFit Beratungs-und Beteilgungsges.m.B.H. Dynamic ligament balancing system
WO2018162808A1 (en) * 2017-01-10 2018-09-13 Braindex S.A.S Physiological sensor for near-infrared spectroscopy at various depths
US11311194B2 (en) * 2017-01-10 2022-04-26 Braindex S.A.S Physiological sensor for a near-infrared spectroscopy at different depths
US10722310B2 (en) 2017-03-13 2020-07-28 Zimmer Biomet CMF and Thoracic, LLC Virtual surgery planning system and method
WO2018213749A1 (en) * 2017-05-18 2018-11-22 Smith & Nephew, Inc. Systems and methods for determining the position and orientation of an implant for joint replacement surgery
US11547353B2 (en) 2017-11-30 2023-01-10 Bbi Medical Innovations, Llc Implant evaluation using acoustic emissions
US10918333B2 (en) 2017-11-30 2021-02-16 Bruin Biometrics, Llc Implant evaluation using acoustic emissions
EP3880074A4 (en) * 2018-11-14 2022-07-27 Precix Pte Ltd Method and device for measuring anatomical movement of a joint
CN113556974A (en) * 2018-11-14 2021-10-26 普瑞克斯私人有限公司 Method and device for measuring anatomical movement of a joint
WO2020101569A1 (en) 2018-11-14 2020-05-22 Precision Medical Pte Ltd Method and device for measuring anatomical movement of a joint
CN109801278A (en) * 2019-01-21 2019-05-24 燕山大学 A kind of high-speed slide electrical contact movement pair surface damage classifying method
US11154274B2 (en) 2019-04-23 2021-10-26 Decision Sciences Medical Company, LLC Semi-rigid acoustic coupling articles for ultrasound diagnostic and treatment applications
US11877870B2 (en) 2019-08-05 2024-01-23 Consultation Semperform Inc Systems, methods and apparatus for prevention of injury
US11520043B2 (en) 2020-11-13 2022-12-06 Decision Sciences Medical Company, LLC Systems and methods for synthetic aperture ultrasound imaging of an object
US20220183591A1 (en) * 2020-12-16 2022-06-16 Polar Electro Oy Biomechanical modelling of motion measurements
WO2022266254A1 (en) * 2021-06-16 2022-12-22 Kinisi Inc Wearable imaging system for measuring bone displacement

Also Published As

Publication number Publication date
US20220215947A1 (en) 2022-07-07
US20130217998A1 (en) 2013-08-22
JP2015109972A (en) 2015-06-18
US11935648B1 (en) 2024-03-19
US11342071B2 (en) 2022-05-24
CA2977574C (en) 2019-07-23
US9642572B2 (en) 2017-05-09
US20240079125A1 (en) 2024-03-07
US20170296115A1 (en) 2017-10-19
CA3170396C (en) 2023-04-18
US11004561B2 (en) 2021-05-11
CA3049975C (en) 2022-10-11
CA3170396A1 (en) 2010-08-05
US11776686B2 (en) 2023-10-03
JP5723788B2 (en) 2015-05-27
CA3049975A1 (en) 2010-08-05
EP3968220A1 (en) 2022-03-16
CA2751422A1 (en) 2010-08-05
US20100198067A1 (en) 2010-08-05
WO2010088696A1 (en) 2010-08-05
JP2016202974A (en) 2016-12-08
EP2391971B1 (en) 2021-11-10
EP2391971A1 (en) 2011-12-07
US8444564B2 (en) 2013-05-21
US20130253379A1 (en) 2013-09-26
US20210193313A1 (en) 2021-06-24
JP6005715B2 (en) 2016-10-12
CA2751422C (en) 2017-10-17
CA3192190A1 (en) 2010-08-05
CA2977574A1 (en) 2010-08-05
JP2012516719A (en) 2012-07-26
EP2391971A4 (en) 2015-10-21
JP6404286B2 (en) 2018-10-10

Similar Documents

Publication Publication Date Title
US11935648B1 (en) Noninvasive diagnostic system
Aminian et al. Capturing human motion using body‐fixed sensors: outdoor measurement and clinical applications
US20130211259A1 (en) Determination of joint condition based on vibration analysis
US9924921B1 (en) System for mapping joint performance
CN106572821A (en) Systems and methods for measuring performance parameters related to artificial orthopedic joints
EP2967440A2 (en) Determination of joint condition based on vibration analysis
EP1458289A2 (en) Method of calibration for the representation of knee kinematics and harness for use therewith
JPWO2006085387A1 (en) Non-invasive moving body analysis system and method of use thereof
Bloomfield et al. Proposal and validation of a knee measurement system for patients with osteoarthritis
US20210153804A1 (en) Joint analysis probe
US20230377714A1 (en) Devices, systems, and methods for optimizing medical procedures and outcomes
US20220401079A1 (en) Wearable Imaging System for Measuring Bone Displacement
WO2023099936A1 (en) A wearable to assess knee joint integrity using non-contact acoustic sensors
Klets Subject-specific musculoskeletal modeling of the lower extremities in persons with unilateral cerebral palsy
McGinnis et al. Feasibility of a Novel, Conformal, Wearable Sensor System for Longitudinal Patient Monitoring

Legal Events

Date Code Title Description
AS Assignment

Owner name: JOINT VUE, LLC., OHIO

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MAHFOUZ, MOHAMED R.;KOMISTEK, RICHARD;WASIELEWSKI, RAY C.;SIGNING DATES FROM 20110804 TO 20110830;REEL/FRAME:027092/0062

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION