US20110196235A1 - Ultrasound imaging system and method for providing assistance in an ultrasound imaging system - Google Patents

Ultrasound imaging system and method for providing assistance in an ultrasound imaging system Download PDF

Info

Publication number
US20110196235A1
US20110196235A1 US12/988,730 US98873008A US2011196235A1 US 20110196235 A1 US20110196235 A1 US 20110196235A1 US 98873008 A US98873008 A US 98873008A US 2011196235 A1 US2011196235 A1 US 2011196235A1
Authority
US
United States
Prior art keywords
video clip
ultrasound
ultrasound image
imaging system
demonstration
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/988,730
Inventor
Allan Dunbar
Siccd Schets
Fateh Mohammed
Hiba Arbash
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
eZono AG
Original Assignee
eZono AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by eZono AG filed Critical eZono AG
Assigned to EZONO AG reassignment EZONO AG ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ARBASH, HIBA, MOHAMMED, FATEH, DUNBAR, ALLAN, SHETS, SICCO
Assigned to EZONO AG reassignment EZONO AG CORRECTIVE ASSIGNMENT TO CORRECT THE SPELLING OF THE LAST NAME OF INVENTOR SICCO SCHETS PREVIOUSLY RECORDED ON REEL 025394 FRAME 0407. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT. Assignors: ARBASH, HIBA, MOHAMMED, FATEH, DUNBAR, ALLAN, Schets, Sicco
Publication of US20110196235A1 publication Critical patent/US20110196235A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/463Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B23/00Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes
    • G09B23/28Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine
    • G09B23/286Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine for scanning or photography techniques, e.g. X-rays, ultrasonics
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation

Definitions

  • the invention relates to an ultrasound imaging system according to the pre-characterizing clause of claim 1 , in particular for the use in medical examinations. It further relates to a method of providing assistance to the user of an ultrasound imaging system according to the pre-characterizing clause of claim 11 , in particular when a medical examination is performed.
  • the U.S. Pat. No. 6,488,629 B1 discloses an ultrasound imaging system and a method for helping the user of the system to find the correct cut-plane when acquiring an ultrasound image.
  • the screen of the ultrasound scanner is divided into two parts, one of which shows the live image currently acquired and the other shows a previously recorded reference image for comparison.
  • the reference image is cycled in a loop and synchronised with the live image based on the patient's ECG signal.
  • a user interface for creating and organizing ultrasound imaging protocols.
  • the ultrasound imaging protocol guides the sonographer through each view of an examination and specifies the types of images and measurements he or she should take during the examination.
  • the user interface displays graphical representations of ultrasound images characteristic for steps of the protocol and allows for manipulation of these graphical representations in order to change the respective protocol step.
  • the revised protocol can then be saved in the system.
  • a device for real-time location and tracking in ultrasound imaging of anatomical landmarks of the heart is known.
  • a set of parameter values is generated that represent the movement of the cardiac structure over time. From these parameters, information about the position of the anatomical landmark is extracted and an indication of the position is overlaid onto the ultrasound image.
  • the invention further aims to provide an improved method of assisting the user of an ultrasound imaging system, in particular when a medical examination is performed.
  • the expression “examination procedure” does not only encompass purely diagnostic procedures but also therapeutic or surgical procedures that involve ultrasound imaging, e.g. for guidance as may be the case in the performance of a regional anaesthesia.
  • the live ultrasound image is an ultrasound image created from the current ultrasound signal received by the ultrasound scanner during the examination of the patient with the ultrasound imaging system.
  • a demonstration video clip is an animated sequence of images demonstrating the performance of a step in the examination procedure.
  • the present invention advantageously can instruct the user as to how he should perform a certain step of the procedure while at the same time allowing him or her to implement what he is shown. It is an achievable advantage of the invention that the user requires less experience in the use of the ultrasound system and in the performance of a given procedure to be able perform this procedure and come to a diagnosis. Thus, less training may be required, thereby saving costs.
  • the ultrasound imaging system according to the invention can itself be used as a training tool both for introductory training and for continued training.
  • an experienced user may quickly learn a new procedure or a new diagnosis.
  • the user may also rehearse procedures or diagnoses he has already learned.
  • training may be achieved with less or even entirely without the involvement of an instructor, thereby significant reducing training costs.
  • the advantage can be achieved that users who have been trained on a different system can easily get acquainted to the system according to the invention.
  • the system according to the invention may be used by less qualified personnel, thereby expanding the market for medical ultrasound imaging.
  • the technology can be made available to a greater number of patients, increasing the quality of health care provided to these patients.
  • a demonstration video clip such as the primary demonstration video clip and the secondary demonstration video clip discussed further below, preferably shows the position, orientation, and motion of a probe of the ultrasound scanner relatively to the patient's anatomy in the respective step of the examination procedure.
  • the demonstration video clip may for example be a recording of an examination that has been performed on a real subject or a phantom. Alternatively, it may be the recording of a simulation using an anatomic model.
  • the demonstration video clip may also be generated in real-time inside or outside the ultrasound imaging system, preferably by the assistance means. If the clip is be pre-recorded, it may be stored in a memory, e.g.
  • the assistance means can in addition to the at least one primary demonstration video clip provide at least one ultrasound image video clip, which corresponds to one of the at least one primary demonstration video clips.
  • the demonstration video clip corresponds to the ultrasound image video clip in that the ultrasound image video clip represents the animated ultrasound image that results or would result from performing the procedural step as demonstrated in the demonstration video clip.
  • the video display is functionally connected with the assistance means in order to present the ultrasound image video clip simultaneously with the primary demonstration video clip and the live ultrasound image.
  • the ultrasound image video clip which corresponds to the primary demonstration video clip is presented simultaneously with the live ultrasound image and the primary demonstration video clip.
  • this embodiment of the invention can help the user to reproduce the procedural steps and interpret the live ultrasound image. It is believed, without prejudice, that this is at least partly due to the fact that the user can on one hand compare the live ultrasound image with the ultrasound image video clip, and on the other hand his own handling of the ultrasound probe (which result in the live ultrasound image) with the demonstration of the procedural step in the primary demonstration video clip (which corresponds to the ultrasound image video clip).
  • the ultrasound image video clip may be the ultrasound image that has been obtained in the examination of a real subject or a phantom or it may be the result of a simulation based on an anatomic model.
  • the assistance means comprises a synchroniser for synchronising the presentation of the ultrasound image video clip with the presentation of the demonstration video clip, so that the primary demonstration video clip runs in synchrony with the ultrasound image video clip.
  • Synchrony between a demonstration video clip and the ultrasound image video clip means that at any moment the ultrasound image video clip shows the ultrasound image that results or would result from the position and orientation of the ultrasound scanning probe in the demonstration video clip at the respective moment.
  • the ultrasound image in the ultrasound image video clip changes correspondingly.
  • the ultrasound image in the ultrasound image video clip changes correspondingly.
  • the synchronised presentation of the performance of the procedural step and the corresponding ultrasound image can make it even easier for the user to understand the relationship between his manipulation of the scanning probe and the resulting ultrasound image.
  • the user can e.g. see what motion helps in a particular procedural step to proceed from a landmark to a target feature or to identify a certain anatomic structure.
  • nerves can be distinguished from bones by the fact that the nerve ultrasound image is stronger at some incident angles than others, while the image of bones is the same at all angles. Proper manipulation of the probe and interpretation of the ultrasound image can thus be facilitated.
  • synchronisation will preferably involve ensuring that the presentation of these video clips starts synchronously on the video display. It may also involve adjusting the speed in which one or both of the video clips are presented.
  • the assistance means in addition to the primary demonstration video clip can provide at least one secondary demonstration video clip, which corresponds to one of the at least one primary demonstration video clips.
  • the video display is functionally connected with the assistance means in order to present the secondary demonstration video clip simultaneously with the primary demonstration video clip, the live ultrasound image, and, preferably, the ultrasound image video clip.
  • a corresponding secondary demonstration video clip is shown on the video display.
  • the primary demonstration video clip and the corresponding secondary demonstration video clip show different views of the same step of the examination. This embodiment of the invention can improve the comprehensibility of the demonstration of the procedural step and can make it easier for the user to understand the relationship between the motion of the probe and the resulting ultrasound image as shown in the ultrasound image video clip.
  • one demonstration video clip shows the step of the examination procedure in a perspective view.
  • one demonstration video clip shows the step of the examination procedure in a cross-sectional view.
  • one of the first and the second demonstration video clips shows the perspective view while the other shows the cross-sectional view.
  • the perspective view can be semi-transparent so that the user does not only see the patient's skin but also the underlying tissue, including bones, blood vessels, nerves and other organs.
  • the cross-sectional plane essentially coincides with the scanning plane of the ultrasound image video clip. Hence, if the scanning plane changes over the course of the ultrasound image video clip, so does the cross-sectional plane of the cross-sectional view. Due to the provision of both a perspective view and a cross-sectional view together with the ultrasound image video clip, the user can more easily move his or her anatomical perception from the 3-dimensional space in which the probe is manipulated to the 2-dimensional imaging slice.
  • the synchroniser synchronises the presentation of the secondary demonstration video clip with the presentations of the primary demonstration video clip and, preferably, the ultrasound image video clip.
  • the secondary demonstration video clip runs in synchrony with the primary demonstration video clip and the ultrasound image video clip. This can further help the user to understand the relationship between the motion of the probe and the resulting ultrasound image as shown in the ultrasound image video clip.
  • the imaging slice indicated, from which the ultrasound image of the corresponding ultrasound image video clip results or would result. This can assist the user in interpreting the ultrasound image video clip.
  • the imaging slice may for example be illustrated in the form of a quadrangular frame or a shaded area adjacent to the representation of the probe.
  • text is displayed on the video display, e.g. in one or more text fields.
  • the text may e.g. explain in more detail how to handle the probe of the ultrasound scanner or how to interpret the ultrasound image and derive a diagnosis.
  • the text field(s) may be adjacent to one or more of the locations on the video display where the demonstration video clips or the ultrasound image video clips are presented.
  • the scanner's imaging settings are adjusted by providing the scanner with a set of image parameters.
  • the set of image parameters preferably includes at least one of a group of parameters comprising imaging depth, number of focal zones, and imaging mode. Examples for imaging modes are B-mode, C-mode, M-mode, PW-Doppler-mode and CW-Doppler-mode.
  • the assistance means can provide at least one set of image parameter values corresponding to at least one of the demonstration video clips and/or ultrasound image video clips. More preferably, the assistance means is functionally connected with the ultrasound scanner to adjust the ultrasound scanner's imaging settings according to the set of image parameter values that corresponds to the demonstration video clip and/or ultrasound image video clip to be presented on the video display.
  • This embodiment of the invention can help users who are inexperienced or unfamiliar with the image parameters and the effect on the image quality to obtain a good image. Moreover, by reducing the amount of image manipulation which the user has to perform, it allows users to concentrate on performing the procedure and obtaining the diagnosis rather than dealing with the technical details of the image generation.
  • the user may thus save valuable time that he or she would otherwise require to adjust the settings of the ultrasound scanner.
  • this embodiment of the invention may help to avoid sub-optimal images that are obtained because in order to save time the user leaves the parameters at some typical values all the time.
  • the set of image parameter values that corresponds to the ultrasound image video clip is chosen so that the live ultrasound image obtained is similar to the image shown in the ultrasound image video clip.
  • the image parameters are essentially those that result or would result into the image shown in the ultrasound image video clip.
  • the assistant means can provide at least one procedure demonstration set comprising several primary demonstration video clips, each demonstrating a different step in the examination procedure.
  • the assistant means is functionally connected with the video display to provide the primary demonstration video clips for presentation on the video display.
  • the procedure demonstration set preferably comprises several ultrasound image video clips, each clip corresponding to a different one of the primary demonstration video clips, and the assistant means is functionally connected with the video display in order to present the ultrasound image video clips simultaneously with the primary demonstration video clips in the same order.
  • the synchroniser can synchronise the presentation of each ultrasound image video clip with the presentation of its corresponding primary demonstration video clip.
  • a preferred procedure demonstration set comprises several secondary demonstration video clips, each corresponding to a different one of the primary demonstration video clips, and the assistant means is functionally connected with the video display in order to present the secondary demonstration video clips simultaneously with the primary demonstration video clips in the same order.
  • the synchroniser can synchronise the presentation of each secondary presentation video clip with the presentation of its corresponding primary demonstration video clip.
  • the present invention also encompasses embodiments of the invention which differ from the above embodiments contemplating procedure demonstration sets of several video clips in that some of the primary demonstration video clips are replaced by demonstration still images. Similarly, some or all of the corresponding ultrasound video clips and/or the secondary demonstration video clips may be replaced by ultrasound still images or demonstration still images, respectively.
  • These alternative embodiments of the invention recognize the fact that some of the procedural steps may not involve motions relevant to the demonstration of the procedural step and may therefore be represented by still images.
  • These embodiments of the invention may advantageously save storage and computing resources.
  • the corresponding ultrasound image video clip if any, is replaced by a corresponding still image and the corresponding secondary demonstration video clip, if any, is equally replaced by a still image.
  • the preferred procedure demonstration set will comprise one primary demonstration video clip or still image for each step of the examination procedure.
  • a procedure generally comprises at least the steps of (1) positioning the patient and finding a first landmark detectable with the ultrasound scanner, and (2) navigating with the scanner's probe from the landmark to the target. Further steps may e.g. include (3) studying the target and (4) inserting a needle.
  • the primary demonstration video clip or still image presented on the video screen changes, in an appropriate moment, to another primary demonstration video clip or still image in which the performance of another step in the examination procedure is demonstrated.
  • the ultrasound image video clip or still image preferably changes to another ultrasound image video clip or still image which corresponds to the other primary demonstration video clip or still image and/or the secondary demonstration video clip or still image changes to another secondary demonstration video clip or still image which corresponds to the other primary demonstration video clip or still image.
  • the primary demonstration video clip changes synchronously with the ultrasound image video clip and/or the secondary demonstration video clip.
  • the ultrasound imaging system comprises a step control that is functionally connected with the assistant means in order upon operation of the step control by the user of the ultrasound system to provide another primary demonstration video or still image (and, if any, the corresponding ultrasound video clip or still image and/or secondary demonstration video clip or still image) for presentation on the video display.
  • the step control may e.g. comprise several virtual buttons on the video display that the user can operate to choose the step he wishes to view next.
  • the primary demonstration video clip presented on the video screen may change to a subsequent primary demonstration video clip or still image in which the performance of a subsequent, preferably the next, step in the examination procedure is demonstrated.
  • the assistant means upon operation of the step control by the user, also provides the ultrasound image video clip or still image and/or the secondary demonstration video corresponding to the respective primary demonstration video clip or still image.
  • the assistance means can provide several procedure demonstration sets, each set demonstrating steps of a different examination procedure.
  • the ultrasound imaging system comprises a procedure control that is functionally connected with the assistant means in order upon operation of the procedure control by the user of the ultrasound imaging system to select from the several procedure demonstration sets the set to be provided for presentation on the video display.
  • the user can select one of several procedures and then go through the steps of the procedure as described above.
  • a functional connection between components of the ultrasound imaging system preferably is a communication between these components, involving an exchange of data, either in one direction or in both directions.
  • a functional connection between components of the ultrasound imaging system preferably is a communication between these components, involving an exchange of data, either in one direction or in both directions.
  • such communication may simply involve the exchange of one or more parameter values from one software component, e.g. a software component which controls a processing unit to implement the step control, to another software component, e.g. a software component that controls the same processing unit to implement the assistance means. It may, however, also involve the exchange of one or more electric, magnetic, microwave, or optical signals from one hardware component, e.g. the hardware component which is controlled by the assistance means-software, to another hardware component, e.g. a video display-hardware.
  • FIG. 1 shows a block diagram of an ultrasound imaging system according to the present invention
  • FIG. 2 shows a screen of a user interface presented on the video display of an ultrasound imaging system according to the invention, the screen comprising several controls and a live ultrasound image;
  • FIG. 3 shows another screen of the user interface presented on the video display, which screen comprises a procedure control
  • FIG. 4 shows yet another screen of the user interface presented on the video display, which screen comprises a step control and presentations of the first demonstration video clip, the second demonstration video clip, the ultrasound image video clip and the ultrasound life image.
  • FIG. 1 An embodiment of the ultrasound imaging system 1 according to the invention is illustrated in FIG. 1 by means of a simplified block diagram.
  • the system comprises an ultrasound scanner 2 with a probe 3 , transmit electronics 4 , receive electronics 5 , and a control unit 6 .
  • a user of the ultrasound imaging system 1 can bring the probe 3 into contact with a patient in order to obtain a live ultrasound image 7 of an imaging slice defined by the position and orientation of the probe 3 .
  • the probe 3 contains transducers for generating ultrasound signals and receiving reflections of these ultrasound signals from the body of the patient.
  • the transducers for ultrasound generation and reception can be provided by an array of piezo elements.
  • opto-electrical ultrasound sensors may be used for the reception of the reflected ultrasound signals.
  • the probe 3 is connected via a flexible cable with the rest of the ultrasound system 1 so that it can be easily manipulated by the user during the examination.
  • the transmit electronics 4 comprise multiple channels that are connected directly or via one or more multiplexers to the ultrasound generating elements of the probe 3 , for example piezo elements. Electric pulses are generated in each individual channel, and the relative timing of the pulses can be varied accurately enough to perform a transmit focusing in the lateral direction at different depths.
  • the transmit electronics are implemented as a mixed signal electronics board or an ASIC, and include high voltage pulsers for generating the electric pulses.
  • the electric signals generated by the receiving piezo elements or opto-electrical ultrasound sensors in response to the reflected ultrasound signals are processed by the receive electronics 5 .
  • Processing includes the amplification of the analogue signals coming from the receiving transducers and the conversion into digital signals.
  • the receive electronics 5 comprise a mixed signal electronics board or ASIC, which includes amplifiers and analogue-to-digital converters.
  • the control unit 6 controls the ultrasound scanner's 2 control unit 6 .
  • the control unit's 6 tasks includes beam-forming and the processing of the digital signals received from the receive electronics 5 .
  • the control unit 6 comprises a digital signal processor (DSP), e.g. in the form of a field programmable gate array (FPGA).
  • DSP digital signal processor
  • FPGA field programmable gate array
  • the ultrasound imaging system 1 further comprises a video display 8 which is in communication with the scanner's 2 control unit 6 via a processing unit 9 in order to present a life ultrasound image 7 based on the ultrasound signals received by the probe 3 .
  • the processing unit 9 comprises a CPU and GPU and performs its tasks at least partially by means of a software program. It performs scan-conversion and image processing functions to prepare the signals coming from the control unit 6 for presentation on the video display 8 .
  • the assistance means in the present example is software-implemented in the processing unit 9 so that, amongst other things, the processing unit 9 provides primary and secondary demonstration videos clips or still images and ultrasound image videos or still images for presentation on the video display 8 .
  • the videos and still images are stored in a memory 10 in an appropriate directory structure, so that they can easily be retrieved by the processing unit 9 for presentation on the video display 8 .
  • parameter values corresponding to the ultrasound image video clips are stored in the memory and retrieved by the processing unit 9 for adjusting the settings of the ultrasound scanner 6 .
  • a suitable memory 10 is a flash memory drive. In the memories directory structure, each procedure has a different directory containing the required video or image files of all steps in this procedure.
  • the processing unit 9 generates on the video display 8 a user interface comprising multiple screens, three of which are shown in FIGS. 2 to 4 .
  • the screens comprise software-implemented controls and serve to present the live ultrasound image along with video clips, still images, and written information.
  • FIG. 2 shows a first screen 11 of the user interface which comprises the life ultrasound image 7 on the right hand side and various controls 12 , 13 , 14 on the left hand side.
  • the controls can be operated via appropriate input means, e.g. a computer mouse or touch screen sensors on the video display 8 .
  • the controls 12 , 13 the user can, adjust various settings of the ultrasound imaging system 1 , make measurements and annotations and administer patient records.
  • there is a procedure control 14 in the top left corner which leads the user to the procedure selection screen 15 shown in FIG. 3 which replaces the screen of FIG. 2 .
  • the user finds in an area on the left hand side a procedure control 16 comprising a number of virtual tabs 17 , 18 , 19 , 20 and buttons 21 representing different procedures.
  • the uppermost tab 17 for example, produces a list of buttons 21 for procedures regarding the field of regional anaesthesia, while the tab 18 below produces a list of procedure buttons 21 regarding vascular access.
  • There is also a tab 20 which produces a list of “favourite” procedure buttons 21 , which list the user can customize to include the procedures that he or she uses most often.
  • the user can choose the corresponding procedures and, on the right hand side 22 of the screen, obtains some general information regarding the procedure, generally in the form of still images. Pressing a cue card button 23 on the bottom of the screen 15 leads the user to a cue card screen 24 corresponding to the selected procedure.
  • the cue card screen 24 replaces the procedure selection screen 15 of FIG. 3 .
  • a step control 25 which comprises several virtual tabs 26 representing the steps of the procedure and arranged from top to bottom in the order in which they have to be performed. The user can select the step he seeks to perform next by operating the corresponding tab 26 of the step control 25 .
  • the primary demonstration video clip 27 or still image, the secondary demonstration video clip 28 or still image and the ultrasound image video clip 29 or still image, all retrieved by the processing unit 9 from the memory 10 are shown in that order from left to right.
  • the life video image 7 is shown.
  • a primary demonstration video clip 27 that shows the movement of the probe 3 , e.g. from a landmark to a target, in a perspective view based on an anatomic model of a human
  • a secondary demonstration video clip 28 that shows the movement in a cross-sectional view based on the same anatomic model
  • an ultrasound image video clip 29 that shows the ultrasound image that would result from the procedure.
  • the body of the patient in the perspective view is semi-transparent (not shown in the Figure), so that the spinal cord, the mayor blood vessels and the thyroid gland are visible.
  • a shaded quadrangle indicates the imaging slice that would corresponds to the ultrasound image shown in the ultrasound image video clip 29 .
  • text boxes are 30 providing additional information.
  • the image parameter values corresponding to the ultrasound image video clips 29 are retrieved by the processing unit 9 from the memory 10 .
  • the processing unit 9 sends these parameters to the ultrasound scanner 2 to adjust the ultrasound scanner's 2 imaging settings to those suitable for the current procedural step.
  • the user's anatomic perception is guided from tree dimensions to two dimensions and from there to the ultrasound image.
  • the user can easily reproduce the demonstrated step of the procedure and interpret the live image he obtains.

Abstract

An ultrasound imaging system (1) comprising an ultrasound scanner (2) for acquiring a live ultrasound image (7) of a portion of the anatomy of a patient being examined with the ultrasound imaging system (1), an assistance means (9) for providing at least one primary demonstration video clip (27), and at least one video display (8). The video display (8) is functionally connected with the ultrasound scanner (2) and the assistance means (9) in order to present the primary demonstration video clip (27) simultaneously with a live ultrasound image (7).

Description

    BACKGROUND OF THE INVENTION
  • The invention relates to an ultrasound imaging system according to the pre-characterizing clause of claim 1, in particular for the use in medical examinations. It further relates to a method of providing assistance to the user of an ultrasound imaging system according to the pre-characterizing clause of claim 11, in particular when a medical examination is performed.
  • STATE OF THE ART
  • In order to make proper use an ultrasound imaging system in the examination of a patient and to take full advantages of its capabilities, a user in general requires a considerable amount of training. From the prior art, systems are known, that provide the user with training videos to be viewed before performing an examination. For example, the S-Series devices and the M-Turbo system available from SonoSite, Inc. Bothell, Wash., USA are delivered with an USB thumb drive, named an “Education Key™”, that contain a combination of system operation video tutorials, application-specific video refresher programs with instructions on how to perform certain exams and procedures, and an image reference library for comparison purposes. The tutorials, programs and images can be viewed on the imaging devices when no examination is performed with the device and the device's display is therefore not required for showing the ultrasound image of the examination.
  • The U.S. Pat. No. 7,263,710 B1, too, describes a medical diagnostic system that provides video training. According to this disclosure, the medical diagnostic system, e.g. an ultrasound system, along with other medical diagnostic systems is connected via a network with a central service facility. On the central service facility, training videos are stored that can be accessed remotely via the network from the medical diagnostic system and displayed on a display monitor of the system. Once the user has finished viewing the video, he can subsequently perform an examination with the medical diagnostic system, which then for that purpose displays a diagnostic image on the display monitor.
  • Moreover, several methods have been proposed to aid the user during the performance of an ultrasound examination. For example, the U.S. Pat. No. 6,488,629 B1 discloses an ultrasound imaging system and a method for helping the user of the system to find the correct cut-plane when acquiring an ultrasound image. The screen of the ultrasound scanner is divided into two parts, one of which shows the live image currently acquired and the other shows a previously recorded reference image for comparison. The reference image is cycled in a loop and synchronised with the live image based on the patient's ECG signal.
  • In the international patent application WO 2007/036880 A1, a user interface is disclosed for creating and organizing ultrasound imaging protocols. The ultrasound imaging protocol guides the sonographer through each view of an examination and specifies the types of images and measurements he or she should take during the examination. The user interface displays graphical representations of ultrasound images characteristic for steps of the protocol and allows for manipulation of these graphical representations in order to change the respective protocol step. The revised protocol can then be saved in the system.
  • From the international patent application WO 2004/058072 A1, a device for real-time location and tracking in ultrasound imaging of anatomical landmarks of the heart is known. According to this disclosure, from the ultrasound signal a set of parameter values is generated that represent the movement of the cardiac structure over time. From these parameters, information about the position of the anatomical landmark is extracted and an indication of the position is overlaid onto the ultrasound image.
  • In the U.S. Pat. No. 6,656,120 B2, a device for knowledge-based adjustment of the imaging parameters of an ultrasound imaging system is disclosed. Based on a comparison between a patient information database and a reference image database, the imaging settings are automatically adjusted.
  • Problem to be Solved by the Invention
  • It is an objective of the present invention to provide an improved ultrasound imaging system, in particular for the use in medical examinations. The invention further aims to provide an improved method of assisting the user of an ultrasound imaging system, in particular when a medical examination is performed.
  • Solution According to the Invention
  • According to the invention, the problem is solved by providing an ultrasound imaging system comprising an ultrasound scanner for acquiring a live ultrasound image of a portion of the anatomy of a patient being examined with the ultrasound imaging system, an assistance means for providing at least one primary demonstration video clip, and at least one video display, wherein the video display is functionally connected with the ultrasound scanner and the assistance means in order to present the primary demonstration video clip simultaneously with a live ultrasound image. Also, the problem is solved by method of assisting the user of an ultrasound imaging system, the method comprising the steps of acquiring a live ultrasound image of a portion of the anatomy of a patient being examined, and presenting on a video display the live ultrasound image simultaneously with a primary demonstration video clip.
  • In the context of the present invention, the expression “examination procedure” does not only encompass purely diagnostic procedures but also therapeutic or surgical procedures that involve ultrasound imaging, e.g. for guidance as may be the case in the performance of a regional anaesthesia. The live ultrasound image is an ultrasound image created from the current ultrasound signal received by the ultrasound scanner during the examination of the patient with the ultrasound imaging system.
  • A demonstration video clip is an animated sequence of images demonstrating the performance of a step in the examination procedure. The present invention advantageously can instruct the user as to how he should perform a certain step of the procedure while at the same time allowing him or her to implement what he is shown. It is an achievable advantage of the invention that the user requires less experience in the use of the ultrasound system and in the performance of a given procedure to be able perform this procedure and come to a diagnosis. Thus, less training may be required, thereby saving costs.
  • It is another achievable advantage that the ultrasound imaging system according to the invention can itself be used as a training tool both for introductory training and for continued training. E.g., by means of the system, an experienced user may quickly learn a new procedure or a new diagnosis. Using the system, the user may also rehearse procedures or diagnoses he has already learned. With the help of the present invention, training may be achieved with less or even entirely without the involvement of an instructor, thereby significant reducing training costs. Also, with the present invention the advantage can be achieved that users who have been trained on a different system can easily get acquainted to the system according to the invention.
  • The system according to the invention may be used by less qualified personnel, thereby expanding the market for medical ultrasound imaging. Thus, the technology can be made available to a greater number of patients, increasing the quality of health care provided to these patients.
  • It is a further achievable advantage of the invention that even for an experienced user, handling of the system is more intuitive than in conventional systems. He may thus perform a procedure quicker and arrive faster and more confidently to a diagnosis, thereby saving costs and improving quality.
  • DESCRIPTION OF PREFERRED EMBODIMENTS OF THE INVENTION
  • Preferred features of the invention which may be applied alone or in combination are discussed in the dependent claims.
  • A demonstration video clip, such as the primary demonstration video clip and the secondary demonstration video clip discussed further below, preferably shows the position, orientation, and motion of a probe of the ultrasound scanner relatively to the patient's anatomy in the respective step of the examination procedure. The demonstration video clip may for example be a recording of an examination that has been performed on a real subject or a phantom. Alternatively, it may be the recording of a simulation using an anatomic model. The demonstration video clip may also be generated in real-time inside or outside the ultrasound imaging system, preferably by the assistance means. If the clip is be pre-recorded, it may be stored in a memory, e.g. inside the ultrasound imaging system, on a separate memory device such as SonoSite's Education Key™, or on a remote service facility such as the one disclosed in U.S. Pat. No. 7,263,710 B1. The corresponding disclosure of the latter publication is incorporated into the present application by reference.
  • In a preferred embodiment of the invention, the assistance means can in addition to the at least one primary demonstration video clip provide at least one ultrasound image video clip, which corresponds to one of the at least one primary demonstration video clips. The demonstration video clip corresponds to the ultrasound image video clip in that the ultrasound image video clip represents the animated ultrasound image that results or would result from performing the procedural step as demonstrated in the demonstration video clip. Moreover, preferably the video display is functionally connected with the assistance means in order to present the ultrasound image video clip simultaneously with the primary demonstration video clip and the live ultrasound image. Thus, the ultrasound image video clip which corresponds to the primary demonstration video clip is presented simultaneously with the live ultrasound image and the primary demonstration video clip.
  • The inventors have found that, advantageously, the demonstration of a procedural step simultaneously with presenting the corresponding ultrasound image allows the user to more naturally grasp the relationship between the manipulation of the scanning probe and the ultrasound image. Hence, this embodiment of the invention can help the user to reproduce the procedural steps and interpret the live ultrasound image. It is believed, without prejudice, that this is at least partly due to the fact that the user can on one hand compare the live ultrasound image with the ultrasound image video clip, and on the other hand his own handling of the ultrasound probe (which result in the live ultrasound image) with the demonstration of the procedural step in the primary demonstration video clip (which corresponds to the ultrasound image video clip).
  • Similar to the demonstration video clip, the ultrasound image video clip may be the ultrasound image that has been obtained in the examination of a real subject or a phantom or it may be the result of a simulation based on an anatomic model.
  • In a preferred embodiment on the invention, the assistance means comprises a synchroniser for synchronising the presentation of the ultrasound image video clip with the presentation of the demonstration video clip, so that the primary demonstration video clip runs in synchrony with the ultrasound image video clip. Synchrony between a demonstration video clip and the ultrasound image video clip means that at any moment the ultrasound image video clip shows the ultrasound image that results or would result from the position and orientation of the ultrasound scanning probe in the demonstration video clip at the respective moment. Thus, as the probe moves in the demonstration video clip, the ultrasound image in the ultrasound image video clip changes correspondingly. Moreover, preferably if in the demonstration video clip the patient is moved or manipulated in a way that would have an effect on the ultrasound image obtained from the patient, the ultrasound image in the ultrasound image video clip changes correspondingly.
  • The synchronised presentation of the performance of the procedural step and the corresponding ultrasound image can make it even easier for the user to understand the relationship between his manipulation of the scanning probe and the resulting ultrasound image. The user can e.g. see what motion helps in a particular procedural step to proceed from a landmark to a target feature or to identify a certain anatomic structure. As to the identification of anatomic structures, it can e.g. be demonstrated that nerves can be distinguished from bones by the fact that the nerve ultrasound image is stronger at some incident angles than others, while the image of bones is the same at all angles. Proper manipulation of the probe and interpretation of the ultrasound image can thus be facilitated.
  • If the demonstration video clip and the ultrasound image video clip are pre-recorded, synchronisation will preferably involve ensuring that the presentation of these video clips starts synchronously on the video display. It may also involve adjusting the speed in which one or both of the video clips are presented.
  • In a preferred embodiment of the invention, in addition to the primary demonstration video clip the assistance means can provide at least one secondary demonstration video clip, which corresponds to one of the at least one primary demonstration video clips. Preferably, the video display is functionally connected with the assistance means in order to present the secondary demonstration video clip simultaneously with the primary demonstration video clip, the live ultrasound image, and, preferably, the ultrasound image video clip. Thus, in addition to the primary demonstration video clip a corresponding secondary demonstration video clip is shown on the video display. Preferably, the primary demonstration video clip and the corresponding secondary demonstration video clip show different views of the same step of the examination. This embodiment of the invention can improve the comprehensibility of the demonstration of the procedural step and can make it easier for the user to understand the relationship between the motion of the probe and the resulting ultrasound image as shown in the ultrasound image video clip.
  • In a preferred embodiment of the invention, one demonstration video clip shows the step of the examination procedure in a perspective view. Preferably one demonstration video clip shows the step of the examination procedure in a cross-sectional view. More preferably, one of the first and the second demonstration video clips shows the perspective view while the other shows the cross-sectional view. The perspective view can be semi-transparent so that the user does not only see the patient's skin but also the underlying tissue, including bones, blood vessels, nerves and other organs. In the cross-sectional view, preferably the cross-sectional plane essentially coincides with the scanning plane of the ultrasound image video clip. Hence, if the scanning plane changes over the course of the ultrasound image video clip, so does the cross-sectional plane of the cross-sectional view. Due to the provision of both a perspective view and a cross-sectional view together with the ultrasound image video clip, the user can more easily move his or her anatomical perception from the 3-dimensional space in which the probe is manipulated to the 2-dimensional imaging slice.
  • Preferably, the synchroniser synchronises the presentation of the secondary demonstration video clip with the presentations of the primary demonstration video clip and, preferably, the ultrasound image video clip. In other words, the secondary demonstration video clip runs in synchrony with the primary demonstration video clip and the ultrasound image video clip. This can further help the user to understand the relationship between the motion of the probe and the resulting ultrasound image as shown in the ultrasound image video clip.
  • In a preferred embodiment of the invention, in the primary and/or the secondary demonstration video clip, there is the imaging slice indicated, from which the ultrasound image of the corresponding ultrasound image video clip results or would result. This can assist the user in interpreting the ultrasound image video clip. The imaging slice may for example be illustrated in the form of a quadrangular frame or a shaded area adjacent to the representation of the probe.
  • To complement the graphical information provided to the user with written information, in a preferred embodiment of the invention text is displayed on the video display, e.g. in one or more text fields. The text may e.g. explain in more detail how to handle the probe of the ultrasound scanner or how to interpret the ultrasound image and derive a diagnosis. The text field(s) may be adjacent to one or more of the locations on the video display where the demonstration video clips or the ultrasound image video clips are presented.
  • In ultrasonography, the properties of the image in general and its quality in particular are a function of the scanner's imaging settings. Finding the right imaging settings may be essential for a user to perform the procedure properly and arrive at the correct diagnosis. In a preferred embodiment of the invention, the scanner's imaging settings are adjusted by providing the scanner with a set of image parameters. The set of image parameters preferably includes at least one of a group of parameters comprising imaging depth, number of focal zones, and imaging mode. Examples for imaging modes are B-mode, C-mode, M-mode, PW-Doppler-mode and CW-Doppler-mode.
  • Preferably, the assistance means can provide at least one set of image parameter values corresponding to at least one of the demonstration video clips and/or ultrasound image video clips. More preferably, the assistance means is functionally connected with the ultrasound scanner to adjust the ultrasound scanner's imaging settings according to the set of image parameter values that corresponds to the demonstration video clip and/or ultrasound image video clip to be presented on the video display. This embodiment of the invention can help users who are inexperienced or unfamiliar with the image parameters and the effect on the image quality to obtain a good image. Moreover, by reducing the amount of image manipulation which the user has to perform, it allows users to concentrate on performing the procedure and obtaining the diagnosis rather than dealing with the technical details of the image generation. Advantageously, the user may thus save valuable time that he or she would otherwise require to adjust the settings of the ultrasound scanner. Moreover, this embodiment of the invention may help to avoid sub-optimal images that are obtained because in order to save time the user leaves the parameters at some typical values all the time. Preferably, the set of image parameter values that corresponds to the ultrasound image video clip is chosen so that the live ultrasound image obtained is similar to the image shown in the ultrasound image video clip. In other words, the image parameters are essentially those that result or would result into the image shown in the ultrasound image video clip.
  • In a preferred embodiment of the invention, the assistant means can provide at least one procedure demonstration set comprising several primary demonstration video clips, each demonstrating a different step in the examination procedure. Preferably, the assistant means is functionally connected with the video display to provide the primary demonstration video clips for presentation on the video display. Moreover, the procedure demonstration set preferably comprises several ultrasound image video clips, each clip corresponding to a different one of the primary demonstration video clips, and the assistant means is functionally connected with the video display in order to present the ultrasound image video clips simultaneously with the primary demonstration video clips in the same order. Preferably, the synchroniser can synchronise the presentation of each ultrasound image video clip with the presentation of its corresponding primary demonstration video clip.
  • A preferred procedure demonstration set comprises several secondary demonstration video clips, each corresponding to a different one of the primary demonstration video clips, and the assistant means is functionally connected with the video display in order to present the secondary demonstration video clips simultaneously with the primary demonstration video clips in the same order. Preferably, the synchroniser can synchronise the presentation of each secondary presentation video clip with the presentation of its corresponding primary demonstration video clip.
  • The present invention also encompasses embodiments of the invention which differ from the above embodiments contemplating procedure demonstration sets of several video clips in that some of the primary demonstration video clips are replaced by demonstration still images. Similarly, some or all of the corresponding ultrasound video clips and/or the secondary demonstration video clips may be replaced by ultrasound still images or demonstration still images, respectively. These alternative embodiments of the invention recognize the fact that some of the procedural steps may not involve motions relevant to the demonstration of the procedural step and may therefore be represented by still images. These embodiments of the invention may advantageously save storage and computing resources. Preferably, if a primary demonstration video clip is replaced by a still image, the corresponding ultrasound image video clip, if any, is replaced by a corresponding still image and the corresponding secondary demonstration video clip, if any, is equally replaced by a still image.
  • The preferred procedure demonstration set will comprise one primary demonstration video clip or still image for each step of the examination procedure. A procedure generally comprises at least the steps of (1) positioning the patient and finding a first landmark detectable with the ultrasound scanner, and (2) navigating with the scanner's probe from the landmark to the target. Further steps may e.g. include (3) studying the target and (4) inserting a needle. Preferably, the primary demonstration video clip or still image presented on the video screen changes, in an appropriate moment, to another primary demonstration video clip or still image in which the performance of another step in the examination procedure is demonstrated. Correspondingly, the ultrasound image video clip or still image preferably changes to another ultrasound image video clip or still image which corresponds to the other primary demonstration video clip or still image and/or the secondary demonstration video clip or still image changes to another secondary demonstration video clip or still image which corresponds to the other primary demonstration video clip or still image. Preferably, the primary demonstration video clip changes synchronously with the ultrasound image video clip and/or the secondary demonstration video clip.
  • In general, if the user of the ultrasound imaging system does not intervene, the demonstration video of the current procedural step and the corresponding ultrasound image video are shown continuously in a loop. In a preferred embodiment, the ultrasound imaging system comprises a step control that is functionally connected with the assistant means in order upon operation of the step control by the user of the ultrasound system to provide another primary demonstration video or still image (and, if any, the corresponding ultrasound video clip or still image and/or secondary demonstration video clip or still image) for presentation on the video display. The step control may e.g. comprise several virtual buttons on the video display that the user can operate to choose the step he wishes to view next. Thus, e.g., the primary demonstration video clip presented on the video screen may change to a subsequent primary demonstration video clip or still image in which the performance of a subsequent, preferably the next, step in the examination procedure is demonstrated. Preferably, upon operation of the step control by the user, the assistant means also provides the ultrasound image video clip or still image and/or the secondary demonstration video corresponding to the respective primary demonstration video clip or still image. Hence, advantageously, by means of the step control, the user can go sequentially through all steps of the procedure.
  • In a preferred embodiment of the invention, the assistance means can provide several procedure demonstration sets, each set demonstrating steps of a different examination procedure. Preferably, the ultrasound imaging system comprises a procedure control that is functionally connected with the assistant means in order upon operation of the procedure control by the user of the ultrasound imaging system to select from the several procedure demonstration sets the set to be provided for presentation on the video display. Thus, advantageously, the user can select one of several procedures and then go through the steps of the procedure as described above.
  • The components of the ultrasound imaging system according to the invention, in particular the assistance means, the video display, the ultrasound scanner, the memory, the step control, and the procedure control, can be implemented by hardware, by software, or by a combination of both. Accordingly, in the context of the present invention a functional connection between components of the ultrasound imaging system preferably is a communication between these components, involving an exchange of data, either in one direction or in both directions. In the case of software, such communication may simply involve the exchange of one or more parameter values from one software component, e.g. a software component which controls a processing unit to implement the step control, to another software component, e.g. a software component that controls the same processing unit to implement the assistance means. It may, however, also involve the exchange of one or more electric, magnetic, microwave, or optical signals from one hardware component, e.g. the hardware component which is controlled by the assistance means-software, to another hardware component, e.g. a video display-hardware.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The invention is illustrated in greater details with the aid of schematic drawings:
  • FIG. 1 shows a block diagram of an ultrasound imaging system according to the present invention;
  • FIG. 2 shows a screen of a user interface presented on the video display of an ultrasound imaging system according to the invention, the screen comprising several controls and a live ultrasound image;
  • FIG. 3 shows another screen of the user interface presented on the video display, which screen comprises a procedure control; and
  • FIG. 4 shows yet another screen of the user interface presented on the video display, which screen comprises a step control and presentations of the first demonstration video clip, the second demonstration video clip, the ultrasound image video clip and the ultrasound life image.
  • DETAILED DESCRIPTION OF AN EMBODIMENT OF THE INVENTION
  • An embodiment of the ultrasound imaging system 1 according to the invention is illustrated in FIG. 1 by means of a simplified block diagram. The system comprises an ultrasound scanner 2 with a probe 3, transmit electronics 4, receive electronics 5, and a control unit 6. A user of the ultrasound imaging system 1 can bring the probe 3 into contact with a patient in order to obtain a live ultrasound image 7 of an imaging slice defined by the position and orientation of the probe 3. The probe 3 contains transducers for generating ultrasound signals and receiving reflections of these ultrasound signals from the body of the patient. For example, the transducers for ultrasound generation and reception can be provided by an array of piezo elements. Alternatively, opto-electrical ultrasound sensors may be used for the reception of the reflected ultrasound signals. The probe 3 is connected via a flexible cable with the rest of the ultrasound system 1 so that it can be easily manipulated by the user during the examination.
  • The transmit electronics 4 comprise multiple channels that are connected directly or via one or more multiplexers to the ultrasound generating elements of the probe 3, for example piezo elements. Electric pulses are generated in each individual channel, and the relative timing of the pulses can be varied accurately enough to perform a transmit focusing in the lateral direction at different depths. The transmit electronics are implemented as a mixed signal electronics board or an ASIC, and include high voltage pulsers for generating the electric pulses.
  • The electric signals generated by the receiving piezo elements or opto-electrical ultrasound sensors in response to the reflected ultrasound signals are processed by the receive electronics 5. Processing includes the amplification of the analogue signals coming from the receiving transducers and the conversion into digital signals. The receive electronics 5 comprise a mixed signal electronics board or ASIC, which includes amplifiers and analogue-to-digital converters.
  • Both, the transmit electronics 4 and the receive electronics 5 are controlled by the ultrasound scanner's 2 control unit 6. The control unit's 6 tasks includes beam-forming and the processing of the digital signals received from the receive electronics 5. For this purpose, the control unit 6 comprises a digital signal processor (DSP), e.g. in the form of a field programmable gate array (FPGA).
  • The ultrasound imaging system 1 further comprises a video display 8 which is in communication with the scanner's 2 control unit 6 via a processing unit 9 in order to present a life ultrasound image 7 based on the ultrasound signals received by the probe 3. In the present example, the processing unit 9 comprises a CPU and GPU and performs its tasks at least partially by means of a software program. It performs scan-conversion and image processing functions to prepare the signals coming from the control unit 6 for presentation on the video display 8. In addition, the assistance means in the present example is software-implemented in the processing unit 9 so that, amongst other things, the processing unit 9 provides primary and secondary demonstration videos clips or still images and ultrasound image videos or still images for presentation on the video display 8. For this purpose, the videos and still images are stored in a memory 10 in an appropriate directory structure, so that they can easily be retrieved by the processing unit 9 for presentation on the video display 8. Moreover, parameter values corresponding to the ultrasound image video clips are stored in the memory and retrieved by the processing unit 9 for adjusting the settings of the ultrasound scanner 6. A suitable memory 10 is a flash memory drive. In the memories directory structure, each procedure has a different directory containing the required video or image files of all steps in this procedure.
  • The processing unit 9 generates on the video display 8 a user interface comprising multiple screens, three of which are shown in FIGS. 2 to 4. The screens comprise software-implemented controls and serve to present the live ultrasound image along with video clips, still images, and written information. FIG. 2 shows a first screen 11 of the user interface which comprises the life ultrasound image 7 on the right hand side and various controls 12, 13, 14 on the left hand side. The controls can be operated via appropriate input means, e.g. a computer mouse or touch screen sensors on the video display 8. With the controls 12, 13, the user can, adjust various settings of the ultrasound imaging system 1, make measurements and annotations and administer patient records. Moreover, there is a procedure control 14 in the top left corner, which leads the user to the procedure selection screen 15 shown in FIG. 3 which replaces the screen of FIG. 2.
  • On the procedure selection screen 15, the user finds in an area on the left hand side a procedure control 16 comprising a number of virtual tabs 17, 18, 19, 20 and buttons 21 representing different procedures. The uppermost tab 17, for example, produces a list of buttons 21 for procedures regarding the field of regional anaesthesia, while the tab 18 below produces a list of procedure buttons 21 regarding vascular access. There is also a tab 20 which produces a list of “favourite” procedure buttons 21, which list the user can customize to include the procedures that he or she uses most often. By operating one of the procedure buttons 21, the user can choose the corresponding procedures and, on the right hand side 22 of the screen, obtains some general information regarding the procedure, generally in the form of still images. Pressing a cue card button 23 on the bottom of the screen 15 leads the user to a cue card screen 24 corresponding to the selected procedure. The cue card screen 24 replaces the procedure selection screen 15 of FIG. 3.
  • On the cue card screen 24 shown in FIG. 4, on the bottom left hand side a step control 25 is shown which comprises several virtual tabs 26 representing the steps of the procedure and arranged from top to bottom in the order in which they have to be performed. The user can select the step he seeks to perform next by operating the corresponding tab 26 of the step control 25. As a result, on top of the cue card screen 24, the primary demonstration video clip 27 or still image, the secondary demonstration video clip 28 or still image and the ultrasound image video clip 29 or still image, all retrieved by the processing unit 9 from the memory 10, are shown in that order from left to right. In the bottom right corner of the cue card screen 24, the life video image 7 is shown. If the three images 27, 28, 29 in the upper row are video clips, they are synchronised by means of a synchroniser software-implemented in the processing unit 9, so that a movement of the probe 3 shown in the primary demonstration video clip 27 corresponds to a movement of the probe 3 in the secondary demonstration video clip 28 and the representation of the ultrasound image of this slice in the ultrasound image video clip 29.
  • More particularly, in the example of FIG. 4, there is a primary demonstration video clip 27 that shows the movement of the probe 3, e.g. from a landmark to a target, in a perspective view based on an anatomic model of a human, a secondary demonstration video clip 28 that shows the movement in a cross-sectional view based on the same anatomic model, and an ultrasound image video clip 29 that shows the ultrasound image that would result from the procedure. The body of the patient in the perspective view is semi-transparent (not shown in the Figure), so that the spinal cord, the mayor blood vessels and the thyroid gland are visible. Both, in the perspective view of the first demonstration video clip 27 and the cross-sectional view of the second demonstration video clip 28, a shaded quadrangle indicates the imaging slice that would corresponds to the ultrasound image shown in the ultrasound image video clip 29. Below the areas on the screen where the video clips or still images are shown, text boxes are 30 providing additional information.
  • Moreover, the image parameter values corresponding to the ultrasound image video clips 29 are retrieved by the processing unit 9 from the memory 10. The processing unit 9 sends these parameters to the ultrasound scanner 2 to adjust the ultrasound scanner's 2 imaging settings to those suitable for the current procedural step.
  • Using the perspective view, the cross-sectional view and the corresponding ultrasound image of the demonstration, the user's anatomic perception is guided from tree dimensions to two dimensions and from there to the ultrasound image. By comparison of the demonstration with his own handling of the probe 3 and the live ultrasound image 7, the user can easily reproduce the demonstrated step of the procedure and interpret the live image he obtains.
  • The features described in the above description, claims and figures can be relevant to the invention in any combination.

Claims (16)

1.-15. (canceled)
16. An ultrasound imaging system comprising:
an ultrasound scanner for acquiring a live ultrasound image of an object being examined with the ultrasound imaging system;
an assistance means for providing at least one primary demonstration video clip; and
at least one video display; wherein:
the video display is functionally coupled to the ultrasound scanner and to the assistance means, said video display presenting the primary demonstration video clip simultaneously with a live ultrasound image.
17. The ultrasound imaging system according to claim 16, wherein:
the assistance means is adapted to provide at least one ultrasound image video clip corresponding to a primary demonstration video clip; and
the video display is functionally coupled to the assistance means, said video display presenting the ultrasound image video clip simultaneously with the primary demonstration video clip and with the live ultrasound image.
18. The ultrasound imaging system according to claim 17, wherein the assistance means comprises a synchronizer for synchronizing presentation of the primary demonstration video clip with presentation of the ultrasound image video clip.
19. The ultrasound imaging system according to claim 17, wherein:
the assistance means is adapted to provide at least one secondary demonstration video clip corresponding to a primary demonstration video clip; and
the video display is functionally coupled to the assistance means and presents the secondary demonstration video clip simultaneously with the primary demonstration video clip, the ultrasound image video clip, and the live ultrasound image.
20. The ultrasound imaging system according to claim 19, wherein the primary demonstration video clip and the corresponding secondary demonstration video clip show different views of a step of the examination.
21. The ultrasound imaging system according to claim 19, wherein a synchronizer synchronizes presentation of the secondary demonstration video clip with presentations of the primary demonstration video clip and the ultrasound image video clip.
22. The ultrasound imaging system according to claim 17, wherein the assistance means is adapted to provide at least one set of image parameter values corresponding to at least one ultrasound image video clip and is functionally coupled to the ultrasound scanner to adjust the ultrasound scanner's imaging settings according to the set of image parameter values corresponding to the ultrasound image video clip to be presented on the video display.
23. The ultrasound imaging system according to claim 16, wherein:
the assistance means is adapted to provide at least one procedure demonstration set comprising several primary demonstration video clips, each primary demonstration video clip demonstrating a different step in an examination procedure; and
the assistance means is functionally coupled to the video display, whereby the primary demonstration video clips are presented on the video display.
24. The ultrasound imaging system according to claim 16, further comprising a step control functionally coupled to the assistance means whereby, upon operation of the step control by a user of the ultrasound system, a specific primary demonstration video is presented on the video display.
25. The ultrasound imaging system according to claim 23, wherein:
the assistance means is adapted to provide several procedure demonstration sets, each set demonstrating steps in a different examination procedure; and
the ultrasound imaging system comprises a procedure control functionally coupled to the assistance means whereby, upon operation of the procedure control by a user of the ultrasound imaging system, a set from among the several procedure demonstration sets is selected for presentation on the video display.
26. In an ultrasound imaging system, a method for assisting a user of the system, said method comprising the steps of:
acquiring a live ultrasound image of an object being examined; and
presenting on a video display the live ultrasound image simultaneously with a primary demonstration video clip.
27. The method according to claim 26, wherein the presenting step further comprises presenting an ultrasound image video clip, corresponding to the primary demonstration video clip, simultaneously with the live ultrasound image and the primary demonstration video clip.
28. The method according to claim 27, wherein:
when presenting the live ultrasound image simultaneously with the primary demonstration video clip and the ultrasound image video clip, the primary demonstration video clip runs in synchronism with the ultrasound image video clip.
29. The method according to claim 27, wherein the presenting step further comprises presenting a corresponding secondary demonstration video clip on the video display in synchronism with the primary demonstration video clip and the ultrasound image video clip.
30. The method of claim 27, further comprising the step of adjusting imaging settings of the ultrasound scanner according to a set of image parameters corresponding to the ultrasound image video clip presented on the video display.
US12/988,730 2008-04-22 2008-04-22 Ultrasound imaging system and method for providing assistance in an ultrasound imaging system Abandoned US20110196235A1 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2008/054832 WO2009129845A1 (en) 2008-04-22 2008-04-22 Ultrasound imaging system and method for providing assistance in an ultrasound imaging system

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2008/054832 A-371-Of-International WO2009129845A1 (en) 2008-04-22 2008-04-22 Ultrasound imaging system and method for providing assistance in an ultrasound imaging system

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/242,803 Continuation US11311269B2 (en) 2008-04-22 2019-01-08 Ultrasound imaging system and method for providing assistance in an ultrasound imaging system

Publications (1)

Publication Number Publication Date
US20110196235A1 true US20110196235A1 (en) 2011-08-11

Family

ID=39739671

Family Applications (2)

Application Number Title Priority Date Filing Date
US12/988,730 Abandoned US20110196235A1 (en) 2008-04-22 2008-04-22 Ultrasound imaging system and method for providing assistance in an ultrasound imaging system
US16/242,803 Active 2028-10-04 US11311269B2 (en) 2008-04-22 2019-01-08 Ultrasound imaging system and method for providing assistance in an ultrasound imaging system

Family Applications After (1)

Application Number Title Priority Date Filing Date
US16/242,803 Active 2028-10-04 US11311269B2 (en) 2008-04-22 2019-01-08 Ultrasound imaging system and method for providing assistance in an ultrasound imaging system

Country Status (4)

Country Link
US (2) US20110196235A1 (en)
EP (1) EP2285287B1 (en)
JP (1) JP5349582B2 (en)
WO (1) WO2009129845A1 (en)

Cited By (40)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120172722A1 (en) * 2010-01-07 2012-07-05 Timothy Mark Chinowsky Ultrasound apparatus and graphical interface for procedural assistance
US20140153358A1 (en) * 2012-11-30 2014-06-05 General Electric Company Medical imaging system and method for providing imaging assitance
US20140200449A1 (en) * 2013-01-16 2014-07-17 Samsung Medison Co., Ltd. Ultrasound apparatus and method of providing information of the same
US8784336B2 (en) 2005-08-24 2014-07-22 C. R. Bard, Inc. Stylet apparatuses and methods of manufacture
US8849382B2 (en) 2007-11-26 2014-09-30 C. R. Bard, Inc. Apparatus and display methods relating to intravascular placement of a catheter
US8858455B2 (en) 2006-10-23 2014-10-14 Bard Access Systems, Inc. Method of locating the tip of a central venous catheter
US20150011886A1 (en) * 2011-12-12 2015-01-08 Koninklijke Philips N.V. Automatic imaging plane selection for echocardiography
US9125578B2 (en) 2009-06-12 2015-09-08 Bard Access Systems, Inc. Apparatus and method for catheter navigation and tip location
US9265443B2 (en) 2006-10-23 2016-02-23 Bard Access Systems, Inc. Method of locating the tip of a central venous catheter
US9339206B2 (en) 2009-06-12 2016-05-17 Bard Access Systems, Inc. Adaptor for endovascular electrocardiography
US9415188B2 (en) 2010-10-29 2016-08-16 C. R. Bard, Inc. Bioimpedance-assisted placement of a medical device
US9445734B2 (en) 2009-06-12 2016-09-20 Bard Access Systems, Inc. Devices and methods for endovascular electrography
US9456766B2 (en) 2007-11-26 2016-10-04 C. R. Bard, Inc. Apparatus for use with needle insertion guidance system
US9492097B2 (en) 2007-11-26 2016-11-15 C. R. Bard, Inc. Needle length determination and calibration for insertion guidance system
US9521961B2 (en) 2007-11-26 2016-12-20 C. R. Bard, Inc. Systems and methods for guiding a medical instrument
US9526440B2 (en) 2007-11-26 2016-12-27 C.R. Bard, Inc. System for placement of a catheter including a signal-generating stylet
US9532724B2 (en) 2009-06-12 2017-01-03 Bard Access Systems, Inc. Apparatus and method for catheter navigation using endovascular energy mapping
US20170000453A1 (en) * 2015-06-30 2017-01-05 Wisconsin Alumni Research Foundation Obstetrical Imaging at the Point of Care for Untrained or Minimally Trained Operators
US9554716B2 (en) 2007-11-26 2017-01-31 C. R. Bard, Inc. Insertion guidance system for needles and medical components
USD785656S1 (en) * 2015-11-24 2017-05-02 Meditech International Inc. Display screen or portion thereof with graphical user interface
US9636031B2 (en) 2007-11-26 2017-05-02 C.R. Bard, Inc. Stylets for use with apparatus for intravascular placement of a catheter
US9649048B2 (en) 2007-11-26 2017-05-16 C. R. Bard, Inc. Systems and methods for breaching a sterile field for intravascular placement of a catheter
US9681823B2 (en) 2007-11-26 2017-06-20 C. R. Bard, Inc. Integrated system for intravascular placement of a catheter
US9839372B2 (en) 2014-02-06 2017-12-12 C. R. Bard, Inc. Systems and methods for guidance and placement of an intravascular device
US9901714B2 (en) 2008-08-22 2018-02-27 C. R. Bard, Inc. Catheter assembly including ECG sensor and magnetic assemblies
US9907513B2 (en) 2008-10-07 2018-03-06 Bard Access Systems, Inc. Percutaneous magnetic gastrostomy
US10046139B2 (en) 2010-08-20 2018-08-14 C. R. Bard, Inc. Reconfirmation of ECG-assisted catheter tip placement
US10349890B2 (en) 2015-06-26 2019-07-16 C. R. Bard, Inc. Connector interface for ECG-based catheter positioning system
EP3549528A1 (en) * 2018-04-05 2019-10-09 Koninklijke Philips N.V. Ultrasound imaging system and method
US10449330B2 (en) 2007-11-26 2019-10-22 C. R. Bard, Inc. Magnetic element-equipped needle assemblies
US10524691B2 (en) 2007-11-26 2020-01-07 C. R. Bard, Inc. Needle assembly including an aligned magnetic element
EP3590436A1 (en) 2018-07-06 2020-01-08 Koninklijke Philips N.V. Identifying an optimal image from a number of ultrasound images
US10751509B2 (en) 2007-11-26 2020-08-25 C. R. Bard, Inc. Iconic representations for guidance of an indwelling medical device
US10973584B2 (en) 2015-01-19 2021-04-13 Bard Access Systems, Inc. Device and method for vascular access
US10992079B2 (en) 2018-10-16 2021-04-27 Bard Access Systems, Inc. Safety-equipped connection systems and methods thereof for establishing electrical connections
US11000207B2 (en) 2016-01-29 2021-05-11 C. R. Bard, Inc. Multiple coil system for tracking a medical device
US11369410B2 (en) 2017-04-27 2022-06-28 Bard Access Systems, Inc. Magnetizing system for needle assemblies including orientation key system for positioning needle tray in magnetizer
US11399803B2 (en) * 2018-08-08 2022-08-02 General Electric Company Ultrasound imaging system and method
US11478218B2 (en) * 2017-08-31 2022-10-25 Bfly Operations, Inc. Methods and apparatus for collection of ultrasound data
US11911140B2 (en) 2020-11-09 2024-02-27 Bard Access Systems, Inc. Medical device magnetizer

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090093719A1 (en) * 2007-10-03 2009-04-09 Laurent Pelissier Handheld ultrasound imaging systems
JP5557088B2 (en) * 2009-11-04 2014-07-23 ジーイー・メディカル・システムズ・グローバル・テクノロジー・カンパニー・エルエルシー Ultrasonic diagnostic equipment
GB2479406A (en) * 2010-04-09 2011-10-12 Medaphor Ltd Ultrasound Simulation Training System
US20110263980A1 (en) * 2010-04-23 2011-10-27 General Electric Company Method and system for guiding clinicians in real time
US20120065508A1 (en) * 2010-09-09 2012-03-15 General Electric Company Ultrasound imaging system and method for displaying a target image
US9877699B2 (en) 2012-03-26 2018-01-30 Teratech Corporation Tablet ultrasound system
US10667790B2 (en) 2012-03-26 2020-06-02 Teratech Corporation Tablet ultrasound system
US11043144B2 (en) * 2017-08-04 2021-06-22 Clarius Mobile Health Corp. Systems and methods for providing an interactive demonstration of an ultrasound user interface
JP7284337B2 (en) * 2019-07-12 2023-05-30 ベラソン インコーポレイテッド Representation of a target during aiming of an ultrasonic probe
WO2022020351A1 (en) 2020-07-21 2022-01-27 Bard Access Systems, Inc. System, method and apparatus for magnetic tracking of ultrasound probe and generation of 3d visualization thereof

Citations (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US2612074A (en) * 1949-03-30 1952-09-30 Prec Mecanique Paris Soc Interferometer
US4381676A (en) * 1980-05-02 1983-05-03 Krautkramer-Branson, Incorporated Apparatus for sensing ultrasonic waves by optical means
US4762413A (en) * 1984-09-07 1988-08-09 Olympus Optical Co., Ltd. Method and apparatus for measuring immunological reaction with the aid of fluctuation in intensity of scattered light
US5146079A (en) * 1990-11-01 1992-09-08 At&T Bell Laboratories Broadband optical receiver with active bias feedback circuit
US5211165A (en) * 1991-09-03 1993-05-18 General Electric Company Tracking system to follow the position and orientation of a device with radiofrequency field gradients
US5735276A (en) * 1995-03-21 1998-04-07 Lemelson; Jerome Method and apparatus for scanning and evaluating matter
US5791907A (en) * 1996-03-08 1998-08-11 Ramshaw; Bruce J. Interactive medical training system
US5832055A (en) * 1996-08-08 1998-11-03 Agfa-Gevaert Method of correcting a radiation image for defects in the recording member
US20010010003A1 (en) * 1991-08-02 2001-07-26 Lai Shui T. Method and apparatus for surgery of the cornea using short laser pulses having shallow ablation depth
US20010039836A1 (en) * 2000-05-02 2001-11-15 Eiji Ogawa Ultrasonic probe and ultrasonic diagnosis apparatus using the same
US20010042410A1 (en) * 2000-05-02 2001-11-22 Eiji Ogawa Ultrasonic probe, ultrasonic receiver and ultrasonic diagnostic apparatus
US6390978B1 (en) * 1999-10-01 2002-05-21 Karl Storz Gmbh & Co. Kg Imaging method for determining a physical or chemical condition of tissue in human or animal bodies, and system for carrying out the method
US20020087080A1 (en) * 2000-12-28 2002-07-04 Slayton Michael H. Visual imaging system for ultrasonic probe
US6436049B1 (en) * 1999-05-31 2002-08-20 Kabushiki Kaisha Toshiba Three-dimensional ultrasound diagnosis based on contrast echo technique
US20020156375A1 (en) * 1999-10-28 2002-10-24 Paul Kessman Navigation information overlay onto ultrasound imagery
US6488629B1 (en) * 2001-07-31 2002-12-03 Ge Medical Systems Global Technology Company, Llc Ultrasound image acquisition with synchronized reference image
US20040019270A1 (en) * 2002-06-12 2004-01-29 Takashi Takeuchi Ultrasonic diagnostic apparatus, ultrasonic probe and navigation method for acquisition of ultrasonic image
US20050036668A1 (en) * 2003-02-12 2005-02-17 The University Of Iowa Research Foundation Methods and devices useful for analyzing color medical images
US20050068221A1 (en) * 2003-09-30 2005-03-31 Freeman Steven R. Ultrasonic signal acquisition in the digital beamformer
US20050090742A1 (en) * 2003-08-19 2005-04-28 Yoshitaka Mine Ultrasonic diagnostic apparatus
US20050096545A1 (en) * 2003-10-30 2005-05-05 Haider Bruno H. Methods and apparatus for transducer probe
US7041058B2 (en) * 2002-08-29 2006-05-09 Siemens Medical Solutions Usa, Inc. Medical diagnostic imaging system and method for efficient access of data
US20070010743A1 (en) * 2003-05-08 2007-01-11 Osamu Arai Reference image display method for ultrasonography and ultrasonograph
US7263710B1 (en) * 1999-12-31 2007-08-28 General Electric Company Medical diagnostic system with on-line real-time video training
US20070239001A1 (en) * 2005-11-02 2007-10-11 James Mehi High frequency array ultrasound system
US20080249402A1 (en) * 2004-09-29 2008-10-09 Koninklijke Philips Electronics, N.V. System or Synchronised Playback of Video Image Clips
US7806824B2 (en) * 2003-10-22 2010-10-05 Aloka Co., Ltd. Ultrasound diagnosis apparatus

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04135546A (en) * 1990-09-27 1992-05-11 Toshiba Corp Ultrasonic diagnostic device
US6458081B1 (en) * 1999-04-23 2002-10-01 Kabushiki Kaisha Toshiba Ultrasonic diagnostic apparatus
US7286241B2 (en) 1999-06-24 2007-10-23 Lockheed Martin Corporation System and method for high-speed laser detection of ultrasound
JP4619481B2 (en) * 2000-03-29 2011-01-26 株式会社東芝 Ultrasound diagnostic imaging equipment
JP2001321374A (en) 2000-05-15 2001-11-20 Fuji Photo Film Co Ltd Method of composing image data, and ultrasonograph using the same
US6468217B1 (en) * 2001-07-10 2002-10-22 Koninklijke Philips Electronics N.V. Method and apparatus for performing real-time storage of ultrasound video image information
KR100527315B1 (en) 2001-11-16 2005-11-09 주식회사 메디슨 Ultrasound imaging system using knowledge-based image adjusting device
US20040116810A1 (en) 2002-12-17 2004-06-17 Bjorn Olstad Ultrasound location of anatomical landmarks
JP4135546B2 (en) * 2003-03-31 2008-08-20 マツダ株式会社 Variable valve gear for engine
US20050187472A1 (en) 2004-01-30 2005-08-25 Peter Lysyansky Protocol-driven ultrasound examination
WO2006060406A1 (en) * 2004-11-30 2006-06-08 The Regents Of The University Of California Multimodal medical procedure training system
JP2009509615A (en) 2005-09-30 2009-03-12 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ User interface system and method for generating, organizing and configuring an ultrasound imaging protocol
US10531858B2 (en) * 2007-07-20 2020-01-14 Elekta, LTD Methods and systems for guiding the acquisition of ultrasound images

Patent Citations (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US2612074A (en) * 1949-03-30 1952-09-30 Prec Mecanique Paris Soc Interferometer
US4381676A (en) * 1980-05-02 1983-05-03 Krautkramer-Branson, Incorporated Apparatus for sensing ultrasonic waves by optical means
US4762413A (en) * 1984-09-07 1988-08-09 Olympus Optical Co., Ltd. Method and apparatus for measuring immunological reaction with the aid of fluctuation in intensity of scattered light
US5146079A (en) * 1990-11-01 1992-09-08 At&T Bell Laboratories Broadband optical receiver with active bias feedback circuit
US20010010003A1 (en) * 1991-08-02 2001-07-26 Lai Shui T. Method and apparatus for surgery of the cornea using short laser pulses having shallow ablation depth
US5211165A (en) * 1991-09-03 1993-05-18 General Electric Company Tracking system to follow the position and orientation of a device with radiofrequency field gradients
US5735276A (en) * 1995-03-21 1998-04-07 Lemelson; Jerome Method and apparatus for scanning and evaluating matter
US5791907A (en) * 1996-03-08 1998-08-11 Ramshaw; Bruce J. Interactive medical training system
US5832055A (en) * 1996-08-08 1998-11-03 Agfa-Gevaert Method of correcting a radiation image for defects in the recording member
US6436049B1 (en) * 1999-05-31 2002-08-20 Kabushiki Kaisha Toshiba Three-dimensional ultrasound diagnosis based on contrast echo technique
US6390978B1 (en) * 1999-10-01 2002-05-21 Karl Storz Gmbh & Co. Kg Imaging method for determining a physical or chemical condition of tissue in human or animal bodies, and system for carrying out the method
US20020156375A1 (en) * 1999-10-28 2002-10-24 Paul Kessman Navigation information overlay onto ultrasound imagery
US7263710B1 (en) * 1999-12-31 2007-08-28 General Electric Company Medical diagnostic system with on-line real-time video training
US20010039836A1 (en) * 2000-05-02 2001-11-15 Eiji Ogawa Ultrasonic probe and ultrasonic diagnosis apparatus using the same
US20010042410A1 (en) * 2000-05-02 2001-11-22 Eiji Ogawa Ultrasonic probe, ultrasonic receiver and ultrasonic diagnostic apparatus
US20020087080A1 (en) * 2000-12-28 2002-07-04 Slayton Michael H. Visual imaging system for ultrasonic probe
US6488629B1 (en) * 2001-07-31 2002-12-03 Ge Medical Systems Global Technology Company, Llc Ultrasound image acquisition with synchronized reference image
US20040019270A1 (en) * 2002-06-12 2004-01-29 Takashi Takeuchi Ultrasonic diagnostic apparatus, ultrasonic probe and navigation method for acquisition of ultrasonic image
US7041058B2 (en) * 2002-08-29 2006-05-09 Siemens Medical Solutions Usa, Inc. Medical diagnostic imaging system and method for efficient access of data
US20050036668A1 (en) * 2003-02-12 2005-02-17 The University Of Iowa Research Foundation Methods and devices useful for analyzing color medical images
US20070010743A1 (en) * 2003-05-08 2007-01-11 Osamu Arai Reference image display method for ultrasonography and ultrasonograph
US20050090742A1 (en) * 2003-08-19 2005-04-28 Yoshitaka Mine Ultrasonic diagnostic apparatus
US20050068221A1 (en) * 2003-09-30 2005-03-31 Freeman Steven R. Ultrasonic signal acquisition in the digital beamformer
US7806824B2 (en) * 2003-10-22 2010-10-05 Aloka Co., Ltd. Ultrasound diagnosis apparatus
US20050096545A1 (en) * 2003-10-30 2005-05-05 Haider Bruno H. Methods and apparatus for transducer probe
US20080249402A1 (en) * 2004-09-29 2008-10-09 Koninklijke Philips Electronics, N.V. System or Synchronised Playback of Video Image Clips
US20070239001A1 (en) * 2005-11-02 2007-10-11 James Mehi High frequency array ultrasound system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Ehricke et al., "SONOSim3D: a multimedia system for sonography simulation and education with an extensible case database", European Journal of Ultrasound, Vol. 7, 1998, pgs. 225-230. *
Heer et al., "Ultrasound training: the virtual patient", Ultrasound Obstet. Gynecol., 2004, Vol. 24, pgs. 440-444. *

Cited By (72)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11207496B2 (en) 2005-08-24 2021-12-28 C. R. Bard, Inc. Stylet apparatuses and methods of manufacture
US10004875B2 (en) 2005-08-24 2018-06-26 C. R. Bard, Inc. Stylet apparatuses and methods of manufacture
US8784336B2 (en) 2005-08-24 2014-07-22 C. R. Bard, Inc. Stylet apparatuses and methods of manufacture
US9833169B2 (en) 2006-10-23 2017-12-05 Bard Access Systems, Inc. Method of locating the tip of a central venous catheter
US9345422B2 (en) 2006-10-23 2016-05-24 Bard Acess Systems, Inc. Method of locating the tip of a central venous catheter
US8858455B2 (en) 2006-10-23 2014-10-14 Bard Access Systems, Inc. Method of locating the tip of a central venous catheter
US9265443B2 (en) 2006-10-23 2016-02-23 Bard Access Systems, Inc. Method of locating the tip of a central venous catheter
US10849695B2 (en) 2007-11-26 2020-12-01 C. R. Bard, Inc. Systems and methods for breaching a sterile field for intravascular placement of a catheter
US9636031B2 (en) 2007-11-26 2017-05-02 C.R. Bard, Inc. Stylets for use with apparatus for intravascular placement of a catheter
US11134915B2 (en) 2007-11-26 2021-10-05 C. R. Bard, Inc. System for placement of a catheter including a signal-generating stylet
US8849382B2 (en) 2007-11-26 2014-09-30 C. R. Bard, Inc. Apparatus and display methods relating to intravascular placement of a catheter
US11123099B2 (en) 2007-11-26 2021-09-21 C. R. Bard, Inc. Apparatus for use with needle insertion guidance system
US10238418B2 (en) 2007-11-26 2019-03-26 C. R. Bard, Inc. Apparatus for use with needle insertion guidance system
US9456766B2 (en) 2007-11-26 2016-10-04 C. R. Bard, Inc. Apparatus for use with needle insertion guidance system
US9492097B2 (en) 2007-11-26 2016-11-15 C. R. Bard, Inc. Needle length determination and calibration for insertion guidance system
US9521961B2 (en) 2007-11-26 2016-12-20 C. R. Bard, Inc. Systems and methods for guiding a medical instrument
US9526440B2 (en) 2007-11-26 2016-12-27 C.R. Bard, Inc. System for placement of a catheter including a signal-generating stylet
US10966630B2 (en) 2007-11-26 2021-04-06 C. R. Bard, Inc. Integrated system for intravascular placement of a catheter
US10342575B2 (en) 2007-11-26 2019-07-09 C. R. Bard, Inc. Apparatus for use with needle insertion guidance system
US9549685B2 (en) 2007-11-26 2017-01-24 C. R. Bard, Inc. Apparatus and display methods relating to intravascular placement of a catheter
US9554716B2 (en) 2007-11-26 2017-01-31 C. R. Bard, Inc. Insertion guidance system for needles and medical components
US10751509B2 (en) 2007-11-26 2020-08-25 C. R. Bard, Inc. Iconic representations for guidance of an indwelling medical device
US11529070B2 (en) 2007-11-26 2022-12-20 C. R. Bard, Inc. System and methods for guiding a medical instrument
US9649048B2 (en) 2007-11-26 2017-05-16 C. R. Bard, Inc. Systems and methods for breaching a sterile field for intravascular placement of a catheter
US9681823B2 (en) 2007-11-26 2017-06-20 C. R. Bard, Inc. Integrated system for intravascular placement of a catheter
US11707205B2 (en) 2007-11-26 2023-07-25 C. R. Bard, Inc. Integrated system for intravascular placement of a catheter
US10602958B2 (en) 2007-11-26 2020-03-31 C. R. Bard, Inc. Systems and methods for guiding a medical instrument
US10449330B2 (en) 2007-11-26 2019-10-22 C. R. Bard, Inc. Magnetic element-equipped needle assemblies
US10231753B2 (en) 2007-11-26 2019-03-19 C. R. Bard, Inc. Insertion guidance system for needles and medical components
US9999371B2 (en) 2007-11-26 2018-06-19 C. R. Bard, Inc. Integrated system for intravascular placement of a catheter
US11779240B2 (en) 2007-11-26 2023-10-10 C. R. Bard, Inc. Systems and methods for breaching a sterile field for intravascular placement of a catheter
US10524691B2 (en) 2007-11-26 2020-01-07 C. R. Bard, Inc. Needle assembly including an aligned magnetic element
US10105121B2 (en) 2007-11-26 2018-10-23 C. R. Bard, Inc. System for placement of a catheter including a signal-generating stylet
US10165962B2 (en) 2007-11-26 2019-01-01 C. R. Bard, Inc. Integrated systems for intravascular placement of a catheter
US9901714B2 (en) 2008-08-22 2018-02-27 C. R. Bard, Inc. Catheter assembly including ECG sensor and magnetic assemblies
US11027101B2 (en) 2008-08-22 2021-06-08 C. R. Bard, Inc. Catheter assembly including ECG sensor and magnetic assemblies
US9907513B2 (en) 2008-10-07 2018-03-06 Bard Access Systems, Inc. Percutaneous magnetic gastrostomy
US10231643B2 (en) 2009-06-12 2019-03-19 Bard Access Systems, Inc. Apparatus and method for catheter navigation and tip location
US9339206B2 (en) 2009-06-12 2016-05-17 Bard Access Systems, Inc. Adaptor for endovascular electrocardiography
US11419517B2 (en) 2009-06-12 2022-08-23 Bard Access Systems, Inc. Apparatus and method for catheter navigation using endovascular energy mapping
US10271762B2 (en) 2009-06-12 2019-04-30 Bard Access Systems, Inc. Apparatus and method for catheter navigation using endovascular energy mapping
US9125578B2 (en) 2009-06-12 2015-09-08 Bard Access Systems, Inc. Apparatus and method for catheter navigation and tip location
US9445734B2 (en) 2009-06-12 2016-09-20 Bard Access Systems, Inc. Devices and methods for endovascular electrography
US9532724B2 (en) 2009-06-12 2017-01-03 Bard Access Systems, Inc. Apparatus and method for catheter navigation using endovascular energy mapping
US10912488B2 (en) 2009-06-12 2021-02-09 Bard Access Systems, Inc. Apparatus and method for catheter navigation and tip location
US20120172722A1 (en) * 2010-01-07 2012-07-05 Timothy Mark Chinowsky Ultrasound apparatus and graphical interface for procedural assistance
US10046139B2 (en) 2010-08-20 2018-08-14 C. R. Bard, Inc. Reconfirmation of ECG-assisted catheter tip placement
US9415188B2 (en) 2010-10-29 2016-08-16 C. R. Bard, Inc. Bioimpedance-assisted placement of a medical device
US20150011886A1 (en) * 2011-12-12 2015-01-08 Koninklijke Philips N.V. Automatic imaging plane selection for echocardiography
US20140153358A1 (en) * 2012-11-30 2014-06-05 General Electric Company Medical imaging system and method for providing imaging assitance
US20140200449A1 (en) * 2013-01-16 2014-07-17 Samsung Medison Co., Ltd. Ultrasound apparatus and method of providing information of the same
US10863920B2 (en) 2014-02-06 2020-12-15 C. R. Bard, Inc. Systems and methods for guidance and placement of an intravascular device
US9839372B2 (en) 2014-02-06 2017-12-12 C. R. Bard, Inc. Systems and methods for guidance and placement of an intravascular device
US10973584B2 (en) 2015-01-19 2021-04-13 Bard Access Systems, Inc. Device and method for vascular access
US10349890B2 (en) 2015-06-26 2019-07-16 C. R. Bard, Inc. Connector interface for ECG-based catheter positioning system
US11026630B2 (en) 2015-06-26 2021-06-08 C. R. Bard, Inc. Connector interface for ECG-based catheter positioning system
US20170000453A1 (en) * 2015-06-30 2017-01-05 Wisconsin Alumni Research Foundation Obstetrical Imaging at the Point of Care for Untrained or Minimally Trained Operators
US10709416B2 (en) * 2015-06-30 2020-07-14 Wisconsin Alumni Research Foundation Obstetrical imaging at the point of care for untrained or minimally trained operators
USD785656S1 (en) * 2015-11-24 2017-05-02 Meditech International Inc. Display screen or portion thereof with graphical user interface
US11000207B2 (en) 2016-01-29 2021-05-11 C. R. Bard, Inc. Multiple coil system for tracking a medical device
US11369410B2 (en) 2017-04-27 2022-06-28 Bard Access Systems, Inc. Magnetizing system for needle assemblies including orientation key system for positioning needle tray in magnetizer
US11478218B2 (en) * 2017-08-31 2022-10-25 Bfly Operations, Inc. Methods and apparatus for collection of ultrasound data
US20220386990A1 (en) * 2017-08-31 2022-12-08 Bfly Operations, Inc. Methods and apparatus for collection of ultrasound data
WO2019192918A1 (en) * 2018-04-05 2019-10-10 Koninklijke Philips N.V. Ultrasound imaging system and method
EP3549528A1 (en) * 2018-04-05 2019-10-09 Koninklijke Philips N.V. Ultrasound imaging system and method
US11883231B2 (en) 2018-04-05 2024-01-30 Koninklijke Philips N.V. Ultrasound imaging system and method
WO2020008063A1 (en) 2018-07-06 2020-01-09 Koninklijke Philips N.V. Identifying an optimal image from a number of ultrasound images
EP3590436A1 (en) 2018-07-06 2020-01-08 Koninklijke Philips N.V. Identifying an optimal image from a number of ultrasound images
US11399803B2 (en) * 2018-08-08 2022-08-02 General Electric Company Ultrasound imaging system and method
US11621518B2 (en) 2018-10-16 2023-04-04 Bard Access Systems, Inc. Safety-equipped connection systems and methods thereof for establishing electrical connections
US10992079B2 (en) 2018-10-16 2021-04-27 Bard Access Systems, Inc. Safety-equipped connection systems and methods thereof for establishing electrical connections
US11911140B2 (en) 2020-11-09 2024-02-27 Bard Access Systems, Inc. Medical device magnetizer

Also Published As

Publication number Publication date
EP2285287A1 (en) 2011-02-23
JP5349582B2 (en) 2013-11-20
WO2009129845A1 (en) 2009-10-29
JP2011518013A (en) 2011-06-23
US20190142362A1 (en) 2019-05-16
US11311269B2 (en) 2022-04-26
EP2285287B1 (en) 2015-04-01

Similar Documents

Publication Publication Date Title
US11311269B2 (en) Ultrasound imaging system and method for providing assistance in an ultrasound imaging system
US11690602B2 (en) Methods and apparatus for tele-medicine
US20190239850A1 (en) Augmented/mixed reality system and method for the guidance of a medical exam
CN106037797B (en) Three-dimensional volume of interest in ultrasound imaging
JP5543469B2 (en) Generation of a standard protocol for studying 3D ultrasound image data
US20180008232A1 (en) Ultrasonic diagnostic apparatus, scan support method, and medical image processing apparatus
JP6109512B2 (en) Image processing apparatus, X-ray diagnostic apparatus and program
JP2016522074A (en) Ultrasound acquisition feedback guidance for target view
WO2013039192A1 (en) Ultrasound diagnosis device, ultrasound image display device, and ultrasound image display method
KR20080089376A (en) Medical robotic system providing three-dimensional telestration
JP6382050B2 (en) Medical image diagnostic apparatus, image processing apparatus, image processing method, and image processing program
US20140153358A1 (en) Medical imaging system and method for providing imaging assitance
US20200069291A1 (en) Methods and apparatuses for collection of ultrasound data
Handke et al. Transesophageal real-time three-dimensional echocardiography: methods and initial in vitro and human in vivo studies
Nicolaou et al. A Study of saccade transition for attention segregation and task strategy in laparoscopic surgery
US11660158B2 (en) Enhanced haptic feedback system
EP4091174A1 (en) Systems and methods for providing surgical assistance based on operational context
WO2023080170A1 (en) Computer program, trained model generation method, and information processing device
Martins et al. Input system interface for image-guided surgery based on augmented reality
Atkins et al. Eye monitoring applications in medicine
WO2022013022A1 (en) One-dimensional position indicator
WO2021041545A1 (en) Systems and methods for registering imaging data from different imaging modalities based on subsurface image scanning

Legal Events

Date Code Title Description
AS Assignment

Owner name: EZONO AG, GERMANY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:DUNBAR, ALLAN;SHETS, SICCO;MOHAMMED, FATEH;AND OTHERS;SIGNING DATES FROM 20101029 TO 20101103;REEL/FRAME:025394/0407

AS Assignment

Owner name: EZONO AG, GERMANY

Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE SPELLING OF THE LAST NAME OF INVENTOR SICCO SCHETS PREVIOUSLY RECORDED ON REEL 025394 FRAME 0407. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT;ASSIGNORS:DUNBAR, ALLAN;SCHETS, SICCO;MOHAMMED, FATEH;AND OTHERS;SIGNING DATES FROM 20101029 TO 20101103;REEL/FRAME:025570/0686

STCB Information on status: application discontinuation

Free format text: ABANDONED -- AFTER EXAMINER'S ANSWER OR BOARD OF APPEALS DECISION