US20110184291A1 - Ultrasonic diagnostic apparatus, medical image diagnostic apparatus, ultrasonic image processing apparatus, medical image processing apparatus, ultrasonic diagnostic system, and medical image diagnostic system - Google Patents
Ultrasonic diagnostic apparatus, medical image diagnostic apparatus, ultrasonic image processing apparatus, medical image processing apparatus, ultrasonic diagnostic system, and medical image diagnostic system Download PDFInfo
- Publication number
- US20110184291A1 US20110184291A1 US13/014,219 US201113014219A US2011184291A1 US 20110184291 A1 US20110184291 A1 US 20110184291A1 US 201113014219 A US201113014219 A US 201113014219A US 2011184291 A1 US2011184291 A1 US 2011184291A1
- Authority
- US
- United States
- Prior art keywords
- full
- scale
- predetermined region
- contour
- planned surgical
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/48—Diagnostic techniques
- A61B8/483—Diagnostic techniques involving the acquisition of a 3D volume of data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/39—Markers, e.g. radio-opaque or breast lesions markers
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/101—Computer-aided simulation of surgical operations
- A61B2034/105—Modelling of the patient, e.g. for ligaments or bones
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/107—Visualisation of planned trajectories or target regions
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/364—Correlation of different images or relation of image positions in respect to the body
- A61B2090/365—Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/364—Correlation of different images or relation of image positions in respect to the body
- A61B2090/366—Correlation of different images or relation of image positions in respect to the body using projection of images directly onto the body
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/364—Correlation of different images or relation of image positions in respect to the body
- A61B2090/367—Correlation of different images or relation of image positions in respect to the body creating a 3D dataset from 2D images using position information
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/378—Surgical systems with images on a monitor during operation using ultrasound
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/39—Markers, e.g. radio-opaque or breast lesions markers
- A61B2090/3937—Visible markers
- A61B2090/395—Visible markers with marking agent for marking skin or other tissue
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/56—Details of data transmission or power supply
- A61B8/565—Details of data transmission or power supply involving data transmission via a network
Definitions
- Embodiments described herein relate generally to an ultrasonic diagnostic apparatus, medical image diagnostic apparatus, ultrasonic image processing apparatus, medical image processing apparatus, ultrasonic diagnostic system, and medical image diagnostic system which are used when an operation target region or a treatment target region is to be marked before surgical operation or treatment.
- An ultrasonic diagnostic apparatus can display in real time how the heart beats or the fetus moves, by simply bringing an ultrasonic probe into contact with the body surface.
- This apparatus is highly safe, and hence allows repetitive examination.
- this system is smaller in size than other diagnostic apparatuses such as X-ray, CT, and MRI apparatuses and can be moved to the bedside to be easily and conveniently used for examination.
- the ultrasonic diagnostic apparatus is free from the influences of exposure using X-rays and the like, and hence can be used in obstetric treatment, treatment at home, and the like.
- an ultrasonic diagnostic apparatus owing to its high real-time performance, is used not only for image diagnosis but also for support before or during surgical operation. For example, it is possible to make a surgical plan including an incision method by re-checking a lesion to be excised before surgical operation and checking the positions of surrounding blood vessels and the like using ultrasonic images.
- This apparatus is often used for marking of a planned surgical line especially in breast cancer operation or the like.
- an operator executes marking to determine a place to be incised immediately before surgical operation, by drawing the position and size of a tumor (lesion or the like) or a planned surgical line on the body surface (breast surface) with an inkpen (note that the operator cannot acquire precise depth information).
- the operator marks an incision region, an approach method, and the like on the body surface.
- an operator marks a tumor shape while acquiring and checking an ultrasonic image of the periphery of a lesion several ten times.
- FIG. 1 is a block diagram showing the arrangement of an ultrasonic diagnostic apparatus according to the first embodiment
- FIG. 2 is a flowchart showing a procedure for processing (planned surgical line marking support processing) based on a planned surgical line marking support function according to this embodiment
- FIG. 3 is a view showing an example of a VR image with a scanning slice position including information (position marker) indicating a position on a volume rendering image;
- FIG. 4 is a view showing an example of how a sheet on which a full-scale planned surgical line is printed is pasted on the body surface;
- FIG. 5 is a view for explaining an output form according to the first modification
- FIG. 6 is a view for explaining an output form according to the second modification
- FIG. 7 is a flowchart showing a procedure for planned surgical line marking support processing according to the third embodiment.
- FIG. 8 is a block diagram for explaining an ultrasonic diagnostic system S according to the fourth embodiment.
- an ultrasonic diagnostic apparatus comprising a data acquisition unit configured to ultrasonically scan a three-dimensional area including a predetermined region of an object and acquire volume data associated with the three-dimensional area, a calculation unit configured to cut the volume data at at least one plane and calculate a full-scale contour of a slice of the predetermined region and a full-scale planned surgical line used for surgical operation of the predetermined region, and an output unit configured to output at least one of the full-scale contour of the slice of the predetermined region and the full-scale planned surgical line.
- a diagnostic target is a breast in each embodiment.
- the embodiments are not limited to this, and each technical idea of the present embodiments is effective for predetermined organs other than the breasts, e.g., the liver and the pancreas.
- FIG. 1 is a block diagram showing the arrangement of an ultrasonic diagnostic apparatus 1 according to this embodiment.
- the ultrasonic diagnostic apparatus 1 includes an apparatus body 11 , an ultrasonic probe 12 , an input device 13 , a monitor 14 , and an output device 32 connected to the apparatus body 11 as needed.
- the ultrasonic probe 12 includes a plurality of piezoelectric transducers which generate ultrasonic waves based on driving signals from the apparatus body 11 and convert reflected waves from an object into electrical signals, a matching layer provided for the piezoelectric transducers, and a backing member which prevents ultrasonic waves from propagating backward from the piezoelectric transducers.
- the ultrasonic probe 12 transmits an ultrasonic wave to an object, the transmitted ultrasonic wave is sequentially reflected by a discontinuity surface of acoustic impedance of internal body tissue, and is received as an echo signal by the ultrasonic probe 12 .
- the amplitude of this echo signal depends on an acoustic impedance difference on the discontinuity surface by which the echo signal is reflected.
- the echo produced when a transmitted ultrasonic pulse is reflected by the surface of a moving blood flow, cardiac wall, or the like is subjected to a frequency shift depending on the velocity component of the moving body in the ultrasonic transmission direction due to the Doppler
- the ultrasonic probe 12 is a swinging probe or two-dimensional array probe which can ultrasonically scan a three-dimensional area.
- a swinging probe can perform ultrasonic scanning while mechanically swinging a plurality of ultrasonic transducers arrayed in a predetermined direction along a direction perpendicular to the array direction.
- a two-dimensional array probe includes a plurality of ultrasonic transducers arrayed in a two-dimensional matrix, and can three-dimensionally control the transmitting and receiving directions of ultrasonic beams.
- the input device 13 is connected to an apparatus body 11 and includes various types of switches, buttons, a trackball, a mouse, and a keyboard which are used to input, to the apparatus body 11 , various types of instructions, conditions, an instruction to set a region of interest (ROI), various types of image quality condition setting instructions, and the like from an operator.
- the operator operates the end button or FREEZE button of the input device 13 , the transmission/reception of ultrasonic waves is terminated, and the ultrasonic diagnostic apparatus is set in a pause state.
- the monitor 14 displays morphological information and blood flow information in the living body as images based on video signals from an image generating unit 25 .
- the output device 32 includes a printer, projector, and laser output device which output, in predetermined forms, the actual size of a lesion, a planned surgical line, and the like acquired in processing based on a planned surgical line marking support function (to be described later).
- the apparatus body 11 includes an ultrasonic transmission unit 21 , an ultrasonic reception unit 22 , a B-mode processing unit 23 , a Doppler processing unit 24 , an image generating unit 25 , an image memory 26 , an image combining unit 27 , a control processor (CPU) 28 , a storage unit 29 , and an interface unit 30 .
- an ultrasonic transmission unit 21 an ultrasonic reception unit 22 , a B-mode processing unit 23 , a Doppler processing unit 24 , an image generating unit 25 , an image memory 26 , an image combining unit 27 , a control processor (CPU) 28 , a storage unit 29 , and an interface unit 30 .
- CPU control processor
- the ultrasonic transmission unit 21 includes a trigger generating circuit, delay circuit, and pulser circuit (none of which are shown).
- the pulser circuit repetitively generates rate pulses for the formation of transmission ultrasonic waves at a predetermined rate frequency fr Hz (period: 1/fr sec).
- the delay circuit gives each rate pulse a delay time necessary to focus an ultrasonic wave into a beam and determine transmission directivity for each channel.
- the trigger generating circuit applies a driving pulse to the probe 12 at the timing based on this rate pulse.
- the ultrasonic transmission unit 21 has a function of instantly changing a transmission frequency, transmission driving voltage, or the like to execute a predetermined scan sequence in accordance with an instruction from the control processor 28 .
- the function of changing a transmission driving voltage is implemented by linear amplifier type transmission circuit capable of instantly switching its value or a mechanism of electrically switching a plurality of power supply units.
- the ultrasonic reception unit 22 includes an amplifier circuit, A/D converter, and adder (none of which are shown).
- the amplifier circuit amplifies an echo signal received via the probe 12 for each channel.
- the A/D converter gives the amplified echo signals delay times necessary to determine reception directivities.
- the adder then performs addition processing for the signals. With this addition, a reflection component from a direction corresponding to the reception directivity of the echo signal is enhanced to form a composite beam for ultrasonic transmission/reception in accordance with reception directivity and transmission directivity.
- the B-mode processing unit 23 receives an echo signal from the ultrasonic reception unit 22 , and performs logarithmic amplification, envelope detection processing, and the like for the signal to generate data whose signal intensity is expressed by a luminance level.
- the image generating unit 25 causes the monitor 14 to display, as a B-mode image, a signal from the B-mode processing unit 23 whose reflected wave intensity is expressed by a luminance.
- this apparatus can provide image quality suiting user's taste by applying various image filters for edge enhancement, temporal smoothing, spatial smoothing, and the like to the signal.
- the Doppler processing unit 24 frequency-analyzes velocity information from the echo signal received from the ultrasonic reception unit 22 to extract a blood flow, tissue, and contrast medium echo component by the Doppler effect, and obtains blood flow information such as an average velocity, variance, and power at multiple points.
- the obtained blood flow information is sent to the image generating circuit 25 , and is displayed in color as an average velocity image, a variance image, a power image, and a combined image of them on the monitor 14 .
- the image generating unit 25 generates an ultrasonic diagnostic image as a display image by converting the scanning line signal string for ultrasonic scanning into a scanning line signal string in a general video format typified by a TV format.
- the image generating unit 25 generates a scanning slice image, MPR image, volume rendering image, and the like in accordance with instructions from the input device 13 .
- the image generating unit 25 cuts an operation target region (corresponding data) at a plurality of parallel C planes in acquired volume data, and generates a plurality of images corresponding to the respective C slices. Note that data before it is input to the image generating unit 25 is sometimes called “raw data”.
- the image memory 26 is a memory to store, for example, ultrasonic images corresponding to a plurality of frames immediately before a freeze. Continuously displaying (cine-displaying) images stored in the image memory 26 can display an ultrasonic moving image.
- the image combining unit 27 combines the image received from the image generating unit 25 with character information of various types of parameters, scale marks, and the like, and outputs the resultant signal as a video signal to the monitor 14 .
- the control processor 28 has the function of an information processing apparatus (computer) and controls the operation of the main body of this ultrasonic diagnostic apparatus.
- the control processor 28 reads out a control program for executing image generation/display, a dedicated program for implementing a planned surgical line marking support function (to be described later), and the like from a storage unit 29 , expands the program in its own memory, and executes computation, control, and the like associated with each type of processing.
- the storage unit 29 stores transmission/reception conditions, control programs for executing image generation and display processing, diagnostic information (patient ID, findings by doctors, and the like), a diagnostic protocol, a body mark generation program, a dedicated program for implementing the planned surgical line marking support function (to be described later), and other data.
- the storage unit 29 is also used to store images in the image memory 26 , as needed. It is possible to transfer data in the storage unit 29 to an external peripheral device via the interface unit 30 .
- the interface unit 30 is an interface associated with the input device 13 , a network, and an external storage device.
- the interface unit 30 can transfer via a network data such as ultrasonic images, analysis results, and the like obtained by this apparatus to another apparatus.
- the planned surgical line marking support function of the ultrasonic diagnostic apparatus 1 will be described next.
- This function supports marking of an operation target region at the time of surgical operation by calculating the full-scale contour of a slice of an operation target region (a lesion, focus, or the like) of an object or a planned surgical line with a predetermined margin being added to the full-scale contour, and outputting at least one of them in actual size.
- FIG. 2 is a flowchart showing a procedure for processing (planned surgical line marking support processing) based on the planned surgical line marking support function according to this embodiment. Planned surgical line marking support processing will be described with reference to FIG. 2 .
- Step S 1 [Input of Patient Information and the Like: Step S 1 ]
- the operator inputs patient information, transmission/reception conditions (a focal depth, transmission voltage, field angle, swinging range, and the like), and the like via the input device 13 .
- the field angle, swinging range, and the like are set to include an operation target region.
- the control processor 28 stores various kinds of information and conditions in the storage unit 29 (step S 1 ).
- the control processor 28 then executes volume scanning on a three-dimensional area including the operation target region by transmitting ultrasonic waves to the respective slices corresponding to a plurality of swinging angles (swinging positions) and receiving the reflected waves while swinging an ultrasonic transducer array in a direction perpendicular to the array direction (step S 2 ).
- the ultrasonic probe 12 is a two-dimensional array probe having ultrasonic transducers arrayed in a two-dimensional matrix
- volume scanning on a three-dimensional area including the operation target region is executed by three-dimensionally scanning ultrasonic beams.
- the echo signal acquired for each slice in step S 2 is sent to the B-mode processing unit 23 via the ultrasonic reception unit 22 .
- the B-mode processing unit 23 performs logarithmic amplification, envelope detection processing, and the like for the signal to generate luminance data whose signal intensity is expressed by a luminance level.
- the image generating unit 25 generates a two-dimensional image (scanning slice image) corresponding to each scanning slice by using the luminance data received from the B-mode processing unit 23 .
- the image generating unit 25 reconstructs volume data by executing coordinate conversion of a plurality of generated scanning slice image data from the actual spatial coordinate system (i.e., the coordinate system in which the plurality of scanning slice image data are defined) to a volume data spatial coordinate system and performing interpolation processing (step S 3 ).
- the image generating unit 25 generates a plurality of C-plane images by using the generated volume data (step S 4 ). That is, as shown in FIG. 3 , the image generating unit 25 cuts, for example, the operation target region (corresponding data) in the volume data at a plurality of parallel C planes, and generates a plurality of C-plane images corresponding to the respective C slices (step S 4 ).
- the control processor 28 calculates a full-scale planned surgical line by using the plurality of generated C-plane images (step S 5 ). For example, as shown on the right side in FIG. 3 , the control processor 28 calculates the contour of the operation target region on each generated MPR image, and calculates the full-scale contour of an operation target region slice by using the largest contour line obtained by the AND of the respective calculated contours. In addition, the control processor 28 calculates, as a full-scale planned surgical line, a contour with a margin of a predetermined width being added to the calculated full-scale contour of the operation target region slice.
- the method of calculating the full-scale contour of an operation target region slice is not limited to the above example. Another example is to calculate the area of a slice of an operation target region on each generated C-plane image, determine one of the slices of the operation target region which has the largest area, and calculate the full-scale contour of the operation target region slice by using the determined slice.
- the control processor 28 may also calculate, as a full-scale planned surgical line, a contour with a margin of a predetermined width being added to the calculated full-scale contour of the operation target region slice.
- the user may determine the width of a margin to be added to the full-scale contour of an operation target region slice for each calculation, or it is possible to use a recommended value stored in the apparatus in advance.
- the cutting planes at which the operation target region (corresponding data) in volume data is cut are not limited to C planes.
- an arbitrary cutting plane (MPR plane) in volume data in response to an input from the operator or automatically.
- MPR plane arbitrary cutting plane
- the contour of the operation target region on the cutting plane and a planned surgical line are calculated as the actual size of a C-plane image.
- the output device 32 then outputs the calculated full-scale planned surgical line in a predetermined form (step S 6 ).
- the output device 32 prints the planned surgical line on a sheet which can be pasted on the body surface of an object.
- the output device 32 also prints a reference marker as a reference indicating at which position on the body surface the sheet which can be pasted is to be pasted. It is possible to use, as this reference marker, the current position at which the ultrasonic probe 12 is placed on the body surface. The operator pastes the output sheet on the body surface of the object as shown in, for example, FIG. 4 so as to match the current position of the ultrasonic probe 12 with the reference marker, thereby simply and quickly marking a lesion, a planned surgical line, and the like.
- the sheet onto which a planned surgical line is to be output is not limited to the one which can be pasted on the body surface of an object.
- the same effect can be obtained by outputting a full-scale planned surgical line onto trace paper, placing it with reference to a reference marker, and copying the full-scale planned surgical line down on the body surface.
- output form of a planned surgical line is not limited to the above example, and various kinds of output forms are conceivable. Output form variations will be described below with reference to the following embodiments.
- An output form according to this modification is that a planned surgical line is output (drawn) onto a heat-sensitive sheet (sound-sensitive sheet) placed between the ultrasonic probe 12 and an object.
- FIG. 5 is a view for explaining an output form according to the first modification.
- the operator places a heat-sensitive sheet (sound-sensitive sheet) between the ultrasonic probe 12 and the body surface of the object.
- the control processor 28 determines transmission conditions such as a beam direction or a sound pressure to draw the contour of a planned surgical line, and controls the ultrasonic transmission unit 21 in accordance with the determined transmission conditions.
- the ultrasonic beam transmitted under the control of the control processor 28 then draws the planned surgical line on the heat-sensitive sheet (or sound-sensitive sheet).
- positioning acquired volume data relative to a two-dimensional image acquired at the current position of the ultrasonic probe 12 can determine the direction in which the ultrasonic probe 12 is to be moved.
- An output form according to this embodiment is configured to output (project) a planned surgical line on the body surface of an object by using a projector (video projection apparatus).
- FIG. 6 is a view for explaining the output form according to the second modification.
- a sensor 40 placed immediately above the bed on which an object is placed measures the current position of the ultrasonic probe 12 in real time.
- the position of the ultrasonic probe 12 measured by the sensor 40 is sequentially transferred to a projector 42 .
- the projector 42 projects the full-scale planned surgical line acquired from the control processor 28 via the interface unit 30 onto the body surface of the object with reference to the transferred position of the ultrasonic probe 12 .
- An output form according to this modification is configured to output (project) a planned surgical line onto the body surface of an object by using a laser or the like which does not damage the living body.
- the third modification it is possible to draw a planned surgical line calculated from acquired volume data on the body surface at a position corresponding to the position of the planned surgical line by using an ultrasonic probe including a laser function.
- the sensor 40 measures the current position of the ultrasonic probe 12 in real time. The position of the ultrasonic probe 12 measured by the sensor 40 is sequentially transferred to a laser output device.
- the laser output device projects the full-scale planned surgical line acquired from the control processor 28 via the interface unit 30 onto the body surface of the object with reference to the transferred position of the ultrasonic probe 12 .
- This ultrasonic diagnostic apparatus performs volume scanning of a three-dimensional area including an operation target region of an object to acquire volume data.
- This apparatus generates a plurality of C-slice images by using the acquired volume data, and calculates the largest contour or the like of the operation target region.
- the apparatus calculates the full-scale contour of a slice of the operation target region or a planned surgical line determined upon addition of a predetermined margin to the full-scale contour by using the calculated largest contour or the like, and outputs the resultant information in actual size.
- the operator can therefore quickly and easily execute marking of an operation target region shape at the time of surgical operation, and can quickly starts surgical operation by using the marked full-scale contour or planned surgical line. This obviates the necessity to perform marking several ten times while repeatedly changing the position of the ultrasonic probe and checking an operation target region. It is therefore possible to reduce the operation load in marking of an operation target region shape at the time of surgical operation.
- this apparatus calculates and outputs the full-scale contour of an operation target region and a planned surgical line by using an ultrasonic image. This can implement marking of an operation target region shape with higher accuracy than that in the prior art, and hence can contribute to an improvement in the quality of medical work.
- the apparatus can output the full-scale contour of an operation target region and a planned surgical line in various forms including drawing them on a sheet to be pasted on the body surface of the object, drawing them on a heat-sensitive sheet or sound-sensitive sheet placed between the object and the ultrasonic probe, projecting images of them on the body surface of the object, and projecting them on the body surface of the object using a laser or the like which does not damage the living body. It is therefore possible to select a desired output form in accordance with a surgical operation environment, an object, and the characteristics of an operator and to easily and quickly perform marking of an operation target region shape at the time of surgical operation.
- the second embodiment is applied to a medical image diagnostic apparatus (e.g., an X-ray diagnostic apparatus, X-ray computed tomography apparatus, magnetic resonance imaging apparatus, and nuclear medicine diagnostic apparatus) configured to perform imaging upon placing an object on a bed.
- a medical image diagnostic apparatus e.g., an X-ray diagnostic apparatus, X-ray computed tomography apparatus, magnetic resonance imaging apparatus, and nuclear medicine diagnostic apparatus
- These apparatuses also acquire volume data of a three-dimensional area including an operation target region and calculate a planned surgical line or the like by almost the same method as that in the first embodiment.
- the planned surgical line or the like obtained by calculation is in one of the output forms according to the first embodiment and the respective modifications.
- such an apparatus outputs a sheet to be pasted on the body surface of an object or projects a planned surgical line on the body surface using a projector or a laser with reference to a predetermined position on the bed (e.g., the top). That is, it is possible to easily define a scanning range for the object on the bed (i.e., the acquisition range of volume data) as a three-dimensional coordinate system on the top of the bed. Therefore, the apparatus prints a reference marker together with a planned surgical line as a marker for placing a full-scale planned surgical line at a predetermined position in the three-dimensional coordinate system on the top of the bed in, for example, a form of matching the marker with the predetermined reference position on the top of the bed. It is also possible to project a full-scale planned surgical line on the body surface of an object using a projector or a laser based on a position on volume data or a position on the body surface in the three-dimensional coordinate system on the top of the bed.
- the above arrangement can acquire the same effects as those of the first embodiment by using a medical image diagnostic apparatus.
- the first and second embodiments are configured to generate voxel volume data and then extract the contour of an operation target region on each of a plurality of C-plane images obtained by cutting the voxel volume data.
- the planned surgical line marking support function of the third embodiment extracts the contour of an operation target region or the like on voxel volume data and cuts the voxel volume data at an arbitrary slice, thereby calculating and outputting a full-scale planned surgical line on an MPR image corresponding to the arbitrary slice.
- the third embodiment is not limited to this, and can be applied to a medical image diagnostic apparatus which performs imaging after an object is placed on the bed, as in the second embodiment.
- FIG. 7 is a flowchart showing a procedure for planned surgical line marking support processing according to this embodiment. Planned surgical line marking support processing will be described with reference to FIG. 7 . Note that steps S 11 to S 13 are almost the same as steps S 1 to S 3 in FIG. 2 . The contents of processing in each of steps S 14 to S 17 will therefore be described below.
- An image generating unit 25 executes segmentation processing (area extraction processing) for the generated volume data to extract the contour of the operation target region of the object (step S 14 ). It is possible to implement this segmentation processing by any methods. Typically, it is possible to use, for example, a method of extracting voxels having voxel values larger than a predetermined value by threshold processing.
- the image generating unit 25 sets an arbitrary cutting plane in the volume data from which the contour of the operation target region has been extracted, and calculates the full-scale contour of the operation target region slice when the cutting plane is projected on a C plane, and a full-scale planned surgical line with a margin of a predetermined width being added to the contour (step S 15 ).
- a cutting plane is not limited to a plane parallel to a C plane, and is set at a predetermined position in the volume data in response to an input from the operator or automatically. When a cutting plane is automatically set, it is preferable to set the cutting plane so as to cut the operation target region with a maximum area.
- a cutting plane by a method of calculating the center of gravity of an extracted operation target region, calculating a plane including a circle (or an ellipse) of circles (or ellipses) inscribed in or circumscribed around the operation target region centered on the center of gravity which has the largest diameter (or long axis), and setting the plane as a cutting plane.
- the output device 32 outputs the calculated full-scale planned surgical line in a predetermined form (step S 16 ). Output form variations for full-scale planned surgical lines have been described above.
- the apparatus sets an arbitrary cutting plane in volume data, and projects and outputs the contour of an operation target region on the cutting plane and a planned surgical line onto a C plane. Therefore, it is possible to reflect the largest diameter of an operation target region in a contour or planned surgical line to be output. This makes it possible to perform marking with higher accuracy and safety.
- This embodiment implements the planned surgical line marking support function according to any one of the first to third embodiments by using an ultrasonic diagnostic system including an ultrasonic diagnostic apparatus and an ultrasonic image processing apparatus, and a medical image diagnostic system including a medical image diagnostic apparatus and a medical image processing apparatus.
- an ultrasonic diagnostic system including an ultrasonic diagnostic apparatus and an ultrasonic image processing apparatus.
- FIG. 8 is a block diagram for explaining an ultrasonic diagnostic system S including an ultrasonic diagnostic apparatus 1 and an ultrasonic image processing apparatus 5 .
- the ultrasonic image processing apparatus 5 is implemented by, for example, a medical workstation, and includes a storage unit 50 , an image generating unit 51 , a display processing unit 52 , a control processor 53 , a display unit 54 , an interface unit 55 , and an operation unit 56 .
- the storage unit 50 stores ultrasonic images acquired in advance and ultrasonic images transmitted from the ultrasonic diagnostic apparatus 1 via a network.
- the image generating unit 51 executes the planned surgical line marking support processing described above.
- the display processing unit 52 executes various kinds of processes associated with a dynamic range, luminance (brightness), contrast, ⁇ curve correction, RGB conversion, and the like for various kinds of image data generated/processed by the image processing unit 50 .
- the control processor 53 reads out a dedicated program for implementing the planned surgical line marking support function described above, expands the program in its own memory, and executes computation/control and the like associated with various kinds of processes.
- the display unit 54 is a monitor to display an ultrasonic image or the like in a predetermined form.
- the interface unit 55 is an interface for network connection and connection to other external storage devices.
- the operation unit 56 includes switches, buttons, a trackball, a mouse, and a keyboard which are used to input various types of instructions to the apparatus.
- the ultrasonic diagnostic apparatus 1 executes, for example, the processes in steps S 1 and S 2 , and the ultrasonic image processing apparatus 5 executes the processes in steps S 3 to S 6 .
- the ultrasonic diagnostic apparatus 1 can execute the processes in steps S 1 to S 3
- the ultrasonic image processing apparatus 5 can execute the processes in steps S 4 to S 6 .
- the ultrasonic diagnostic apparatus 1 executes the processes in steps S 11 and S 12 , and the ultrasonic image processing apparatus 5 executes the processes in steps S 13 to S 17 .
- the ultrasonic diagnostic apparatus 1 can execute the processes in steps S 11 to S 13
- the ultrasonic image processing apparatus 5 can execute the processes in steps S 14 to S 17 .
- the above arrangement can also acquire the effects described in the first to third embodiments.
- Each function (each function in planned surgical line marking support) associated with each embodiment can also be implemented by installing programs for executing the corresponding processing in a computer such as a workstation and expanding them in a memory.
- the programs which can cause the computer to execute the corresponding techniques can be distributed by being stored in recording media such as magnetic disks ((floppy®) disks, hard disks, and the like), optical disks (CD-ROMs, DVDs, and the like), and semiconductor memories.
Abstract
Description
- This application is based upon and claims the benefit of priority from Japanese Patent Applications No. 2010-015891, filed Jan. 27, 2010; and No. 2011-011730, filed Jan. 24, 2011; the entire contents of both of which are incorporated herein by reference.
- Embodiments described herein relate generally to an ultrasonic diagnostic apparatus, medical image diagnostic apparatus, ultrasonic image processing apparatus, medical image processing apparatus, ultrasonic diagnostic system, and medical image diagnostic system which are used when an operation target region or a treatment target region is to be marked before surgical operation or treatment.
- An ultrasonic diagnostic apparatus can display in real time how the heart beats or the fetus moves, by simply bringing an ultrasonic probe into contact with the body surface. This apparatus is highly safe, and hence allows repetitive examination. Furthermore, this system is smaller in size than other diagnostic apparatuses such as X-ray, CT, and MRI apparatuses and can be moved to the bedside to be easily and conveniently used for examination. In addition, the ultrasonic diagnostic apparatus is free from the influences of exposure using X-rays and the like, and hence can be used in obstetric treatment, treatment at home, and the like.
- In addition, such an ultrasonic diagnostic apparatus, owing to its high real-time performance, is used not only for image diagnosis but also for support before or during surgical operation. For example, it is possible to make a surgical plan including an incision method by re-checking a lesion to be excised before surgical operation and checking the positions of surrounding blood vessels and the like using ultrasonic images. This apparatus is often used for marking of a planned surgical line especially in breast cancer operation or the like.
- In this case, an operator executes marking to determine a place to be incised immediately before surgical operation, by drawing the position and size of a tumor (lesion or the like) or a planned surgical line on the body surface (breast surface) with an inkpen (note that the operator cannot acquire precise depth information). In addition, the operator marks an incision region, an approach method, and the like on the body surface. Under present circumstances, an operator marks a tumor shape while acquiring and checking an ultrasonic image of the periphery of a lesion several ten times.
- Conventionally, however, when marking a lesion and a planned surgical line before surgical operation, the operator needs to acquire an ultrasonic image of the periphery of a lesion several ten times and accurately check the periphery of the lesion with caution. For this reason, marking takes much time and labor, and hence leads to a deterioration in operation efficiency at the time of surgical operation.
-
FIG. 1 is a block diagram showing the arrangement of an ultrasonic diagnostic apparatus according to the first embodiment; -
FIG. 2 is a flowchart showing a procedure for processing (planned surgical line marking support processing) based on a planned surgical line marking support function according to this embodiment; -
FIG. 3 is a view showing an example of a VR image with a scanning slice position including information (position marker) indicating a position on a volume rendering image; -
FIG. 4 is a view showing an example of how a sheet on which a full-scale planned surgical line is printed is pasted on the body surface; -
FIG. 5 is a view for explaining an output form according to the first modification; -
FIG. 6 is a view for explaining an output form according to the second modification; -
FIG. 7 is a flowchart showing a procedure for planned surgical line marking support processing according to the third embodiment; and -
FIG. 8 is a block diagram for explaining an ultrasonic diagnostic system S according to the fourth embodiment. - In general, according to one embodiment, there is provided an ultrasonic diagnostic apparatus comprising a data acquisition unit configured to ultrasonically scan a three-dimensional area including a predetermined region of an object and acquire volume data associated with the three-dimensional area, a calculation unit configured to cut the volume data at at least one plane and calculate a full-scale contour of a slice of the predetermined region and a full-scale planned surgical line used for surgical operation of the predetermined region, and an output unit configured to output at least one of the full-scale contour of the slice of the predetermined region and the full-scale planned surgical line.
- An embodiment will be described below with reference to the views of the accompanying drawing. Note that the same reference numerals in the following description denote constituent elements having almost the same functions and arrangements, and a repetitive description will be made only when required. For the sake of a concrete description, assume that a diagnostic target is a breast in each embodiment. However, the embodiments are not limited to this, and each technical idea of the present embodiments is effective for predetermined organs other than the breasts, e.g., the liver and the pancreas.
-
FIG. 1 is a block diagram showing the arrangement of an ultrasonicdiagnostic apparatus 1 according to this embodiment. As shown inFIG. 1 , the ultrasonicdiagnostic apparatus 1 includes anapparatus body 11, anultrasonic probe 12, aninput device 13, amonitor 14, and anoutput device 32 connected to theapparatus body 11 as needed. - The
ultrasonic probe 12 includes a plurality of piezoelectric transducers which generate ultrasonic waves based on driving signals from theapparatus body 11 and convert reflected waves from an object into electrical signals, a matching layer provided for the piezoelectric transducers, and a backing member which prevents ultrasonic waves from propagating backward from the piezoelectric transducers. When theultrasonic probe 12 transmits an ultrasonic wave to an object, the transmitted ultrasonic wave is sequentially reflected by a discontinuity surface of acoustic impedance of internal body tissue, and is received as an echo signal by theultrasonic probe 12. The amplitude of this echo signal depends on an acoustic impedance difference on the discontinuity surface by which the echo signal is reflected. The echo produced when a transmitted ultrasonic pulse is reflected by the surface of a moving blood flow, cardiac wall, or the like is subjected to a frequency shift depending on the velocity component of the moving body in the ultrasonic transmission direction due to the Doppler effect. - Assume that the
ultrasonic probe 12 is a swinging probe or two-dimensional array probe which can ultrasonically scan a three-dimensional area. A swinging probe can perform ultrasonic scanning while mechanically swinging a plurality of ultrasonic transducers arrayed in a predetermined direction along a direction perpendicular to the array direction. A two-dimensional array probe includes a plurality of ultrasonic transducers arrayed in a two-dimensional matrix, and can three-dimensionally control the transmitting and receiving directions of ultrasonic beams. - The
input device 13 is connected to anapparatus body 11 and includes various types of switches, buttons, a trackball, a mouse, and a keyboard which are used to input, to theapparatus body 11, various types of instructions, conditions, an instruction to set a region of interest (ROI), various types of image quality condition setting instructions, and the like from an operator. When, for example, the operator operates the end button or FREEZE button of theinput device 13, the transmission/reception of ultrasonic waves is terminated, and the ultrasonic diagnostic apparatus is set in a pause state. - The
monitor 14 displays morphological information and blood flow information in the living body as images based on video signals from animage generating unit 25. - The
output device 32 includes a printer, projector, and laser output device which output, in predetermined forms, the actual size of a lesion, a planned surgical line, and the like acquired in processing based on a planned surgical line marking support function (to be described later). - The
apparatus body 11 includes anultrasonic transmission unit 21, anultrasonic reception unit 22, a B-mode processing unit 23, a Dopplerprocessing unit 24, animage generating unit 25, animage memory 26, animage combining unit 27, a control processor (CPU) 28, astorage unit 29, and aninterface unit 30. - The
ultrasonic transmission unit 21 includes a trigger generating circuit, delay circuit, and pulser circuit (none of which are shown). The pulser circuit repetitively generates rate pulses for the formation of transmission ultrasonic waves at a predetermined rate frequency fr Hz (period: 1/fr sec). The delay circuit gives each rate pulse a delay time necessary to focus an ultrasonic wave into a beam and determine transmission directivity for each channel. The trigger generating circuit applies a driving pulse to theprobe 12 at the timing based on this rate pulse. - The
ultrasonic transmission unit 21 has a function of instantly changing a transmission frequency, transmission driving voltage, or the like to execute a predetermined scan sequence in accordance with an instruction from thecontrol processor 28. In particular, the function of changing a transmission driving voltage is implemented by linear amplifier type transmission circuit capable of instantly switching its value or a mechanism of electrically switching a plurality of power supply units. - The
ultrasonic reception unit 22 includes an amplifier circuit, A/D converter, and adder (none of which are shown). The amplifier circuit amplifies an echo signal received via theprobe 12 for each channel. The A/D converter gives the amplified echo signals delay times necessary to determine reception directivities. The adder then performs addition processing for the signals. With this addition, a reflection component from a direction corresponding to the reception directivity of the echo signal is enhanced to form a composite beam for ultrasonic transmission/reception in accordance with reception directivity and transmission directivity. - The B-
mode processing unit 23 receives an echo signal from theultrasonic reception unit 22, and performs logarithmic amplification, envelope detection processing, and the like for the signal to generate data whose signal intensity is expressed by a luminance level. Theimage generating unit 25 causes themonitor 14 to display, as a B-mode image, a signal from the B-mode processing unit 23 whose reflected wave intensity is expressed by a luminance. At this time, this apparatus can provide image quality suiting user's taste by applying various image filters for edge enhancement, temporal smoothing, spatial smoothing, and the like to the signal. - The
Doppler processing unit 24 frequency-analyzes velocity information from the echo signal received from theultrasonic reception unit 22 to extract a blood flow, tissue, and contrast medium echo component by the Doppler effect, and obtains blood flow information such as an average velocity, variance, and power at multiple points. The obtained blood flow information is sent to theimage generating circuit 25, and is displayed in color as an average velocity image, a variance image, a power image, and a combined image of them on themonitor 14. - The
image generating unit 25 generates an ultrasonic diagnostic image as a display image by converting the scanning line signal string for ultrasonic scanning into a scanning line signal string in a general video format typified by a TV format. Theimage generating unit 25 generates a scanning slice image, MPR image, volume rendering image, and the like in accordance with instructions from theinput device 13. Furthermore, in processing (planned surgical line marking support processing) based on the planned surgical line marking support function (to be described later), theimage generating unit 25 cuts an operation target region (corresponding data) at a plurality of parallel C planes in acquired volume data, and generates a plurality of images corresponding to the respective C slices. Note that data before it is input to theimage generating unit 25 is sometimes called “raw data”. - The
image memory 26 is a memory to store, for example, ultrasonic images corresponding to a plurality of frames immediately before a freeze. Continuously displaying (cine-displaying) images stored in theimage memory 26 can display an ultrasonic moving image. - The
image combining unit 27 combines the image received from theimage generating unit 25 with character information of various types of parameters, scale marks, and the like, and outputs the resultant signal as a video signal to themonitor 14. - The
control processor 28 has the function of an information processing apparatus (computer) and controls the operation of the main body of this ultrasonic diagnostic apparatus. Thecontrol processor 28 reads out a control program for executing image generation/display, a dedicated program for implementing a planned surgical line marking support function (to be described later), and the like from astorage unit 29, expands the program in its own memory, and executes computation, control, and the like associated with each type of processing. - The
storage unit 29 stores transmission/reception conditions, control programs for executing image generation and display processing, diagnostic information (patient ID, findings by doctors, and the like), a diagnostic protocol, a body mark generation program, a dedicated program for implementing the planned surgical line marking support function (to be described later), and other data. Thestorage unit 29 is also used to store images in theimage memory 26, as needed. It is possible to transfer data in thestorage unit 29 to an external peripheral device via theinterface unit 30. - The
interface unit 30 is an interface associated with theinput device 13, a network, and an external storage device. Theinterface unit 30 can transfer via a network data such as ultrasonic images, analysis results, and the like obtained by this apparatus to another apparatus. - The planned surgical line marking support function of the ultrasonic
diagnostic apparatus 1 will be described next. This function supports marking of an operation target region at the time of surgical operation by calculating the full-scale contour of a slice of an operation target region (a lesion, focus, or the like) of an object or a planned surgical line with a predetermined margin being added to the full-scale contour, and outputting at least one of them in actual size. -
FIG. 2 is a flowchart showing a procedure for processing (planned surgical line marking support processing) based on the planned surgical line marking support function according to this embodiment. Planned surgical line marking support processing will be described with reference toFIG. 2 . - First of all, the operator inputs patient information, transmission/reception conditions (a focal depth, transmission voltage, field angle, swinging range, and the like), and the like via the
input device 13. The field angle, swinging range, and the like are set to include an operation target region. Thecontrol processor 28 stores various kinds of information and conditions in the storage unit 29 (step S1). - If, for example, the
ultrasonic probe 12 is a swinging probe, thecontrol processor 28 then executes volume scanning on a three-dimensional area including the operation target region by transmitting ultrasonic waves to the respective slices corresponding to a plurality of swinging angles (swinging positions) and receiving the reflected waves while swinging an ultrasonic transducer array in a direction perpendicular to the array direction (step S2). Alternatively, if theultrasonic probe 12 is a two-dimensional array probe having ultrasonic transducers arrayed in a two-dimensional matrix, volume scanning on a three-dimensional area including the operation target region is executed by three-dimensionally scanning ultrasonic beams. - The echo signal acquired for each slice in step S2 is sent to the B-
mode processing unit 23 via theultrasonic reception unit 22. The B-mode processing unit 23 performs logarithmic amplification, envelope detection processing, and the like for the signal to generate luminance data whose signal intensity is expressed by a luminance level. Theimage generating unit 25 generates a two-dimensional image (scanning slice image) corresponding to each scanning slice by using the luminance data received from the B-mode processing unit 23. - The
image generating unit 25 reconstructs volume data by executing coordinate conversion of a plurality of generated scanning slice image data from the actual spatial coordinate system (i.e., the coordinate system in which the plurality of scanning slice image data are defined) to a volume data spatial coordinate system and performing interpolation processing (step S3). - [Generation of Plurality of C-plane images: Step S4]
- The
image generating unit 25 generates a plurality of C-plane images by using the generated volume data (step S4). That is, as shown inFIG. 3 , theimage generating unit 25 cuts, for example, the operation target region (corresponding data) in the volume data at a plurality of parallel C planes, and generates a plurality of C-plane images corresponding to the respective C slices (step S4). - The
control processor 28 then calculates a full-scale planned surgical line by using the plurality of generated C-plane images (step S5). For example, as shown on the right side inFIG. 3 , thecontrol processor 28 calculates the contour of the operation target region on each generated MPR image, and calculates the full-scale contour of an operation target region slice by using the largest contour line obtained by the AND of the respective calculated contours. In addition, thecontrol processor 28 calculates, as a full-scale planned surgical line, a contour with a margin of a predetermined width being added to the calculated full-scale contour of the operation target region slice. - Note that the method of calculating the full-scale contour of an operation target region slice is not limited to the above example. Another example is to calculate the area of a slice of an operation target region on each generated C-plane image, determine one of the slices of the operation target region which has the largest area, and calculate the full-scale contour of the operation target region slice by using the determined slice. The
control processor 28 may also calculate, as a full-scale planned surgical line, a contour with a margin of a predetermined width being added to the calculated full-scale contour of the operation target region slice. Furthermore, the user may determine the width of a margin to be added to the full-scale contour of an operation target region slice for each calculation, or it is possible to use a recommended value stored in the apparatus in advance. - The cutting planes at which the operation target region (corresponding data) in volume data is cut are not limited to C planes. For example, it is possible to set an arbitrary cutting plane (MPR plane) in volume data in response to an input from the operator or automatically. When such a cutting plane is set, the contour of the operation target region on the cutting plane and a planned surgical line are calculated as the actual size of a C-plane image.
- The
output device 32 then outputs the calculated full-scale planned surgical line in a predetermined form (step S6). Assume that in this embodiment, theoutput device 32 prints the planned surgical line on a sheet which can be pasted on the body surface of an object. At the same time, theoutput device 32 also prints a reference marker as a reference indicating at which position on the body surface the sheet which can be pasted is to be pasted. It is possible to use, as this reference marker, the current position at which theultrasonic probe 12 is placed on the body surface. The operator pastes the output sheet on the body surface of the object as shown in, for example,FIG. 4 so as to match the current position of theultrasonic probe 12 with the reference marker, thereby simply and quickly marking a lesion, a planned surgical line, and the like. - Note that the sheet onto which a planned surgical line is to be output is not limited to the one which can be pasted on the body surface of an object. For example, the same effect can be obtained by outputting a full-scale planned surgical line onto trace paper, placing it with reference to a reference marker, and copying the full-scale planned surgical line down on the body surface.
- In addition, it is possible to output not only a planned surgical line but also the full-scale contour of an operation target region slice, as needed. Alternatively, it is possible to output only the full-scale operation target region slice. It is possible to arbitrarily select which information is to be output in accordance with, for example, an instruction from the
input device 13. - Note that the output form of a planned surgical line is not limited to the above example, and various kinds of output forms are conceivable. Output form variations will be described below with reference to the following embodiments.
- An output form according to this modification is that a planned surgical line is output (drawn) onto a heat-sensitive sheet (sound-sensitive sheet) placed between the
ultrasonic probe 12 and an object. -
FIG. 5 is a view for explaining an output form according to the first modification. As shown inFIG. 5 , the operator places a heat-sensitive sheet (sound-sensitive sheet) between theultrasonic probe 12 and the body surface of the object. Thecontrol processor 28 determines transmission conditions such as a beam direction or a sound pressure to draw the contour of a planned surgical line, and controls theultrasonic transmission unit 21 in accordance with the determined transmission conditions. The ultrasonic beam transmitted under the control of thecontrol processor 28 then draws the planned surgical line on the heat-sensitive sheet (or sound-sensitive sheet). - In order to draw a planned surgical line having a wide range on a heat-sensitive sheet (or sound-sensitive sheet), it is necessary to move the
ultrasonic probe 12 along the body surface. In this case, positioning acquired volume data relative to a two-dimensional image acquired at the current position of theultrasonic probe 12 can determine the direction in which theultrasonic probe 12 is to be moved. In addition, it is preferable to support the operator in moving theultrasonic probe 12 by displaying, on themonitor 14 or the like, the determined direction in which theultrasonic probe 12 is to be moved. - An output form according to this embodiment is configured to output (project) a planned surgical line on the body surface of an object by using a projector (video projection apparatus).
-
FIG. 6 is a view for explaining the output form according to the second modification. As shown inFIG. 6 , for example, asensor 40 placed immediately above the bed on which an object is placed measures the current position of theultrasonic probe 12 in real time. The position of theultrasonic probe 12 measured by thesensor 40 is sequentially transferred to aprojector 42. Theprojector 42 projects the full-scale planned surgical line acquired from thecontrol processor 28 via theinterface unit 30 onto the body surface of the object with reference to the transferred position of theultrasonic probe 12. - An output form according to this modification is configured to output (project) a planned surgical line onto the body surface of an object by using a laser or the like which does not damage the living body. In the third modification, it is possible to draw a planned surgical line calculated from acquired volume data on the body surface at a position corresponding to the position of the planned surgical line by using an ultrasonic probe including a laser function. In addition, in the third modification as well, the
sensor 40 measures the current position of theultrasonic probe 12 in real time. The position of theultrasonic probe 12 measured by thesensor 40 is sequentially transferred to a laser output device. The laser output device projects the full-scale planned surgical line acquired from thecontrol processor 28 via theinterface unit 30 onto the body surface of the object with reference to the transferred position of theultrasonic probe 12. - According to the above arrangements, the following effects can be obtained.
- This ultrasonic diagnostic apparatus performs volume scanning of a three-dimensional area including an operation target region of an object to acquire volume data. This apparatus generates a plurality of C-slice images by using the acquired volume data, and calculates the largest contour or the like of the operation target region. The apparatus calculates the full-scale contour of a slice of the operation target region or a planned surgical line determined upon addition of a predetermined margin to the full-scale contour by using the calculated largest contour or the like, and outputs the resultant information in actual size. The operator can therefore quickly and easily execute marking of an operation target region shape at the time of surgical operation, and can quickly starts surgical operation by using the marked full-scale contour or planned surgical line. This obviates the necessity to perform marking several ten times while repeatedly changing the position of the ultrasonic probe and checking an operation target region. It is therefore possible to reduce the operation load in marking of an operation target region shape at the time of surgical operation.
- In addition, this apparatus calculates and outputs the full-scale contour of an operation target region and a planned surgical line by using an ultrasonic image. This can implement marking of an operation target region shape with higher accuracy than that in the prior art, and hence can contribute to an improvement in the quality of medical work.
- Furthermore, the apparatus can output the full-scale contour of an operation target region and a planned surgical line in various forms including drawing them on a sheet to be pasted on the body surface of the object, drawing them on a heat-sensitive sheet or sound-sensitive sheet placed between the object and the ultrasonic probe, projecting images of them on the body surface of the object, and projecting them on the body surface of the object using a laser or the like which does not damage the living body. It is therefore possible to select a desired output form in accordance with a surgical operation environment, an object, and the characteristics of an operator and to easily and quickly perform marking of an operation target region shape at the time of surgical operation.
- The second embodiment is applied to a medical image diagnostic apparatus (e.g., an X-ray diagnostic apparatus, X-ray computed tomography apparatus, magnetic resonance imaging apparatus, and nuclear medicine diagnostic apparatus) configured to perform imaging upon placing an object on a bed.
- These apparatuses also acquire volume data of a three-dimensional area including an operation target region and calculate a planned surgical line or the like by almost the same method as that in the first embodiment. The planned surgical line or the like obtained by calculation is in one of the output forms according to the first embodiment and the respective modifications.
- At this time, such an apparatus outputs a sheet to be pasted on the body surface of an object or projects a planned surgical line on the body surface using a projector or a laser with reference to a predetermined position on the bed (e.g., the top). That is, it is possible to easily define a scanning range for the object on the bed (i.e., the acquisition range of volume data) as a three-dimensional coordinate system on the top of the bed. Therefore, the apparatus prints a reference marker together with a planned surgical line as a marker for placing a full-scale planned surgical line at a predetermined position in the three-dimensional coordinate system on the top of the bed in, for example, a form of matching the marker with the predetermined reference position on the top of the bed. It is also possible to project a full-scale planned surgical line on the body surface of an object using a projector or a laser based on a position on volume data or a position on the body surface in the three-dimensional coordinate system on the top of the bed.
- The above arrangement can acquire the same effects as those of the first embodiment by using a medical image diagnostic apparatus.
- The first and second embodiments are configured to generate voxel volume data and then extract the contour of an operation target region on each of a plurality of C-plane images obtained by cutting the voxel volume data. In contrast to this, the planned surgical line marking support function of the third embodiment extracts the contour of an operation target region or the like on voxel volume data and cuts the voxel volume data at an arbitrary slice, thereby calculating and outputting a full-scale planned surgical line on an MPR image corresponding to the arbitrary slice.
- For the sake of a concrete description, consider the planned surgical line marking support function according to this embodiment in the ultrasonic diagnostic apparatus. However, the third embodiment is not limited to this, and can be applied to a medical image diagnostic apparatus which performs imaging after an object is placed on the bed, as in the second embodiment.
-
FIG. 7 is a flowchart showing a procedure for planned surgical line marking support processing according to this embodiment. Planned surgical line marking support processing will be described with reference toFIG. 7 . Note that steps S11 to S13 are almost the same as steps S1 to S3 inFIG. 2 . The contents of processing in each of steps S14 to S17 will therefore be described below. - An
image generating unit 25 executes segmentation processing (area extraction processing) for the generated volume data to extract the contour of the operation target region of the object (step S14). It is possible to implement this segmentation processing by any methods. Typically, it is possible to use, for example, a method of extracting voxels having voxel values larger than a predetermined value by threshold processing. - The
image generating unit 25 sets an arbitrary cutting plane in the volume data from which the contour of the operation target region has been extracted, and calculates the full-scale contour of the operation target region slice when the cutting plane is projected on a C plane, and a full-scale planned surgical line with a margin of a predetermined width being added to the contour (step S15). Note that a cutting plane is not limited to a plane parallel to a C plane, and is set at a predetermined position in the volume data in response to an input from the operator or automatically. When a cutting plane is automatically set, it is preferable to set the cutting plane so as to cut the operation target region with a maximum area. For example, it is possible to set a cutting plane by a method of calculating the center of gravity of an extracted operation target region, calculating a plane including a circle (or an ellipse) of circles (or ellipses) inscribed in or circumscribed around the operation target region centered on the center of gravity which has the largest diameter (or long axis), and setting the plane as a cutting plane. - The
output device 32 outputs the calculated full-scale planned surgical line in a predetermined form (step S16). Output form variations for full-scale planned surgical lines have been described above. - The arrangement described above can also acquire the same effects as those of the first embodiment. In this embodiment, in particular, the apparatus sets an arbitrary cutting plane in volume data, and projects and outputs the contour of an operation target region on the cutting plane and a planned surgical line onto a C plane. Therefore, it is possible to reflect the largest diameter of an operation target region in a contour or planned surgical line to be output. This makes it possible to perform marking with higher accuracy and safety.
- This embodiment implements the planned surgical line marking support function according to any one of the first to third embodiments by using an ultrasonic diagnostic system including an ultrasonic diagnostic apparatus and an ultrasonic image processing apparatus, and a medical image diagnostic system including a medical image diagnostic apparatus and a medical image processing apparatus. For the sake of a concrete description, consider a case in which the embodiment is implemented by an ultrasonic diagnostic system including an ultrasonic diagnostic apparatus and an ultrasonic image processing apparatus.
-
FIG. 8 is a block diagram for explaining an ultrasonic diagnostic system S including an ultrasonicdiagnostic apparatus 1 and an ultrasonicimage processing apparatus 5. As shown inFIG. 8 , the ultrasonicimage processing apparatus 5 is implemented by, for example, a medical workstation, and includes astorage unit 50, animage generating unit 51, adisplay processing unit 52, acontrol processor 53, adisplay unit 54, aninterface unit 55, and anoperation unit 56. - The
storage unit 50 stores ultrasonic images acquired in advance and ultrasonic images transmitted from the ultrasonicdiagnostic apparatus 1 via a network. Theimage generating unit 51 executes the planned surgical line marking support processing described above. Thedisplay processing unit 52 executes various kinds of processes associated with a dynamic range, luminance (brightness), contrast, γ curve correction, RGB conversion, and the like for various kinds of image data generated/processed by theimage processing unit 50. Thecontrol processor 53 reads out a dedicated program for implementing the planned surgical line marking support function described above, expands the program in its own memory, and executes computation/control and the like associated with various kinds of processes. Thedisplay unit 54 is a monitor to display an ultrasonic image or the like in a predetermined form. Theinterface unit 55 is an interface for network connection and connection to other external storage devices. Theoperation unit 56 includes switches, buttons, a trackball, a mouse, and a keyboard which are used to input various types of instructions to the apparatus. - When performing the planned surgical line marking support processing shown in
FIG. 2 by using the ultrasonic diagnostic system S, the ultrasonicdiagnostic apparatus 1 executes, for example, the processes in steps S1 and S2, and the ultrasonicimage processing apparatus 5 executes the processes in steps S3 to S6. Alternatively, the ultrasonicdiagnostic apparatus 1 can execute the processes in steps S1 to S3, and the ultrasonicimage processing apparatus 5 can execute the processes in steps S4 to S6. - Likewise, when performing the planned surgical line marking support processing shown in
FIG. 7 using the ultrasonic diagnostic system S, the ultrasonicdiagnostic apparatus 1 executes the processes in steps S11 and S12, and the ultrasonicimage processing apparatus 5 executes the processes in steps S13 to S17. Alternatively, the ultrasonicdiagnostic apparatus 1 can execute the processes in steps S11 to S13, and the ultrasonicimage processing apparatus 5 can execute the processes in steps S14 to S17. - The above arrangement can also acquire the effects described in the first to third embodiments.
- Note that the present embodiment is not limited to each embodiment described above, and constituent elements can be modified and embodied in the execution stage within the spirit and scope of the embodiment.
- The following are concrete modifications.
- (1) Each function (each function in planned surgical line marking support) associated with each embodiment can also be implemented by installing programs for executing the corresponding processing in a computer such as a workstation and expanding them in a memory. In this case, the programs which can cause the computer to execute the corresponding techniques can be distributed by being stored in recording media such as magnetic disks ((floppy®) disks, hard disks, and the like), optical disks (CD-ROMs, DVDs, and the like), and semiconductor memories.
- (2) Each embodiment described above has exemplified the case in which planned surgical line marking is supported. However, the technical idea of the present embodiment is not limited to the techniques used for surgical operation, and can be used in a case in which, for example, when a radiation treatment apparatus treats a lesion by irradiating it with radiation, an irradiation range is marked.
- While certain embodiments have been described, these embodiments have been presented by way of example only, and are not intended to limit the scope of the inventions. Indeed, the novel embodiments described herein may be embodied in a variety of other forms; furthermore, various omissions, substitutions and changes in the form of the embodiments described herein may be made without departing from the spirit of the inventions. The accompanying claims and their equivalents are intended to cover such forms or modifications as would fall within the scope and spirit of the inventions.
Claims (25)
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010015891 | 2010-01-27 | ||
JP2010-015891 | 2010-01-27 | ||
JP2011011730A JP5707148B2 (en) | 2010-01-27 | 2011-01-24 | Medical image diagnostic apparatus and medical image processing apparatus |
JP2011-011730 | 2011-01-24 |
Publications (1)
Publication Number | Publication Date |
---|---|
US20110184291A1 true US20110184291A1 (en) | 2011-07-28 |
Family
ID=44293134
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/014,219 Abandoned US20110184291A1 (en) | 2010-01-27 | 2011-01-26 | Ultrasonic diagnostic apparatus, medical image diagnostic apparatus, ultrasonic image processing apparatus, medical image processing apparatus, ultrasonic diagnostic system, and medical image diagnostic system |
Country Status (3)
Country | Link |
---|---|
US (1) | US20110184291A1 (en) |
JP (1) | JP5707148B2 (en) |
CN (1) | CN102133110B (en) |
Cited By (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2754395A1 (en) * | 2013-01-10 | 2014-07-16 | Samsung Electronics Co., Ltd | Lesion diagnosis apparatus and method |
US20150057646A1 (en) * | 2009-03-06 | 2015-02-26 | Procept Biorobotics Corporation | Automated image-guided tissue resection and treatment |
CN104970827A (en) * | 2014-04-09 | 2015-10-14 | 柯尼卡美能达株式会社 | Ultrasound imaging apparatus and ultrasound image display method |
US9510853B2 (en) | 2009-03-06 | 2016-12-06 | Procept Biorobotics Corporation | Tissue resection and treatment with shedding pulses |
US20170086940A1 (en) * | 2014-06-25 | 2017-03-30 | Panasonic Intellectual Property Management Co. Ltd. | Projection system |
WO2018109227A1 (en) * | 2016-12-16 | 2018-06-21 | Koninklijke Philips N.V. | System providing images guiding surgery |
US10448966B2 (en) | 2010-02-04 | 2019-10-22 | Procept Biorobotics Corporation | Fluid jet tissue resection and cold coagulation methods |
US10524822B2 (en) | 2009-03-06 | 2020-01-07 | Procept Biorobotics Corporation | Image-guided eye surgery apparatus |
US10588609B2 (en) | 2010-02-04 | 2020-03-17 | Procept Biorobotics Corporation | Gene analysis and generation of stem cell methods and apparatus |
US10776905B2 (en) * | 2018-02-28 | 2020-09-15 | Microsoft Technology Licensing, Llc | Adaptive interface transformation across display screens |
US11033330B2 (en) | 2008-03-06 | 2021-06-15 | Aquabeam, Llc | Tissue ablation and cautery with optical energy carried in fluid stream |
US11137889B2 (en) | 2018-02-28 | 2021-10-05 | Microsoft Technology Licensing, Llc | Adaptive interface transformation across display screens |
US11207058B2 (en) | 2014-09-05 | 2021-12-28 | Procept Biorobotics Corporation | Apparatus for removing intact cells from a surgical site |
US11213313B2 (en) | 2013-09-06 | 2022-01-04 | Procept Biorobotics Corporation | Tissue resection and treatment with shedding pulses |
EP3944832A1 (en) * | 2020-07-30 | 2022-02-02 | Ellicut UG (haftungsbeschränkt) | System and method for creating cutting lines |
US11350964B2 (en) | 2007-01-02 | 2022-06-07 | Aquabeam, Llc | Minimally invasive treatment device for tissue resection |
US11406453B2 (en) | 2009-03-06 | 2022-08-09 | Procept Biorobotics Corporation | Physician controlled tissue resection integrated with treatment mapping of target organ images |
US11571180B2 (en) | 2016-12-16 | 2023-02-07 | Koninklijke Philips N.V. | Systems providing images guiding surgery |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6073563B2 (en) * | 2012-03-21 | 2017-02-01 | 東芝メディカルシステムズ株式会社 | Ultrasonic diagnostic apparatus, image processing apparatus, and image processing program |
KR101634334B1 (en) * | 2014-03-25 | 2016-06-29 | 재단법인 아산사회복지재단 | Method of extracting representing image from medical image |
KR101723791B1 (en) * | 2014-09-25 | 2017-04-07 | 재단법인 아산사회복지재단 | Method for stroke infarction section classification |
WO2016201637A1 (en) * | 2015-06-17 | 2016-12-22 | Covidien Lp | Guided ultrasound breast cancer screening system |
CN105434047A (en) * | 2015-11-19 | 2016-03-30 | 郑州大学 | Ultrasonic locating method and device for intervention catheter |
US10342633B2 (en) * | 2016-06-20 | 2019-07-09 | Toshiba Medical Systems Corporation | Medical image data processing system and method |
WO2018159867A1 (en) * | 2017-02-28 | 2018-09-07 | 메디컬아이피 주식회사 | Three-dimensional medical image control method and device therefor |
KR102216697B1 (en) * | 2020-02-28 | 2021-02-17 | 주식회사 루닛 | Medical image apparatus and method for processing medical image |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5506785A (en) * | 1993-02-11 | 1996-04-09 | Dover Systems Corporation | Method and apparatus for generating hollow and non-hollow solid representations of volumetric data |
US20020065461A1 (en) * | 1991-01-28 | 2002-05-30 | Cosman Eric R. | Surgical positioning system |
US20020077533A1 (en) * | 2000-07-12 | 2002-06-20 | Johannes Bieger | Method and device for visualization of positions and orientation of intracorporeally guided instruments during a surgical intervention |
US20060142657A1 (en) * | 2002-03-06 | 2006-06-29 | Mako Surgical Corporation | Haptic guidance system and method |
US20080064953A1 (en) * | 2006-09-13 | 2008-03-13 | Tony Falco | Incorporating Internal Anatomy In Clinical Radiotherapy Setups |
US8358818B2 (en) * | 2006-11-16 | 2013-01-22 | Vanderbilt University | Apparatus and methods of compensating for organ deformation, registration of internal structures to images, and applications of same |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1176262A (en) * | 1997-09-02 | 1999-03-23 | Ge Yokogawa Medical Syst Ltd | Land mark seal for operation aid and method and apparatus for manufacturing same |
JP4068234B2 (en) * | 1998-10-05 | 2008-03-26 | 株式会社東芝 | Ultrasonic diagnostic equipment |
JP2000237205A (en) * | 1999-02-17 | 2000-09-05 | Toshiba Corp | Ultrasonic therapeutic apparatus |
JP4127640B2 (en) * | 2002-09-19 | 2008-07-30 | 株式会社東芝 | Ultrasonic therapy device |
US20050159759A1 (en) * | 2004-01-20 | 2005-07-21 | Mark Harbaugh | Systems and methods for performing minimally invasive incisions |
JP2008018015A (en) * | 2006-07-12 | 2008-01-31 | Toshiba Corp | Medical display unit and system |
US7889912B2 (en) * | 2006-09-15 | 2011-02-15 | The General Electric Company | Method for real-time tracking of cardiac structures in 3D echocardiography |
JP2009100872A (en) * | 2007-10-22 | 2009-05-14 | Panasonic Corp | Ultrasonic diagnostic apparatus |
JP5395371B2 (en) * | 2008-06-18 | 2014-01-22 | 株式会社東芝 | Ultrasonic diagnostic apparatus, ultrasonic image acquisition method and program |
WO2010007860A1 (en) * | 2008-07-15 | 2010-01-21 | 株式会社 日立メディコ | Ultrasound diagnostic device and method for displaying probe operation guide of the same |
-
2011
- 2011-01-24 JP JP2011011730A patent/JP5707148B2/en not_active Expired - Fee Related
- 2011-01-26 US US13/014,219 patent/US20110184291A1/en not_active Abandoned
- 2011-01-27 CN CN2011100293403A patent/CN102133110B/en not_active Expired - Fee Related
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020065461A1 (en) * | 1991-01-28 | 2002-05-30 | Cosman Eric R. | Surgical positioning system |
US5506785A (en) * | 1993-02-11 | 1996-04-09 | Dover Systems Corporation | Method and apparatus for generating hollow and non-hollow solid representations of volumetric data |
US20020077533A1 (en) * | 2000-07-12 | 2002-06-20 | Johannes Bieger | Method and device for visualization of positions and orientation of intracorporeally guided instruments during a surgical intervention |
US20060142657A1 (en) * | 2002-03-06 | 2006-06-29 | Mako Surgical Corporation | Haptic guidance system and method |
US20080064953A1 (en) * | 2006-09-13 | 2008-03-13 | Tony Falco | Incorporating Internal Anatomy In Clinical Radiotherapy Setups |
US8358818B2 (en) * | 2006-11-16 | 2013-01-22 | Vanderbilt University | Apparatus and methods of compensating for organ deformation, registration of internal structures to images, and applications of same |
Cited By (32)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11350964B2 (en) | 2007-01-02 | 2022-06-07 | Aquabeam, Llc | Minimally invasive treatment device for tissue resection |
US11478269B2 (en) | 2007-01-02 | 2022-10-25 | Aquabeam, Llc | Minimally invasive methods for multi-fluid tissue ablation |
US11759258B2 (en) | 2008-03-06 | 2023-09-19 | Aquabeam, Llc | Controlled ablation with laser energy |
US11172986B2 (en) | 2008-03-06 | 2021-11-16 | Aquabeam Llc | Ablation with energy carried in fluid stream |
US11033330B2 (en) | 2008-03-06 | 2021-06-15 | Aquabeam, Llc | Tissue ablation and cautery with optical energy carried in fluid stream |
US9510852B2 (en) | 2009-03-06 | 2016-12-06 | Procept Biorobotics Corporation | Automated image-guided tissue resection and treatment |
US10524822B2 (en) | 2009-03-06 | 2020-01-07 | Procept Biorobotics Corporation | Image-guided eye surgery apparatus |
US20150057646A1 (en) * | 2009-03-06 | 2015-02-26 | Procept Biorobotics Corporation | Automated image-guided tissue resection and treatment |
US9668764B2 (en) | 2009-03-06 | 2017-06-06 | Procept Biorobotics Corporation | Automated image-guided tissue resection and treatment |
US9848904B2 (en) | 2009-03-06 | 2017-12-26 | Procept Biorobotics Corporation | Tissue resection and treatment with shedding pulses |
US9510853B2 (en) | 2009-03-06 | 2016-12-06 | Procept Biorobotics Corporation | Tissue resection and treatment with shedding pulses |
US11406453B2 (en) | 2009-03-06 | 2022-08-09 | Procept Biorobotics Corporation | Physician controlled tissue resection integrated with treatment mapping of target organ images |
US9364251B2 (en) * | 2009-03-06 | 2016-06-14 | Procept Biorobotics Corporation | Automated image-guided tissue resection and treatment |
US10588609B2 (en) | 2010-02-04 | 2020-03-17 | Procept Biorobotics Corporation | Gene analysis and generation of stem cell methods and apparatus |
US10448966B2 (en) | 2010-02-04 | 2019-10-22 | Procept Biorobotics Corporation | Fluid jet tissue resection and cold coagulation methods |
US10653438B2 (en) | 2012-02-29 | 2020-05-19 | Procept Biorobotics Corporation | Automated image-guided tissue resection and treatment |
US11737776B2 (en) | 2012-02-29 | 2023-08-29 | Procept Biorobotics Corporation | Automated image-guided tissue resection and treatment |
US11464536B2 (en) | 2012-02-29 | 2022-10-11 | Procept Biorobotics Corporation | Automated image-guided tissue resection and treatment |
EP2754395A1 (en) * | 2013-01-10 | 2014-07-16 | Samsung Electronics Co., Ltd | Lesion diagnosis apparatus and method |
US11213313B2 (en) | 2013-09-06 | 2022-01-04 | Procept Biorobotics Corporation | Tissue resection and treatment with shedding pulses |
US20150289839A1 (en) * | 2014-04-09 | 2015-10-15 | Konica Minolta, Inc. | Ultrasound imaging apparatus and ultrasound image display method |
CN104970827A (en) * | 2014-04-09 | 2015-10-14 | 柯尼卡美能达株式会社 | Ultrasound imaging apparatus and ultrasound image display method |
US10426568B2 (en) * | 2014-06-25 | 2019-10-01 | Panasonic Intellectual Property Management Co., Ltd. | Projection system |
US20170086940A1 (en) * | 2014-06-25 | 2017-03-30 | Panasonic Intellectual Property Management Co. Ltd. | Projection system |
US11350963B2 (en) | 2014-06-30 | 2022-06-07 | Procept Biorobotics Corporation | Fluid jet tissue ablation apparatus |
US11903606B2 (en) | 2014-06-30 | 2024-02-20 | Procept Biorobotics Corporation | Tissue treatment with pulsatile shear waves |
US11207058B2 (en) | 2014-09-05 | 2021-12-28 | Procept Biorobotics Corporation | Apparatus for removing intact cells from a surgical site |
US11571180B2 (en) | 2016-12-16 | 2023-02-07 | Koninklijke Philips N.V. | Systems providing images guiding surgery |
WO2018109227A1 (en) * | 2016-12-16 | 2018-06-21 | Koninklijke Philips N.V. | System providing images guiding surgery |
US11137889B2 (en) | 2018-02-28 | 2021-10-05 | Microsoft Technology Licensing, Llc | Adaptive interface transformation across display screens |
US10776905B2 (en) * | 2018-02-28 | 2020-09-15 | Microsoft Technology Licensing, Llc | Adaptive interface transformation across display screens |
EP3944832A1 (en) * | 2020-07-30 | 2022-02-02 | Ellicut UG (haftungsbeschränkt) | System and method for creating cutting lines |
Also Published As
Publication number | Publication date |
---|---|
JP5707148B2 (en) | 2015-04-22 |
CN102133110B (en) | 2013-12-11 |
JP2011172918A (en) | 2011-09-08 |
CN102133110A (en) | 2011-07-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20110184291A1 (en) | Ultrasonic diagnostic apparatus, medical image diagnostic apparatus, ultrasonic image processing apparatus, medical image processing apparatus, ultrasonic diagnostic system, and medical image diagnostic system | |
US10278670B2 (en) | Ultrasound diagnostic apparatus and method of controlling ultrasound diagnostic apparatus | |
US9119558B2 (en) | Ultrasonic diagnostic apparatus and ultrasonic diagnostic method | |
EP3328285B1 (en) | A method and system for correcting fat-induced aberrations | |
CN110192893B (en) | Quantifying region of interest placement for ultrasound imaging | |
JP5868067B2 (en) | Medical image diagnostic apparatus, image processing apparatus and method | |
JP5762076B2 (en) | Ultrasonic diagnostic apparatus, ultrasonic image processing apparatus, and medical image diagnostic apparatus | |
US10524768B2 (en) | Medical image diagnostic apparatus and medical image processing apparatus | |
JP7461530B2 (en) | Ultrasound diagnostic device and puncture support program | |
US11266380B2 (en) | Medical ultrasound image processing device | |
JP5897674B2 (en) | Ultrasonic diagnostic apparatus, image processing apparatus, and image processing program | |
JP7240405B2 (en) | Apparatus and method for obtaining anatomical measurements from ultrasound images | |
US20140378837A1 (en) | Ultrasound diagnostic apparatus | |
JP6125380B2 (en) | Ultrasonic diagnostic apparatus, medical image processing apparatus, and image processing program | |
KR20160140858A (en) | Ultrasound imaging system and method for tracking a specular reflector | |
JP2006314689A (en) | Ultrasonic diagnostic system and its control program | |
JP2010148828A (en) | Ultrasonic diagnostic device and control program of ultrasonic diagnostic device | |
JP6460652B2 (en) | Ultrasonic diagnostic apparatus and medical image processing apparatus | |
JP7258483B2 (en) | Medical information processing system, medical information processing device and ultrasonic diagnostic device | |
JP5134897B2 (en) | Breast examination system | |
US20150105658A1 (en) | Ultrasonic imaging apparatus and control method thereof | |
JP2019195447A (en) | Ultrasound diagnosis apparatus and medical information processing program | |
US11850101B2 (en) | Medical image diagnostic apparatus, medical image processing apparatus, and medical image processing method | |
JP2020044266A (en) | Medical information processing device, x-ray diagnostic device and medical information processing program | |
JP7350490B2 (en) | Ultrasonic probe and ultrasound diagnostic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: KABUSHIKI KAISHA TOSHIBA, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:OKAMURA, YOKO;KAMIYAMA, NAOHISA;REEL/FRAME:025701/0551 Effective date: 20110124 Owner name: TOSHIBA MEDICAL SYSTEMS CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:OKAMURA, YOKO;KAMIYAMA, NAOHISA;REEL/FRAME:025701/0551 Effective date: 20110124 |
|
AS | Assignment |
Owner name: TOSHIBA MEDICAL SYSTEMS CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:KABUSHIKI KAISHA TOSHIBA;REEL/FRAME:039127/0669 Effective date: 20160608 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |