WO2017137807A1 - Method and system for generating a compound image - Google Patents

Method and system for generating a compound image Download PDF

Info

Publication number
WO2017137807A1
WO2017137807A1 PCT/IB2016/050746 IB2016050746W WO2017137807A1 WO 2017137807 A1 WO2017137807 A1 WO 2017137807A1 IB 2016050746 W IB2016050746 W IB 2016050746W WO 2017137807 A1 WO2017137807 A1 WO 2017137807A1
Authority
WO
WIPO (PCT)
Prior art keywords
main
image frame
transducer array
frame
secondary image
Prior art date
Application number
PCT/IB2016/050746
Other languages
French (fr)
Inventor
Fulvio BIORDI
Original Assignee
Esaote S.P.A.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Esaote S.P.A. filed Critical Esaote S.P.A.
Priority to PCT/IB2016/050746 priority Critical patent/WO2017137807A1/en
Publication of WO2017137807A1 publication Critical patent/WO2017137807A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • G01S15/8906Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
    • G01S15/8995Combining images from different aspect angles, e.g. spatial compounding
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52085Details related to the ultrasound signal acquisition, e.g. scan sequences

Definitions

  • Ultrasound systems exist today that utilize a variety of techniques for processing ultrasound signals to generate information of interest.
  • One of the problems to be solved in diagnostic imaging and also in ultrasound imaging relates to increasing image resolution, eliminating artifacts, shadows, increasing edge detail and suppressing speckle.
  • Spatial compounding is an imaging technique in which a number of ultrasound images of a given target that have been obtained from multiple vantage points or angles are combined into a single compounded image by combining the data received from each point in the compound image target which has been received from each angle. Examples of spatial compounding may be found in U.S. Pat. Nos . 4,649,927; 4,319,489; and 4,159,462.
  • Real time spatial series of partially overlapping component image frames from substantially independent spatial directions, utilizing an array transducer to implement electronic beam steering and/or electronic translation of the component frames.
  • the component frames are combined into a compound image by summation, averaging, peak detection, or other combinational means.
  • the acquisition sequence and formation of compound images are repeated continuously at a rate limited by the acquisition frame rate, that is, the time required to acquire the full complement of scanlines over the selected width and depth of imaging.
  • Speckle is reduced (i.e. speckle signal to noise ratio is improved) by the square root of N in a compound image with N component frames, provided that the component frames used to create the compound image are substantially independent and are averaged.
  • Standard ultrasound images compound is generally provided using images acquired with different steering angles. Each image leads on a fixed line of sight (LOS) discontinuities due to a not complete areas overlapping. To avoid this it is necessary to reduce the field of view of the output image or heavy filtering it.
  • LOS line of sight
  • a method for performing compound imaging comprises the operations of :
  • main frame boundaries for the main image frame along opposite lateral sides thereof;
  • the line of sight (LOS) of the different images are chosen such that the virtual apex also defined as virtual source of ultrasound beams is set at the center of the transducer array.
  • the secondary image frames obtained by steering will cover an area having a trapezoidal shape and the combination of the different image frames can be carried out by considering adjacent boundaries of the primary and secondary image frames in such a way as to obviate to the generation of discontinuities in the compound image.
  • the virtual apex can be placed in different positions for at least some of the lines of view of the main and secondary image frame.
  • the virtual source or sources being moved from one line to the other of the first and/or second image frames.
  • an ultrasound system comprising an ultrasound probe having a transducer array
  • a beam former configured to:
  • a processor configured to execute the program instructions to:
  • Embodiments herein provide improvements to the method allowing the process to be simplified, while keeping the focusing accuracy high and while reducing the computational burden without the need for a specific particular hardware structure.
  • Still another aim in accordance with at least some embodiments, is to provide a beamforming processor that allow the method according to the embodiments herein to be carried out.
  • Fig. 2 illustrates a more detailed block diagram of the ultrasound system of Fig. 1.
  • Fig. 3 schematically illustrates three images to be compounded according to the prior art.
  • Fig. 4 schematically illustrates the resulted compounded image .
  • Fig. 5 schematically illustrates three images to be compounded in connection with embodiments herein. compounded image .
  • Fig. 7 illustrates a block diagram of an ultrasound system formed in accordance with an alternative embodiment .
  • FIG. 8 illustrates a block diagram of a portion of the digital front-end boards.
  • Fig. 9 illustrates a block diagram of the digital processing board.
  • Fig. 11 illustrates a block diagram of another compound imaging module formed in accordance with embodiments herein.
  • Fig. 12 illustrates a block diagram of another compound imaging module formed in accordance with embodiments herein.
  • Figure 14 schematically shows a field of view of a secondary image frame obtained by a convex transducer array and by offsetting the virtual apex from the center of the array.
  • Fig. 1 illustrates a high-level block diagram of an ultrasound system implemented in accordance with embodiments herein. Portions of the system (as defined by various functional blocks) may be implemented with dedicated hardware, analog and/or digital circuitry, and/or one or more processors operating program instructions stored in memory. Additionally or alternatively, all or portions of the system may be implemented utilizing digital components, digital signal processors (DSPs) and/or field programmable gate arrays (FPGAs) and the like.
  • DSPs digital signal processors
  • FPGAs field programmable gate arrays
  • the blocks/modules illustrated in Fig. 1 can be implemented with dedicated hardware (DPSs, FPGAs, memories) and/or in software with one or more processors .
  • a transmit section and a receive section 152, 153 are connected alternatively one another with the probe to provide to each individual transducer an excitation signal of the corresponding ultrasound pulse and to receive the electric signal corresponding to an acoustic pulse that has hit the transducer.
  • the transmit signals to the transducers are each one sent in an independent manner through a dedicated channel or by a multiplexer to an digital analog converter 125 that generate signals at a predetermined sampling rate and it provides analog excitation signals to each transducer/channel .
  • Digital transmit signals are subjected to a processing by a so called beamforming processor 103 that carries out a proper delay of the transmission signals of each channel in order to selective concentrate ultrasound energy in a narrow line, zone or eventually in the whole body region to be investigated depending on the adopted image formation scheme.
  • the receive signals of the transducers are each one sent in an independent manner through a dedicated channel or by a multiplexer to an analog digital converter 124 that samples said signals with a predetermined sampling rate and it provides output digitized receive signals of each transducer/channel .
  • digitized signals are subjected to a processing by a so called beamforming processor 103 that the receive signal of each channel correspondingly to the travel time of the signal reflected by a predetermined reflection point from said reflection point to the corresponding transducer.
  • the individual transducers of the array provided on the probe have positions different from each other, they necessarily have different distances from the reflection point and therefore the echo signal deriving from such point reaches each individual reflector in a different moment.
  • the focusing process performs the time re-alignment of the contributions of the receive signal of each transducer deriving from the same reflection point and therefore to sum together such contributions in a coherent manner .
  • the focusing process in dependence of the transmission scheme adopted, may concern a narrow line, a zone or the whole investigated body region.
  • a TX waveform generator 102 is coupled to the beamformer 103 and generates the transmit signals that are supplied from the beamformer 103 to the probe 101.
  • the transmit signals may represent various types of ultrasound TX signals such as used in connection with B- mode imaging, Doppler imaging, color Doppler imaging, pulse-inversion transmit techniques, contrast-based imaging, M-mode imaging and the like. Additionally or alternatively, the transmit signals may include single or multi-line transmit, narrow beams transmit, zone transmit, broad beams transmit, plane-waves transmit, shear waves transmit and the like.
  • the beamformer 103 performs beamforming upon received echo signals to form beamformed echo signals in connection to pixel locations distributed across the region of interest.
  • the transducer elements generate raw analog receive signals that are supplied to the beamformer.
  • the beamformer adjusts the delays to focus the receive signal along one or more select receive beams and at one or more select depths within the region of the receive signals to obtain a desired apodization and profile.
  • the beamformer applies weights and delays to the receive signals from individual corresponding transducers of the probe. The delayed, weighted receive signals are then summed to form a coherent receive signal .
  • the beamformer 103 includes (or is coupled to) an A/D converter 124 that digitizes the receive signals at a select sampling rate. The digitization process may be performed before or after the summing operation that produces the coherent receive signals.
  • the beamformer also includes (or is coupled to) a demodulator 122 that demodulates the receive signals to remove the carrier waveform.
  • complex receive signals are generated that include I,Q components (also referred to as I,Q data pairs) .
  • the I,Q data pairs are saved as image pixels in memory.
  • the I,Q data pairs defining the image pixels for corresponding individual locations along corresponding lines of sight (LOS) or view lines.
  • a collection of image pixels (e.g., I,Q data pairs) are collected over time and saved as 2D image frames and/or 3D volumes of image data.
  • the image pixels correspond to tissue and other anatomy within the ROI .
  • the sequence controller 110 may be programmed to manage acquisition timing which can be generalized as a sequence of firings aimed at select reflection points/targets in the ROI .
  • the sequence controller 110 manages operation of the TX/RX beamformer 103 in connection with transmitting ultrasound beams and the lines of sight.
  • the sequence controller 110 also manages collection of receive signals.
  • the beamformer may be configured to acquire a main image frame and a secondary image frame of ultrasound data at the transducer array, the main and secondary image frames at least partially overlapping one another.
  • One or more processors 106 and/or CPU 112 perform various processing operations as described herein.
  • the processor 106 executes a B/W module to generate B-mode images.
  • the processor 106 and/or CPU 112 executes a Doppler module to generate Doppler images.
  • the processor executes a Color flow module (CFM) to generate color flow images.
  • the processor 106 and/or CPU 112 may implement additional ultrasound imaging and measurement operations.
  • the processor 106 and/or CPU 112 may filter the first and second displacements to eliminate movement-related artifacts.
  • An image scan converter 107 performs scan conversion on the image pixels to convert the format of the image pixels from the coordinate system of the ultrasound acquisition signal path (e.g., the beamformer, etc.) and the coordinate system of the display.
  • the scan converter 107 may convert the image pixels from polar coordinates to Cartesian coordinates for image frames.
  • a cine memory 108 stores a collection of image frames over time.
  • the image frames may be stored formatted in polar coordinates , Cartesian coordinates or another coordinate system. in ormation, such as the image frames and information measured in accordance with embodiments herein.
  • the display 109 displays the ultrasound image with the region of interest shown.
  • a control CPU module 112 is configured to perform various tasks such as implementing the user/interface and overall system configuration/control.
  • the processing node In case of fully software implementation of the ultrasound signal path, the processing node usually hosts also the functions of the control CPU.
  • a power supply circuit 111 is provided to supply power to the various circuitry, modules, processors, memory components, and the like.
  • the power supply 111 may be an A.C. power source and/or a battery power source (e.g., in connection with portable operation).
  • the processor 106 and/or CPU 112 may be configured to execute a compound module to generate compound images .
  • Spatial compounding is an imaging technique in which a number of ultrasound images of a given target that have been obtained from multiple vantage points or angles are combined into a single compounded image by combining the data received from each point in the compound image target which has been received from each angle. Examples of spatial compounding may be found in U.S. Pat. Nos . 4,649,927; 4,319,489; and 4,159,462.
  • Real time spatial compound imaging is performed by rapidly acquiring a series of partially overlapping component image frames from substantially independent spatial directions, utilizing an array transducer to implement electronic component frames.
  • the component frames are combined into a compound image by summation, averaging, peak detection, or other combinational means.
  • the acquisition sequence and formation of compound images are repeated continuously at a rate limited by the acquisition frame rate, that is, the time required to acquire the full complement of scanlines over the selected width and depth of imaging.
  • the compounded image typically shows lower speckle and better specular reflector delineation than conventional ultrasound images from a single viewpoint. Speckle is reduced (i.e. speckle signal to noise ratio is improved) by the square root of N in a compound image with N component frames, provided that the component frames used to create the compound image are substantially independent and are averaged.
  • Standard ultrasound images compound is generally provided using images acquired with different steering angles. Each image leads on a fixed line of sight (LOS) angle step. Resulting composed image shows a double side discontinuities due to a not complete areas overlapping. To avoid this it is necessary to reduce the field of view of the output image or heavy filtering it.
  • processor 106 is configured to execute the program instructions to: combine the main and secondary image frames to form a compound image; and align at least one of the main frame boundaries for the main image frame with one of the lines of view of the secondary image frame .
  • Fig. 3 illustrates three different steered acquired images (steer left, centered, steer right) normally used to obtain a compounded image according to the status of the art.
  • Fig. 4 shows the resulting overlapped image with the highlighted discontinuity side artifact.
  • the compounded image can be produced with the same field of view as the center one without heavy filtering.
  • Fig. 5 illustrates a series of image frames of the ultrasound data that are acquired by an ultrasound probe having a transducer array with a desired configuration. While the transducer array may have a linear, convex or alternative shape, in the example of Fig. 5, the transducer array has a linear configuration.
  • a main image frame 304 is illustrated as corresponding to a central portion of a compounded image.
  • Left and right secondary image frames 306 and 308 are illustrated as corresponding to left and right portions of a compounded image. Is few as two image frames, or from more than three image frames.
  • the main and secondary image frames 304 - 308 are combined to form a compound image (e.g. as illustrated in Fig. 6) .
  • the receive beamformer defines main frame boundaries 310, 312 for the main image frame 304 that are located along opposite lateral sides of a main field of view 314.
  • the main field of view 314 includes a profile defined by lateral main frame boundaries 310, 312, a proximal edge 330 and a distal depth 331.
  • the receive beamformer also defines a secondary frame boundaries 316, 318 for the left secondary image frame 306, and secondary frame boundaries 320, 322 for the right secondary image frame 308.
  • the secondary frame boundaries 316, 318 are located along opposite lateral sides of the secondary field of view 324, while the secondary frame boundaries 320, 322 are located along opposite lateral sides of the secondary field of view 326.
  • the right and left secondary fields of view 326, 324 include corresponding profiles that are defined by the lateral secondary frame boundaries 320, 322 and 316, 318, proximal edges 334, 332 and distal depths 335, 333, respectively.
  • the profiles for the right and left secondary fields of view 326, 324 and secondary image frames 308, 306 correspond to trapezoids with a virtual apex remotely located from the surface of the transducer array in the example of Fig. 3.
  • the profile may correspond to alternative shapes.
  • the boundaries 310, 312, 318 array and overlap such that the boundaries 310, 312 of the main image frame align with the boundaries 318, 322.
  • the aligned boundaries 318, 310 and 322, 312 of main and secondary frames 314, 324 and 326 may be oriented at non-perpendicular angles with respect to the surface of the transducer array, provide that boundaries 318 and 310 align, and boundaries 322 and 312 align and are oriented at common corresponding angles .
  • the receive beamformer defines lines of view within the main and secondary image frames that extend from the transducer array and project into the region of interest.
  • a linear surface of the transducer array may correspond to the proximal edges 330, 332, 334 of the main and secondary fields of view 314, 324, 326.
  • a portion of the lines of view 340 in the main field of view 314 are illustrated to extend at an angle 342 into the region of interest from the surface of the transducer array at the proximal edge 330.
  • the lines of view 340 in the main field of view 314 extend at a common reception steering angle 342 from the surface of the transducer array.
  • the lines of view 340 may extend at different angles from the surface of the transducer array.
  • the receive beamformer defines the lines of view in the secondary image frames 306, 308 to extend from the surface of the transducer array into the region of interest at different angles from one another relative to the surface of the transducer array.
  • lines of view 352, 354 and 356 each are defined to have a corresponding reception steering angle that are relative to the surface of the transducer array and are different from one another.
  • At least a portion of the reception steering angles are oriented at a non-perpendicular angle with respect to the surface of the transducer array.
  • the reception steering angles associated with the peripheral outermost lines of view in the secondary image frame e.g., proximate to the secondary frame boundary that does not overlap the main image frame.
  • reception angles associated with individual lines of view may be defined in various manners as explained herein.
  • the reception steering angles of adjacent/neighboring lines of view may differ from one by a predetermined amount, or may be varied as a function of the position along the transducer array as well as a function of the profile of the field of view.
  • a combiner module e.g. a dedicated circuit, firmware and/or a processor executing program instructions
  • main image frame 304 left secondary image frame 306 and right secondary image frame 308.
  • the boundaries of the corresponding image frames 304 - 308 are defined in connection with the acquisition operation and aligned during the combining operation such that one or more of the main frame boundaries 310, 312 substantially correspond to and align with an associated one of the boundaries of the secondary image frames 308, 306 (as well as the line of sight in the secondary image frame corresponding to the associated boundary) .
  • the main frame boundary 312 may be aligned with the secondary frame boundary 318 of the left secondary image frame 306.
  • the mainframe boundary 310 may be aligned with the secondary frame boundary 322 of the right secondary image frame 308.
  • the main frame boundaries 310, 312 are also aligned with corresponding lines of view within the secondary image frames 308, 306.
  • scan conversion is done following the compounding process by a scan converter 107.
  • the compound images may be stored in a Cine memory 108 in either estimate or display pixel form. If stored in estimate form the images may be scan converted when replayed from the Cine memory for display.
  • the scan converter and Cine memory may also be used to render three dimensional presentations of the spatially compounded images as described in U.S. Pat. Nos . 5,485,842 and compounded images are processed for display by a video processor and displayed on an image display 109.
  • boundary of secondary image frame may be perpendicular to transducer while main image boundary extends at a non-perpendicular angle from the surface of the transducer.
  • boundaries of main and secondary frames may both be non- perpendicular to the surface of the transducer array but are oriented at a common angle.
  • Fig. 7 illustrates a block diagram of an ultrasound system formed in accordance with an alternative embodiment.
  • the system of Fig. 7 implements the operations described herein in connection with various embodiments.
  • one or more circuits/processors within the system implement the operations of any processes illustrated in connection with the figures and/or described herein.
  • the system includes a probe interconnect board 702 that includes one or more probe connection ports 70 .
  • the connection ports 704 may support various numbers of signal channels (e.g., 128, 192, 256, etc.).
  • the connector ports 704 may be configured to be used with different types of probe arrays (e.g., phased array, linear array, curved array, ID, 1.25D, 1.5D, 1.75D, 2D array, etc.).
  • the probes may be configured for different types of applications, such as abdominal, cardiac, maternity, gynecological, urological and cerebrovascular examination, breast examination and the like.
  • acquisition of 2D image data and/or one or more of the connection ports 704 may support 3D image data.
  • the 3D image data may be acquired through physical movement (e.g., mechanically sweeping or physician movement) of the probe and/or by a probe that electrically or mechanically steers the transducer array.
  • Figure 13 relates to a linear transducer array where the steering is carried out by generating lines of sight which are not parallel and cover a trapezoidal field of view or image frame 1301.
  • the virtual prolongation 1303 of the lines of sight 1302 intersects at the virtual apex is offset relating to the center of the linear array and in normal conditions in which the lines of sight are parallel the virtual apex or virtual source lines in the infinite.

Abstract

Methods are provided for generating a compound image with an ultrasound system, the method comprising: acquiring a main image frame and a secondary image frame of ultrasound data at an ultrasound probe having a transducer array, the main and secondary image frames at least partially overlapping one another; defining main frame boundaries for the main image frame along opposite lateral sides thereof; and defining receive lines of view within the secondary image frame that extend from the transducer array, wherein at least a portion of neighboring lines of view extend, into a region of interest, at different reception angles from one another relative to a surface of the transducer array; and combining the main and secondary image frames to form a compound image, wherein the lines of view within the primary and/or secondary image frame are oriented such that virtual extensions of the lines of view extend through the transducer array and converge at a virtual source behind the transducer array considering the direction of transmission of the ultrasound beams. A corresponding system is also disclosed.

Description

METHOD AND SYSTEM FOR GENERATING A COMPOUND IMAGE
BACKGROUND OF THE INVENTION
Ultrasound systems exist today that utilize a variety of techniques for processing ultrasound signals to generate information of interest.
The different techniques has been developed for enhancing the quality of the image information which can be made available for diagnosis.
One of the problems to be solved in diagnostic imaging and also in ultrasound imaging relates to increasing image resolution, eliminating artifacts, shadows, increasing edge detail and suppressing speckle.
One known technique that allows to achieve these results is the so called compound imaging.
Spatial compounding is an imaging technique in which a number of ultrasound images of a given target that have been obtained from multiple vantage points or angles are combined into a single compounded image by combining the data received from each point in the compound image target which has been received from each angle. Examples of spatial compounding may be found in U.S. Pat. Nos . 4,649,927; 4,319,489; and 4,159,462. Real time spatial series of partially overlapping component image frames from substantially independent spatial directions, utilizing an array transducer to implement electronic beam steering and/or electronic translation of the component frames. The component frames are combined into a compound image by summation, averaging, peak detection, or other combinational means. The acquisition sequence and formation of compound images are repeated continuously at a rate limited by the acquisition frame rate, that is, the time required to acquire the full complement of scanlines over the selected width and depth of imaging.
The compounded image typically shows lower speckle and better specular reflector delineation than conventional ultrasound images from a single viewpoint.
Speckle is reduced (i.e. speckle signal to noise ratio is improved) by the square root of N in a compound image with N component frames, provided that the component frames used to create the compound image are substantially independent and are averaged.
Conventional approaches to implementing spatial compounding such as that shown in U.S. Pat. No. 4,649,927 typically use a large FIFO memory buffer to temporarily store the component image frames that will be compounded
(typically by summing and normalization) to form the final compounded image.
Standard ultrasound images compound is generally provided using images acquired with different steering angles. Each image leads on a fixed line of sight (LOS) discontinuities due to a not complete areas overlapping. To avoid this it is necessary to reduce the field of view of the output image or heavy filtering it.
SUMMARY
In accordance with embodiments herein, a method is provided for performing compound imaging. The method comprises the operations of :
acquiring a main image frame and a secondary image frame of ultrasound data at an ultrasound probe having a transducer array, the main and secondary image frames at least partially overlapping one another;
defining main frame boundaries for the main image frame along opposite lateral sides thereof; and
defining receive lines of view within the secondary image frame that extend from the transducer array, wherein at least a portion of neighboring lines of view extend, into a region of interest, at different reception angles from one another relative to a surface of the transducer array; and
combining the main and secondary image frames to form a compound image, wherein the lines of view within the primary and/or secondary image frame are oriented such that virtual extensions of the lines of view extend through the transducer array and converge at a virtual source behind the transducer array considering the direction of transmission of the ultrasound beams. source is located offset with respect to a center of the transducer array.
In the case of a so called linear array, the center of a transducer array can be defined as a point along an axis or a plane which is perpendicular to a planar or linear array of transducers and which axis or plane are coincident with the axis or plane of symmetry of the said planar or linear transducer array.
In the case of a so called convex probe or convex array, the transducers of the array are placed side by side along an arched, surface or a surface corresponding to a sector of a circle. In this case the center of the transducer array corresponds with the center of the said arch or sector of circle.
According to traditional techniques the line of sight (LOS) of the different images are chosen such that the virtual apex also defined as virtual source of ultrasound beams is set at the center of the transducer array. By allowing to offset the position of the virtual apex the secondary image frames obtained by steering will cover an area having a trapezoidal shape and the combination of the different image frames can be carried out by considering adjacent boundaries of the primary and secondary image frames in such a way as to obviate to the generation of discontinuities in the compound image.
In accordance with embodiments herein, the method further comprises the operations that the main and secondary image frames include corresponding main and secondary lines of view that are defined with respect to located behind and offset with respect to a surface of the transducer array.
According to this embodiment the virtual apex can be placed in different positions for at least some of the lines of view of the main and secondary image frame.
According to a further embodiment of the method the virtual source or sources being moved from one line to the other of the first and/or second image frames.
In accordance to still another embodiment of a method at least one of the main frame boundaries for the main image frame substantially aligned with one of the lines of view of the secondary image frame.
In accordance with embodiments herein, an ultrasound system is provided that comprises an ultrasound probe having a transducer array;
a memory storing program instructions;
a beam former configured to:
acquire a main image frame and a secondary image frame of ultrasound data at the transducer array, the main and secondary image frames at least partially overlapping one another;
define main frame boundaries for the main image frame along opposite lateral sides thereof; and
define receive lines of view within the secondary image frame that extend from the transducer array, wherein at least a portion of neighboring lines of view extend, into a region of interest, at different reception angles from one another relative to a surface of the transducer array; and secondary image frame with an orientation such that virtual extensions of the lines of view extend through the transducer array and converge at a virtual source behind the transducer array considering the direction of transmission of the ultrasound beams;
a processor configured to execute the program instructions to:
combine the main and secondary image frames to form a compound image; and
align at least one of the main frame boundaries for the main image frame with one of the lines of view of the secondary image frame.
The ultrasound system includes circuitry configured to:
align at least one of the main frame boundaries for the main image frame with one of the lines of view of the secondary image frame.
Embodiments herein relate to a compound method for ultrasound signals by means of an ultrasound machine acquiring diagnostic images which ultrasound machine comprises an array of electroacoustic transducers arranged according to a predetermined arrangement and with predetermined relative positions from each other and which transducers are used, alternatively, for generating an excitation ultrasound wave and for receiving the reflection echoes (target) from the tissues under examination. Said reflection echoes generate electric signals corresponding to the received acoustic wave which electric signals are processed by each processing channel electric signal that corresponds to the combination of the contributions of the reflection signal of each transducer deriving from a certain reflection target or point,
which method comprising the following operations: acquiring a main image frame and a secondary image frame of ultrasound data at an ultrasound probe having a transducer array, the main and secondary image frames at least partially overlapping one another;
defining main frame boundaries for the main image frame along opposite lateral sides thereof; and
defining receive lines of view within the secondary image frame that extend from the transducer array, wherein at least a portion of neighboring lines of view extend, into a region of interest, at different reception angles from one another relative to a surface of the transducer array; and
combining the main and secondary image frames to form a compound image, wherein the lines of view within the primary and/or secondary image frame are oriented such that virtual extensions of the lines of view extend through the transducer array and converge at a virtual source behind the transducer array.
Embodiments herein provide improvements to the method allowing the process to be simplified, while keeping the focusing accuracy high and while reducing the computational burden without the need for a specific particular hardware structure.
A further aim of the at least some embodiments herein correction coefficients to be put in table on the basis of general geometrical characteristics of the ultrasound system and particularly of the transducer array.
Still another aim, in accordance with at least some embodiments, is to provide a beamforming processor that allow the method according to the embodiments herein to be carried out.
A further aim is to provide an ultrasound system for carrying out the method according to the embodiments herein .
Further characteristics and improvements of the embodiments herein are the subject matter of the subclaims .
BRIEF DESCRIPTION OF THE DRAWINGS
Further improvements and characteristics of the embodiments herein will be clear from the following description of some non-limiting embodiments schematically shown in the annexed figures wherein:
Fig. 1 illustrates a block diagram of an ultrasound system.
Fig. 2 illustrates a more detailed block diagram of the ultrasound system of Fig. 1.
Fig. 3 schematically illustrates three images to be compounded according to the prior art.
Fig. 4 schematically illustrates the resulted compounded image .
Fig. 5 schematically illustrates three images to be compounded in connection with embodiments herein. compounded image .
Fig. 7 illustrates a block diagram of an ultrasound system formed in accordance with an alternative embodiment .
FIG. 8 illustrates a block diagram of a portion of the digital front-end boards.
Fig. 9 illustrates a block diagram of the digital processing board.
Fig. 10 illustrates a block diagram of a compound imaging module formed in accordance with embodiments herein .
Fig. 11 illustrates a block diagram of another compound imaging module formed in accordance with embodiments herein.
Fig. 12 illustrates a block diagram of another compound imaging module formed in accordance with embodiments herein.
Figure 13 schematically shows a trapezoidal secondary image frame obtained by offsetting the virtual apex of a linear array transducer.
Figure 14 schematically shows a field of view of a secondary image frame obtained by a convex transducer array and by offsetting the virtual apex from the center of the array.
Figure 15 shows schematically a way to generate a field of view similar to a convex transducer array by means of a linear transducer array by steering laterally the ultrasound beams according to a technique disclosed in patent EP1681020B1. While multiple embodiments are described, still other embodiments of the described subject matter will become apparent to those skilled in the art from the following detailed description and drawings, which show and describe illustrative embodiments of disclosed inventive subject matter. As will be realized, the inventive subject matter is capable of modifications in various aspects, all without departing from the spirit and scope of the described subject matter. Accordingly, the drawings and detailed description are to be regarded as illustrative in nature and not restrictive.
Fig. 1 illustrates a high-level block diagram of an ultrasound system implemented in accordance with embodiments herein. Portions of the system (as defined by various functional blocks) may be implemented with dedicated hardware, analog and/or digital circuitry, and/or one or more processors operating program instructions stored in memory. Additionally or alternatively, all or portions of the system may be implemented utilizing digital components, digital signal processors (DSPs) and/or field programmable gate arrays (FPGAs) and the like. The blocks/modules illustrated in Fig. 1 can be implemented with dedicated hardware (DPSs, FPGAs, memories) and/or in software with one or more processors .
The ultrasound machine for acquiring diagnostic images comprises a probe 101 provided with an array of electroacoustic transducers intended to transform acoustic signals and vice versa the received acoustic signals into corresponding electric signals.
A transmit section and a receive section 152, 153 are connected alternatively one another with the probe to provide to each individual transducer an excitation signal of the corresponding ultrasound pulse and to receive the electric signal corresponding to an acoustic pulse that has hit the transducer.
The transmit signals to the transducers are each one sent in an independent manner through a dedicated channel or by a multiplexer to an digital analog converter 125 that generate signals at a predetermined sampling rate and it provides analog excitation signals to each transducer/channel .
Digital transmit signals are subjected to a processing by a so called beamforming processor 103 that carries out a proper delay of the transmission signals of each channel in order to selective concentrate ultrasound energy in a narrow line, zone or eventually in the whole body region to be investigated depending on the adopted image formation scheme.
The receive signals of the transducers are each one sent in an independent manner through a dedicated channel or by a multiplexer to an analog digital converter 124 that samples said signals with a predetermined sampling rate and it provides output digitized receive signals of each transducer/channel .
Therefore digitized signals are subjected to a processing by a so called beamforming processor 103 that the receive signal of each channel correspondingly to the travel time of the signal reflected by a predetermined reflection point from said reflection point to the corresponding transducer.
Since the individual transducers of the array provided on the probe have positions different from each other, they necessarily have different distances from the reflection point and therefore the echo signal deriving from such point reaches each individual reflector in a different moment.
The focusing process performs the time re-alignment of the contributions of the receive signal of each transducer deriving from the same reflection point and therefore to sum together such contributions in a coherent manner .
The focusing process, in dependence of the transmission scheme adopted, may concern a narrow line, a zone or the whole investigated body region.
One or more processors 106 perform various processing operations as described herein before the signal is ready to be scan converted in 157 and displayed (109) .
Fig. 2 shows more details of the ultrasound system of Fig. 1. The probe 101 may include various transducer array configurations, such as a one dimensional array, a two dimensional array, a linear array, a convex array and the like. The transducers of the array may be managed to operate as a ID array, 1.25D array, 1.5D array, 1.75D array, 2D array, 3D array, 4D array, etc. wireless link to a beamformer 103. The beamformer 103 includes a transmit (TX) beamformer and a receive (RX) beamformer that are jointly represented by TX/RX beamformer 103. The TX and RX portions of the beamformer may be implemented together or separately. The beamformer 103 supplies transmit signals to the probe 101 and performs beamforming of "echo" receive signals that are received by the probe 101.
A TX waveform generator 102 is coupled to the beamformer 103 and generates the transmit signals that are supplied from the beamformer 103 to the probe 101. The transmit signals may represent various types of ultrasound TX signals such as used in connection with B- mode imaging, Doppler imaging, color Doppler imaging, pulse-inversion transmit techniques, contrast-based imaging, M-mode imaging and the like. Additionally or alternatively, the transmit signals may include single or multi-line transmit, narrow beams transmit, zone transmit, broad beams transmit, plane-waves transmit, shear waves transmit and the like.
The beamformer 103 performs beamforming upon received echo signals to form beamformed echo signals in connection to pixel locations distributed across the region of interest. For example, in accordance with certain embodiments , the transducer elements generate raw analog receive signals that are supplied to the beamformer. The beamformer adjusts the delays to focus the receive signal along one or more select receive beams and at one or more select depths within the region of the receive signals to obtain a desired apodization and profile. The beamformer applies weights and delays to the receive signals from individual corresponding transducers of the probe. The delayed, weighted receive signals are then summed to form a coherent receive signal .
The beamformer 103 includes (or is coupled to) an A/D converter 124 that digitizes the receive signals at a select sampling rate. The digitization process may be performed before or after the summing operation that produces the coherent receive signals. The beamformer also includes (or is coupled to) a demodulator 122 that demodulates the receive signals to remove the carrier waveform. Once the receive signals are demodulated and digitized, complex receive signals are generated that include I,Q components (also referred to as I,Q data pairs) . The I,Q data pairs are saved as image pixels in memory. The I,Q data pairs, defining the image pixels for corresponding individual locations along corresponding lines of sight (LOS) or view lines. A collection of image pixels (e.g., I,Q data pairs) are collected over time and saved as 2D image frames and/or 3D volumes of image data. The image pixels correspond to tissue and other anatomy within the ROI .
Optionally, a dedicated sequencer/timing controller
110 may be programmed to manage acquisition timing which can be generalized as a sequence of firings aimed at select reflection points/targets in the ROI . The sequence controller 110 manages operation of the TX/RX beamformer 103 in connection with transmitting ultrasound beams and the lines of sight. The sequence controller 110 also manages collection of receive signals.
In accordance with embodiments herein the beamformer may be configured to acquire a main image frame and a secondary image frame of ultrasound data at the transducer array, the main and secondary image frames at least partially overlapping one another.
One or more processors 106 and/or CPU 112 perform various processing operations as described herein.
For example, the processor 106 executes a B/W module to generate B-mode images. The processor 106 and/or CPU 112 executes a Doppler module to generate Doppler images. The processor executes a Color flow module (CFM) to generate color flow images. The processor 106 and/or CPU 112 may implement additional ultrasound imaging and measurement operations. Optionally, the processor 106 and/or CPU 112 may filter the first and second displacements to eliminate movement-related artifacts.
An image scan converter 107 performs scan conversion on the image pixels to convert the format of the image pixels from the coordinate system of the ultrasound acquisition signal path (e.g., the beamformer, etc.) and the coordinate system of the display. For example, the scan converter 107 may convert the image pixels from polar coordinates to Cartesian coordinates for image frames.
A cine memory 108 stores a collection of image frames over time. The image frames may be stored formatted in polar coordinates , Cartesian coordinates or another coordinate system. in ormation, such as the image frames and information measured in accordance with embodiments herein. The display 109 displays the ultrasound image with the region of interest shown.
A control CPU module 112 is configured to perform various tasks such as implementing the user/interface and overall system configuration/control. In case of fully software implementation of the ultrasound signal path, the processing node usually hosts also the functions of the control CPU.
A power supply circuit 111 is provided to supply power to the various circuitry, modules, processors, memory components, and the like. The power supply 111 may be an A.C. power source and/or a battery power source (e.g., in connection with portable operation).
The processor 106 and/or CPU 112 may be configured to execute a compound module to generate compound images .
Spatial compounding is an imaging technique in which a number of ultrasound images of a given target that have been obtained from multiple vantage points or angles are combined into a single compounded image by combining the data received from each point in the compound image target which has been received from each angle. Examples of spatial compounding may be found in U.S. Pat. Nos . 4,649,927; 4,319,489; and 4,159,462. Real time spatial compound imaging is performed by rapidly acquiring a series of partially overlapping component image frames from substantially independent spatial directions, utilizing an array transducer to implement electronic component frames. The component frames are combined into a compound image by summation, averaging, peak detection, or other combinational means. The acquisition sequence and formation of compound images are repeated continuously at a rate limited by the acquisition frame rate, that is, the time required to acquire the full complement of scanlines over the selected width and depth of imaging.
The compounded image typically shows lower speckle and better specular reflector delineation than conventional ultrasound images from a single viewpoint. Speckle is reduced (i.e. speckle signal to noise ratio is improved) by the square root of N in a compound image with N component frames, provided that the component frames used to create the compound image are substantially independent and are averaged.
Conventional approaches to implementing spatial compounding such as that shown in U.S. Pat. No. 4,649,927 typically use a large FIFO memory buffer to temporarily store the component image frames that will be compounded (typically by summing and normalization) to form the final compounded image.
Standard ultrasound images compound is generally provided using images acquired with different steering angles. Each image leads on a fixed line of sight (LOS) angle step. Resulting composed image shows a double side discontinuities due to a not complete areas overlapping. To avoid this it is necessary to reduce the field of view of the output image or heavy filtering it. processor 106 is configured to execute the program instructions to: combine the main and secondary image frames to form a compound image; and align at least one of the main frame boundaries for the main image frame with one of the lines of view of the secondary image frame .
Fig. 3 illustrates three different steered acquired images (steer left, centered, steer right) normally used to obtain a compounded image according to the status of the art. Fig. 4 shows the resulting overlapped image with the highlighted discontinuity side artifact.
To obtain a wider overlapped area composing images with different steering angles and thus reduce the discontinuities embodiments provide to virtually laterally move the origin center of LOS, called also "Apex", and apply a variable angular step between LOS.
As a result the compounded image can be produced with the same field of view as the center one without heavy filtering.
Fig. 5 illustrates a series of image frames of the ultrasound data that are acquired by an ultrasound probe having a transducer array with a desired configuration. While the transducer array may have a linear, convex or alternative shape, in the example of Fig. 5, the transducer array has a linear configuration. A main image frame 304 is illustrated as corresponding to a central portion of a compounded image. Left and right secondary image frames 306 and 308 are illustrated as corresponding to left and right portions of a compounded image. Is few as two image frames, or from more than three image frames. As explained herein, the main and secondary image frames 304 - 308 are combined to form a compound image (e.g. as illustrated in Fig. 6) .
As explained herein, the receive beamformer defines main frame boundaries 310, 312 for the main image frame 304 that are located along opposite lateral sides of a main field of view 314. The main field of view 314 includes a profile defined by lateral main frame boundaries 310, 312, a proximal edge 330 and a distal depth 331. The receive beamformer also defines a secondary frame boundaries 316, 318 for the left secondary image frame 306, and secondary frame boundaries 320, 322 for the right secondary image frame 308. The secondary frame boundaries 316, 318 are located along opposite lateral sides of the secondary field of view 324, while the secondary frame boundaries 320, 322 are located along opposite lateral sides of the secondary field of view 326. The right and left secondary fields of view 326, 324 include corresponding profiles that are defined by the lateral secondary frame boundaries 320, 322 and 316, 318, proximal edges 334, 332 and distal depths 335, 333, respectively. The profiles for the right and left secondary fields of view 326, 324 and secondary image frames 308, 306 correspond to trapezoids with a virtual apex remotely located from the surface of the transducer array in the example of Fig. 3. Optionally, the profile may correspond to alternative shapes.
In the present example, the boundaries 310, 312, 318 array and overlap such that the boundaries 310, 312 of the main image frame align with the boundaries 318, 322. Optionally, the aligned boundaries 318, 310 and 322, 312 of main and secondary frames 314, 324 and 326 may be oriented at non-perpendicular angles with respect to the surface of the transducer array, provide that boundaries 318 and 310 align, and boundaries 322 and 312 align and are oriented at common corresponding angles .
The receive beamformer defines lines of view within the main and secondary image frames that extend from the transducer array and project into the region of interest. In the example of Fig. 3, a linear surface of the transducer array may correspond to the proximal edges 330, 332, 334 of the main and secondary fields of view 314, 324, 326. A portion of the lines of view 340 in the main field of view 314 are illustrated to extend at an angle 342 into the region of interest from the surface of the transducer array at the proximal edge 330. In the example of Fig. 3, the lines of view 340 in the main field of view 314 extend at a common reception steering angle 342 from the surface of the transducer array. Optionally, the lines of view 340 may extend at different angles from the surface of the transducer array.
The receive beamformer defines the lines of view in the secondary image frames 306, 308 to extend from the surface of the transducer array into the region of interest at different angles from one another relative to the surface of the transducer array. By way of illustration, with reference to the secondary image frame first reception steering angle 346 from the surface of the transducer array (corresponding to the proximal edge 332) , while a neighboring line of view 348 is defined to extend at a different second reception steering angle 350 from the surface of the transducer array. Continuing of the foregoing example, lines of view 352, 354 and 356 each are defined to have a corresponding reception steering angle that are relative to the surface of the transducer array and are different from one another. At least a portion of the reception steering angles are oriented at a non-perpendicular angle with respect to the surface of the transducer array. For example, the reception steering angles associated with the peripheral outermost lines of view in the secondary image frame (e.g., proximate to the secondary frame boundary that does not overlap the main image frame) .
The reception angles associated with individual lines of view may be defined in various manners as explained herein. By way of example, the reception steering angles of adjacent/neighboring lines of view may differ from one by a predetermined amount, or may be varied as a function of the position along the transducer array as well as a function of the profile of the field of view.
A combiner module (e.g. a dedicated circuit, firmware and/or a processor executing program instructions) combines the main image frame 304 with one or more desired secondary image frames 306, 308 to form a compound image (as illustrated in Fig. 6) . With main image frame 304, left secondary image frame 306 and right secondary image frame 308. The boundaries of the corresponding image frames 304 - 308 are defined in connection with the acquisition operation and aligned during the combining operation such that one or more of the main frame boundaries 310, 312 substantially correspond to and align with an associated one of the boundaries of the secondary image frames 308, 306 (as well as the line of sight in the secondary image frame corresponding to the associated boundary) .
For example, the main frame boundary 312 may be aligned with the secondary frame boundary 318 of the left secondary image frame 306. The mainframe boundary 310 may be aligned with the secondary frame boundary 322 of the right secondary image frame 308. As the frame boundaries 318 and 322 correspond to lines of view within the corresponding secondary image frames 306, 308, the main frame boundaries 310, 312 are also aligned with corresponding lines of view within the secondary image frames 308, 306.
In a preferred embodiment scan conversion is done following the compounding process by a scan converter 107. The compound images may be stored in a Cine memory 108 in either estimate or display pixel form. If stored in estimate form the images may be scan converted when replayed from the Cine memory for display. The scan converter and Cine memory may also be used to render three dimensional presentations of the spatially compounded images as described in U.S. Pat. Nos . 5,485,842 and compounded images are processed for display by a video processor and displayed on an image display 109.
As an alternative embodiment the boundary of secondary image frame may be perpendicular to transducer while main image boundary extends at a non-perpendicular angle from the surface of the transducer.
As a further alternative embodiment that boundaries of main and secondary frames may both be non- perpendicular to the surface of the transducer array but are oriented at a common angle.
Fig. 7 illustrates a block diagram of an ultrasound system formed in accordance with an alternative embodiment. The system of Fig. 7 implements the operations described herein in connection with various embodiments. By way of example, one or more circuits/processors within the system implement the operations of any processes illustrated in connection with the figures and/or described herein. The system includes a probe interconnect board 702 that includes one or more probe connection ports 70 . The connection ports 704 may support various numbers of signal channels (e.g., 128, 192, 256, etc.). The connector ports 704 may be configured to be used with different types of probe arrays (e.g., phased array, linear array, curved array, ID, 1.25D, 1.5D, 1.75D, 2D array, etc.). The probes may be configured for different types of applications, such as abdominal, cardiac, maternity, gynecological, urological and cerebrovascular examination, breast examination and the like. acquisition of 2D image data and/or one or more of the connection ports 704 may support 3D image data. By way of example only, the 3D image data may be acquired through physical movement (e.g., mechanically sweeping or physician movement) of the probe and/or by a probe that electrically or mechanically steers the transducer array.
The probe interconnect board (PIB) 702 includes a switching circuit 706 to select between the connection ports 70 . The switching circuit 706 may be manually managed based on user inputs. For example, a user may designate a connection port 704 by selecting a button, switch or other input on the system. Optionally, the user may select a connection port 704 by entering a selection through a user interface on the system.
Optionally, the switching circuit 706 may automatically switch to one of the connection ports 704 in response to detecting a presence of a mating connection of a probe. For example, the switching circuit 706 may receive a "connect" signal indicating that a probe has been connected to a select one of the connection ports 704. The connect signal may be generated by the probe when power is initially supplied to the probe when coupled to the connection port 704.
Additionally or alternatively, each connection port
704 may include a sensor 705 that detects when a mating connection on a cable of a probe has been interconnected with the corresponding connection port 704. The sensor
705 provides connect signal to the switching circuit 706, and in response thereto, the switching circuit 706 outputs 708. Optionally, the sensor 705 may be constructed as a circuit with contacts provided at the connection ports 70 . The circuit remains open when no mating connected is joined to the corresponding connection port 704. The circuit is closed when the mating connector of a probe is joined to the connection port 704.
A control line 724 conveys control signals between the probe interconnection board 702 and a digital processing board 724. A power supply line 736 provides power from a power supply 740 to the various components of the system, including but not limited to, the probe interconnection board (PIB) 702, digital front end boards (DFB) 710, digital processing board (DPB) 726, the master processing board (M PB) 744, and a user interface control board (UI CB) 746. A temporary control bus 738 interconnects, and provides temporary control signals between, the power supply 740 and the boards 702, 710, 726, 744 and 746. The power supply 740 includes a cable to be coupled to an external AC power supply. Optionally, the power supply 740 may include one or more power storage devices (e.g. batteries) that provide power when the AC power supply is interrupted or disconnected. The power supply 740 includes a controller 742 that manages operation of the power supply 740 including operation of the storage devices.
Additionally or alternatively, the power supply 740 may include alternative power sources, such as solar panels and the like. One or more fans 743 are coupled to 742 to be turned on and off based on operating parameters (e.g. temperature) of the various circuit boards and electronic components within the overall system (e.g. to prevent overheating of the various electronics) .
The digital front-end boards 710 providing analog interface to and from probes connected to the probe interconnection board 702. The DFB 710 also provides pulse or control and drive signals, manages analog gains, includes analog to digital converters in connection with each receive channel, provides transmit beamforming management and receive beamforming management and vector composition (associated with focusing during receive operations) .
The digital front end boards 710 include transmit driver circuits 712 that generate transmit signals that are passed over corresponding channels to the corresponding transducers in connection with ultrasound transmit firing operations. The transmit driver circuits 712 provide pulse or control for each drive signal and transmit beamforming management to steer firing operations to points of interest within the region of interest. By way of example, a separate transmit driver circuits 712 may be provided in connection with each individual channel, or a common transmit driver circuits 712 may be utilized to drive multiple channels. The transmit driver circuits 712 cooperate to focus transmit beams to one or more select points within the region of interest. The transmit driver circuits 712 may implement single line transmit, encoded firing sequences, multiline ultrasound beams as well as other forms of ultrasound transmission techniques.
The digital front end boards 710 include receive beamformer circuits 714 that received echo/receive signals and perform various analog and digital processing thereon, as well as phase shifting, time delaying and other operations in connection with beamforming. The beam former circuits 714 may implement various types of beamforming, such as single-line acquisition, multiline acquisition as well as other ultrasound beamforming techniques .
The digital front end boards 716 include continuous wave Doppler processing circuits 716 configured to perform continuous wave Doppler processing upon received echo signals. Optionally, the continuous wave Doppler circuits 716 may also generate continuous wave Doppler transmit signals.
The digital front-end boards 710 are coupled to the digital processing board 726 through various buses and control lines, such as control lines 722, synchronization lines 720 and one or more data bus 718.
The control lines 722 and synchronization lines 720 provide control information and data, as well as synchronization signals, to the transmit drive circuits 712, receive beamforming circuits 714 and continuous wave Doppler circuits 716. The data bus 718 conveys RF ultrasound data from the digital front-end boards 710 to the digital processing board 726. Optionally, the digital front end boards 710 may convert the RF ultrasound digital processing board 726.
The digital processing board 726 includes an RF and imaging module 728, a color flow processing module 730, an RF processing and Doppler module 732 and a PCI link module 734. The digital processing board 726 performs RF filtering and processing, processing of black and white image information, processing in connection with color flow, Doppler mode processing (e.g. in connection with polls wise and continuous wave Doppler) . The digital processing board 726 also provides image filtering (e.g. speckle reduction) and scanner timing control . The digital processing board 726 may include other modules based upon the ultrasound image processing functionality afforded by the system.
The modules 728 - 734 comprise one or more processors, DSPs, and/or FPGAs , and memory storing program instructions to direct the processors, DSPs, and/or FPGAs to perform various ultrasound image processing operations. The RF and imaging module 728 performs various ultrasound related imaging, such as B mode related image processing of the RF data. The RF processing and Doppler module 732 convert incoming RF data to I,Q data pairs, and performs Doppler related processing on the I, Q data pairs. Optionally, the imaging module 728 may perform B mode related image processing upon I, Q data pairs. The CFM processing module 730 performs color flow related image processing upon the ultrasound RF data and/or the I, Q data pairs. The PCI link 734 manages transfer of ultrasound data, control between the digital processing board 726 and the master processing board 74 .
The master processing board 744 includes memory 750 (e.g. serial ATA solid-state devices, serial ATA hard disk drives, etc.) , a VGA board 752 that includes one or more graphic processing unit (GPUs) , one or more transceivers 760 one or more CPUs 752 and memory 754. The master processing board (also referred to as a PC board) provides user interface management, scan conversion and cine loop management. The master processing board 744 may be connected to one or more external devices, such as a DVD player 756, and one or more displays 758. The master processing board includes communications interfaces, such as one or more USB ports 762 and one or more ports 764 configured to be coupled to peripheral devices. The master processing board 744 is configured to maintain communication with various types of network devices 766 and various network servers 768, such as over wireless links through the transceiver 760 and/or through a network connection (e.g. via USB connector 762 and/or peripheral connector 764) .
The network devices 766 may represent portable or desktop devices, such as smart phones, personal digital assistants, tablet devices, laptop computers, desktop computers, smart watches, ECG monitors, patient monitors, and the like. The master processing board 744 conveys ultrasound images, ultrasound data, patient data and other information and content to the network devices for presentation to the user. The master processing board 744 data entry and the like.
The network server 768 may represent part of a medical network, such as a hospital, a healthcare network, a third-party healthcare service provider, a medical equipment maintenance service, a medical equipment manufacturer, a government healthcare service and the like. The communications link to the network server 768 may be over the Internet, a private intranet, a local area network, a wide-area network, and the like.
The master processing board 744 is connected, via a communications link 770 with a user interface control board 746. The communications link 770 conveys data and information between the user interface and the master processing board 744. The user interface control board 746 includes one or more processors 772, one or more audio/video components 774 (e.g. speakers, a display, etc.). The user interface control board 746 is coupled to one or more user interface input/output devices, such as an LCD touch panel 776, a trackball 778, a keyboard 780 and the like. The processor 772 manages operation of the LCD touch panel 776, as well as collecting user inputs via the touch panel 776, trackball 778 and keyboard 780, where such user inputs are conveyed to the master processing board 744 in connection with implementing embodiments herein.
Fig. 8 illustrates a block diagram of a portion of the digital front-end boards 710 formed in accordance with embodiments herein. A group of diplexers 802 receive the ultrasound signals for the individual channels over along a standard processing circuit 805 or to a continuous wave processing circuit 812, based upon the type of probing utilized. When processed by the standard processing circuit 805, a preamplifier and variable gain amplifier 804 process the incoming ultrasound receive signals that are then provided to an anti-aliasing filter 806 which performs anti-aliasing filtering. The output thereof is provided to an A/D converter 808 that digitizes the incoming analog ultrasound receive signals. When a continuous wave (CW) probe is utilized, the signals therefrom are provided to a continuous wave phase shifter, demodulator and summer 810 which converts the analog RF receive signals to I,Q data pairs. The CW I,Q data pairs are summed, filtered and digitized by a continuous wave processing circuit 812. Outputs from the standard or continuous wave processing circuits 805, 812 are then passed to beam forming circuits 820 which utilize one or more FPGAs to perform filtering, delaying and summing the incoming digitized receive signals before passing the RF data to the digital processing board 826 (Fig. 7) . The FPGAs receive focalization data from memories 828. The focalization data is utilized to manage the filters, delays and summing operations performed by the FPGAs in connection with beamforming. The being formed RF data is passed between the beamforming circuits 820 and ultimately to the digital processing board 726.
The digital front-end boards 710 also include transmit modules 822 that provide transmit drive signals to corresponding transducers of the ultrasound probe. The transmit waveforms. The transmit modules 822 receive transmit waveforms over line 824 from the beamforming circuits 820.
Fig. 9 illustrates a block diagram of the digital processing board 726 implemented in accordance with embodiments herein. The digital processing board 726 includes various processors 952-959 to perform different operations under the control of program instructions saved within corresponding memories see 962 - 969. A master controller 950 manages operation of the digital processing board 726 and the processors 952 - 959. By way of example, one or more processors as the 952 may perform filtering, compounding, the modulation, compression and other operations, while another processor 953 performs color flow processing. The master controller provides probe control signals, timing control signals, communications control and the like. The master controller 950 provides real-time configuration information and synchronization signals in connection with each channel to the digital front-end board 710.
Fig. 10-12 illustrate methods for performing compound imaging in accordance with embodiments herein. The operations of Figures 10-12 may be carried out by one or more processors of an ultrasound system in response to execution of program instructions stored in the memory of the ultrasound system. The operations of Figures 10- 12 may be carried out by one or more digital signal processors (DSPs) , field programmable gate arrays (FPGAs) and/or other hardware or firmware components. 10-12 may be carried out by the processors within one or more servers, on a network, in response to execution of program instructions stored at the server, and/or other applications stored at the server.
At 1002 a main image frame and a secondary image frame of ultrasound data representative of an ultrasound image are acquired at an ultrasound probe having a transducer array. The main and secondary image frames are at least partially overlapping one another. For example, one or more processors, beamformers and other hardware and software manage transmission and reception of ultrasound signals to acquire ultrasound echo signals representative of at least a portion of a patient (e.g., human or animal) .
At 1004 a processor or hardware based beamformer module defines main frame boundaries for the main image frame along opposite lateral sides thereof.
For example, the image frames are acquired acting on the steering of the ultrasound beam within a region of interest ROI .
The region of interest includes lateral side boundaries along opposite sides of the ROI. The side boundaries project from the surface of the transducers of the ultrasound probe. The ROI also includes top and bottom boundaries that extend from side to side in directions generally common with the surface of the transducers of the ultrasound probe. As non-limiting examples , the top and bottom boundaries may extend parallel to one another or along common concentric arcs module defines receive lines of view within the secondary image frame that extend from the transducer array, wherein at least a portion of neighboring lines of view extend, into a region of interest, at different reception angles from one another relative to a surface of the transducer array.
The lines of view within the secondary image frame are, for example, oriented such that virtual extensions of the lines of view extend through the transducer array and converge at a virtual source behind the transducer array, typically located offset with respect to a center of the transducer array.
At 1008 the processor or hardware based compounding module combines the main and secondary image frames to form a compound image, wherein the lines of view within the primary and/or secondary image frame are oriented such that virtual extensions of the lines of view extend through the transducer array and converge at a virtual source behind the transducer array considering the direction of transmission of the ultrasound beams.
Figure 13 and 14 show the position of the virtual apex or virtual source in which in the secondary image frames the virtual apex or virtual source location is offset relating to the center of the transducer array.
Figure 13 relates to a linear transducer array where the steering is carried out by generating lines of sight which are not parallel and cover a trapezoidal field of view or image frame 1301. The virtual prolongation 1303 of the lines of sight 1302 intersects at the virtual apex is offset relating to the center of the linear array and in normal conditions in which the lines of sight are parallel the virtual apex or virtual source lines in the infinite.
Figure 14 shows the conditions in an embodiment comprising a convex transducer array. Convex transducer array will have a center coinciding with the center of the curved convex surface of the transducer array. The virtual apex normally falls with the center or on an axis of symmetry passing through the said center. By displacing the virtual apex VA in relation to the center C of the transducer array as shown in fig. 13 a steered asymmetric secondary image frame 1401 has been generate in which the lines of sight 1402 have different orientation and their prolongation 1403 on the back of the transducer array intersects at the virtual apex Va. This point does not coincide with the center C.
With reference to Fig. 11, a further embodiment is hereby disclosed. Operations 1102, 1104, 1106 are the same as seen above correspondingly referenced as 1002, 1004 and 1006.
At 1108 the processor or hardware based compounding module combines the main and secondary image frames to form a compound image, wherein at least one of the main frame boundaries for the main image frame is substantially aligned with one of the lines of view of the secondary image frame, particularly is aligned with a secondary frame boundary of the secondary image frame. This is a particularly advantageous situation where A typical situation of compounding is when the main image frame combines with two secondary image frames positioned on opposite left and right sides of the main image frame. This situation is illustrated with reference to block diagram of Fig. 12 and Figures 5-6.
At 1202 a main image frame 314 and left and right secondary image frames 324, 326 of ultrasound data representative of an ultrasound image are acquired at an ultrasound probe having a transducer array. The left and right secondary image frames are positioned on opposite left and right sides of the main image frame as shown in Fig. 5. For example, the "left" and "right" sides of the main image frame correspond to the left and right sides from the perspective of a user viewing the compound image on a display.
At 1204 a processor or hardware based beamformer module defines main frame boundaries for the main image frame along opposite lateral sides thereof.
At 1206 the processor or hardware based beamformer module defined secondary frame boundaries for the left and right secondary image frames along opposite lateral sides thereof. The left secondary image frame has a secondary frame boundary that aligns with one of the main frame boundaries while the right secondary image frame has a secondary frame boundary that aligns with another of the main frame. Optionally, one or more of the secondary image frames may not entirely overlap the main image frames. For example, one of the secondary image frames may only partially overlap the main image frame, the main image frame.
At 1208 the processor or hardware based beamformer module defines receive lines of view within the left and right secondary image frames that extend from the transducer array, wherein at least a portion of neighboring lines of view extend, into a region of interest, at different reception angles from one another relative to a surface of the transducer array. Also in this embodiment the lines of view within the secondary image frames may be oriented such that virtual extensions of the lines of view extend through the transducer array and converge at a virtual source behind the transducer array, typically located offset with respect to a center of the transducer array.
At 1210 the processor or hardware based compounding module combines the main and left and right secondary image frames to form a compound image, wherein the main frame boundaries for the main image frame substantially are aligned with one of the lines of view of the left and right secondary image frames. The result of such combining operation is shown in Fig. 6.
An embodiment provides that the main and secondary image frames include corresponding main and secondary lines of view that are defined with respect to different main and secondary virtual sources that are located behind and offset with respect to a surface of the transducer array.
In another embodiment the secondary image frame includes a lateral portion extending laterally beyond the secondary image frame entirely overlaps the main image frame such that no frame boundaries of the secondary image frame are located within the main image frame.
In a further embodiment the main image frame is substantially void of secondary frame boundaries associated with the secondary image frame.
The acquiring operation common to various embodiments may utilize a transducer array with a linear surface or a convex array with a curved surface.
In case of use of a linear array the one of the lines of view of the secondary image frame, that aligns with the main frame boundary, typically extends perpendicularly from the linear surface of the transducer array.
In case of use of a convex array the one of the lines of view of the secondary image frame, that aligns with the main frame boundary, extends perpendicularly from a tangent line at a point of intersection between the one of the lines of view and the convex surface of the transducer array.
According to a further embodiment the present method can be combined with a method for allowing to generate image frames corresponding alternatively to the ones generated with a linear transducer array and to the ones generated by a convex transducer array by using only a linear transducer array. This technique is disclosed in EP 1681020B1.
The method comprises the following steps:
Providing a linear transducer array and a beamformer Steering the ultrasound emitted beams by the linear array transducer so that the linear array of transducers generates a trapezoidal scanning slice or plane diverging in the direction of propagation of the beam;
providing a reflected beam signals focussing rule which generates a trapezoidal image corresponding to the steered ultrasound beams.
The steering of the ultrasound emitted beams can be obtained as known to the skilled person by providing transducers driving signals having delays in driving each transducer of the array so that the emitted ultrasound beam is laterally steered outwards of the lateral limits of the imaged area defined by the projection of the linear array of transducers in a direction perpendicular to the longitudinal extension of the said linear array.
Applying certain delays rules it is possible to focus the emitted ultrasound beam on a line which is diverging laterally outside the slice or surface defined by the projection in the direction perpendicular to it of the longitudinal extension of the array of transducers, thus covering with the emitted ultrasound beams also two triangular zones outside the typical rectangular image zone of a linear array of transducers.
Using a linear array of transducers it is possible to virtually generate a trapezoidal image similarly to a convex array of transducers avoiding in this case the drawbacks relating to the oscillation of a convex array of transducers and the drawbacks of acoustic coupling problems of the said convex array of transducers . linear transducer array 1502. Using normal pulse excitation without lateral steering the image frame will be almost rectangular as indicated by 150 . By applying lateral steering the two lateral boundary lines will diverge and a trapezoidal or sector shape of the frame will be generated as indicated by 150 . A virtual apex VA will be defined by the line of sight intersecting in it.
A system for carrying out the above method steps may have the same configuration as the system disclosed in the description with reference to figures 1, 2 and 6 by providing the beamformer and the processor with a software allowing to carry out lateral steering on the transmitted waves and on the received waves.
In combination the lateral steering can be carried out in such a way as to generate secondary image frames according to the present invention and the embodiments disclosed herein.
It should be clearly understood that the various arrangements and processes broadly described and illustrated with respect to the Figures, and/or one or more individual components or elements of such arrangements and/or one or more process operations associated of such processes, can be employed independently from or together with one or more other components, elements and/or process operations described and illustrated herein. Accordingly, while various arrangements and processes are broadly contemplated, described and illustrated herein, it should be understood restrictive fashion, and furthermore can be regarded as but mere examples of possible working environments in which one or more arrangements or processes may function or operate .
Aspects are described herein with reference to the Figures, which illustrate example methods, devices and program products according to various example embodiments. These program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing device or information handling device to produce a machine, such that the instructions, which execute via a processor of the device implement the functions/acts specified. The program instructions may also be stored in a device readable medium that can direct a device to function in a particular manner, such that the instructions stored in the device readable medium produce an article of manufacture including instructions which implement the function/act specified. The program instructions may also be loaded onto a device to cause a series of operational steps to be performed on the device to produce a device implemented process such that the instructions which execute on the device provide processes for implementing the functions/acts specified.
One or more of the operations described above in connection with the methods may be performed using one or more processors. The different devices in the systems described herein may represent one or more processors, and two or more of these devices may include at least one described herein may represent actions performed when one or more processors (e.g., of the devices described herein) execute program instructions stored in memory (for example, software stored on a tangible and non- transitory computer readable storage medium, such as a computer hard drive, ROM, RAM, or the like) .
The processor (s) may execute a set of instructions that are stored in one or more storage elements, in order to process data. The storage elements may also store data or other information as desired or needed. The storage element may be in the form of an information source or a physical memory element within the controllers and the controller device. The set of instructions may include various commands that instruct the controllers and the controller device to perform specific operations such as the methods and processes of the various embodiments of the subject matter described herein. The set of instructions may be in the form of a software program. The software may be in various forms such as system software or application software. Further, the software may be in the form of a collection of separate programs or modules, a program module within a larger program or a portion of a program module. The software also may include modular programming in the form of object- oriented programming. The processing of input data by the processing machine may be in response to user commands, or in response to results of previous processing, or in response to a request made by another processing machine. microprocessor-based system including systems using microcontrollers, reduced instruction set computers (RISC) , application specific integrated circuitry (ASICs) , field-programmable gate arrays (FPGAs) , logic circuitry, and any other circuit or processor capable of executing the functions described herein. When processor- based, the controller executes program instructions stored in memory to perform the corresponding operations. Additionally or alternatively, the controllers and the controller device may represent circuitry that may be implemented as hardware. The above examples are exemplary only, and are thus not intended to limit in any way the definition and/or meaning of the term "controller."
Optionally, aspects of the processes described herein may be performed over one or more networks one a network server. The network may support communications using any of a variety of commercially-available protocols, such as Transmission Control Protocol/Internet Protocol ("TCP/IP") , User Datagram Protocol ("UDP") , protocols operating in various layers of the Open System Interconnection ("OSI") model, File Transfer Protocol ("FTP") , Universal Plug and Play ("UpnP") , Network File System ("NFS") , Common Internet File System ("CIFS") and AppleTalk. The network can be, for example, a local area network, a wide-area network, a virtual private network, the Internet, an intranet, an extranet, a public switched telephone network, an infrared network, a wireless network, a satellite network and any combination thereof. server can run any of a variety of server or mid-tier applications, including Hypertext Transfer Protocol ("HTTP") servers, FTP servers, Common Gateway Interface ("CGI") servers, data servers, Java servers, Apache servers and business application servers. The server (s) also may be capable of executing programs or scripts in response to requests from user devices, such as by executing one or more web applications that may be implemented as one or more scripts or programs written in any programming language, such as Java®, C, C# or C++, or any scripting language, such as Ruby, PHP, Perl, Python or TCL, as well as combinations thereof. The server (s) may also include database servers, including without limitation those commercially available from Oracle®, Microsoft®, Sybase® and IBM® as well as open- source servers such as MySQL, Postgres, SQLite, MongoDB, and any other server capable of storing, retrieving and accessing structured or unstructured data. Database servers may include table-based servers, document-based servers, unstructured servers, relational servers, nonrelational servers or combinations of these and/or other database servers .
The embodiments described herein may include a variety of data stores and other memory and storage media as discussed above. These can reside in a variety of locations, such as on a storage medium local to (and/or resident in) one or more of the computers or remote from any or all of the computers across the network. In a particular set of embodiments, the information may reside skilled in the art. Similarly, any necessary files for performing the functions attributed to the computers, servers or other network devices may be stored locally and/or remotely, as appropriate. Where a system includes computerized devices, each such device can include hardware elements that may be electrically coupled via a bus, the elements including, for example, at least one central processing unit ("CPU" or "processor") , at least one input device (e.g., a mouse, keyboard, controller, touch screen or keypad) and at least one output device (e.g., a display device, printer or speaker). Such a system may also include one or more storage devices, such as disk drives, optical storage devices and solid-state storage devices such as random access memory ("RAM") or read-only memory ("ROM") , as well as removable media devices, memory cards, flash cards, etc.
Such devices also can include a computer-readable storage media reader, a communications device (e.g., a modem, a network card (wireless or wired) , an infrared communication device, etc.) and working memory as described above. The computer-readable storage media reader can be connected with, or configured to receive, a computer-readable storage medium, representing remote, local, fixed and/or removable storage devices as well as storage media for temporarily and/or more permanently containing, storing, transmitting and retrieving computer-readable information. The system and various devices also typically will include a number of software applications, modules, services or other elements located operating system and application programs, such as a client application or web browser. It should be appreciated that alternate embodiments may have numerous variations from that described above. For example, customized hardware might also be used and/or particular elements might be implemented in hardware, software (including portable software, such as applets) or both. Further, connection to other computing devices such as network input/output devices may be employed.
Various embodiments may further include receiving, sending, or storing instructions and/or data implemented in accordance with the foregoing description upon a computer-readable medium. Storage media and computer readable media for containing code, or portions of code, can include any appropriate media known or used in the art, including storage media and communication media, such as, but not limited to, volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage and/or transmission of information such as computer readable instructions, data structures, program modules or other data, including RAM, ROM, Electrically Erasable Programmable Read-Only Memory ("EEPROM") , flash memory or other memory technology, Compact Disc Read-Only Memory ("CD-ROM") , digital versatile disk (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices or any other medium which can be used to store the desired information and which can be accessed by the system device. Based on the disclosure in the art will appreciate other ways and/or methods to implement the various embodiments .
The speci ication and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense. It will, however, be evident that various modifications and changes may be made thereunto without departing from the broader spirit and scope of the invention as set forth in the claims.
Other variations are within the spirit of the present disclosure. Thus, while the disclosed techniques are susceptible to various modifications and alternative constructions, certain illustrated embodiments thereof are shown in the drawings and have been described above in detail. It should be understood, however, that there is no intention to limit the invention to the specific form or forms disclosed, but on the contrary, the intention is to cover all modifications, alternative constructions and equivalents falling within the spirit and scope of the invention, as defined in the appended claims .
The use of the terms "a" and "an" and "the" and similar referents in the context of describing the disclosed embodiments (especially in the context of the following claims) are to be construed to cover both the singular and the plural, unless otherwise indicated herein or clearly contradicted by context. The terms "comprising," "having," "including" and "containing" are to be construed as open-ended terms (i.e., meaning "including, but not limited to,") unless otherwise noted. physical connections, is to be construed as partly or wholly contained within, attached to or joined together, even if there is something intervening. Recitation of ranges of values herein are merely intended to serve as a shorthand method of referring individually to each separate value falling within the range, unless otherwise indicated herein and each separate value is incorporated into the specification as if it were individually recited herein. The use of the term "set" (e.g., "a set of items") or "subset" unless otherwise noted or contradicted by context, is to be construed as a nonempty collection comprising one or more members. Further, unless otherwise noted or contradicted by context, the term "subset" of a corresponding set does not necessarily denote a proper subset of the corresponding set, but the subset and the corresponding set may be equal .
Operations of processes described herein can be performed in any suitable order unless otherwise indicated herein or otherwise clearly contradicted by context. Processes described herein (or variations and/or combinations thereof) may be performed under the control of one or more computer systems configured with executable instructions and may be implemented as code (e.g., executable instructions, one or more computer programs or one or more applications) executing collectively on one or more processors, by hardware or combinations thereof. The code may be stored on a computer-readable storage medium, for example, in the form of a computer program comprising a plurality of computer-readable storage medium may be non-transitory.
Preferred embodiments of this disclosure are described herein, including the best mode known to the inventors for carrying out the invention. Variations of those preferred embodiments may become apparent to those of ordinary skill in the art upon reading the foregoing description. The inventors expect skilled artisans to employ such variations as appropriate and the inventors intend for embodiments of the present disclosure to be practiced otherwise than as specifically described herein. Accordingly, the scope of the present disclosure includes all modifications and equivalents of the subject matter recited in the claims appended hereto as permitted by applicable law. Moreover, any combination of the above-described elements in all possible variations thereof is encompassed by the scope of the present disclosure unless otherwise indicated herein or otherwise clearly contradicted by context.
All references, including publications, patent applications and patents, cited herein are hereby incorporated by reference to the same extent as if each reference were individually and specifically indicated to be incorporated by reference and were set forth in its entirety herein.

Claims

1. A method for generating a compound image with an ultrasound system, the method comprising:
acquiring a main image frame and a secondary image frame of ultrasound data at an ultrasound probe having a transducer array, the main and secondary image frames at least partially overlapping one another;
defining main frame boundaries for the main image frame along opposite lateral sides thereof; and
defining receive lines of view within the secondary image frame that extend from the transducer array, wherein at least a portion of neighboring lines of view extend, into a region of interest, at different reception angles from one another relative to a surface of the transducer array; and
combining the main and secondary image frames to form a compound image, wherein the lines of view within the primary and/or secondary image frame are oriented such that virtual extensions of the lines of view extend through the transducer array and converge at a virtual source behind the transducer array.
2. The method of claim 1, wherein the virtual source is located offset with respect to a center of the transducer array.
3. The method of claim 1 or 2 , wherein the main and secondary image frames include corresponding main and secondary lines of view that are defined with respect to different main and secondary virtual sources that are located behind and offset with respect to a surface of
. The method of any preceding claims , wherein the virtual source or sources is moved from one line to the other of the first and/or second image frames.
5. The method of any preceding claims, wherein at least one of the main frame boundaries for the main image frame substantially aligned with one of the lines of view of the secondary image frame.
6. The method of claim 5, wherein the at least one of the main frame boundaries of the main image frame aligns with a secondary frame boundary of the secondary image frame .
7. The method of any preceding claim, wherein the secondary image frame includes left and right secondary image frames positioned on opposite left and right sides of the main image frame, the left secondary image frame having a secondary frame boundary that aligns with one of the main frame boundaries, the right secondary image frame having a secondary frame boundary that aligns with another of the main frame boundaries, the left and right secondary image frames entirely overlapping the main image frame .
8. The method of any preceding claim, wherein the secondary image frame includes a lateral portion extending laterally beyond the main frame boundaries of the main image frame, while the secondary image frame entirely overlaps the main image frame such that no frame boundaries of the secondary image frame are located within the main image frame.
9. The method of any preceding claim, wherein the boundaries associated with the secondary image frame.
10. The method of any preceding claim, wherein the acquiring operation utilizes a transducer array with a linear surface and wherein the one of the lines of view of the secondary image frame, that aligns with the main frame boundary, extends perpendicularly from the linear surface of the transducer array.
11. The method of any preceding claim, wherein the acquiring operation utilizes a transducer array with a convex surface and wherein the one of the lines of view of the secondary image frame, that aligns with the main frame boundary, extends perpendicularly from a tangent line at a point of intersection between the one of the lines of view and the convex surface of the transducer array.
12. An ultrasound system, comprising:
an ultrasound probe having a transducer array;
a memory storing program instructions;
a beam former configured to:
acquire a main image frame and a secondary image frame of ultrasound data at the transducer array, the main and secondary image frames at least partially overlapping one another ;
define main frame boundaries for the main image frame along opposite lateral sides thereof; and
define receive lines of view within the secondary image frame that extend from the transducer array, wherein at least a portion of neighboring lines of view extend, into a region of interest, at different reception transducer array;
define the lines of view within the primary and/or secondary image frame with an orientation such that virtual extensions of the lines of view extend through the transducer array and converge at a virtual source behind the transducer array considering the direction of transmission of the ultrasound beams; and
a processor configured to execute the program instructions to:
combine the main and secondary image frames to form a compound image .
13. A method according to claim 12 comprising the step of aligning at least one of the main frame boundaries for the main image frame with one of the lines of view of the secondary image frame.
14. The system of claim 12 or 13, wherein the beam former aligns at least one of the main frame boundaries of the main image frame with a secondary frame boundary of the secondary image frame.
15. The system of one or more of the preceding claims 12 to 14, wherein the secondary image frame includes left and right secondary image frames positioned on opposite left and right sides of the main image frame, the beam former configured to define the left secondary image frame to have a secondary frame boundary that aligns with one of the main frame boundaries, the beam former configured to define the right secondary image frame to have a secondary frame boundary that aligns with another of the main frame boundaries, the left and right secondary
16. The system of any preceding claim 12 to 15, wherein the beam former is configured to orient the lines of view within the secondary image frame such that virtual extensions of the lines of view extend through the transducer array and converge at a virtual source behind the transducer array.
17. The system of claim 16, wherein the beam former is configured to offset the virtual source with respect to a center of the transducer array.
18. The system of any preceding claim 12 to 17, wherein the beam former is configured to define different first and second virtual sources that are located behind and offset with respect to a surface of the transducer array, wherein the main and secondary image frames include corresponding main and secondary lines of view that are defined with respect to the different main and secondary virtual sources.
19. The system of any preceding claim 12 to 18, wherein the secondary image frame includes a lateral portion extending laterally beyond the main frame boundaries of the main image frame, while the secondary image frame entirely overlaps the main image frame such that no frame boundaries of the secondary image frame are located within the main image frame.
20. The system of any preceding claim 12 to 19, wherein the main image frame is substantially void of secondary frame boundaries associated with the secondary image frame .
21. The system of any preceding claim 12 to 20, and wherein the one of the lines of view of the secondary image frame, that aligns with the main frame boundary, extends perpendicularly from the linear surface of the transducer array.
22. The system of any preceding claim 12 to 21, wherein the transducer array includes a convex surface and wherein the one of the lines of view of the secondary image frame, that aligns with the main frame boundary, extends perpendicularly from a tangent line at a point of intersection between the one of the lines of view and the convex surface of the transducer array.
23. The system of any preceding claim 12 to 22, wherein the beam former is implemented by at least one of a circuit or the processor when executing the program instructions .
2 . A method for generating a compound image with an ultrasound system, the method comprising:
acquiring a main image frame and a secondary image frame of ultrasound data at an ultrasound probe having a transducer array, the main and secondary image frames at least partially overlapping one another;
defining main frame boundaries for the main image frame along opposite lateral sides thereof; and
defining receive lines of view within the secondary image frame that extend from the transducer array, wherein at least a portion of neighboring lines of view extend, into a region of interest, at different reception angles from one another relative to a surface of the transducer array; and form a compound image, wherein at least one of the main frame boundaries for the main image frame substantially aligned with one of the lines of view of the secondary image frame .
25. The method according to claim 24 comprising the features of one or more of the preceding claims 1 to 11.
26. An ultrasound system, comprising:
an ultrasound probe having a transducer array;
a memory storing program instructions;
a beam former configured to:
acquire a main image frame and a secondary image frame of ultrasound data at the transducer array, the main and secondary image frames at least partially overlapping one another;
define main frame boundaries for the main image frame along opposite lateral sides thereof; and
define receive lines of view within the secondary image frame that extend from the transducer array, wherein at least a portion of neighboring lines of view extend, into a region of interest, at different reception angles from one another relative to a surface of the transducer array; and
a processor configured to execute the program instructions to:
combine the main and secondary image frames to form a compound image; and
align at least one of the main frame boundaries for the main image frame with one of the lines of view of the secondary image frame. features of one or more of the preceding claims 12 to 23.
PCT/IB2016/050746 2016-02-12 2016-02-12 Method and system for generating a compound image WO2017137807A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/IB2016/050746 WO2017137807A1 (en) 2016-02-12 2016-02-12 Method and system for generating a compound image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/IB2016/050746 WO2017137807A1 (en) 2016-02-12 2016-02-12 Method and system for generating a compound image

Publications (1)

Publication Number Publication Date
WO2017137807A1 true WO2017137807A1 (en) 2017-08-17

Family

ID=55453230

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2016/050746 WO2017137807A1 (en) 2016-02-12 2016-02-12 Method and system for generating a compound image

Country Status (1)

Country Link
WO (1) WO2017137807A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112401932A (en) * 2020-12-08 2021-02-26 深圳开立生物医疗科技股份有限公司 Ultrasonic extended space compound imaging method and related device

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4159462A (en) 1977-08-18 1979-06-26 General Electric Company Ultrasonic multi-sector scanner
US4319489A (en) 1980-03-28 1982-03-16 Yokogawa Electric Works, Ltd. Ultrasonic diagnostic method and apparatus
US4649927A (en) 1984-09-25 1987-03-17 Kontron Holding Ag Real time display of an ultrasonic compound image
US5485842A (en) 1994-11-30 1996-01-23 Advanced Technology Laboratories, Inc. Ultrasonic diagnostic scan conversion for three dimensional display processing
US5860924A (en) 1996-11-26 1999-01-19 Advanced Technology Laboratories, Inc. Three dimensional ultrasonic diagnostic image rendering from tissue and flow images
US20040054284A1 (en) * 2002-09-13 2004-03-18 Acuson Corporation Overlapped scanning for multi-directional compounding of ultrasound images
US20040193047A1 (en) * 2003-02-19 2004-09-30 Ultrasonix Medical Corporation Compound ultrasound imaging method
EP1757954A2 (en) * 2005-08-22 2007-02-28 Medison Co., Ltd. System and method of forming an ultrasound spatial compound image
EP1681020B1 (en) 2005-01-18 2008-06-04 Esaote S.p.A. Ultrasonic imaging method and probe for 3D gynecologic inspection
EP2444821A2 (en) * 2010-10-19 2012-04-25 Samsung Medison Co., Ltd. Providing an ultrasound spatial compound image based on center lines of ultrasound images in an ultrasound system
US20120209107A1 (en) * 2010-12-27 2012-08-16 General Electric Company Method and apparatus for enhancing needle visualization in ultrasound imaging

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4159462A (en) 1977-08-18 1979-06-26 General Electric Company Ultrasonic multi-sector scanner
US4319489A (en) 1980-03-28 1982-03-16 Yokogawa Electric Works, Ltd. Ultrasonic diagnostic method and apparatus
US4649927A (en) 1984-09-25 1987-03-17 Kontron Holding Ag Real time display of an ultrasonic compound image
US5485842A (en) 1994-11-30 1996-01-23 Advanced Technology Laboratories, Inc. Ultrasonic diagnostic scan conversion for three dimensional display processing
US5860924A (en) 1996-11-26 1999-01-19 Advanced Technology Laboratories, Inc. Three dimensional ultrasonic diagnostic image rendering from tissue and flow images
US20040054284A1 (en) * 2002-09-13 2004-03-18 Acuson Corporation Overlapped scanning for multi-directional compounding of ultrasound images
US20040193047A1 (en) * 2003-02-19 2004-09-30 Ultrasonix Medical Corporation Compound ultrasound imaging method
EP1681020B1 (en) 2005-01-18 2008-06-04 Esaote S.p.A. Ultrasonic imaging method and probe for 3D gynecologic inspection
EP1757954A2 (en) * 2005-08-22 2007-02-28 Medison Co., Ltd. System and method of forming an ultrasound spatial compound image
EP2444821A2 (en) * 2010-10-19 2012-04-25 Samsung Medison Co., Ltd. Providing an ultrasound spatial compound image based on center lines of ultrasound images in an ultrasound system
US20120209107A1 (en) * 2010-12-27 2012-08-16 General Electric Company Method and apparatus for enhancing needle visualization in ultrasound imaging

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112401932A (en) * 2020-12-08 2021-02-26 深圳开立生物医疗科技股份有限公司 Ultrasonic extended space compound imaging method and related device
CN112401932B (en) * 2020-12-08 2023-07-07 深圳开立生物医疗科技股份有限公司 Ultrasonic expansion space compound imaging method and related device

Similar Documents

Publication Publication Date Title
US11624816B2 (en) Method and system for performing retrospective dynamic transmit focussing beamforming on ultrasound signals
US11633175B2 (en) Method and ultrasound system for shear wave elasticity imaging
US10679349B2 (en) Method and system for estimating motion between images, particularly in ultrasound spatial compounding
US11471130B2 (en) Method and ultrasound system for shear wave elasticity imaging
JP6356216B2 (en) Ultrasound diagnostic imaging system.
JP6342212B2 (en) Ultrasonic diagnostic equipment
US11160536B2 (en) Ultrasound method and ultrasound system for real time automatic setting of parameters for doppler imaging modes
US11612381B2 (en) Method for tissue characterization by ultrasound wave attenuation measurements and ultrasound system for tissue characterization
US20130012819A1 (en) Method and apparatus for performing ultrasound elevation compounding
US10444333B2 (en) Method and system for performing baseband digital receiver beamforming on ultrasound signals
US11402354B2 (en) Method for generating ultrasound transmission waves and ultrasound system for carrying out the said method
US8348848B1 (en) Methods and apparatus for ultrasound imaging
CN107205722A (en) The fundamental wave harmonic frequency ultrasound diagnosing image of broadband mixing
US8343054B1 (en) Methods and apparatus for ultrasound imaging
CN102626328B (en) Diagnostic ultrasound equipment, Ultrasonographic device and adquisitiones
US11751849B2 (en) High-resolution and/or high-contrast 3-D and/or 4-D ultrasound imaging with a 1-D transducer array
WO2017137807A1 (en) Method and system for generating a compound image
KR102146374B1 (en) ultrasonic apparatus and control method for the same
US20130281859A1 (en) Ultrasound imaging system and method
US20230281837A1 (en) Method and system for registering images acquired with different modalities for generating fusion images from registered images acquired with different modalities
EP4331499A1 (en) Ultrasound imaging systems and methods
Giangrossi Development and real-time implementation of novel 2-D and 3-D imaging techniques on a research scanner
CN117295975A (en) Coherent composite ultrasound image generation and associated systems, methods, and devices

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16707966

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 12.11.2018)

122 Ep: pct application non-entry in european phase

Ref document number: 16707966

Country of ref document: EP

Kind code of ref document: A1