US20060197780A1 - User control of 3d volume plane crop - Google Patents

User control of 3d volume plane crop Download PDF

Info

Publication number
US20060197780A1
US20060197780A1 US10/559,212 US55921205A US2006197780A1 US 20060197780 A1 US20060197780 A1 US 20060197780A1 US 55921205 A US55921205 A US 55921205A US 2006197780 A1 US2006197780 A1 US 2006197780A1
Authority
US
United States
Prior art keywords
image
volume
plane
user
cropping
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US10/559,212
Other versions
US7656418B2 (en
Inventor
Stephen Watkins
Steven Araiza
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Koninklijke Philips NV
Original Assignee
Koninklijke Philips Electronics NV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips Electronics NV filed Critical Koninklijke Philips Electronics NV
Priority to US10/559,212 priority Critical patent/US7656418B2/en
Assigned to KONINLIJKE PHILIPS ELECTRONICS N.V. reassignment KONINLIJKE PHILIPS ELECTRONICS N.V. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: WATKINS, STEPHEN, ARAZIA, STEVEN
Publication of US20060197780A1 publication Critical patent/US20060197780A1/en
Application granted granted Critical
Publication of US7656418B2 publication Critical patent/US7656418B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • G01S15/8906Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
    • G01S15/8993Three dimensional imaging systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/22Cropping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/41Medical
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/008Cut plane or projection plane definition
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10TECHNICAL SUBJECTS COVERED BY FORMER USPC
    • Y10STECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10S128/00Surgery
    • Y10S128/916Ultrasound 3-D imaging

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)
  • Processing Or Creating Images (AREA)
  • Image Processing (AREA)
  • Image Generation (AREA)

Abstract

A method for interactive adjustment of a 3D ultrasound image of an object includes acquiring a loop of 3D image data of the object, and providing a 3D image of the object on a display for user viewing. The method also includes activating a crop mode via a user interface in response to a user input to generate a cropping plane. The cropping plane may be oriented by the user in relation to the orientation of the image of the object in image space. The cropping plane is displayed along with the 3D image. The user may manipulate the user interface to control the orientation of the cropping plane in the image space with respect to the orientation of the image.

Description

  • The present invention relates to three-dimensional (3D) ultrasound imaging, and more particularly relates to implementation of a graphic plane to crop of 3D volumetric images in real time.
  • Surface rendering is an imaging technique where volumetric data are converted into geometric primitives by a process such as isosurfacing, isocontouring, surface extraction or border following. The primitives, such as polygon meshes or contours, are then rendered for display using conventional geometric rendering techniques.
  • Volume rendering is an imaging technique for visualizing 3D arrays of sampled data. 3D data arrays are widely used for representing image information. Put another way, volume rendering is a technique or method of direct display of data sampled in 3D. For example, medical imaging technologies such as ultrasound can produce 3D arrays of sampled data containing detailed representations of internal organs. The basic steps of volume rendering consist of assigning a color and an opacity value to each sample in a 3D input array, projecting the samples onto an image plane and blending the samples. The foundation for such visualization is a physically based model for the propagation of light in a colored, semi-transparent material.
  • Each of the elements of the 3D array (volume) is called a voxel, which represents a small cube in space, or a point sample of a continuous scalar function. The volume rendering process comprises an approximate simulation of the propagation of light through a participating medium. The medium may be thought of as a block of colored, semi-transparent gel in which color and opacity are functions of the scalar values of the input array. Light may be absorbed, scattered or transmitted by the volume, as known to those skilled in the art. FIG. 1 is a simple physical model for volume rendering. The figure shows a light ray 156 propagating from source 150 through a cube 154 of semitransparent gel, and scattering onto a single image plane 152. Each voxel in the volume of cube 154 emits light and absorbs a fraction of light passing through it. The value of each pixel comprising the cube is computed by sampling the voxel values along the viewing ray from a point x on the image plane to a point Xb on the opposite boundary of the volume and then numerically calculating what is known to those skilled in the art as a volume rendering integral.
  • A key advantage of volume rendering is that volume data need not be thresholded, in contrast to surface rendering techniques. Surface rendering techniques for volume data operate by fitting polygons to an iso surface in the volume (as known to those skilled in the art), and then rendering the polygonal model with traditional polygonal rendering techniques.
  • Also known to those skilled in the art, volume rendering requires that the physical volume of interest is a first acquired in a form of a 3D array, including opacity values. The acquisition process may include preparation steps such as resampling the volume to a regular grid, interpolating missing voxel values, and applying image-processing operators to improve contrast. Typically, opacity values are accorded values form 0 to 55 in a classification step. The array is typically stores as a number of z of x-y planes, where each z is a 2D slice in memory. An alternative method would be to partition the volume into specific structure using a segmentation algorithm, and then assigning opacity values to the segmented structures.
  • After classifying the data, a shading function is typically employed to specify the illumination model and a rule for determining color of each voxel. Careful use of visual cues, such as specular highlights, depth cueing and shadows greatly improve the effectiveness of a visualization. The particular rendering system then chooses the viewing parameters such as viewpoint, type of projection (parallel or perspective), clipping (i.e., cropping) planes, etc. Prior art volume cropping techniques typically are implemented pre-rendering.
  • Many acceleration techniques are known for volume rendering, the most successful of which use spatial data structures. Philippe G. Lacroute, FAST VOLUME RENDERING USING SHEAR-WARP FACTORIZATION OF THE VIEWING TRANSFORMATION, Computer Systems Laboratory, Depts. Of Electrical Engineering and Computer Science, Stanford university (1995). There are four major classes of volume rendering algorithms: ray casting, splatting, cell projection and multi-pass resampling. Ray casting algorithms produce an image by casting a ray through the volume for each image pixel and integrating the color and opacity along the ray. Ray casters are sometimes called backward projection algorithms since they calculate the mapping of voxels to image pixels by projecting the image pixels along viewing rays into the volume. Light rays flow forward from the image whereas viewing rays flow backward from the image into the volume.
  • In contrast to ray casting algorithms, splatting algorithms operate by iterating over the voxels. More particularly, the splatting algorithm computes the contribution of a voxel to the image by convolving the voxel with a filter that distributes the voxel's value to a neighborhood of pixels. Splatter algorithms may be described as forward projection since the voxels are projected directly into the image, in the same direction of the light rays.
  • Cell projection techniques are often used for volumes sampled on non-regular grids, and use polygon scan conversion to perform the projection. Multipass re-sampling algorithms operate by resampling the entire volume of the image coordinate system so that the re-sampled voxels line up behind each other on the viewing axis in image space. The voxels can then be composited together along the viewing axis as in a ray caster, except that in the re-sampled volume the viewing rays are always axis aligned. The viewing transformation is factored into a sequence of simple shears and scales, which are then applied to the volume in separate passes. Each shear or scale of the volume may be implemented with a scanline-order algorithm and a 1D-resampling filter. In this manner, for example, affine transformations may be implemented using three passes. The first pass samples the volume in the x direction of the volume. The new volume then becomes the input to the second pass, which re-samples the scanlines in the y direction. The result then feeds into the third pass, which re-samples scanlines in the z dimension.
  • The arbitrary nature of mapping from object space to image space complicates efficient, high-quality filtering and projection in object-order volume rendering algorithms, solved by transforming the volume to an intermediate coordinate system. Such a coordinate system is referred to in the art as sheared object space, where all viewing rays are parallel to a third coordinate axis. FIGS. 2A, 2B together show a perspective transformation. Horizontal lines 164 in the figures represent slices of the volume data viewed in cross section. Rays 166, in FIG. 2A, are shown emanating from source 162 through imaging plane 160 in volumetric object space (sampled data). After transformation, shown in FIG. 2B, the volume has been sheared parallel to the set of slices that is most perpendicular to the viewing direction and the viewing rays 166′ are perpendicular to the slices, and scaled as well as translated. The reason for first computing a distorted intermediate image is that the properties of the factorization result in a very efficient implementation of the resampling and compositing loop. That is, scanlines of voxels in each voxel slice are parallel to scanlines of pixels in the intermediate range.
  • Conventional 3D scan conversion algorithms can use data structures similar to what are known as mixed-data-set rendering algorithms, wherein edge table and active edge tables keep track of which cropping planes interact current voxel scan lines. The slopes of the cropping planes are used to calculate the intersection points incrementally as the algorithm iterates through voxel scanlines. The intersection points determine the bounds for the loop that iterates over voxels in a voxel scanline.
  • In volumetric analysis, the object under study is often a simple, more-or-less convex shape like a brain, heart, or engine block. A “Plane Crop” removes the entire portion of the volume that lies on a specified side of a plane in three-dimensional space. Cropping planes are commonly used in volume visualization applications to provide cutaway views of interior structures in a data set or to remove extraneous information while focusing on a particular part of a data set. Cropping planes with arbitrary orientations can be added to, for example, shear warp volume rendering algorithms by using 3D scan conversion algorithms to convert the cropping planes into bounds for rendering loops.
  • One known method for visualizing a volume using a cropping or slicing plane technique is shown in U.S. Pat. No. 5,454,371 (“the '371 patent”). The '371 patent provides an interactive user interface, which allows a used to manipulate a displayed image of a slice selected from a volume comprising a plurality of slices. Using the '371 patent technique, which is clearly distinguished from volume rendering, allows a user to rotate slices of a 3D image about an arbitrary axis. That is, a surface portion of an image for displayed may be translated to provide different cross-sectional views of the image and a selected surface of the displayed image may be rotated about an arbitrary axis, all using a graphical user interface.
  • The '371 patent performs a volume image reconstruction and stores the reconstructed volumetric image, that is, a volumetric image array, in an external file storage memory and/or displayed using Fenster's graphical user input device, i.e., a mouse. A routine implemented by the '371 patent assigns the volumetric image display a model in a form of a complex polyhedron having a plurality of planar faces defined in the same space as the 3D image to be displayed, typically a right parallelepiped, substantially enclosing all of the pixels in the volumetric image array. The model is then projected onto the screen of the monitor within a rectangular sub-region of the full screen display. Only visible faces of the model are displayed on the screen. Hidden-surface elimination is performed so the displayed model has an opaque appearance.
  • The display of each visible model face is accomplished in view of the fact that each screen pixel within the polygonal area of the displayed face has an associated 2D Cartesian coordinate pair, extendable to 3D. The 3D coordinates are then converted to voxel coordinates to select voxels in the volumetric image array. The extracted voxels are processed in according to what is known in the art as texture mapping. The correspondence between display coordinates and volumetric image coordinates is determined by what Fenster defines as a viewing transformation. The Fenster patent further defines that the particulars of the viewing transformation are recomputed each time the user, using the graphical user interface, changes parameters such as angle of view, display scale, etc.
  • After the 3D image and model are displayed on the screen of the monitor, the initial view is saved in memory. While viewing, all user manipulation is accomplished using three actions defined as “point”, “drag” and “click.” The user may rotate the entire model and 3D image about an arbitrary axis, translate a selected plane of the model, and rotate a selected plane of the model about an arbitrary axis. The fixed point of rotation for the 3D image is constrained to be the geometrical center of the initial model.
  • The '371 patent describes the manipulations of the model with respect to the mathematical description of a convex polyhedron. A convex polyhedron is characterized by a set of half-spaces defined by at least four planes, referred to by the Fenster patent as bounding planes. It describes each face of the polyhedron as a convex polygon embedded in a corresponding bounding plane, where any change in the shape of the model polyhedron is implemented by changing the parameters of the bounding planes.
  • The '371 patent teaches two primary manipulations of the bounding plane coefficients. The user may change D, which is the perpendicular distance from the plane to the coordinate origin, and rotate. Rotation requires a change in the 3D coefficients, which collectively specify the orientation of the plane relative the coordinate axes. The '371 patent distinguishes between original bounding planes assigned to the model of the volumetric image array, and planes added in response to user input. Model faces corresponding to original bounding planes have their perimeter lines displayed as white lines, while faces corresponding to user added planes are indicated in another color. Only user added planes may be translated, rotated or deleted. As a user added plane is translated by a user via the graphical user interface, various cross sections of the image may be viewed as the translated plane slices through the volumetric image array.
  • While the '371 patent's graphical input device is moved to effect changes in the displayed view and the display is updated showing intermediate positions and orientations of the affected plane or planes, the patent's display module must re-sample the volumetric image display and complete the texture mapping process. The 3D rendering described in the '371 patent may be characterized as a crude simulation of volume rendering where exterior voxels are “painted” onto a rough polygonal approximation of the volume. The technique must be distinguished from conventional volume rendering, which shows the combined effect of all of the voxels in the volume from a particular point of view. In contrast with Fenster's '371 patent, the present invention includes a graphic plane, referred to interchangeably herein as a cropping plane, to crop out unwanted parts of a 3D image in real time, as well as an elegant user interface for controlling the cropping planes. 1 5 The present invention is directed to cropping a 3D physical volume, e.g., heart, kidney, etc., derived from acquired ultrasound data, and rendering the cropped volume for visualization in real time. Cropping is more powerful than slicing because cropping allows internal 3D structure to be viewed. The invention further provides for effective user control of the 3D volumetric cropping. To do so, the invention defines a cropping plane as a tangent to a sphere centered at the center of gravity of the object being imaged. That is, the center of gravity of the object to be imaged is assumed to be (0,0,0) in Cartesian coordinates. The boundaries of the object may be assumed to represent a rectangle where the sphere with a radius, R, circumscribes the rectangle (object). In such a representation, the diameter of the sphere is defined as the longest diagonal of the rectangle, passing through the origin (0,0,0). The radius of the sphere is one half the diagonal. It follows that the sphere fully encompasses the object's volume and is thus referred to herein as the “bounding sphere”.
  • The cropping plane is defined in space with respect to a unit normal vector emanating from (0,0,0). The cropping plane is perpendicular to the unit normal vector. By default at initiation of the cropping features, the unit normal vector is equal to the radius as described such that the cropping plane is tangent to the sphere. As such, no cropping occurs without user input. By changing the dimension of the unit normal vector, or spherical radius, the orientation of the cropping plane is moved towards or away from the center of the object being imaged.
  • The present invention provides that the graphic plane or image appears different on each side of the cropping plane. For example, the colors green and purple may be used to distinguish voxels defining the graphic plane on each side of the cropping plane. Those skilled in the art should realize that other colors, patterns, intensities, etc., also may be used to distinguish between the front and back of the plane. In all of the inventive embodiments, the relationship of the plane to the image is locked and determines the portion of the image that is cropped. With respect to that portion of space where the unit normal vector is less than the radius of the bounded object, the voxels on the outside of the plane are zeroed out.
  • The user interface is such that the plane may be easily and readily manipulated with just the trackball providing that the user implemented view changes, image reconstructions and display of the changed image may be implemented and viewed in real time. Quite simply, the user merely changes the radius of the sphere, controlling/defining the depth at which the plane cuts into the image. Such an arrangement is quite convenient when viewing normally spherical shaped organs, such as the human heart.
  • An understanding of the present invention can be gained from the following detailed description of the invention, taken in conjunction with the accompanying drawings of which:
  • FIG. 1 is a prior art model for 3D volume rendering;
  • FIG. 2 is a prior art depiction of a transform of a volume into sheared object space to show a perspective projection;
  • FIG. 3 is a simplified block diagram of an ultrasound system, which may be used to implement the present invention;
  • FIG. 4 is screen shot of a 3D image of an object;
  • FIG. 5 is a screen shot of the 3D image of the object shown in FIG. 4, where the crop mode feature of the present invention has been initiated by a user, and the cropping plane may be seen in the foreground of the 3D image space;
  • FIG. 6 is a screen shot of the 3D image where the cropping plane has been adjusted by the user to crop out 40% of the 3D image of the object;
  • FIG. 7 is a screen shot of the 3D image where the cropping plane has been adjusted by the user to crop out 60% of the 3D image of the object;
  • FIG. 8 is a screen shot of the 3D image where the cropping plane has been rotated upward to change an orientation of the image;
  • FIG. 9 is a screen shot of the 3D image with a plane lock button pressed, the trackball rotated to the right with no crop change; and
  • FIG. 10 is a screen shot of the 3D image with the save crop button pressed, where the volume is shown with the crop applied.
  • The detailed description of preferred apparatus and methods that follow is presented in terms of routines and symbolic representations of operations of data bits within a memory, associated processors, and possibly networks, and network devices.
  • These descriptions and representations are the means used by those skilled in the art effectively convey the substance of their work to others skilled in the art. A routine is here, and generally, conceived to be a self-consistent sequence of steps or actions leading to a desired result. Thus, the term “routine” is generally used to refer to a series of operations performed by a processor, be it a central processing unit of an ultrasound system, or a secondary processing unit of such an ultrasound system, and as such, encompasses such terms of art as “program,” “objects,” “functions,” “subroutines,” and “procedures.”
  • In general, the sequence of steps in the routines requires physical manipulation of physical quantities. Usually, though not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared or otherwise manipulated. Those of ordinary skill in the art conveniently refer to these signals as “bits”, “values”, “elements”, “symbols”, “characters”, “images”, “terms”, “numbers”, or the like. It should be recognized that these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities.
  • In the present case, the routines and operations are machine operations to be performed in conjunction with human operators. Useful machines for performing the operations of the present invention include the SONOS line of ultrasound systems, commonly owned by the owner of this invention, and other similar devices. In general, the present invention relates to method steps, software, and associated hardware including computer readable medium, configured to store and/or process electrical or other physical signals using the routines described herein to generate other desired physical signals.
  • The apparatus set forth in the present application is preferably specifically constructed for the required purpose, i.e., ultrasound imaging, but the methods described herein may be embodied on a general purpose computer or other network device selectively activated or reconfigured by a routine stored in the computer and interface with the necessary ultrasound imaging equipment. The procedures presented herein are not inherently related to any particular ultrasonic system, computer or other apparatus. In particular, various machines may be used with routines in accordance with the teachings herein, or it may prove more convenient to construct more specialized apparatus to perform the required method steps. In certain circumstances, when it is desirable that a piece of hardware possesses certain characteristics, these characteristics are described more fully in the following text. The required structures for a variety of these machines may appear in the description given below. Machines, which may perform the functions of the present invention, include those manufactured by such companies as PHILIPS MEDICAL SYSTEMS INTERNATIONAL, GE MEDICAL SYSTEMS, and SIEMANS MEDICAL SYSTEMS, as well as other manufacturers of ultrasound equipment.
  • With respect to the software described herein, those of ordinary skill in the art will recognize that there exist a variety of platforms and languages for creating software for performing the procedures outlined herein. Those of ordinary skill in the art also recognize that the choice of the exact platform and language is often dictated by the specifics of the actual system constructed, such that what may work for one type of system may not be efficient on another system.
  • FIG. 3 is a simplified block diagram of an ultrasound imaging system 100 in accordance with a preferred embodiment of the present invention. It will be appreciated by those of ordinary skill in the relevant arts that the ultrasound imaging system 100, and the operation thereof as described hereinafter, is intended to be generally representative of medical imaging systems. Any particular system may differ significantly from that shown in FIG. 3, particularly in the details of construction and operation of such system. As such, the ultrasound imaging system 100 is to be regarded as illustrative and exemplary and not limiting as regards the invention described herein or the claims attached hereto.
  • The FIG. 3 ultrasound imaging system 100 shows a transmit beamformer 110 as coupled through a transmit/receive (T/R) switch 112 to a transducer array 114. Transducer array 114 includes an array of transducer elements, typically two-dimensional (2D) for use in 3D imaging. The T/R switch 112 normally includes one switch element for each transducer element. The transmit beamformer 110 receives pulse sequences from a pulse generator 116. The transducer array 114, energized by the transmit beamformer 110, transmits ultrasound energy into a region of interest (ROI) in a patient's body and receives reflected ultrasound energy, or echoes, from various structures and organs within the patient's body. As is known in the art, by appropriately delaying the waveforms applied to each transducer element by the transmit beamformer 110, a focused ultrasound beam is transmitted.
  • The transducer array 114 is also coupled, through the T/R switch 112, to a receive beamformer 118. Ultrasound energy from a given point within the patient's body is received by the transducer elements at different times. The transducer elements convert the received ultrasound energy to transducer signals which may be amplified, individually delayed and then summed by the receive beamformer 118. Such operation provides a beamformer signal that represents the received ultrasound level along a desired receive line. The receive beamformer 118 may be a digital beamformer including an analog-to-digital converter for converting the transducer signals to digital values. As known in the art, the delays applied to the transducer signals may be varied during reception of ultrasound energy to effect dynamic focusing. The process is repeated for multiple scan lines to provide signals for generating an image of the region of interest in the patient's body, and therefore, implement the 3D imaging.
  • The beamformer signals are applied to a signal processor 124, which processes the beamformer signals for improved image quality. The signal processor may include processes such as harmonic processing. The receive beamformer 118 and the signal processor 124 constitute an ultrasound receiver 126. The signal processor is where actual 3D reconstruction takes place within the system. The output of the signal processor 124 is supplied to a scan converter 128, which converts sector scan or other scan pattern signals to conventional raster scan display signals. The output of the scan converter 128 is buffered for eventual display.
  • A system controller 130 provides overall control of the system. The system controller 130 performs timing and control functions and typically includes a microprocessor operating under the control of control routines 132, stored in a memory 134. As will be discussed in detail below, the control routines 132, in addition to known control routines, include a variety of routines to create, store, index, and synchronize digitized audio information. The system controller 130 also utilizes the memory 134 to store intermediate values, including system variables describing the operation of the ultrasound imaging system 100, and to buffer various outputs, including the output of the scan converter 128.
  • An Input/Output unit 136 (hereinafter referred to as “user interface”) controls a variety of input and output operations, for example, a conventional trackball or mouse (not shown in the figure). The user interface of this invention provides several interactive controls, preferably implemented by a trackball.
  • After a loop is acquired, there are no crops applied to the volume. The user indicates the desire to manipulate cropping by turning the crop mode selector to ON. At this point the cropping plane appears at its default location, typically rotated to the side to prevent obscuring the volume. The default location is always defined by a radius, R, a fixed distance from the center of the volume at (0,0,0) which defines a sphere completely encompassing the entire volume. The default plane is tangential to the sphere defined by the radius, and crops out nothing.
  • FIG. 4 shows the 3D volume of an object being imaged, where FIG. 5 shows the same object imaged in FIG. 4 with the cropping mode enabled. In the embodiment shown in FIG. 5, the plane appears in purple and in parallel to the plane of the screen. A “crop mode” selector provided by the inventive user interface turns control of the cropping plane on and off. Once the user initiates the crop mode, the trackball is used to control the orientation of the volume/plane. No cropping has yet been implemented because the radius of the sphere, and therefore the position of the plane with respect to the center (0,0,0), still merely bound rather than crop the object.
  • The user can move the crop cursor around the volume by moving the trackball, which will move the plane around on the surface of the bounding sphere. The crop adjust knob will move the plane toward or away from the center of the volume, decreasing or increasing R. As the plane moves, the volume will be cropped interactively. As assistance to visualization, the portion of the volume where the cropping plane intersects the volume will be shown as grayscale while the remainder of the volume (which is behind the plane) will be tinted the same color as the plane.
  • By rotating the crop adjustment knob of the user interface, the percentage of the cropping is adjusted. Rotating counterclockwise moves the cropping plane towards the origin (0,0,0), decreasing R. Rotating counterclockwise moves the cropping plane away from the origin (0,0,0). FIG. 6 shows the 3D volume with approximately 40% of the image on the user side of the cropping plane cropped away. FIG. 7 shows the image where the cropping is further adjusted to 60%. A desired feature is the seen as the darkened area in the upper portion of the object volume.
  • Another feature of the user interface of this invention includes the ability to rotate the cropping plane to change the orientation of the view of the feature. FIG. 8 shows the image of the object where cropping has been rotated by moving the track ball upward.
  • A “Lock Plane to Volume” toggle which controls whether the cropping plane rotates independently of or in conjunction with the volume. Such operation is highlighted with the screen shot shown in FIG. 9. When the “Lock Plane To Volume” button is ON (plane lock button), the crop cursor is locked to the volume and will therefore rotate as the volume is rotated in order to maintain a fixed relationship to the volume. If the “cursor locked” button is OFF, then the cursor will remain fixed in place and the volume will rotate thus changing the position of the crop w.r.t. the volume. The crop will be undone at the current location and redone at the new location.
  • The user is then free to change the cropping amount and location as desired until they are satisfied with the current state of the crop, at which point they can hit the “store” button. This will add the current crop to the crop list associated with the volume and then begin a new crop. The current crop can be abandoned by dialing the crop position back to 0 (OFF).
  • And another feature of the user interface of the present invention is the “save crop” feature. The save crop feature allows the user to bind the current crop setting to the volume. FIG. 10 is a screen shot which highlights the feature of the invention which allows the user to save a particular crop. The volume shown in the figure is with the crop applied.
  • Although a few examples of the present invention have been shown and described, it would be appreciated by those skilled in the art that changes might be made in these embodiments without departing from the principles and spirit of the invention, the scope of which is defined in the claims and their equivalents.

Claims (6)

1. A method for interactive adjustment of a 3D ultrasound image of an object, including the following steps:
acquiring a loop of 3D image data of the object, and providing a 3D image of the object on a display for user viewing;
activating a crop mode via a user interface in response to a user input, wherein a cropping plane is generated and oriented in relation to the orientation of the image of the object in image space, and the cropping plane is displayed along with the 3D image, and wherein the user may manipulate the user interface to control the orientation of the cropping plane in the image space with respect to the orientation of the image.
2. The method set forth in claim 1, wherein the step of activating includes using a trackball.
3. The method set forth in claim 1, wherein the step of activating includes implementation of a save crop setting, wherein a current crop, defined by a current user defined cropping plane orientation, is bound the object volume.
4. The method set forth in claim 1, wherein the step of activating includes implementation of a feature, which determines whether the cropping plane is locked to the volume of the object being imaged, where instead on rotating independently, rotation of the cropping plane rotates the 3D image.
5. The method set forth in claim 1, wherein the step of activating includes turning the crop mode on and off.
6. A computer readable medium for containing a set of computer instructions for implementing a method for interactive adjustment of a 3D ultrasound image of an object, the method including the following steps:
acquiring a loop of 3D image data of the object, and providing a 3D image of the object on a display for user viewing;
activating a crop mode via a user interface in response to a user input, wherein a cropping plane is generated and oriented in relation to the orientation of the image of the object in image space, and the cropping plane is displayed along with the 3D image, and wherein the user may manipulate the user interface to control the orientation of the cropping plane in the image space with respect to the orientation of the image.
US10/559,212 2003-06-11 2004-06-07 User control of 3d volume plane crop Active 2024-11-15 US7656418B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US10/559,212 US7656418B2 (en) 2003-06-11 2004-06-07 User control of 3d volume plane crop

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US47754303P 2003-06-11 2003-06-11
PCT/IB2004/050858 WO2004109603A1 (en) 2003-06-11 2004-06-07 User control of 3d volume plane crop
US10/559,212 US7656418B2 (en) 2003-06-11 2004-06-07 User control of 3d volume plane crop

Publications (2)

Publication Number Publication Date
US20060197780A1 true US20060197780A1 (en) 2006-09-07
US7656418B2 US7656418B2 (en) 2010-02-02

Family

ID=33511853

Family Applications (1)

Application Number Title Priority Date Filing Date
US10/559,212 Active 2024-11-15 US7656418B2 (en) 2003-06-11 2004-06-07 User control of 3d volume plane crop

Country Status (5)

Country Link
US (1) US7656418B2 (en)
EP (1) EP1636761A1 (en)
JP (1) JP4510817B2 (en)
CN (1) CN100385467C (en)
WO (1) WO2004109603A1 (en)

Cited By (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040218796A1 (en) * 2003-02-05 2004-11-04 Ryner Lawrence N Magnetic resonance spectroscopy using a conformal voxel
US20050240104A1 (en) * 2004-04-01 2005-10-27 Medison Co., Ltd. Apparatus and method for forming 3D ultrasound image
US20060239523A1 (en) * 2005-04-05 2006-10-26 Bradley University Radiographic imaging display apparatus and method
US20090003665A1 (en) * 2007-06-30 2009-01-01 General Electric Company Method and system for multiple view volume rendering
US20090160985A1 (en) * 2007-12-10 2009-06-25 The University Of Connecticut Method and system for recognition of a target in a three dimensional scene
US7567248B1 (en) * 2004-04-28 2009-07-28 Mark William R System and method for computing intersections between rays and surfaces
US7679625B1 (en) * 2005-01-07 2010-03-16 Apple, Inc. Straightening digital images
US20100259542A1 (en) * 2007-11-02 2010-10-14 Koninklijke Philips Electronics N.V. Automatic movie fly-path calculation
US8189002B1 (en) * 2004-10-29 2012-05-29 PME IP Australia Pty, Ltd. Method and apparatus for visualizing three-dimensional and higher-dimensional image data sets
US20120206448A1 (en) * 2011-02-11 2012-08-16 Embrey Cattle Co. System and method for modeling a biopsy specimen
US20130106913A1 (en) * 2011-10-28 2013-05-02 Microsoft Corporation Image layout for a display
US8475375B2 (en) 2006-12-15 2013-07-02 General Electric Company System and method for actively cooling an ultrasound probe
US20130222383A1 (en) * 2010-11-12 2013-08-29 Hitachi Medical Corporation Medical image display device and medical image display method
TWI411299B (en) * 2010-11-30 2013-10-01 Innovision Labs Co Ltd Method of generating multiple different orientation images according to single image and apparatus thereof
US20130257870A1 (en) * 2012-04-02 2013-10-03 Yoshiyuki Kokojima Image processing apparatus, stereoscopic image display apparatus, image processing method and computer program product
US20130328874A1 (en) * 2012-06-06 2013-12-12 Siemens Medical Solutions Usa, Inc. Clip Surface for Volume Rendering in Three-Dimensional Medical Imaging
US8775510B2 (en) 2007-08-27 2014-07-08 Pme Ip Australia Pty Ltd Fast file server methods and system
US20150062177A1 (en) * 2013-09-02 2015-03-05 Samsung Electronics Co., Ltd. Method and apparatus for fitting a template based on subject information
US8976190B1 (en) 2013-03-15 2015-03-10 Pme Ip Australia Pty Ltd Method and system for rule based display of sets of images
US9019287B2 (en) 2007-11-23 2015-04-28 Pme Ip Australia Pty Ltd Client-server visualization system with hybrid data processing
US9355616B2 (en) 2007-11-23 2016-05-31 PME IP Pty Ltd Multi-user multi-GPU render server apparatus and methods
US20160225180A1 (en) * 2015-01-29 2016-08-04 Siemens Medical Solutions Usa, Inc. Measurement tools with plane projection in rendered ultrasound volume imaging
US9454813B2 (en) 2007-11-23 2016-09-27 PME IP Pty Ltd Image segmentation assignment of a volume by comparing and correlating slice histograms with an anatomic atlas of average histograms
US9509802B1 (en) 2013-03-15 2016-11-29 PME IP Pty Ltd Method and system FPOR transferring data to improve responsiveness when sending large data sets
US9892566B2 (en) 2011-02-07 2018-02-13 Fujifilm Corporation Image processing apparatus, method and program
US9904969B1 (en) 2007-11-23 2018-02-27 PME IP Pty Ltd Multi-user multi-GPU render server apparatus and methods
US9984478B2 (en) 2015-07-28 2018-05-29 PME IP Pty Ltd Apparatus and method for visualizing digital breast tomosynthesis and other volumetric images
US10070839B2 (en) 2013-03-15 2018-09-11 PME IP Pty Ltd Apparatus and system for rule based visualization of digital breast tomosynthesis and other volumetric images
US10311541B2 (en) 2007-11-23 2019-06-04 PME IP Pty Ltd Multi-user multi-GPU render server apparatus and methods
US10540803B2 (en) 2013-03-15 2020-01-21 PME IP Pty Ltd Method and system for rule-based display of sets of images
US10909679B2 (en) 2017-09-24 2021-02-02 PME IP Pty Ltd Method and system for rule based display of sets of images using image content derived parameters
US11183292B2 (en) 2013-03-15 2021-11-23 PME IP Pty Ltd Method and system for rule-based anonymized display and data export
US11244495B2 (en) 2013-03-15 2022-02-08 PME IP Pty Ltd Method and system for rule based display of sets of images using image content derived parameters
US11386606B2 (en) * 2018-04-11 2022-07-12 Koninklijke Philips N.V. Systems and methods for generating enhanced diagnostic images from 3D medical image
US11599672B2 (en) 2015-07-31 2023-03-07 PME IP Pty Ltd Method and apparatus for anonymized display and data export
US11972024B2 (en) 2023-02-14 2024-04-30 PME IP Pty Ltd Method and apparatus for anonymized display and data export

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
ES2524303T3 (en) 2006-05-08 2014-12-05 C.R. Bard, Inc. User interface and methods for an ultrasound presentation device
EP1975754B1 (en) 2007-03-30 2017-12-27 ABB Research Ltd. A computer implemented method to display technical data for monitoring an industrial installation
CN100568289C (en) * 2007-07-13 2009-12-09 威盛电子股份有限公司 Computer drawing element describing method and device
EP2239652A1 (en) * 2009-04-07 2010-10-13 Keywords.de GmbH Providing an interactive visual representation on a display
KR101117913B1 (en) * 2009-05-11 2012-02-24 삼성메디슨 주식회사 Ultrasound system and method for rendering volume data
US9892546B2 (en) 2010-06-30 2018-02-13 Primal Space Systems, Inc. Pursuit path camera model method and system
CN107093203A (en) 2010-06-30 2017-08-25 巴里·林恩·詹金斯 The control method and system that prefetching transmission or reception based on navigation of graphical information
US9916763B2 (en) 2010-06-30 2018-03-13 Primal Space Systems, Inc. Visibility event navigation method and system
EP3452992B1 (en) 2016-05-03 2021-06-23 Affera, Inc. Anatomical model displaying
WO2017197114A1 (en) 2016-05-11 2017-11-16 Affera, Inc. Anatomical model generation
US11728026B2 (en) 2016-05-12 2023-08-15 Affera, Inc. Three-dimensional cardiac representation
US10649615B2 (en) 2016-10-20 2020-05-12 Microsoft Technology Licensing, Llc Control interface for a three-dimensional graphical object
EP4107964A1 (en) * 2020-02-20 2022-12-28 Align Technology, Inc. Medical imaging data compression and extraction on client side

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4734690A (en) * 1984-07-20 1988-03-29 Tektronix, Inc. Method and apparatus for spherical panning

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3013593B2 (en) * 1991-05-09 2000-02-28 株式会社日立製作所 Image display method
US5298919A (en) * 1991-08-02 1994-03-29 Multipoint Technology Corporation Multi-dimensional input device
JPH05189541A (en) * 1992-01-13 1993-07-30 Hitachi Ltd Image operating method
US5313230A (en) * 1992-07-24 1994-05-17 Apple Computer, Inc. Three degree of freedom graphic object controller
US5454371A (en) * 1993-11-29 1995-10-03 London Health Association Method and system for constructing and displaying three-dimensional images
EP0795832A3 (en) * 1996-02-16 1998-07-22 Axsys Corporation Method and apparatus for maximizing the number of radiological images displayed on a display screen or printed on a sheet of film
DE69737720T2 (en) 1996-11-29 2008-01-10 London Health Sciences Centre, London IMPROVED IMAGE PROCESSING METHOD FOR A THREE-DIMENSIONAL IMAGE GENERATION SYSTEM
JP2001109906A (en) * 1999-10-01 2001-04-20 Mitsubishi Electric Inf Technol Center America Inc Rendering device/rendering method for volume data set
DE10004898C2 (en) * 2000-02-04 2003-06-26 Siemens Ag display means
DE10157268A1 (en) * 2001-11-22 2003-06-12 Philips Intellectual Property Method and device for the simultaneous display of arbitrarily selectable complementary sectional images

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4734690A (en) * 1984-07-20 1988-03-29 Tektronix, Inc. Method and apparatus for spherical panning

Cited By (94)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040218796A1 (en) * 2003-02-05 2004-11-04 Ryner Lawrence N Magnetic resonance spectroscopy using a conformal voxel
US7319784B2 (en) * 2003-02-05 2008-01-15 National Research Council Of Canada Magnetic resonance spectroscopy using a conformal voxel
US7507204B2 (en) * 2004-04-01 2009-03-24 Medison Co., Ltd. Apparatus and method for forming 3D ultrasound image
US20050240104A1 (en) * 2004-04-01 2005-10-27 Medison Co., Ltd. Apparatus and method for forming 3D ultrasound image
US7567248B1 (en) * 2004-04-28 2009-07-28 Mark William R System and method for computing intersections between rays and surfaces
US8189002B1 (en) * 2004-10-29 2012-05-29 PME IP Australia Pty, Ltd. Method and apparatus for visualizing three-dimensional and higher-dimensional image data sets
US7679625B1 (en) * 2005-01-07 2010-03-16 Apple, Inc. Straightening digital images
US20060239523A1 (en) * 2005-04-05 2006-10-26 Bradley University Radiographic imaging display apparatus and method
US8041087B2 (en) * 2005-04-05 2011-10-18 Bradley University Radiographic imaging display apparatus and method
US8475375B2 (en) 2006-12-15 2013-07-02 General Electric Company System and method for actively cooling an ultrasound probe
US20090003665A1 (en) * 2007-06-30 2009-01-01 General Electric Company Method and system for multiple view volume rendering
US7894663B2 (en) 2007-06-30 2011-02-22 General Electric Company Method and system for multiple view volume rendering
US9860300B2 (en) 2007-08-27 2018-01-02 PME IP Pty Ltd Fast file server methods and systems
US11902357B2 (en) 2007-08-27 2024-02-13 PME IP Pty Ltd Fast file server methods and systems
US10038739B2 (en) 2007-08-27 2018-07-31 PME IP Pty Ltd Fast file server methods and systems
US9167027B2 (en) 2007-08-27 2015-10-20 PME IP Pty Ltd Fast file server methods and systems
US10686868B2 (en) 2007-08-27 2020-06-16 PME IP Pty Ltd Fast file server methods and systems
US9531789B2 (en) 2007-08-27 2016-12-27 PME IP Pty Ltd Fast file server methods and systems
US11075978B2 (en) 2007-08-27 2021-07-27 PME IP Pty Ltd Fast file server methods and systems
US8775510B2 (en) 2007-08-27 2014-07-08 Pme Ip Australia Pty Ltd Fast file server methods and system
US11516282B2 (en) 2007-08-27 2022-11-29 PME IP Pty Ltd Fast file server methods and systems
US20100259542A1 (en) * 2007-11-02 2010-10-14 Koninklijke Philips Electronics N.V. Automatic movie fly-path calculation
US10217282B2 (en) * 2007-11-02 2019-02-26 Koninklijke Philips N.V. Automatic movie fly-path calculation
US9728165B1 (en) 2007-11-23 2017-08-08 PME IP Pty Ltd Multi-user/multi-GPU render server apparatus and methods
US9904969B1 (en) 2007-11-23 2018-02-27 PME IP Pty Ltd Multi-user multi-GPU render server apparatus and methods
US11514572B2 (en) 2007-11-23 2022-11-29 PME IP Pty Ltd Automatic image segmentation methods and analysis
US11640809B2 (en) 2007-11-23 2023-05-02 PME IP Pty Ltd Client-server visualization system with hybrid data processing
US11328381B2 (en) 2007-11-23 2022-05-10 PME IP Pty Ltd Multi-user multi-GPU render server apparatus and methods
US11315210B2 (en) 2007-11-23 2022-04-26 PME IP Pty Ltd Multi-user multi-GPU render server apparatus and methods
US9454813B2 (en) 2007-11-23 2016-09-27 PME IP Pty Ltd Image segmentation assignment of a volume by comparing and correlating slice histograms with an anatomic atlas of average histograms
US11244650B2 (en) 2007-11-23 2022-02-08 PME IP Pty Ltd Client-server visualization system with hybrid data processing
US11900501B2 (en) 2007-11-23 2024-02-13 PME IP Pty Ltd Multi-user multi-GPU render server apparatus and methods
US9355616B2 (en) 2007-11-23 2016-05-31 PME IP Pty Ltd Multi-user multi-GPU render server apparatus and methods
US9595242B1 (en) 2007-11-23 2017-03-14 PME IP Pty Ltd Client-server visualization system with hybrid data processing
US10825126B2 (en) 2007-11-23 2020-11-03 PME IP Pty Ltd Multi-user multi-GPU render server apparatus and methods
US10762872B2 (en) 2007-11-23 2020-09-01 PME IP Pty Ltd Client-server visualization system with hybrid data processing
US10380970B2 (en) 2007-11-23 2019-08-13 PME IP Pty Ltd Client-server visualization system with hybrid data processing
US10706538B2 (en) 2007-11-23 2020-07-07 PME IP Pty Ltd Automatic image segmentation methods and analysis
US10043482B2 (en) 2007-11-23 2018-08-07 PME IP Pty Ltd Client-server visualization system with hybrid data processing
US11900608B2 (en) 2007-11-23 2024-02-13 PME IP Pty Ltd Automatic image segmentation methods and analysis
US10311541B2 (en) 2007-11-23 2019-06-04 PME IP Pty Ltd Multi-user multi-GPU render server apparatus and methods
US10614543B2 (en) 2007-11-23 2020-04-07 PME IP Pty Ltd Multi-user multi-GPU render server apparatus and methods
US9019287B2 (en) 2007-11-23 2015-04-28 Pme Ip Australia Pty Ltd Client-server visualization system with hybrid data processing
US10430914B2 (en) 2007-11-23 2019-10-01 PME IP Pty Ltd Multi-user multi-GPU render server apparatus and methods
US9984460B2 (en) 2007-11-23 2018-05-29 PME IP Pty Ltd Automatic image segmentation methods and analysis
US20090160985A1 (en) * 2007-12-10 2009-06-25 The University Of Connecticut Method and system for recognition of a target in a three dimensional scene
US20130222383A1 (en) * 2010-11-12 2013-08-29 Hitachi Medical Corporation Medical image display device and medical image display method
TWI411299B (en) * 2010-11-30 2013-10-01 Innovision Labs Co Ltd Method of generating multiple different orientation images according to single image and apparatus thereof
US9892566B2 (en) 2011-02-07 2018-02-13 Fujifilm Corporation Image processing apparatus, method and program
US9053563B2 (en) * 2011-02-11 2015-06-09 E4 Endeavors, Inc. System and method for modeling a biopsy specimen
US9672655B2 (en) 2011-02-11 2017-06-06 E4 Endeavors, Inc. System and method for modeling a biopsy specimen
US20120206448A1 (en) * 2011-02-11 2012-08-16 Embrey Cattle Co. System and method for modeling a biopsy specimen
US10223825B2 (en) 2011-02-11 2019-03-05 E4 Endeavors, Inc. System and method for modeling a biopsy specimen
US20130106913A1 (en) * 2011-10-28 2013-05-02 Microsoft Corporation Image layout for a display
US9269323B2 (en) * 2011-10-28 2016-02-23 Microsoft Technology Licensing, Llc Image layout for a display
US20130257870A1 (en) * 2012-04-02 2013-10-03 Yoshiyuki Kokojima Image processing apparatus, stereoscopic image display apparatus, image processing method and computer program product
US20130328874A1 (en) * 2012-06-06 2013-12-12 Siemens Medical Solutions Usa, Inc. Clip Surface for Volume Rendering in Three-Dimensional Medical Imaging
US11296989B2 (en) 2013-03-15 2022-04-05 PME IP Pty Ltd Method and system for transferring data to improve responsiveness when sending large data sets
US11666298B2 (en) 2013-03-15 2023-06-06 PME IP Pty Ltd Apparatus and system for rule based visualization of digital breast tomosynthesis and other volumetric images
US10320684B2 (en) 2013-03-15 2019-06-11 PME IP Pty Ltd Method and system for transferring data to improve responsiveness when sending large data sets
US9898855B2 (en) 2013-03-15 2018-02-20 PME IP Pty Ltd Method and system for rule based display of sets of images
US10540803B2 (en) 2013-03-15 2020-01-21 PME IP Pty Ltd Method and system for rule-based display of sets of images
US10764190B2 (en) 2013-03-15 2020-09-01 PME IP Pty Ltd Method and system for transferring data to improve responsiveness when sending large data sets
US10762687B2 (en) 2013-03-15 2020-09-01 PME IP Pty Ltd Method and system for rule based display of sets of images
US10820877B2 (en) 2013-03-15 2020-11-03 PME IP Pty Ltd Apparatus and system for rule based visualization of digital breast tomosynthesis and other volumetric images
US10373368B2 (en) 2013-03-15 2019-08-06 PME IP Pty Ltd Method and system for rule-based display of sets of images
US10832467B2 (en) 2013-03-15 2020-11-10 PME IP Pty Ltd Method and system for rule based display of sets of images using image content derived parameters
US10070839B2 (en) 2013-03-15 2018-09-11 PME IP Pty Ltd Apparatus and system for rule based visualization of digital breast tomosynthesis and other volumetric images
US11810660B2 (en) 2013-03-15 2023-11-07 PME IP Pty Ltd Method and system for rule-based anonymized display and data export
US9524577B1 (en) 2013-03-15 2016-12-20 PME IP Pty Ltd Method and system for rule based display of sets of images
US11129583B2 (en) 2013-03-15 2021-09-28 PME IP Pty Ltd Apparatus and system for rule based visualization of digital breast tomosynthesis and other volumetric images
US11129578B2 (en) 2013-03-15 2021-09-28 PME IP Pty Ltd Method and system for rule based display of sets of images
US11183292B2 (en) 2013-03-15 2021-11-23 PME IP Pty Ltd Method and system for rule-based anonymized display and data export
US9509802B1 (en) 2013-03-15 2016-11-29 PME IP Pty Ltd Method and system FPOR transferring data to improve responsiveness when sending large data sets
US11244495B2 (en) 2013-03-15 2022-02-08 PME IP Pty Ltd Method and system for rule based display of sets of images using image content derived parameters
US9749245B2 (en) 2013-03-15 2017-08-29 PME IP Pty Ltd Method and system for transferring data to improve responsiveness when sending large data sets
US11916794B2 (en) 2013-03-15 2024-02-27 PME IP Pty Ltd Method and system fpor transferring data to improve responsiveness when sending large data sets
US11763516B2 (en) 2013-03-15 2023-09-19 PME IP Pty Ltd Method and system for rule based display of sets of images using image content derived parameters
US11701064B2 (en) 2013-03-15 2023-07-18 PME IP Pty Ltd Method and system for rule based display of sets of images
US8976190B1 (en) 2013-03-15 2015-03-10 Pme Ip Australia Pty Ltd Method and system for rule based display of sets of images
US10631812B2 (en) 2013-03-15 2020-04-28 PME IP Pty Ltd Apparatus and system for rule based visualization of digital breast tomosynthesis and other volumetric images
US20150062177A1 (en) * 2013-09-02 2015-03-05 Samsung Electronics Co., Ltd. Method and apparatus for fitting a template based on subject information
US20160225180A1 (en) * 2015-01-29 2016-08-04 Siemens Medical Solutions Usa, Inc. Measurement tools with plane projection in rendered ultrasound volume imaging
CN107209924A (en) * 2015-01-29 2017-09-26 美国西门子医疗解决公司 Utilize the survey tool of the plane projection in rendered volume imagery
WO2016120831A3 (en) * 2015-01-29 2017-04-20 Siemens Medical Solutions Usa, Inc. Measurement tools with plane projection in rendered volume imaging
US11017568B2 (en) 2015-07-28 2021-05-25 PME IP Pty Ltd Apparatus and method for visualizing digital breast tomosynthesis and other volumetric images
US9984478B2 (en) 2015-07-28 2018-05-29 PME IP Pty Ltd Apparatus and method for visualizing digital breast tomosynthesis and other volumetric images
US10395398B2 (en) 2015-07-28 2019-08-27 PME IP Pty Ltd Appartus and method for visualizing digital breast tomosynthesis and other volumetric images
US11620773B2 (en) 2015-07-28 2023-04-04 PME IP Pty Ltd Apparatus and method for visualizing digital breast tomosynthesis and other volumetric images
US11599672B2 (en) 2015-07-31 2023-03-07 PME IP Pty Ltd Method and apparatus for anonymized display and data export
US11669969B2 (en) 2017-09-24 2023-06-06 PME IP Pty Ltd Method and system for rule based display of sets of images using image content derived parameters
US10909679B2 (en) 2017-09-24 2021-02-02 PME IP Pty Ltd Method and system for rule based display of sets of images using image content derived parameters
US11386606B2 (en) * 2018-04-11 2022-07-12 Koninklijke Philips N.V. Systems and methods for generating enhanced diagnostic images from 3D medical image
US11972024B2 (en) 2023-02-14 2024-04-30 PME IP Pty Ltd Method and apparatus for anonymized display and data export

Also Published As

Publication number Publication date
EP1636761A1 (en) 2006-03-22
CN1806260A (en) 2006-07-19
US7656418B2 (en) 2010-02-02
CN100385467C (en) 2008-04-30
JP4510817B2 (en) 2010-07-28
JP2006527056A (en) 2006-11-30
WO2004109603A1 (en) 2004-12-16

Similar Documents

Publication Publication Date Title
US7656418B2 (en) User control of 3d volume plane crop
US6334847B1 (en) Enhanced image processing for a three-dimensional imaging system
US8018454B2 (en) Volume rendering processing distribution in a graphics processing unit
JP3483929B2 (en) 3D image generation method
US6461298B1 (en) Three-dimensional imaging system
US6181348B1 (en) Method for selective volume visualization via texture mapping
Guennebaud et al. Real-time Soft Shadow Mapping by Backprojection.
US5454371A (en) Method and system for constructing and displaying three-dimensional images
Carr Surface reconstruction in 3D medical imaging
EP1046929B1 (en) Method and apparatus for three-dimensional ultrasound imaging using surface-enhanced volume rendering
Kaufman et al. Intermixing surface and volume rendering
WO1998024058A9 (en) Enhanced image processing for a three-dimensional imaging system
CN110087553B (en) Ultrasonic device and three-dimensional ultrasonic image display method thereof
US9196092B2 (en) Multiple volume renderings in three-dimensional medical imaging
US20130328874A1 (en) Clip Surface for Volume Rendering in Three-Dimensional Medical Imaging
US7852335B2 (en) Volume rendering processing distribution in a graphics processing unit
Englmeier et al. Hybrid rendering of multidimensional image data
Jaffey et al. Digital reconstruction methods for three-dimensional image visualization
Dietrich et al. Real-time interactive visualization and manipulation of the volumetric data using GPU-based methods
Ohbuchi et al. Incremental volume rendereing algorithm for interactive 3D ultrasound imaging
i Bartrolı et al. Visualization techniques for virtual endoscopy
Demiris et al. 3-D visualization in medicine: an overview
JPH0239385A (en) Three-dimensional image processor
Lenz et al. Interactive display of 3D-images in PICAP II
Basset et al. Three-dimensional reconstruction of the prostate from transverse or sagittal ultrasonic images

Legal Events

Date Code Title Description
AS Assignment

Owner name: KONINLIJKE PHILIPS ELECTRONICS N.V.,NETHERLANDS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WATKINS, STEPHEN;ARAZIA, STEVEN;SIGNING DATES FROM 20040902 TO 20041104;REEL/FRAME:017376/0654

Owner name: KONINLIJKE PHILIPS ELECTRONICS N.V., NETHERLANDS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WATKINS, STEPHEN;ARAZIA, STEVEN;REEL/FRAME:017376/0654;SIGNING DATES FROM 20040902 TO 20041104

STCF Information on status: patent grant

Free format text: PATENTED CASE

FPAY Fee payment

Year of fee payment: 4

FPAY Fee payment

Year of fee payment: 8

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 12TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1553); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 12