CA2346256A1 - Manipulating a digital dentition model to form models of individual dentition components - Google Patents
Manipulating a digital dentition model to form models of individual dentition components Download PDFInfo
- Publication number
- CA2346256A1 CA2346256A1 CA002346256A CA2346256A CA2346256A1 CA 2346256 A1 CA2346256 A1 CA 2346256A1 CA 002346256 A CA002346256 A CA 002346256A CA 2346256 A CA2346256 A CA 2346256A CA 2346256 A1 CA2346256 A1 CA 2346256A1
- Authority
- CA
- Canada
- Prior art keywords
- cross
- computer
- dentition
- program
- plane
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61C—DENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
- A61C9/00—Impression cups, i.e. impression trays; Impression methods
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61C—DENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
- A61C7/00—Orthodontics, i.e. obtaining or maintaining the desired position of teeth, e.g. by straightening, evening, regulating, separating, or by correcting malocclusions
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61C—DENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
- A61C9/00—Impression cups, i.e. impression trays; Impression methods
- A61C9/004—Means or methods for taking digitized impressions
- A61C9/0046—Data acquisition means or methods
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61C—DENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
- A61C7/00—Orthodontics, i.e. obtaining or maintaining the desired position of teeth, e.g. by straightening, evening, regulating, separating, or by correcting malocclusions
- A61C7/002—Orthodontic computer assisted systems
- A61C2007/004—Automatic construction of a set of axes for a tooth or a plurality of teeth
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61C—DENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
- A61C7/00—Orthodontics, i.e. obtaining or maintaining the desired position of teeth, e.g. by straightening, evening, regulating, separating, or by correcting malocclusions
- A61C7/002—Orthodontic computer assisted systems
Abstract
A programmed computer is used to create a digital model of an individual component of a patient's dentition. The computer obtains a 3D digital model of the patient's dentition, identifies points (766) in the dentition model that lie on an inter-proximal margin between adjacent teeth in the patient's dentition, and uses the identified points to create a cutting surface (782) that separates portions of the dentition model representing the adjacent tee th.
Description
MANIPULATING A DIGITAL DENTITION MODEL TO FORM
MODELS OF INDIVIDUAL DENTITION COMPONENTS
RELATED APPLICATIONS
This application is continuation-in-part of U.S. patent application 09/264,547, filed on March 8, 1999, and entitled "Segmenting a Digital Dentition Model", which is a continuation-in-part of U.S. patent application 09/169,276, filed on October 8, 1998, and entitled "Computer Automated Development of an Orthodontic Treatment Plan and Appliance," which claims priority from PCT application PCT/US98/12681, filed on June 19, 1998, and entitled "Method and System for Incrementally Moving Teeth", which claims priority from U.S. patent application 08/947,080, filed on October 8, 1997, is which claims priority from U.S. provisional application 60/050,342, filed on June 20, 1997, all of which are incorporated by reference into this application.
TECHNOLOGICAL FIELD
The invention relates to the fields of computer-assisted dentistry and 20 orthodontics.
BACKGROUND
Two-dimensional (2D) and three-dimensional (3D) digital image technology has recently been tapped as a tool to assist in dental and orthodontic 2s treatment. Many treatment providers use some form of digital image technology to study the dentitions of patients. U.S. patent application 09/169,276 describes the use of 2D and 3D image data in forming a digital model of a patient's dentition, including models of individual dentition components. Such models are useful, among other things, in developing an ao orthodontic treatment plan for the patient, as well as in creating one or more orthodontic appliances to implement the treatment plan.
SUMMARY
The inventors have developed several computer-automated techniques 35 for subdividing, or segmenting, a digital dentition model into models of SUBSTITUTE SHEET (RULE 26) individual dentition components. These dentition components include, but are not limited to, tooth crowns, tooth roots, and gingival regions. The segmentation techniques include both human-assisted and fully-automated techniques. Some of the human-assisted techniques allow a human user to provide "algorithmic hints" by identifying certain features in the digital dentition model. The identified features then serve as a basis for automated segmentation. Some techniques act on a volumetric 3D image model, or "voxel representation," of the dentition, and other techniques act on a geometric 3D model, or "geometric representation."
to In one aspect, a computer implementing the invention receives a data set that forms a three-dimensional (3D) representation of the patient's dentition, applies a test to the data set to identify data elements that represent portions of the individual component, and creates a digital model of the individual component based upon the identified data elements. Some implementations ~5 require the computer to identify data elements that form one or more 2D
cross-sections of the dentition in one or more 2D planes intersecting the dentition.
In many of these embodiments, these 2D planes are roughly parallel to the dentition's occlusal plane. The computer analyzes the features of the 2D cross-sections to identify data elements that correspond to the individual component 2o to be modeled. For example, one technique requires the computer to identify cusps in the 2D cross-sectional surface of the dentition, where the cusps represent the locations of an interproximal margin between teeth in the dentition. One variation of this technique allows the computer to confine its search for cusps in one 2D plane to areas in the vicinity of cusps already 2s identified on another 2D plane. Another variation allows the computer to link cusps on adjacent 2D planes to form a solid surface representing the interproximal margin. Some embodiments allow the computer to receive input from a human user identifying the cusp locations in one or more of the 2D
cross sections.
MODELS OF INDIVIDUAL DENTITION COMPONENTS
RELATED APPLICATIONS
This application is continuation-in-part of U.S. patent application 09/264,547, filed on March 8, 1999, and entitled "Segmenting a Digital Dentition Model", which is a continuation-in-part of U.S. patent application 09/169,276, filed on October 8, 1998, and entitled "Computer Automated Development of an Orthodontic Treatment Plan and Appliance," which claims priority from PCT application PCT/US98/12681, filed on June 19, 1998, and entitled "Method and System for Incrementally Moving Teeth", which claims priority from U.S. patent application 08/947,080, filed on October 8, 1997, is which claims priority from U.S. provisional application 60/050,342, filed on June 20, 1997, all of which are incorporated by reference into this application.
TECHNOLOGICAL FIELD
The invention relates to the fields of computer-assisted dentistry and 20 orthodontics.
BACKGROUND
Two-dimensional (2D) and three-dimensional (3D) digital image technology has recently been tapped as a tool to assist in dental and orthodontic 2s treatment. Many treatment providers use some form of digital image technology to study the dentitions of patients. U.S. patent application 09/169,276 describes the use of 2D and 3D image data in forming a digital model of a patient's dentition, including models of individual dentition components. Such models are useful, among other things, in developing an ao orthodontic treatment plan for the patient, as well as in creating one or more orthodontic appliances to implement the treatment plan.
SUMMARY
The inventors have developed several computer-automated techniques 35 for subdividing, or segmenting, a digital dentition model into models of SUBSTITUTE SHEET (RULE 26) individual dentition components. These dentition components include, but are not limited to, tooth crowns, tooth roots, and gingival regions. The segmentation techniques include both human-assisted and fully-automated techniques. Some of the human-assisted techniques allow a human user to provide "algorithmic hints" by identifying certain features in the digital dentition model. The identified features then serve as a basis for automated segmentation. Some techniques act on a volumetric 3D image model, or "voxel representation," of the dentition, and other techniques act on a geometric 3D model, or "geometric representation."
to In one aspect, a computer implementing the invention receives a data set that forms a three-dimensional (3D) representation of the patient's dentition, applies a test to the data set to identify data elements that represent portions of the individual component, and creates a digital model of the individual component based upon the identified data elements. Some implementations ~5 require the computer to identify data elements that form one or more 2D
cross-sections of the dentition in one or more 2D planes intersecting the dentition.
In many of these embodiments, these 2D planes are roughly parallel to the dentition's occlusal plane. The computer analyzes the features of the 2D cross-sections to identify data elements that correspond to the individual component 2o to be modeled. For example, one technique requires the computer to identify cusps in the 2D cross-sectional surface of the dentition, where the cusps represent the locations of an interproximal margin between teeth in the dentition. One variation of this technique allows the computer to confine its search for cusps in one 2D plane to areas in the vicinity of cusps already 2s identified on another 2D plane. Another variation allows the computer to link cusps on adjacent 2D planes to form a solid surface representing the interproximal margin. Some embodiments allow the computer to receive input from a human user identifying the cusp locations in one or more of the 2D
cross sections.
SUBSTITUTE SHEET (RULE 2~
Other embodiments require the computer to identify data elements that represent a structural core, or skeleton, of each individual component to be modeled. The computer creates the modcl by linking other data elements representing the individual component to the structural core.
s In another aspect, a computer implementing the invention receives a three-dimensional (3D) data set representing the patient's dentition, applies a test to identify data elements that represent an interproximal margin between two teeth in the dentition, and applies another computer-implemented test to select data elements that lie on one side of the interproximal margin for o inclusion in the digital model. Some implementations require the computer to identify data elements that form one or more 2D cross-sections of the dentition in one or more 2D planes intersecting the dentition roughly parallel to the dentition's occlusal plane.
In another aspect, a computer implementing the invention receives a 3D
15 data set representing at least a portion of the patient's dentition, including at least a portion of a tooth and gum tissue surrounding the tooth; applies a test to identify data elements lying on a gingival boundary that occurs where the tooth and the gum tissue meet; and applies a test to the data elements lying on the boundary to identify other data elements representing portions of the tooth.
2o In one aspect, the invention involves obtaining a three-dimensional (3D) digital model of a patient's dentition and analyzing the model to determine the orientation of at least one axis of the model automatically. In some implementations, the model's z-axis is found by creating an Oriented Bounding Box {OBB) around the model and identifying the direction in which the OBB
2s has minimum thickness. The z-axis extends in this direction, from the model's bottom surface to its top surface. Moreover, in a dentition model having only one mandible, one of the model surfaces is substantially flat and an opposite surface is textured. The direction of the positive z-axis can be identified in this type of model by identifying which of the surfaces is flat or textured. One SUBSTITUTE SHEET (RULE 26) technique for doing so involves creating one or more planes that are roughly normal to the z-axis and then creating line segments that extend between the planes and the top and bottom surfaces of the dentition model. The surface for which all of the line segments are of one length is identified as being the flat surface, and the surface for which the line segments have varying lengths is identified as being the textured surface.
In other implementations, the x- and y-axes are found by selecting a two-dimensional (2D) plane that contains the axes and an arch-shaped cross section of the dentition model and identifying the orientations of the axes in ~o this plane. In general, the arch-shaped cross section is roughly symmetrical about the y-axis. One technique for identifying the y-axis involves identifying a point at each end of the arch-shaped cross section, creating a line segment that extends between the identified points, and identifying the orientation of the y-axis as being roughly perpendicular to the line segment. The point at each ~s end of the arch can be identified by selecting a point that lies within an area surrounded by the arch-shaped cross section, creating a line segment that extends between the selected point and an edge of the 2D plane, sweeping the line segment in a circular manner around the selected point, and identifying points at the ends of the arch-shaped cross section at which the sweeping line 2o segment begins intersecting the cross section of the dentition model and stops intersecting the cross section of the dentition model. In general, the x-axis is perpendicular to the y-axis.
In another aspect, the invention involves using a programmed computer to create a digital model of an individual component of a patient's dentition by 25 obtaining a 3D digital model of the patient's dentition, identifying points in the dentition model that lie on an inter-proximal margin between adjacent teeth in the patient's dentition, and using the identified points to create a cutting surface for use in separating portions of the dentition model representing the adjacent teeth.
SUBSTITUTE SHEET (RULE 26) In some implementations, 2D cross sections of the dentition model are displayed to a human operator, and the operator provides input identifying approximate points at which the interproximal margin between the adjacent teeth meets gingival tissue. In some cases, the dentition model includes a 3D
s volumetric model of the dentition, and the input provided by the operator identifies two voxels in the volumetric model. The computer then defines a neighborhood of voxels around each of the two voxels identified by the human operator, where each neighborhood includes voxels representing the dentition model and voxels representing a background image. The computer selects the ~ o pair of voxels, one in each neighborhood, representing the background image that lie closest together.
In some of these implementations, the computer also identifies voxels on another 2D cross section that represent the inteiproximal margin. One technique for doing so is by defining a neighborhood of voxels around each of ~ s the selected voxels, where each neighborhood includes voxels representing the dentition model and voxels representing a background image, projecting the neighborhoods onto the other 2D cross section, and selecting two voxels in the projected neighborhoods that represent the inter-proximal margin.
In another aspect, the invention involves displaying an image of a 2o dentition model, receiving input from a human operator identifying points in the image representing a gingival line at which a tooth in the dentition model meets gingival tissue, and using the identified points to create a cutting surface for use in separating the tooth from the gingival tissue in the dentition model.
The cutting surface often extends roughly perpendicular to the dentition's 2s occlusal plane.
In some implementations, the cutting surface is created by projecting at least a portion of the gingival line onto a plane that is roughly parallel to the occlusal plane and then creating a surface that connects the gingival line to the projection. One way of establishing the plane is by fitting the plane among the SUBSTITUTE SHEET (RULE 26) points on the gingival line and then shifting the plane away from the tooth in a direction that is roughly normal to the plane. For example, the plane can be shifted along a line segment that includes a point near the center of the tooth and that is roughly perpendicular to the plane. The length of the line segment s usually approximates the length of a tooth root.
In other embodiments, the cutting surface extends roughly parallel to the dentition's occlusal plane in the dentition model. In some of these embodiments, the input received from the human operator identifies points that form two 3D curves representing gingival lines at which teeth in the dentition ~o model meet gum tissue on both the buccal and lingual sides of the dentition model. The cutting surface is created by fitting a surface among the points lying on the two curves. For each tooth, a point lying between the two curves is identified and surface triangles are created between the identified point and points on the two curves. One technique for identifying the point involves ~s averaging, for each tooth, x, y and z coordinate values of the points on portions of the two curves adjacent to the tooth.
Other embodiments involve creating, for each tooth, a surface that represents the tooth's roots. One technique for doing so involves projecting points onto a plane that is roughly parallel to the occlusal plane and connecting 2o points on the two curves to the projected points. The surface can be used to separate portions of the dentition model representing the tooth roots from portions representing gingival tissue. The model of the tooth roots is then connected to the tooth model.
Other embodiments and advantages are apparent from the detailed 2s description and the claims below.
DESCRIPTION OF THE DRAWIrIGS
FIGS. lA, 1B, and 2 are partial views of a dentition model as displayed on a computer monitor and segmented with a human-operated saw tool.
SUBSTITUTE SHEET (RULE 26) FIG. 3 is a partial view of a dentition model as displayed on a computer monitor and segmented with a human-operated eraser tool.
FIG. 4 is a view of a dentition model for which a feature skeleton has been identified.
FIGS. 5 and 6 are flowcharts for a feature skeleton analysis technique used in segmenting a dentition model.
FIG. 7A is a horizontal 2D cross-sectional view of a dentition model.
FIG. 7B is a side view of a dentition model intersected by several 2D
planes.
~o FIG. 8 is a flowchart for a 2D slice analysis technique used in segmenting a dentition model.
FIGS. 9 and l0A through l OC each shows a group of voxels in a 2D
slice of a dentition model.
FIG. 11 is a flowchart for an automatic cusp detection technique used in ~ s segmenting a dentition model.
FIG. 12 is a horizontal 2D cross section of a dentition model illustrating a neighborhood filtered automatic cusp detection technique used in segmenting the dentition model.
FIG. 13 shows two groups of voxels in a 2D slice of a dentition model zo illustrating the neighborhood filtered automatic cusp detection technique.
FIG. 14 is a flowchart for the neighborhood filtered automatic cusp detection technique.
FIG. 15 is a horizontal 2D cross section of a dentition model illustrating an arch curve fitting technique used in segmenting the dentition model.
2s FIG. 16 is a flowchart for the arch curve fitting technique.
FIG. 17 is a horizontal 2D cross section of a dentition model illustrating a curve creation technique for use with the arch curve fitting technique.
FIG. 18 is a flowchart for the curve creation technique.
SUBSTITUTE SHEET (RULE 26) FIGS. 19A and 19B are a perspective view and a vertical 2D cross-sectional view of a dentition model illustrating another technique for use in segmenting the dentition model.
FIGS. 20 and 21 are flowcharts of the technique illustrated in FIGS. 19A
and 19B.
FIG. 22 is a vertical 2D cross-sectional view of a dentition model illustrating the gingival margin detection technique for use in segmenting the dentition model.
FIG. 23 shows a group of voxels in a 2D slice of a dentition model ~o illustrating a gingival margin detection technique.
FIG. 24 is a flowchart for the gingival margin detection technique.
FIG. 25 shows a digital dentition model inside an Oriented Bounding Box (OBB).
FIG. 26 illustrates a technique for properly orienting a digital dentition ~s model along a z-axis.
FIGS. 27A, 27B, and 27C illustrate a technique for properly orienting a digital dentition model along x- and y-axes.
FIGS. 28, 29, 30 and 31 are flowcharts for the techniques of FIGS. 25, 26, and 27A-C.
2o FIGS. 32 and 33 illustrate a human-assisted technique for identifying interproximal margins between teeth.
FIG. 34 is a flowchart for the technique of FIGS. 32 and 33.
FIGS. 35A through 35F illustrate a technique for segmenting a digital dentition model into models of individual teeth and gum tissue.
2s FIG. 36 is a flowchart for the technique of FIGS. 35A through 35F.
FIGS. 37A, 37B, and 37C illustrate another technique for segmenting a digital dentition model into models of individual teeth.
FIGS. 38 and 39 are flowcharts for the technique of FIGS. 37A, 37B, and 37C.
SUBSTITiJTE SHEET (RULE 26) DETAILED DESCRIPTION
U.S. patent application 09/169,276 describes techniques for generating a 3D digital data set that contains a model of a patient's dentition, including the s crowns and roots of the patient's teeth as well as the surrounding gum tissue.
One such technique involves creating a physical model of the dentition from a material such as plaster and then digitally imaging the model with a laser scanner or a destructive scanning system. These techniques are used to produce a digital volumetric 3D model ("volume element representation" or "voxel representation's of the dentition model, and/or a digital geometric 3D
surface model ("geometric model") of the dentition. The computer-implemented techniques described below act on one or both of these types of 3D dentition models.
In creating a voxel representation, the physical model is usually ~ s embedded in a potting material that contrasts sharply with the color of the physical model to enhance detection of the dentition features. A white dentition model embedded in a black potting material provides the sharpest contrast. A wide variety of information can be used to enhance the 3D model, including data taken from photographic images, 2D and 3D x-rays scans, 2o computed tomography (CT) scans, and magnetic resonance imaging (MRI) scans of the patient's dentition.
The 3D data set is loaded into a computer which, under control of a program implementing one or more techniques of the dentition, either with or without human assistance, segments the digital dentition model into digital 2s models of individual dentition components, such as teeth and gingival tissue.
In one implementation, the computer produces a digital model of each individual tooth in the patient's dentition, as well as a digital model of the gingival tissue surrounding the teeth.
SUBSTTIrUTE SHEET (RULE 2b) To segment the digital dentition model accurately, the computer often must know the exact orientation of the dentition model. One technique for establishing the orientation of the digital dentition model in the 3D data set involves holding the physical dentition model at a prescribed orientation during the digital imaging process discussed above. Embedding the physical model at a particular orientation in a solid potting material is one way of holding the physical model. In some systems, however, even this technique introduces small errors in the orientation of the dentition model.
~o Orienting the Digital Dentition Model FIGS. 25, 26, 27A-C and 28 illustrate several techniques used by the computer to orient the digital dentition model 500 properly. The computer first obtains a digital model of the dentition using one of the techniques described above (step 700). The computer then locates the model's z-axis 502, which in ~ s the depicted example extends from the base of the model toward the roof of the patient's mouth and is normal to the dentition's occlusal plane (step 702).
The computer then locates the model's y-axis 504, which in the depicted example extends from an area lying within the dental arch toward the patient's front teeth (step 704). Using the right-hand rule, the computer then defines the 2o model's x-axis 506 to extend from an area lying within the dental arch toward the teeth on the right side of the patient's mouth (step 706). The occlusal plane is a plane that is pierced by all of the cusps of the patient's teeth when the patient's mandibles interdigitate. Techniques for identifying the occlusal plane include receiving user input identifying the location of the plane and 2s conducting a fully-automated analysis of the dentition model.
FIGS. 25, 26, and 29 show one technique for identifying the z-axis 502.
The computer first identifies the dentition model 500 in the 3D data set (step 710). For 3D geometric data, identifying the dentition model is simply a matter of locating the geometric surfaces. For 3D volumetric data, identifying the SUBSTITUTE SHEET (RULE Z6) dentition model involves distinguishing the lighter voxels, which represent the dentition model, from the darker voxels, which represent the background. The computer then fits an Oriented Bounding Box ("OBB") 510 around the dentition model 500 using a conventional OBB fitting technique (step 712).
s The dimension in which the OBB 510 has its smallest thickness TM,N is the dimension in which the z-axis 502 extends (step 714).
After determining the dimension in which the z-axis extends 502, the computer determines whether the dentition model is facing upward or downward, i.e., in which direction the positive z-axis extends. FIGS. 26 and ~o illustrate a technique for determining the direction of the positive z-axis. This technique relies on an observation that the bottom surface 512 of the dentition model is flat and the upper surface 514 follows the jagged contours of the patient's teeth. This technique also relies on an assumption that the model at this point includes only one of the patient's mandibles.
~ 5 The computer first creates one or more planes 516, 518 that are normal to the z-axis 502 (step 720). The computer then creates line segments S 15A, S 158 between the planes 516, 518 and the surfaces 512, 514 of the model (step 722). The line segments S 15A that touch the flat bottom surface 512 are all of approximately the same length (step 724). The line segments S 15B that touch 2o the jagged top surface 514 have varying lengths (step 726). The computer identifies the positive z-axis as extending from the bottom surface 512 toward the top surface 514 and orients the digital dentition model 500 accordingly (step 72$).
FIGS. 27A, 278, 27C, and 31 illustrate a technique for identifying the 2s y-axis 504 and the x-axis 506 of the dentition model 500. The computer begins by selecting a 2D slice 520 of data that is normal to the z-axis and that contains a cross section 522 of the dentition model (step 730). This technique relies on an observation that the cross section 522 of the dentition model is arch shaped.
The computer identifies a point 524 at or near the center of the 2D slice 520 SUBSTITUTE SHEET (RULE 26) (step 732). The computer then creates a line segment 526 (or 530) that extends from the selected point 524 to an edge 52$ (or 532) of the slice 520 (step 734).
The direction in which the line segment extends is arbitrary, so the line segment may or may not intersect the dental cross section. The depicted example shows two line segments 526, 530, one of which intersects the dental cross section 522, the other of which does not.
The computer then begins rotating, or sweeping, one of the line segments 526, 530 about the center point 524 (step 736). In general, the computer sweeps the line segment in small, discrete steps, usually on the order 0 of five degrees of rotation. As it is swept, a line segment 526 that initially intersects the dental cross section 522 will eventually stop intersecting the cross section 522, and the computer marks the point 534 at which this occurs. As sweeping continues, the line segment 526 will eventually resume intersecting the cross section 522, and the computer marks the point 536 at which this ~ s occurs. Likewise, a line segment 530 that initially does not intersect the cross section 522 eventually will begin intersecting the cross section 522, and the computer marks the point 536 at which this occurs. The computer also marks the point 534 at which this line segment 530 stops intersecting the cross section 522 (step 738). The computer stops sweeping the line segments 526, 530 after 2o marking both of the points 534, 536 (step 740).
The computer then creates a line segment 538 that extends between the two marked points 534, 536 (step 742). The y-axis 504 of the dentition model extends roughly normal to this line segment 538 through the front 540 of the dental arch (step 744). The x-axis 506 extends roughly parallel to this line 2s segment 538 through the right side 542 of the dental arch (step 746). The computer uses this Iine segment 538 to orient the dentition model correctly along the x- and y-axes (step 748).
SUBSTITUTE SHEET (RULE 2~
Se>rxnenting the Digital Dentition Model Into Individual Component Models Some computer-implemented techniques for segmenting a 3D dentition model into models of individual dentition components require a substantial s amount of human interaction with the computer. One such technique, which is shown in FIGS. 1 A, 1 B, and 2, provides a graphical user interface with a feature that imitates a conventional saw, allowing the user to identify components to be cut away from the dentition model 100. The graphical user interface provides a rendered 3D image 100 of the dentition model, either at to one or more static views from predetermined positions, as shown in FIGS. lA
and 1B, or in a "full 3D" mode that allows the user to alter the viewing angle, as shown in FIG. 2. The saw tool is implemented as a set of mathematical control points 102, represented graphically on the rendered image 100, which define a 3D cutting surface 104 that intersects the volumetric or geometric ~ s dentition model. The computer subdivides the data elements in the dentition model by performing a surface intersection operation between the 3D cutting surface 104 and the dentition model. The user sets the locations of the mathematical control points, and thus the geometry and position of the 3D
cutting surface, by manipulating the control points in the graphical display with 2o an input device, such as a mouse. The computer provides a visual representation 104 of the cu ing surface on the display to assist the user in fitting the surface around the individual component to be separated. Once the intersection operation is complete, the computer creates a model of the individual component using the newly segmented data elements.
2s Another technique requiring substantial human interaction, shown in FIG. 3, is a graphical user interface with a tool that imitates a conventional eraser. The eraser tool allows the user to isolate an individual dentition component by removing portions of the dentition model that surround the individual component. The eraser tool is implemented as a 3D solid 110, SUBSTITUTE SHEET (RULE 26) typically having the shape of a rectangular prism, or a curved surface that matches the shape of a side surface of a tooth. The solid is made as small as possible, usually only a single voxel thick, to minimize degradation of the data set. As with the saw technique above, the graphical user interface presents the user with a rendered 3D image 112 of the dentition model at one or more predetermined static views or in a full 3D mode. The user identifies portions of the dentition model for removal by manipulating a graphical representation 110 of the 3D solid with an input device. In alternative embodiments, the computer either removes the identified portions of the dentition model as the user moves the eraser 112, or the computer waits until the user stops moving the eraser and provides an instruction to remove the identified portions. The computer updates the display in real time to show the path 114 of the eraser through the dentition model.
Other computer-implemented segmentation techniques require little or 15 no human interaction during the segmentation process. One such technique, which is illustrated in FIG. 4, involves the application of conventional "feature skeleton" analysis to a volumetric representation of the dentition model. This technique is particularly useful in identifying and modeling individual teeth.
In general, a computer applying this technique identifies a core of voxels, that 2o forms a skeleton 122 for the dentition 120. The skeleton 122 roughly resembles the network of biological nerves within patient's teeth. The computer then divides the skeleton 122 into branches 124, each containing voxels that lie entirely within one tooth. One technique for identifying the branches is by defining a plane 126 that cuts through the skeleton 122 roughly 2s parallel to the occlusal plane of the patient's dentition ("horizontal plane").
Each branch 124 intersects the horizontal plane 126 at one or more points, or clusters, that are relatively distant from the clusters associated with the other branches. The computer forms the individual tooth models by linking other voxels to the appropriate branches 124 of the skeleton.
SUBSTITUTE SHEET (RULE Z6) FIG. 5 describes a particular technique for forming a skeleton in the dentition model. The computer first identifies the voxels in the dentition model that represent the tooth surfaces (step 130). For a voxel representation that is created from a physical model embedded in a sharply contrasting material, identifying the tooth surfaces is as simple as identifying the voxels at which sharp changes in image value occur, as described in U.S. patent application 09/169,276. The computer then calculates, for each voxel in the model, a distance measure indicating the physical distance between the voxel and the nearest tooth surface (step 132). The computer identifies the voxels with the o largest distance measures and labels each of these voxels as forming a portion of the skeleton (step 134). Feature skeleton analysis techniques are described in more detail in the following publications: ( 1 ) Gagvani and Silver, "Parameter Controlled Skeletons for 3D Visualization," Proceedings of the IEEE Visualization Conference (1997); (2) Bertrand, "A Parallel Thinning ~ s Algorithm for Medial Surfaces," Pattern Recognition Letters, v. 16, pp.
979-986 { 1995); (3) Mukherjee, Chatterji, and Das, "Thinning of 3-D Images Using the Safe Point Thinning Algorithm (SPTA)," Pattern Recognition Letters, v. 10, pp. 167-173 (1989); (4) Niblack, Gibbons, and Capson, "Generating Skeletons and Centerlines from the Distance Transform," CVGIP:
2o Graphical Models and Image Processing, v. 54, n. 5, pp. 420-437 (1992).
Once a skeleton has been formed, the computer uses the skeleton to divide the dentition model into 3D models of the individual teeth. FIG. 6 shows one technique for doing so. The computer first identifies those portions of the skeleton that are associated with each individual tooth. To do so, the 2s computer defines a plane that is roughly parallel to the dentition's occlusal surface and that intersects the skeleton near its base (step 136). The computer then identifies points at which the plane and the skeleton intersect by identifying each voxel that lies on both the skeleton and the plane (step 138).
In general, a single tooth includes all of the voxels that lie in a particular branch is SUBSTITUTE SHEET (RULE 26) of the skeleton; and because the plane intersects the skeleton near its base, voxels that lie together in a branch of the skeleton usually cluster together on the intersecting plane. The computer is able to locate the branches by identifying voxels on the skeleton that lie within a particular distance of each s other on the intersecting plane (step 140). The computer then identifies and labels all voxels on the skeleton that belong to each branch (step 142).
Once the branches are identified, the computer links other voxels in the model to the branches. The computer begins by identifying a reference voxel in each branch of the skeleton (step 144). For each reference voxel, the o computer selects an adjacent voxel that does not lie on the skeleton (step 146).
The computer then processes the selected voxel, determining whether the voxel lies outside of the dentition, i.e., whether the associated image value is above or below a particular threshold value (step 148); determining whether the voxel already is labeled as belonging to another tooth (step 150); and determining ~s whether the voxel's distance measure is greater than the distance measure of the reference voxel (step 152). If none of these conditions is true, the computer labels the selected voxel as belonging to the same tooth as the reference voxel (step 154). The computer then repeats this test for all other voxels adjacent to the reference voxel (step 156). Upon testing all adjacent voxels, the computer 2o selects one of the adjacent voxeIs as a new reference point, provided that the adjacent voxel is labeled as belonging to the same tooth, and then repeats the test above for each untested voxel that is adjacent to the new reference point.
This process continues until alI voxels in the dentition have been tested.
FIGS. 7A and 7B illustrate another technique for identifying and 2s segmenting individual teeth in the dentition model. This technique, called "2D
slice analysis," involves dividing the voxel representation of the dentition model into a series of parallel 2D planes 160, or slices, that are each one voxel thick and that are roughly parallel to the dentition's occlusal plane, which is roughly normal to the model's z-axis. Each of the 2D slices 160 includes a 2D
SUBSTITUTE SHEET (RULE 16) cross section 162 of the dentition, the surface 164 of which represents the lingual and buccal surfaces of the patient's teeth and/or gums. The computer inspects the cross section 162 in each 2I? slice I60 to identify voxels that approximate the locations of the interproximal margins 166 between the teeth.
s These voxels lie at the tips of cusps 165 in the 2D cross-sectional surface 164.
The computer then uses the identified voxels to create 3D surfaces 168 intersecting the dentition model at these locations. The computer segments the dentition model along these intersecting surfaces 168 to create individual tooth models.
o FIG. 8 describes a particular implementation of the 2D slice analysis technique. The computer begins by identifying the voxels that form each of the 2D slices (step 170). The computer then identifies, for each 2D slice, the voxels that represent the buccal and lingual surfaces of the patient's teeth and gums (step 172) and defines a curve that includes all of these voxels (step 174).
~s This curve represents the surface 164 of the 2D cross section 162.
The computer then calculates the rate of curvature (i.e., the derivative of the radius of curvature) at each voxel on the 2D cross-sectional surface 164 (step 176) and identifies all of the voxels at which local maxima in the rate of curvature occur (step 178). Each voxel at which a local maximum occurs 2o represents a "cusp" in the 2D cross-sectional surface 164 and roughly coincides with an interproximal margin between teeth. In each 2D slice, the computer identifies pairs of these cusp voxels that correspond to the same interproximal margin (step 180), and the computer labels each pair to identify the interproximal margin with which it is associated (step 182). The computer then 2s identifies the voxel pairs on all of the 2D slices that represent the same interproximal margins (step 184). For each interproximal margin, the computer fits a 3D surface 168 approximating the geometry of the interproximal margin among the associated voxel pairs (step 186).
m SUBSTITUTE SHEET (RULE 26) FIG. 9 illustrates one technique for creating the 3D surfaces that approximate the interproximal margins. For each pair of cusp voxels 190a-b in a 2D slice that are associated with a particular interproximal region, the computer creates a line segment 192 bounded by these cusp voxels 190a-b.
The computer changes the colors of the voxels in the line segment, including the cusp voxels 190a-b that bound the segment, to contrast with the other voxels in the 2D slice. The computer creates line segments in this manner in each successive 2D slice, forming 3D surfaces that represent the interproximal regions. All of the voxels that lie between adjacent ones of these 3D surfaces ~o represent an individual tooth.
FIGS. l0A through l OC illustrate a refinement of the technique shown in FIG. 9. The refined technique involves the projection of a line segment 200 from one slice onto a line segment 206 on the next successive slice to form, for the associated interproximal margin, a 2D area bounded by the cusp voxels ~ 5 202a-b, 204a-b of the line segments 200, 206. If the line segments 200, 206 are oriented such that any voxel on one segment 200 is not adjacent to a voxel on the other segment 206, as shown in FIG. 10A, then the resulting 3D surface is discontinuous, leaving unwanted "islands" of white voxels 208, 210.
The computer eliminates these discontinuities by creating two new line 2o segments 212, 214, each of which is bounded by one cusp voxel 202a-b, 204a-b from each original line segment 200, 206, as shown in FIG. IOB. The computer then eliminates the islands between the new line segments 212, 214 by changing the colors of all voxels between the new line segments 212, 214, as shown in FIG. l OC.
2s Automated segmentation is enhanced through a technique known as "seed cusp detection." The term "seed cusp" refers to a location at which an interproximal margin between adjacent teeth meets the patient's gum tissue. In a volumetric representation of the patient's dentition, a seed cusp for a particular interproximal margin is found at the cusp voxel that lies closest to is SUBSTITUTE SHEET (RULE 26) the gum line. By applying the seed cusp detection technique to the 2D slice analysis, the computer is able to identify all of the seed cusp voxels in the model automatically.
FIG. 11 shows a particular implementation of the seed cusp detection 5 technique, in which the computer detects the seed cusps by identifying each slice in which the rate of curvature of a cusp first falls below a predetermined threshold value. The computer begins by selecting a 2D slice that intersects all of the teeth in the arch (step 220). The computer attempts to select a slice that is near the gingival regions but that does not include any voxels representing to gingival tissue. The computer then identifies all of the cusp voxels in the slice (step 222). If the rate of curvature of the 2D cross section at any of the cusp voxels is less than a predetermined threshold value, the computer labels that voxel as a seed cusp {step 224). The computer then selects the next 2D
slice, which is one voxeI layer closer to the gingival region (step 226), and ~s identifies all of the cusp voxels that are not associated with a cusp for which the computer has akeady identified a seed cusp (step 228). If the rate of curvature of the ZD cross section is less than the predetermined threshold value at any of these cusp voxels, the computer labels the voxel as a seed cusp (step 230) and proceeds to the next 2D slice. The computer continues in this manner 2o until a seed cusp voxel has been identified for each cusp associated with an interproximal margin (step 232).
FIGS. 32, 33, and 34 illustrate a human-assisted technique, known as "neighborhood-filtered seed cusp detection," for detecting seed cusps in the digital dentition model. This technique allows a human operator to scroll 2s through 2D image slices on a video display and identify the locations of the seed cusps for each of the interproximal margins. The computer displays the 2D slices (step 750), and the operator searches the 2D slices to determine, for each adjacent pair of teeth, which slice 550 most likely contains the seed cusps for the corresponding interproximal margin. Using an input device such as a l9 SUBSTITUTE SHEET (RULE 26) mouse or an electronic pen, the user marks the points 552, 554 in the slice that appear to represent the seed cusps (step 752). With this human guidance, the computer automatically identifies two voxels in the slice as the seed cusps.
The points 552, 554 identified by the human operator may or may not be s the actual seed cusps 560, 562, but these points 552, 554 lie very close to the actual seed cusps 560, 562. As a result, the computer confines. its search for the actual seed cusps 560, 562 to the voxel neighborhoods 556, 558 immediately surrounding the points 552, 554 selected by the human operator. The computer defines each of the neighborhoods 556, 558 to contain a particular number of o voxels, e.g., twenty-five arranged in a 5 x 5 square, as shown here (step 754).
The computer then tests the image values for all of the voxels in the neighborhoods 556, 558 to identify those associated with the background image and those associated with the dentition (step 756). In this example, voxels in the background are black and voxels in the dentition are white. The ~s computer identifies the actual seed cusps 560, 562 by locating the pair of black voxels, one from each of the neighborhoods 556, 558, that lie closest together (step 758). In the depicted example, each of the actual seed cusps 560, 562 lies next to one of the points 552, 554 selected by the human operator.
FIGS. 12, 13, and 14 illustrate a technique, known as "neighborhood-2o filtered cusp detection," by which the computer focuses its search for cusps on one 2D slice to neighborhoods 244, 246 of voxels defined by a pair of previously detected cusp voxels 240, 242 on another 2D slice. This technique is similar to the neighborhood-filtered seed cusp detection technique described above.
2s Upon detecting a pair of cusp voxels 240, 242 in a 2D slice at level N
(step 250), the computer defines one or more neighborhoods 244, 246 that include a predetermined number of voxels surrounding the pair (step 252). The computer then projects the neighborhoods onto the next 2D slice at level N+1 by identifying the voxels on the next slice that are immediately adjacent the SUBSTITUTE SHEET (RULE 26) voxels in the neighborhoods on the original slice (step 254). The neighborhoods are made large enough to ensure that they include the cusp voxels on the N+1 slice. In the example of FIG. 13, each cusp voxel 240, 242 lies at the center of a neighborhood 244, 246 of twenty-five voxels arranged in s a 5 x 5 square.
In searching for the cusp voxels on the N+1 slice, the computer tests the image values for all voxels in the projected neighborhoods to identify those associated with the background image and those associated with the dentition (step 256). In the illustrated example, voxels in the background are black and voxels in the dentition are white. The computer identifies the cusp voxels on the N+1 slice by locating the pair of black voxels in the two neighborhoods that lie closest together (step 258). The computer then repeats this process for all remaining slices (step 259).
FIGS. 15 and 16 illustrate another technique, known as "arch curve fitting," for identifying interproximal margins between teeth in the dentition.
The arch curve fitting technique, which also applies to 2D cross-sectional slices of the dentition, involves the creation of a curve 260 that fits among the voxels on the 2D cross-sectional surface 262 of the dentition arch 264. A series of closely spaced line segments 268, each bounded by the cross-sectional surface 20 268, are formed along the curve 260, roughiy perpendicular to the curve 260, throughout the 2D cross section 264. In general, the shortest of these line segments 268 lie on or near the interproximal margins; thus computer identifies the cusps that define the interproximal margins by determining the relative lengths of the line segments 268.
2s When applying the arch curve fitting technique, the computer begins by selecting a 2D slice (step 270) and identifying the voxels associated with the surface 262 of the cross-sectional arch 264 (step 272). The computer then defines a curve 260 that fits among the voxels on the surface 262 of the arch (step 274). The computer creates the curve using any of a variety of SUBSTITUTE SHEEP (RULE 26) techniques, a few of which are discussed below. The computer then creates a series of line segments that are roughly perpendicular to the curve and are bounded by the cross-sectional surface 262 (step 276). The line segments are approximately evenly spaced with a spacing distance that depends upon the s required resolution and the acceptable computing time. Greater resolution leads to more line segments and thus greater computing time. In general, a spacing on the order of 0.4 mm is sufficient in the initial pass of the arch curve fitting technique.
The computer calculates the length of each line segment (step 278) and o then identifies those line segments that form local minima in length (step 280).
These line segments roughly approximate the locations of the interproximal boundaries, and the computer labels the voxels that bound these segments as cusp voxels (step 282). The computer repeats this process for each of the 2D
slices (step 284) and then uses the cusp voxels to define 3D cutting surfaces t5 that approximate the interproximal margins.
In some implementations, the computer refines the arch cusp detenmination by creating several additional sets of line segments, each centered around the arch cusps identified on the first pass. The line segments are spaced more narrowly on this pass to provide greater resolution in 2o identifying the actual positions of the arch cusps.
The computer uses any of a variety of curve fitting techniques to create the curve through the arch. One technique involves the creation of a catenary curve with endpoints lying at the two ends 265, 267 (FIG. 15) of the arch. The catenary curve is defined by the equation y = a + bcosh(cx), and the computer 2s fits the curve to the arch by selecting appropriate values for the constants a, b, and c. Another technique involves the creation of two curves, one fitted among voxels lying on the front surface 271 of the arch, and the other fitted among voxels on the rear surface 273. A third curve, which guides the placement of SUBSTTrUT'E SHEET (RULE 26) WO 00/19935 PCT/US99lZ3532 the line segments above, passes through the middle of the area lying between the first two curves.
FIGS. 17 and 18 illustrate another technique for constructing a curve through the arch. This technique involves the creation of a series of initial line s segments through the arch 264 and the subsequent formation of a curve 290 fitted among the midpoints of these line segments This curve 290 serves as the arch curve in the arch curve fitting technique described above.
In applying this technique, the computer first locates an end 265 of the arch (step 300) and creates a line segment 291 that passes through the arch o near this end 265 (step 301 ). The line segment 291 is bounded by voxels 292a-b lying on the surface of the arch. The computer then determines the midpoint 293 of the line segment 291 (step 302), selects a voxel 294 located particular distance from the midpoint 293 (step 304), and creates a second line segment 295 that is parallel to the initial line segment 291 and that includes the ~ 5 selected voxel 294 (step 306). The computer then calculates the midpoint of the second segment 295 (step 308) and rotates the second segment 295 to the orientation 295' that gives the segment its minimum possible length (step 309).
In some cases, the computer limits the second segment 295 to a predetermined amount of rotation (e.g., f 10°).
2o The computer then selects a voxel 297 located a particular distance from the midpoint 296 of the second segment 295 (step 310) and creates a third line segment 298 that is parallel to the second line segment 295 and that includes the selected voxel 297 (step 312). The computer calculates the midpoint 299 of the third segment 298 (step 314) and rotates the segment 298 to the 2s orientation 298' that gives the segment its shortest possible length (step 316).
The computer continues adding line segments in this manner until the other end of the cross-sectional arch is reached (step 318). The computer then creates a curve that fits among the midpoints of the line segments (step 320) and uses this curve in applying the arch fitting technique described above.
SUBSTITUTE SHEET (RULE 26) FIGS. 19A, 19B and 20 illustrate an alternative technique for creating 3D surfaces that approximate the geometries and locations of the interproximal margins in the patient's dentition. This technique involves the creation of 2D
planes that intersect the 3D dentition model at locations that approximate the interproximal margins. In general, the computer defines a series of planes, beginning with an initial plane 330 at one end 331 of the arch 332, that are roughly perpendicular to the occlusal plane of the dentition model ("vertical"
planes). Each plane intersects the dentition model to form a 2D cross section 334. If the planes are spaced sufficiently close to each other, the planes with o the smallest cross-sectional areas approximate the locations of the interproximal margins in the dentition. The computer locates the interproximal regions more precisely by rotating each plane about two orthogonal axes 336, 338 until the plane reaches the orientation that yields the smallest possible cross-sectional area.
~ s In one implementation of this technique, the computer first identifies one end of the arch in the dentition model (step 340). The computer then creates a vertical plane 330 through the arch near this end (step 342) and identifies the center point 331 of the plane 330 (step 344). The computer then selects a voxel located a predetermined distance from the center point (step 20 345) and creates a second plane 333 that is parallel to the initial plane and that includes the selected voxel (step 346). The computer calculates the midpoint of the second plane (step 348) and rotates the second plane about two orthogonal axes that intersect at the midpoint (step 350). The computer stops rotating the plane upon finding the orientation that yields the minimum cross-es sectional area. In some cases, the computer limits the plane to a predetermined amount of rotation (e.g., t 10° about each axis). The computer then selects a voxel located a particular distance from the midpoint of the second plane (step 352) and creates a third plane that is parallel to the second plane and that includes the selected voxel (step- 354). The computer calculates the midpoint SUBSTITiTrE SHEET (RULE 26) of the third plane (step 356) and rotates the plane to the orientation that yields the smallest possible cross-sectional area (step 357). The computer continues adding and rotating planes in this manner until the other end of the arch is reached (step 358). The computer identifies the planes at which local minima s in cross-sectional area occur and labels these planes as "interproximal planes,"
which approximate the locations of the interproximal margins (step 360).
One variation of this technique, described in FIG. 21, allows the computer to refine its identification of inteiproximal planes by creating additional, more closely positioned planes in areas around the planes labeled as ~o interproximal. The computer first creates a curve that fits among the midpoints of the planes labeled as interproximal planes (step 372) and then creates a set of additional planes along this curve (step 374). The additional planes are not evenly spaced along the curve, but rather are concentrated around the interproximal margins. The planes in each interproximal area are spaced very closely (e.g., 0.05 mm from each other). The computer rotates each of the newly constructed planes about two orthogonal axes until the plane reaches its minimum cross-sectional area (step 376). The computer then selects the plane in each cluster with the smallest cross-sectional area as the plane that most closely approximates the interproximal margin (step 378).
2o FIGS. 22, 23, and 24 illustrate a technique for identifying the gingival margin that defines the boundary between tooth and gum in the patient's dentition. This technique involves the creation of a series of vertical 2D
planes 380, or slices, that intersect the dentition model roughly perpendicular to the occlusal plane (see FIG. 19A). The cross-sectional surface 382 of the dentition 2s model in each of these planes 380 includes cusps 384, 386 that represent the gingival margin. The computer identifies the gingival margin by applying one or more of the cusp detection techniques described above.
One technique is very similar to the neighborhood filtered cusp detection technique described above, in that voxel neighborhoods 388, 390 are SUBSTITUTE SHEET (RULE 26) defined on one of the 2D planes to focus the computer's search for cusps on an adjacent 2D plane. Upon detecting a pair of cusps 384, 386 on one 2D plane (step 400), the computer defines one or more neighborhoods 388, 390 to include a predetermined number of voxels surrounding the pair (step 402). The s computer projects the neighborhoods onto an adjacent 2D plane by identifying the voxels on the adjacent plane that correspond to the voxels in the neighborhoods 388, 390 on the original plane (step 404). The computer then identifies the pair of black voxels that lie closest together in the two neighborhoods on the adjacent plane, labeling these voxels as lying in the cusp ~o (step 406). The computer repeats this process for all remaining planes (step 408).
Many of these automated segmentation techniques are even more useful and efficient when used in conjunction with human-assisted techniques. For example, techniques that rely on the identification of the interproximal or ~s gingival margins function more quickly and effectively when a human user first highlights the interproximal or gingival cusps in an image of the dentition model. One technique for receiving this type of information from the user is by displaying a 2D or 3D representation and allowing the user to highlight individual voxels in the display. Another technique allows the user to scroll 2o through a series of 2D cross-sectional slices, identifying those voxels that represent key features such as interproximal or gingival cusps, as in the neighborhood-filtered seed cusp detection technique described above (FIGS.
32, 33, and 34). Some of these techniques rely on user interface tools such as cursors and bounding-box markers.
2s FIGS. 35A-35F illustrate another technique for separating teeth from gingival tissue in the dentition model. This technique is a human-assisted technique in which the computer displays an image of the dentition model (step 760) and allows a human operator to identify, for each tooth, the gingival margin, or gum line 600, encircling the tooth crown 602 (step 762). Some SUBSTT>rUTE SHEET (RULE Z6) applications of this technique involve displaying a 3D volumetric image of the dentition model and allowing the user to select, with an input device such as a mouse, the voxels that define the gingival line 600 around each tooth crown 602. The computer then uses the identified gingival line to model the tooth s roots and to create a cutting surface that separates the tooth, including the root model, from the gingival tissue 604.
Once the human operator has identified the gingival line 600, the computer selects a paint 606 that lies at or near the center of the tooth crown 602 (step 764). One way of choosing this point is by selecting a 2D image vo slice that is parallel to the dentition's occlusal plane and that intersects the tooth crown 602, and then averaging the x- and y-coordinate values of all voxels in this 2D slice that lie on the surface 608 of the tooth crown 602. After selecting the center point 606, the computer defines several points 605 on the gingival line 600 (step 766) and fits a plane 610 among these points 605 (step 768).
The ~s computer then creates a line segment 612 that is normal to the plane 610 and that extends a predetermined distance from the selected center point 606 (step 770). The expected size of a typical tooth or the actual size of the patient's tooth determines the length of the line segment 612. A length on the order of two centimeters is sufficient to model most tooth roots. The computer defines 2o a sphere 614, or a partial sphere, centered at the selected center point 606 (step 772). The radius of the sphere 614 is determined by the length of the line segment 612.
The computer then shifts the plane 610 along the line segment 612 so that the plane 6I0 is tangential to the sphere 614 (step 774). In some 2s applications, the computer allows the human operator to slide the plane 610 along the surface of the sphere 614 to adjust the orientation of the plane 610 (step 776). This is useful, for example, when the tooth crown 602 is tilted, which suggests that the tooth roots also are tilted. The computer then creates a projection 6I6 of the gingival line 600 on the shifted plane 610 (step 778).
The SUBSTITUTE SHEET (RULE 26) tooth roots are modeled by creating a surface 618 that connects the gingival line 600 to the projection 616 (step 780). The computer uses this surface as a cutting surface to separate the tooth from the gingival tissue. The cutting surface extends in a direction that is roughly perpendicular to the occlusal s surface of the tooth crown 602.
In general, the surface 618 that connects the gingival line 600 to the projection is formed by straight line segments that extend between the gingival line and the projection. However, some implementations allow curvature along these line segments. In some applications, the computer scales the projection 616 to be larger or smaller than the gingival line 600, which gives the surface 618 a tapered shape (step 782). Many of these applications allow the computer, with or without human assistance, to change the profile of the tapered surface so that the rate of tapering changes along the length of the surface 618 (step 784). For example, some surfaces taper more rapidly as ~s distance from the tooth crown increases.
FIGS. 37A-C and 38 illustrate another human-assisted technique for separating teeth from gingival tissue in the dentition model. This technique involves displaying an image of the dentition model to a human operator (step 790) and allowing the operator to trace the gingival lines 620, 622 on the 2o buccal and lingual sides of the dental arch (step 792). This produces two curves 624, 626 representing the gingival lines 620, 622 on the buccal and lingual surfaces. The computer uses these curves 624, 626 to create a 3D
cutting surface 628 that separates the tooth crowns 630, 632 from the gingival tissue 634 in the dentition model (step 794). The cutting surface 628 is roughly 2s parallel to the occlusal surface of the tooth crowns 630, 632.
FIGS. 37C and 39 illustrate one technique for defining the cutting surface 628. In general, the computer creates the cutting surface 628 by defining points 636, 638 along each of the 3D curves 624, 626 and defining the cutting surface 628 to fit among the points 636, 638. The computer first SUBSTITUTE SHEET (RULE 26) defines the points 636, 638 on the 3D curves 624, 626 (step 800) and then defines a point 640 at or near the center of each tooth crown 630 (step 802).
One way of defining the center point 64Q is by averaging the x_, y-, and z-coordinate values for all of the points 636, 638 lying on the portions of the s gingival curves 624, 626 associated with that tooth. The computer then creates a triangular surface mesh 642 using the center point 640 and the points 636, 638 on the gingival curves as vertices (step 804). The computer uses this surface mesh 642 to cut the tooth crowns away from the gingival tissue (step 806). In some implementations, a tooth root model is created for each crown, io e.g., by projecting the gingival curves onto a distant plane, as described above (step 808). The computer connects the roots to the crowns to complete the individual tooth models (step 810).
All of the segmentation techniques described above are useful in creating digital models of individual teeth, as well as a model of gingival tissue ~s surrounding the teeth. In some cases, the computer identifies and segments the teeth using one of these techniques to form the individual tooth models, and then uses all remaining data to create the gingival model.
Other Implementations 2o In many instances, the computer creates proposals for segmenting the dentition model and then allows the user to select the best alternative. For example, one version of the arch curve fitting technique described above requires the computer to create a candidate catenary or spline curve, which the user is allowed to modify by manipulating the mathematical control 2s parameters. Other techniques involve displaying several surfaces that are candidate cutting surfaces and allowing the user to select the appropriate surfaces.
Some implementations of the invention are realized in digital electronic circuitry, such as an application specific integrated circuit (ASIC); others are SUBSTITUTE SHEET (RULE 26) realized in computer hardware, firmware, and software, or in combinations of digital circuitry and computer components. The invention is usually embodied, at least in part, as a computer program tangibly stored in a machine-readable storage device for execution by a computer processor. In these situations, s methods embodying the invention are performed when the processor executes instructions organized into program modules, operating on input data and generating output. Suitable processors include general and special purpose microprocessors, which generally receive instructions and data from read-only memory and/or random access memory devices. Storage devices that are ~o suitable for tangibly embodying computer program instructions include all forms of nonvolatile memory, including semiconductor memory devices, such as EPROM, EEPROM, and flash memory devices; magnetic disks such as internal hard disks and removable disks; magneto-optical disks; and CD-ROM.
The invention has been described in terms of particular embodiments.
~ s Other embodiments are within the scope of the following claims.
SUBSTITUTE SHEET (RULE 26)
Other embodiments require the computer to identify data elements that represent a structural core, or skeleton, of each individual component to be modeled. The computer creates the modcl by linking other data elements representing the individual component to the structural core.
s In another aspect, a computer implementing the invention receives a three-dimensional (3D) data set representing the patient's dentition, applies a test to identify data elements that represent an interproximal margin between two teeth in the dentition, and applies another computer-implemented test to select data elements that lie on one side of the interproximal margin for o inclusion in the digital model. Some implementations require the computer to identify data elements that form one or more 2D cross-sections of the dentition in one or more 2D planes intersecting the dentition roughly parallel to the dentition's occlusal plane.
In another aspect, a computer implementing the invention receives a 3D
15 data set representing at least a portion of the patient's dentition, including at least a portion of a tooth and gum tissue surrounding the tooth; applies a test to identify data elements lying on a gingival boundary that occurs where the tooth and the gum tissue meet; and applies a test to the data elements lying on the boundary to identify other data elements representing portions of the tooth.
2o In one aspect, the invention involves obtaining a three-dimensional (3D) digital model of a patient's dentition and analyzing the model to determine the orientation of at least one axis of the model automatically. In some implementations, the model's z-axis is found by creating an Oriented Bounding Box {OBB) around the model and identifying the direction in which the OBB
2s has minimum thickness. The z-axis extends in this direction, from the model's bottom surface to its top surface. Moreover, in a dentition model having only one mandible, one of the model surfaces is substantially flat and an opposite surface is textured. The direction of the positive z-axis can be identified in this type of model by identifying which of the surfaces is flat or textured. One SUBSTITUTE SHEET (RULE 26) technique for doing so involves creating one or more planes that are roughly normal to the z-axis and then creating line segments that extend between the planes and the top and bottom surfaces of the dentition model. The surface for which all of the line segments are of one length is identified as being the flat surface, and the surface for which the line segments have varying lengths is identified as being the textured surface.
In other implementations, the x- and y-axes are found by selecting a two-dimensional (2D) plane that contains the axes and an arch-shaped cross section of the dentition model and identifying the orientations of the axes in ~o this plane. In general, the arch-shaped cross section is roughly symmetrical about the y-axis. One technique for identifying the y-axis involves identifying a point at each end of the arch-shaped cross section, creating a line segment that extends between the identified points, and identifying the orientation of the y-axis as being roughly perpendicular to the line segment. The point at each ~s end of the arch can be identified by selecting a point that lies within an area surrounded by the arch-shaped cross section, creating a line segment that extends between the selected point and an edge of the 2D plane, sweeping the line segment in a circular manner around the selected point, and identifying points at the ends of the arch-shaped cross section at which the sweeping line 2o segment begins intersecting the cross section of the dentition model and stops intersecting the cross section of the dentition model. In general, the x-axis is perpendicular to the y-axis.
In another aspect, the invention involves using a programmed computer to create a digital model of an individual component of a patient's dentition by 25 obtaining a 3D digital model of the patient's dentition, identifying points in the dentition model that lie on an inter-proximal margin between adjacent teeth in the patient's dentition, and using the identified points to create a cutting surface for use in separating portions of the dentition model representing the adjacent teeth.
SUBSTITUTE SHEET (RULE 26) In some implementations, 2D cross sections of the dentition model are displayed to a human operator, and the operator provides input identifying approximate points at which the interproximal margin between the adjacent teeth meets gingival tissue. In some cases, the dentition model includes a 3D
s volumetric model of the dentition, and the input provided by the operator identifies two voxels in the volumetric model. The computer then defines a neighborhood of voxels around each of the two voxels identified by the human operator, where each neighborhood includes voxels representing the dentition model and voxels representing a background image. The computer selects the ~ o pair of voxels, one in each neighborhood, representing the background image that lie closest together.
In some of these implementations, the computer also identifies voxels on another 2D cross section that represent the inteiproximal margin. One technique for doing so is by defining a neighborhood of voxels around each of ~ s the selected voxels, where each neighborhood includes voxels representing the dentition model and voxels representing a background image, projecting the neighborhoods onto the other 2D cross section, and selecting two voxels in the projected neighborhoods that represent the inter-proximal margin.
In another aspect, the invention involves displaying an image of a 2o dentition model, receiving input from a human operator identifying points in the image representing a gingival line at which a tooth in the dentition model meets gingival tissue, and using the identified points to create a cutting surface for use in separating the tooth from the gingival tissue in the dentition model.
The cutting surface often extends roughly perpendicular to the dentition's 2s occlusal plane.
In some implementations, the cutting surface is created by projecting at least a portion of the gingival line onto a plane that is roughly parallel to the occlusal plane and then creating a surface that connects the gingival line to the projection. One way of establishing the plane is by fitting the plane among the SUBSTITUTE SHEET (RULE 26) points on the gingival line and then shifting the plane away from the tooth in a direction that is roughly normal to the plane. For example, the plane can be shifted along a line segment that includes a point near the center of the tooth and that is roughly perpendicular to the plane. The length of the line segment s usually approximates the length of a tooth root.
In other embodiments, the cutting surface extends roughly parallel to the dentition's occlusal plane in the dentition model. In some of these embodiments, the input received from the human operator identifies points that form two 3D curves representing gingival lines at which teeth in the dentition ~o model meet gum tissue on both the buccal and lingual sides of the dentition model. The cutting surface is created by fitting a surface among the points lying on the two curves. For each tooth, a point lying between the two curves is identified and surface triangles are created between the identified point and points on the two curves. One technique for identifying the point involves ~s averaging, for each tooth, x, y and z coordinate values of the points on portions of the two curves adjacent to the tooth.
Other embodiments involve creating, for each tooth, a surface that represents the tooth's roots. One technique for doing so involves projecting points onto a plane that is roughly parallel to the occlusal plane and connecting 2o points on the two curves to the projected points. The surface can be used to separate portions of the dentition model representing the tooth roots from portions representing gingival tissue. The model of the tooth roots is then connected to the tooth model.
Other embodiments and advantages are apparent from the detailed 2s description and the claims below.
DESCRIPTION OF THE DRAWIrIGS
FIGS. lA, 1B, and 2 are partial views of a dentition model as displayed on a computer monitor and segmented with a human-operated saw tool.
SUBSTITUTE SHEET (RULE 26) FIG. 3 is a partial view of a dentition model as displayed on a computer monitor and segmented with a human-operated eraser tool.
FIG. 4 is a view of a dentition model for which a feature skeleton has been identified.
FIGS. 5 and 6 are flowcharts for a feature skeleton analysis technique used in segmenting a dentition model.
FIG. 7A is a horizontal 2D cross-sectional view of a dentition model.
FIG. 7B is a side view of a dentition model intersected by several 2D
planes.
~o FIG. 8 is a flowchart for a 2D slice analysis technique used in segmenting a dentition model.
FIGS. 9 and l0A through l OC each shows a group of voxels in a 2D
slice of a dentition model.
FIG. 11 is a flowchart for an automatic cusp detection technique used in ~ s segmenting a dentition model.
FIG. 12 is a horizontal 2D cross section of a dentition model illustrating a neighborhood filtered automatic cusp detection technique used in segmenting the dentition model.
FIG. 13 shows two groups of voxels in a 2D slice of a dentition model zo illustrating the neighborhood filtered automatic cusp detection technique.
FIG. 14 is a flowchart for the neighborhood filtered automatic cusp detection technique.
FIG. 15 is a horizontal 2D cross section of a dentition model illustrating an arch curve fitting technique used in segmenting the dentition model.
2s FIG. 16 is a flowchart for the arch curve fitting technique.
FIG. 17 is a horizontal 2D cross section of a dentition model illustrating a curve creation technique for use with the arch curve fitting technique.
FIG. 18 is a flowchart for the curve creation technique.
SUBSTITUTE SHEET (RULE 26) FIGS. 19A and 19B are a perspective view and a vertical 2D cross-sectional view of a dentition model illustrating another technique for use in segmenting the dentition model.
FIGS. 20 and 21 are flowcharts of the technique illustrated in FIGS. 19A
and 19B.
FIG. 22 is a vertical 2D cross-sectional view of a dentition model illustrating the gingival margin detection technique for use in segmenting the dentition model.
FIG. 23 shows a group of voxels in a 2D slice of a dentition model ~o illustrating a gingival margin detection technique.
FIG. 24 is a flowchart for the gingival margin detection technique.
FIG. 25 shows a digital dentition model inside an Oriented Bounding Box (OBB).
FIG. 26 illustrates a technique for properly orienting a digital dentition ~s model along a z-axis.
FIGS. 27A, 27B, and 27C illustrate a technique for properly orienting a digital dentition model along x- and y-axes.
FIGS. 28, 29, 30 and 31 are flowcharts for the techniques of FIGS. 25, 26, and 27A-C.
2o FIGS. 32 and 33 illustrate a human-assisted technique for identifying interproximal margins between teeth.
FIG. 34 is a flowchart for the technique of FIGS. 32 and 33.
FIGS. 35A through 35F illustrate a technique for segmenting a digital dentition model into models of individual teeth and gum tissue.
2s FIG. 36 is a flowchart for the technique of FIGS. 35A through 35F.
FIGS. 37A, 37B, and 37C illustrate another technique for segmenting a digital dentition model into models of individual teeth.
FIGS. 38 and 39 are flowcharts for the technique of FIGS. 37A, 37B, and 37C.
SUBSTITiJTE SHEET (RULE 26) DETAILED DESCRIPTION
U.S. patent application 09/169,276 describes techniques for generating a 3D digital data set that contains a model of a patient's dentition, including the s crowns and roots of the patient's teeth as well as the surrounding gum tissue.
One such technique involves creating a physical model of the dentition from a material such as plaster and then digitally imaging the model with a laser scanner or a destructive scanning system. These techniques are used to produce a digital volumetric 3D model ("volume element representation" or "voxel representation's of the dentition model, and/or a digital geometric 3D
surface model ("geometric model") of the dentition. The computer-implemented techniques described below act on one or both of these types of 3D dentition models.
In creating a voxel representation, the physical model is usually ~ s embedded in a potting material that contrasts sharply with the color of the physical model to enhance detection of the dentition features. A white dentition model embedded in a black potting material provides the sharpest contrast. A wide variety of information can be used to enhance the 3D model, including data taken from photographic images, 2D and 3D x-rays scans, 2o computed tomography (CT) scans, and magnetic resonance imaging (MRI) scans of the patient's dentition.
The 3D data set is loaded into a computer which, under control of a program implementing one or more techniques of the dentition, either with or without human assistance, segments the digital dentition model into digital 2s models of individual dentition components, such as teeth and gingival tissue.
In one implementation, the computer produces a digital model of each individual tooth in the patient's dentition, as well as a digital model of the gingival tissue surrounding the teeth.
SUBSTTIrUTE SHEET (RULE 2b) To segment the digital dentition model accurately, the computer often must know the exact orientation of the dentition model. One technique for establishing the orientation of the digital dentition model in the 3D data set involves holding the physical dentition model at a prescribed orientation during the digital imaging process discussed above. Embedding the physical model at a particular orientation in a solid potting material is one way of holding the physical model. In some systems, however, even this technique introduces small errors in the orientation of the dentition model.
~o Orienting the Digital Dentition Model FIGS. 25, 26, 27A-C and 28 illustrate several techniques used by the computer to orient the digital dentition model 500 properly. The computer first obtains a digital model of the dentition using one of the techniques described above (step 700). The computer then locates the model's z-axis 502, which in ~ s the depicted example extends from the base of the model toward the roof of the patient's mouth and is normal to the dentition's occlusal plane (step 702).
The computer then locates the model's y-axis 504, which in the depicted example extends from an area lying within the dental arch toward the patient's front teeth (step 704). Using the right-hand rule, the computer then defines the 2o model's x-axis 506 to extend from an area lying within the dental arch toward the teeth on the right side of the patient's mouth (step 706). The occlusal plane is a plane that is pierced by all of the cusps of the patient's teeth when the patient's mandibles interdigitate. Techniques for identifying the occlusal plane include receiving user input identifying the location of the plane and 2s conducting a fully-automated analysis of the dentition model.
FIGS. 25, 26, and 29 show one technique for identifying the z-axis 502.
The computer first identifies the dentition model 500 in the 3D data set (step 710). For 3D geometric data, identifying the dentition model is simply a matter of locating the geometric surfaces. For 3D volumetric data, identifying the SUBSTITUTE SHEET (RULE Z6) dentition model involves distinguishing the lighter voxels, which represent the dentition model, from the darker voxels, which represent the background. The computer then fits an Oriented Bounding Box ("OBB") 510 around the dentition model 500 using a conventional OBB fitting technique (step 712).
s The dimension in which the OBB 510 has its smallest thickness TM,N is the dimension in which the z-axis 502 extends (step 714).
After determining the dimension in which the z-axis extends 502, the computer determines whether the dentition model is facing upward or downward, i.e., in which direction the positive z-axis extends. FIGS. 26 and ~o illustrate a technique for determining the direction of the positive z-axis. This technique relies on an observation that the bottom surface 512 of the dentition model is flat and the upper surface 514 follows the jagged contours of the patient's teeth. This technique also relies on an assumption that the model at this point includes only one of the patient's mandibles.
~ 5 The computer first creates one or more planes 516, 518 that are normal to the z-axis 502 (step 720). The computer then creates line segments S 15A, S 158 between the planes 516, 518 and the surfaces 512, 514 of the model (step 722). The line segments S 15A that touch the flat bottom surface 512 are all of approximately the same length (step 724). The line segments S 15B that touch 2o the jagged top surface 514 have varying lengths (step 726). The computer identifies the positive z-axis as extending from the bottom surface 512 toward the top surface 514 and orients the digital dentition model 500 accordingly (step 72$).
FIGS. 27A, 278, 27C, and 31 illustrate a technique for identifying the 2s y-axis 504 and the x-axis 506 of the dentition model 500. The computer begins by selecting a 2D slice 520 of data that is normal to the z-axis and that contains a cross section 522 of the dentition model (step 730). This technique relies on an observation that the cross section 522 of the dentition model is arch shaped.
The computer identifies a point 524 at or near the center of the 2D slice 520 SUBSTITUTE SHEET (RULE 26) (step 732). The computer then creates a line segment 526 (or 530) that extends from the selected point 524 to an edge 52$ (or 532) of the slice 520 (step 734).
The direction in which the line segment extends is arbitrary, so the line segment may or may not intersect the dental cross section. The depicted example shows two line segments 526, 530, one of which intersects the dental cross section 522, the other of which does not.
The computer then begins rotating, or sweeping, one of the line segments 526, 530 about the center point 524 (step 736). In general, the computer sweeps the line segment in small, discrete steps, usually on the order 0 of five degrees of rotation. As it is swept, a line segment 526 that initially intersects the dental cross section 522 will eventually stop intersecting the cross section 522, and the computer marks the point 534 at which this occurs. As sweeping continues, the line segment 526 will eventually resume intersecting the cross section 522, and the computer marks the point 536 at which this ~ s occurs. Likewise, a line segment 530 that initially does not intersect the cross section 522 eventually will begin intersecting the cross section 522, and the computer marks the point 536 at which this occurs. The computer also marks the point 534 at which this line segment 530 stops intersecting the cross section 522 (step 738). The computer stops sweeping the line segments 526, 530 after 2o marking both of the points 534, 536 (step 740).
The computer then creates a line segment 538 that extends between the two marked points 534, 536 (step 742). The y-axis 504 of the dentition model extends roughly normal to this line segment 538 through the front 540 of the dental arch (step 744). The x-axis 506 extends roughly parallel to this line 2s segment 538 through the right side 542 of the dental arch (step 746). The computer uses this Iine segment 538 to orient the dentition model correctly along the x- and y-axes (step 748).
SUBSTITUTE SHEET (RULE 2~
Se>rxnenting the Digital Dentition Model Into Individual Component Models Some computer-implemented techniques for segmenting a 3D dentition model into models of individual dentition components require a substantial s amount of human interaction with the computer. One such technique, which is shown in FIGS. 1 A, 1 B, and 2, provides a graphical user interface with a feature that imitates a conventional saw, allowing the user to identify components to be cut away from the dentition model 100. The graphical user interface provides a rendered 3D image 100 of the dentition model, either at to one or more static views from predetermined positions, as shown in FIGS. lA
and 1B, or in a "full 3D" mode that allows the user to alter the viewing angle, as shown in FIG. 2. The saw tool is implemented as a set of mathematical control points 102, represented graphically on the rendered image 100, which define a 3D cutting surface 104 that intersects the volumetric or geometric ~ s dentition model. The computer subdivides the data elements in the dentition model by performing a surface intersection operation between the 3D cutting surface 104 and the dentition model. The user sets the locations of the mathematical control points, and thus the geometry and position of the 3D
cutting surface, by manipulating the control points in the graphical display with 2o an input device, such as a mouse. The computer provides a visual representation 104 of the cu ing surface on the display to assist the user in fitting the surface around the individual component to be separated. Once the intersection operation is complete, the computer creates a model of the individual component using the newly segmented data elements.
2s Another technique requiring substantial human interaction, shown in FIG. 3, is a graphical user interface with a tool that imitates a conventional eraser. The eraser tool allows the user to isolate an individual dentition component by removing portions of the dentition model that surround the individual component. The eraser tool is implemented as a 3D solid 110, SUBSTITUTE SHEET (RULE 26) typically having the shape of a rectangular prism, or a curved surface that matches the shape of a side surface of a tooth. The solid is made as small as possible, usually only a single voxel thick, to minimize degradation of the data set. As with the saw technique above, the graphical user interface presents the user with a rendered 3D image 112 of the dentition model at one or more predetermined static views or in a full 3D mode. The user identifies portions of the dentition model for removal by manipulating a graphical representation 110 of the 3D solid with an input device. In alternative embodiments, the computer either removes the identified portions of the dentition model as the user moves the eraser 112, or the computer waits until the user stops moving the eraser and provides an instruction to remove the identified portions. The computer updates the display in real time to show the path 114 of the eraser through the dentition model.
Other computer-implemented segmentation techniques require little or 15 no human interaction during the segmentation process. One such technique, which is illustrated in FIG. 4, involves the application of conventional "feature skeleton" analysis to a volumetric representation of the dentition model. This technique is particularly useful in identifying and modeling individual teeth.
In general, a computer applying this technique identifies a core of voxels, that 2o forms a skeleton 122 for the dentition 120. The skeleton 122 roughly resembles the network of biological nerves within patient's teeth. The computer then divides the skeleton 122 into branches 124, each containing voxels that lie entirely within one tooth. One technique for identifying the branches is by defining a plane 126 that cuts through the skeleton 122 roughly 2s parallel to the occlusal plane of the patient's dentition ("horizontal plane").
Each branch 124 intersects the horizontal plane 126 at one or more points, or clusters, that are relatively distant from the clusters associated with the other branches. The computer forms the individual tooth models by linking other voxels to the appropriate branches 124 of the skeleton.
SUBSTITUTE SHEET (RULE Z6) FIG. 5 describes a particular technique for forming a skeleton in the dentition model. The computer first identifies the voxels in the dentition model that represent the tooth surfaces (step 130). For a voxel representation that is created from a physical model embedded in a sharply contrasting material, identifying the tooth surfaces is as simple as identifying the voxels at which sharp changes in image value occur, as described in U.S. patent application 09/169,276. The computer then calculates, for each voxel in the model, a distance measure indicating the physical distance between the voxel and the nearest tooth surface (step 132). The computer identifies the voxels with the o largest distance measures and labels each of these voxels as forming a portion of the skeleton (step 134). Feature skeleton analysis techniques are described in more detail in the following publications: ( 1 ) Gagvani and Silver, "Parameter Controlled Skeletons for 3D Visualization," Proceedings of the IEEE Visualization Conference (1997); (2) Bertrand, "A Parallel Thinning ~ s Algorithm for Medial Surfaces," Pattern Recognition Letters, v. 16, pp.
979-986 { 1995); (3) Mukherjee, Chatterji, and Das, "Thinning of 3-D Images Using the Safe Point Thinning Algorithm (SPTA)," Pattern Recognition Letters, v. 10, pp. 167-173 (1989); (4) Niblack, Gibbons, and Capson, "Generating Skeletons and Centerlines from the Distance Transform," CVGIP:
2o Graphical Models and Image Processing, v. 54, n. 5, pp. 420-437 (1992).
Once a skeleton has been formed, the computer uses the skeleton to divide the dentition model into 3D models of the individual teeth. FIG. 6 shows one technique for doing so. The computer first identifies those portions of the skeleton that are associated with each individual tooth. To do so, the 2s computer defines a plane that is roughly parallel to the dentition's occlusal surface and that intersects the skeleton near its base (step 136). The computer then identifies points at which the plane and the skeleton intersect by identifying each voxel that lies on both the skeleton and the plane (step 138).
In general, a single tooth includes all of the voxels that lie in a particular branch is SUBSTITUTE SHEET (RULE 26) of the skeleton; and because the plane intersects the skeleton near its base, voxels that lie together in a branch of the skeleton usually cluster together on the intersecting plane. The computer is able to locate the branches by identifying voxels on the skeleton that lie within a particular distance of each s other on the intersecting plane (step 140). The computer then identifies and labels all voxels on the skeleton that belong to each branch (step 142).
Once the branches are identified, the computer links other voxels in the model to the branches. The computer begins by identifying a reference voxel in each branch of the skeleton (step 144). For each reference voxel, the o computer selects an adjacent voxel that does not lie on the skeleton (step 146).
The computer then processes the selected voxel, determining whether the voxel lies outside of the dentition, i.e., whether the associated image value is above or below a particular threshold value (step 148); determining whether the voxel already is labeled as belonging to another tooth (step 150); and determining ~s whether the voxel's distance measure is greater than the distance measure of the reference voxel (step 152). If none of these conditions is true, the computer labels the selected voxel as belonging to the same tooth as the reference voxel (step 154). The computer then repeats this test for all other voxels adjacent to the reference voxel (step 156). Upon testing all adjacent voxels, the computer 2o selects one of the adjacent voxeIs as a new reference point, provided that the adjacent voxel is labeled as belonging to the same tooth, and then repeats the test above for each untested voxel that is adjacent to the new reference point.
This process continues until alI voxels in the dentition have been tested.
FIGS. 7A and 7B illustrate another technique for identifying and 2s segmenting individual teeth in the dentition model. This technique, called "2D
slice analysis," involves dividing the voxel representation of the dentition model into a series of parallel 2D planes 160, or slices, that are each one voxel thick and that are roughly parallel to the dentition's occlusal plane, which is roughly normal to the model's z-axis. Each of the 2D slices 160 includes a 2D
SUBSTITUTE SHEET (RULE 16) cross section 162 of the dentition, the surface 164 of which represents the lingual and buccal surfaces of the patient's teeth and/or gums. The computer inspects the cross section 162 in each 2I? slice I60 to identify voxels that approximate the locations of the interproximal margins 166 between the teeth.
s These voxels lie at the tips of cusps 165 in the 2D cross-sectional surface 164.
The computer then uses the identified voxels to create 3D surfaces 168 intersecting the dentition model at these locations. The computer segments the dentition model along these intersecting surfaces 168 to create individual tooth models.
o FIG. 8 describes a particular implementation of the 2D slice analysis technique. The computer begins by identifying the voxels that form each of the 2D slices (step 170). The computer then identifies, for each 2D slice, the voxels that represent the buccal and lingual surfaces of the patient's teeth and gums (step 172) and defines a curve that includes all of these voxels (step 174).
~s This curve represents the surface 164 of the 2D cross section 162.
The computer then calculates the rate of curvature (i.e., the derivative of the radius of curvature) at each voxel on the 2D cross-sectional surface 164 (step 176) and identifies all of the voxels at which local maxima in the rate of curvature occur (step 178). Each voxel at which a local maximum occurs 2o represents a "cusp" in the 2D cross-sectional surface 164 and roughly coincides with an interproximal margin between teeth. In each 2D slice, the computer identifies pairs of these cusp voxels that correspond to the same interproximal margin (step 180), and the computer labels each pair to identify the interproximal margin with which it is associated (step 182). The computer then 2s identifies the voxel pairs on all of the 2D slices that represent the same interproximal margins (step 184). For each interproximal margin, the computer fits a 3D surface 168 approximating the geometry of the interproximal margin among the associated voxel pairs (step 186).
m SUBSTITUTE SHEET (RULE 26) FIG. 9 illustrates one technique for creating the 3D surfaces that approximate the interproximal margins. For each pair of cusp voxels 190a-b in a 2D slice that are associated with a particular interproximal region, the computer creates a line segment 192 bounded by these cusp voxels 190a-b.
The computer changes the colors of the voxels in the line segment, including the cusp voxels 190a-b that bound the segment, to contrast with the other voxels in the 2D slice. The computer creates line segments in this manner in each successive 2D slice, forming 3D surfaces that represent the interproximal regions. All of the voxels that lie between adjacent ones of these 3D surfaces ~o represent an individual tooth.
FIGS. l0A through l OC illustrate a refinement of the technique shown in FIG. 9. The refined technique involves the projection of a line segment 200 from one slice onto a line segment 206 on the next successive slice to form, for the associated interproximal margin, a 2D area bounded by the cusp voxels ~ 5 202a-b, 204a-b of the line segments 200, 206. If the line segments 200, 206 are oriented such that any voxel on one segment 200 is not adjacent to a voxel on the other segment 206, as shown in FIG. 10A, then the resulting 3D surface is discontinuous, leaving unwanted "islands" of white voxels 208, 210.
The computer eliminates these discontinuities by creating two new line 2o segments 212, 214, each of which is bounded by one cusp voxel 202a-b, 204a-b from each original line segment 200, 206, as shown in FIG. IOB. The computer then eliminates the islands between the new line segments 212, 214 by changing the colors of all voxels between the new line segments 212, 214, as shown in FIG. l OC.
2s Automated segmentation is enhanced through a technique known as "seed cusp detection." The term "seed cusp" refers to a location at which an interproximal margin between adjacent teeth meets the patient's gum tissue. In a volumetric representation of the patient's dentition, a seed cusp for a particular interproximal margin is found at the cusp voxel that lies closest to is SUBSTITUTE SHEET (RULE 26) the gum line. By applying the seed cusp detection technique to the 2D slice analysis, the computer is able to identify all of the seed cusp voxels in the model automatically.
FIG. 11 shows a particular implementation of the seed cusp detection 5 technique, in which the computer detects the seed cusps by identifying each slice in which the rate of curvature of a cusp first falls below a predetermined threshold value. The computer begins by selecting a 2D slice that intersects all of the teeth in the arch (step 220). The computer attempts to select a slice that is near the gingival regions but that does not include any voxels representing to gingival tissue. The computer then identifies all of the cusp voxels in the slice (step 222). If the rate of curvature of the 2D cross section at any of the cusp voxels is less than a predetermined threshold value, the computer labels that voxel as a seed cusp {step 224). The computer then selects the next 2D
slice, which is one voxeI layer closer to the gingival region (step 226), and ~s identifies all of the cusp voxels that are not associated with a cusp for which the computer has akeady identified a seed cusp (step 228). If the rate of curvature of the ZD cross section is less than the predetermined threshold value at any of these cusp voxels, the computer labels the voxel as a seed cusp (step 230) and proceeds to the next 2D slice. The computer continues in this manner 2o until a seed cusp voxel has been identified for each cusp associated with an interproximal margin (step 232).
FIGS. 32, 33, and 34 illustrate a human-assisted technique, known as "neighborhood-filtered seed cusp detection," for detecting seed cusps in the digital dentition model. This technique allows a human operator to scroll 2s through 2D image slices on a video display and identify the locations of the seed cusps for each of the interproximal margins. The computer displays the 2D slices (step 750), and the operator searches the 2D slices to determine, for each adjacent pair of teeth, which slice 550 most likely contains the seed cusps for the corresponding interproximal margin. Using an input device such as a l9 SUBSTITUTE SHEET (RULE 26) mouse or an electronic pen, the user marks the points 552, 554 in the slice that appear to represent the seed cusps (step 752). With this human guidance, the computer automatically identifies two voxels in the slice as the seed cusps.
The points 552, 554 identified by the human operator may or may not be s the actual seed cusps 560, 562, but these points 552, 554 lie very close to the actual seed cusps 560, 562. As a result, the computer confines. its search for the actual seed cusps 560, 562 to the voxel neighborhoods 556, 558 immediately surrounding the points 552, 554 selected by the human operator. The computer defines each of the neighborhoods 556, 558 to contain a particular number of o voxels, e.g., twenty-five arranged in a 5 x 5 square, as shown here (step 754).
The computer then tests the image values for all of the voxels in the neighborhoods 556, 558 to identify those associated with the background image and those associated with the dentition (step 756). In this example, voxels in the background are black and voxels in the dentition are white. The ~s computer identifies the actual seed cusps 560, 562 by locating the pair of black voxels, one from each of the neighborhoods 556, 558, that lie closest together (step 758). In the depicted example, each of the actual seed cusps 560, 562 lies next to one of the points 552, 554 selected by the human operator.
FIGS. 12, 13, and 14 illustrate a technique, known as "neighborhood-2o filtered cusp detection," by which the computer focuses its search for cusps on one 2D slice to neighborhoods 244, 246 of voxels defined by a pair of previously detected cusp voxels 240, 242 on another 2D slice. This technique is similar to the neighborhood-filtered seed cusp detection technique described above.
2s Upon detecting a pair of cusp voxels 240, 242 in a 2D slice at level N
(step 250), the computer defines one or more neighborhoods 244, 246 that include a predetermined number of voxels surrounding the pair (step 252). The computer then projects the neighborhoods onto the next 2D slice at level N+1 by identifying the voxels on the next slice that are immediately adjacent the SUBSTITUTE SHEET (RULE 26) voxels in the neighborhoods on the original slice (step 254). The neighborhoods are made large enough to ensure that they include the cusp voxels on the N+1 slice. In the example of FIG. 13, each cusp voxel 240, 242 lies at the center of a neighborhood 244, 246 of twenty-five voxels arranged in s a 5 x 5 square.
In searching for the cusp voxels on the N+1 slice, the computer tests the image values for all voxels in the projected neighborhoods to identify those associated with the background image and those associated with the dentition (step 256). In the illustrated example, voxels in the background are black and voxels in the dentition are white. The computer identifies the cusp voxels on the N+1 slice by locating the pair of black voxels in the two neighborhoods that lie closest together (step 258). The computer then repeats this process for all remaining slices (step 259).
FIGS. 15 and 16 illustrate another technique, known as "arch curve fitting," for identifying interproximal margins between teeth in the dentition.
The arch curve fitting technique, which also applies to 2D cross-sectional slices of the dentition, involves the creation of a curve 260 that fits among the voxels on the 2D cross-sectional surface 262 of the dentition arch 264. A series of closely spaced line segments 268, each bounded by the cross-sectional surface 20 268, are formed along the curve 260, roughiy perpendicular to the curve 260, throughout the 2D cross section 264. In general, the shortest of these line segments 268 lie on or near the interproximal margins; thus computer identifies the cusps that define the interproximal margins by determining the relative lengths of the line segments 268.
2s When applying the arch curve fitting technique, the computer begins by selecting a 2D slice (step 270) and identifying the voxels associated with the surface 262 of the cross-sectional arch 264 (step 272). The computer then defines a curve 260 that fits among the voxels on the surface 262 of the arch (step 274). The computer creates the curve using any of a variety of SUBSTITUTE SHEEP (RULE 26) techniques, a few of which are discussed below. The computer then creates a series of line segments that are roughly perpendicular to the curve and are bounded by the cross-sectional surface 262 (step 276). The line segments are approximately evenly spaced with a spacing distance that depends upon the s required resolution and the acceptable computing time. Greater resolution leads to more line segments and thus greater computing time. In general, a spacing on the order of 0.4 mm is sufficient in the initial pass of the arch curve fitting technique.
The computer calculates the length of each line segment (step 278) and o then identifies those line segments that form local minima in length (step 280).
These line segments roughly approximate the locations of the interproximal boundaries, and the computer labels the voxels that bound these segments as cusp voxels (step 282). The computer repeats this process for each of the 2D
slices (step 284) and then uses the cusp voxels to define 3D cutting surfaces t5 that approximate the interproximal margins.
In some implementations, the computer refines the arch cusp detenmination by creating several additional sets of line segments, each centered around the arch cusps identified on the first pass. The line segments are spaced more narrowly on this pass to provide greater resolution in 2o identifying the actual positions of the arch cusps.
The computer uses any of a variety of curve fitting techniques to create the curve through the arch. One technique involves the creation of a catenary curve with endpoints lying at the two ends 265, 267 (FIG. 15) of the arch. The catenary curve is defined by the equation y = a + bcosh(cx), and the computer 2s fits the curve to the arch by selecting appropriate values for the constants a, b, and c. Another technique involves the creation of two curves, one fitted among voxels lying on the front surface 271 of the arch, and the other fitted among voxels on the rear surface 273. A third curve, which guides the placement of SUBSTTrUT'E SHEET (RULE 26) WO 00/19935 PCT/US99lZ3532 the line segments above, passes through the middle of the area lying between the first two curves.
FIGS. 17 and 18 illustrate another technique for constructing a curve through the arch. This technique involves the creation of a series of initial line s segments through the arch 264 and the subsequent formation of a curve 290 fitted among the midpoints of these line segments This curve 290 serves as the arch curve in the arch curve fitting technique described above.
In applying this technique, the computer first locates an end 265 of the arch (step 300) and creates a line segment 291 that passes through the arch o near this end 265 (step 301 ). The line segment 291 is bounded by voxels 292a-b lying on the surface of the arch. The computer then determines the midpoint 293 of the line segment 291 (step 302), selects a voxel 294 located particular distance from the midpoint 293 (step 304), and creates a second line segment 295 that is parallel to the initial line segment 291 and that includes the ~ 5 selected voxel 294 (step 306). The computer then calculates the midpoint of the second segment 295 (step 308) and rotates the second segment 295 to the orientation 295' that gives the segment its minimum possible length (step 309).
In some cases, the computer limits the second segment 295 to a predetermined amount of rotation (e.g., f 10°).
2o The computer then selects a voxel 297 located a particular distance from the midpoint 296 of the second segment 295 (step 310) and creates a third line segment 298 that is parallel to the second line segment 295 and that includes the selected voxel 297 (step 312). The computer calculates the midpoint 299 of the third segment 298 (step 314) and rotates the segment 298 to the 2s orientation 298' that gives the segment its shortest possible length (step 316).
The computer continues adding line segments in this manner until the other end of the cross-sectional arch is reached (step 318). The computer then creates a curve that fits among the midpoints of the line segments (step 320) and uses this curve in applying the arch fitting technique described above.
SUBSTITUTE SHEET (RULE 26) FIGS. 19A, 19B and 20 illustrate an alternative technique for creating 3D surfaces that approximate the geometries and locations of the interproximal margins in the patient's dentition. This technique involves the creation of 2D
planes that intersect the 3D dentition model at locations that approximate the interproximal margins. In general, the computer defines a series of planes, beginning with an initial plane 330 at one end 331 of the arch 332, that are roughly perpendicular to the occlusal plane of the dentition model ("vertical"
planes). Each plane intersects the dentition model to form a 2D cross section 334. If the planes are spaced sufficiently close to each other, the planes with o the smallest cross-sectional areas approximate the locations of the interproximal margins in the dentition. The computer locates the interproximal regions more precisely by rotating each plane about two orthogonal axes 336, 338 until the plane reaches the orientation that yields the smallest possible cross-sectional area.
~ s In one implementation of this technique, the computer first identifies one end of the arch in the dentition model (step 340). The computer then creates a vertical plane 330 through the arch near this end (step 342) and identifies the center point 331 of the plane 330 (step 344). The computer then selects a voxel located a predetermined distance from the center point (step 20 345) and creates a second plane 333 that is parallel to the initial plane and that includes the selected voxel (step 346). The computer calculates the midpoint of the second plane (step 348) and rotates the second plane about two orthogonal axes that intersect at the midpoint (step 350). The computer stops rotating the plane upon finding the orientation that yields the minimum cross-es sectional area. In some cases, the computer limits the plane to a predetermined amount of rotation (e.g., t 10° about each axis). The computer then selects a voxel located a particular distance from the midpoint of the second plane (step 352) and creates a third plane that is parallel to the second plane and that includes the selected voxel (step- 354). The computer calculates the midpoint SUBSTITiTrE SHEET (RULE 26) of the third plane (step 356) and rotates the plane to the orientation that yields the smallest possible cross-sectional area (step 357). The computer continues adding and rotating planes in this manner until the other end of the arch is reached (step 358). The computer identifies the planes at which local minima s in cross-sectional area occur and labels these planes as "interproximal planes,"
which approximate the locations of the interproximal margins (step 360).
One variation of this technique, described in FIG. 21, allows the computer to refine its identification of inteiproximal planes by creating additional, more closely positioned planes in areas around the planes labeled as ~o interproximal. The computer first creates a curve that fits among the midpoints of the planes labeled as interproximal planes (step 372) and then creates a set of additional planes along this curve (step 374). The additional planes are not evenly spaced along the curve, but rather are concentrated around the interproximal margins. The planes in each interproximal area are spaced very closely (e.g., 0.05 mm from each other). The computer rotates each of the newly constructed planes about two orthogonal axes until the plane reaches its minimum cross-sectional area (step 376). The computer then selects the plane in each cluster with the smallest cross-sectional area as the plane that most closely approximates the interproximal margin (step 378).
2o FIGS. 22, 23, and 24 illustrate a technique for identifying the gingival margin that defines the boundary between tooth and gum in the patient's dentition. This technique involves the creation of a series of vertical 2D
planes 380, or slices, that intersect the dentition model roughly perpendicular to the occlusal plane (see FIG. 19A). The cross-sectional surface 382 of the dentition 2s model in each of these planes 380 includes cusps 384, 386 that represent the gingival margin. The computer identifies the gingival margin by applying one or more of the cusp detection techniques described above.
One technique is very similar to the neighborhood filtered cusp detection technique described above, in that voxel neighborhoods 388, 390 are SUBSTITUTE SHEET (RULE 26) defined on one of the 2D planes to focus the computer's search for cusps on an adjacent 2D plane. Upon detecting a pair of cusps 384, 386 on one 2D plane (step 400), the computer defines one or more neighborhoods 388, 390 to include a predetermined number of voxels surrounding the pair (step 402). The s computer projects the neighborhoods onto an adjacent 2D plane by identifying the voxels on the adjacent plane that correspond to the voxels in the neighborhoods 388, 390 on the original plane (step 404). The computer then identifies the pair of black voxels that lie closest together in the two neighborhoods on the adjacent plane, labeling these voxels as lying in the cusp ~o (step 406). The computer repeats this process for all remaining planes (step 408).
Many of these automated segmentation techniques are even more useful and efficient when used in conjunction with human-assisted techniques. For example, techniques that rely on the identification of the interproximal or ~s gingival margins function more quickly and effectively when a human user first highlights the interproximal or gingival cusps in an image of the dentition model. One technique for receiving this type of information from the user is by displaying a 2D or 3D representation and allowing the user to highlight individual voxels in the display. Another technique allows the user to scroll 2o through a series of 2D cross-sectional slices, identifying those voxels that represent key features such as interproximal or gingival cusps, as in the neighborhood-filtered seed cusp detection technique described above (FIGS.
32, 33, and 34). Some of these techniques rely on user interface tools such as cursors and bounding-box markers.
2s FIGS. 35A-35F illustrate another technique for separating teeth from gingival tissue in the dentition model. This technique is a human-assisted technique in which the computer displays an image of the dentition model (step 760) and allows a human operator to identify, for each tooth, the gingival margin, or gum line 600, encircling the tooth crown 602 (step 762). Some SUBSTT>rUTE SHEET (RULE Z6) applications of this technique involve displaying a 3D volumetric image of the dentition model and allowing the user to select, with an input device such as a mouse, the voxels that define the gingival line 600 around each tooth crown 602. The computer then uses the identified gingival line to model the tooth s roots and to create a cutting surface that separates the tooth, including the root model, from the gingival tissue 604.
Once the human operator has identified the gingival line 600, the computer selects a paint 606 that lies at or near the center of the tooth crown 602 (step 764). One way of choosing this point is by selecting a 2D image vo slice that is parallel to the dentition's occlusal plane and that intersects the tooth crown 602, and then averaging the x- and y-coordinate values of all voxels in this 2D slice that lie on the surface 608 of the tooth crown 602. After selecting the center point 606, the computer defines several points 605 on the gingival line 600 (step 766) and fits a plane 610 among these points 605 (step 768).
The ~s computer then creates a line segment 612 that is normal to the plane 610 and that extends a predetermined distance from the selected center point 606 (step 770). The expected size of a typical tooth or the actual size of the patient's tooth determines the length of the line segment 612. A length on the order of two centimeters is sufficient to model most tooth roots. The computer defines 2o a sphere 614, or a partial sphere, centered at the selected center point 606 (step 772). The radius of the sphere 614 is determined by the length of the line segment 612.
The computer then shifts the plane 610 along the line segment 612 so that the plane 6I0 is tangential to the sphere 614 (step 774). In some 2s applications, the computer allows the human operator to slide the plane 610 along the surface of the sphere 614 to adjust the orientation of the plane 610 (step 776). This is useful, for example, when the tooth crown 602 is tilted, which suggests that the tooth roots also are tilted. The computer then creates a projection 6I6 of the gingival line 600 on the shifted plane 610 (step 778).
The SUBSTITUTE SHEET (RULE 26) tooth roots are modeled by creating a surface 618 that connects the gingival line 600 to the projection 616 (step 780). The computer uses this surface as a cutting surface to separate the tooth from the gingival tissue. The cutting surface extends in a direction that is roughly perpendicular to the occlusal s surface of the tooth crown 602.
In general, the surface 618 that connects the gingival line 600 to the projection is formed by straight line segments that extend between the gingival line and the projection. However, some implementations allow curvature along these line segments. In some applications, the computer scales the projection 616 to be larger or smaller than the gingival line 600, which gives the surface 618 a tapered shape (step 782). Many of these applications allow the computer, with or without human assistance, to change the profile of the tapered surface so that the rate of tapering changes along the length of the surface 618 (step 784). For example, some surfaces taper more rapidly as ~s distance from the tooth crown increases.
FIGS. 37A-C and 38 illustrate another human-assisted technique for separating teeth from gingival tissue in the dentition model. This technique involves displaying an image of the dentition model to a human operator (step 790) and allowing the operator to trace the gingival lines 620, 622 on the 2o buccal and lingual sides of the dental arch (step 792). This produces two curves 624, 626 representing the gingival lines 620, 622 on the buccal and lingual surfaces. The computer uses these curves 624, 626 to create a 3D
cutting surface 628 that separates the tooth crowns 630, 632 from the gingival tissue 634 in the dentition model (step 794). The cutting surface 628 is roughly 2s parallel to the occlusal surface of the tooth crowns 630, 632.
FIGS. 37C and 39 illustrate one technique for defining the cutting surface 628. In general, the computer creates the cutting surface 628 by defining points 636, 638 along each of the 3D curves 624, 626 and defining the cutting surface 628 to fit among the points 636, 638. The computer first SUBSTITUTE SHEET (RULE 26) defines the points 636, 638 on the 3D curves 624, 626 (step 800) and then defines a point 640 at or near the center of each tooth crown 630 (step 802).
One way of defining the center point 64Q is by averaging the x_, y-, and z-coordinate values for all of the points 636, 638 lying on the portions of the s gingival curves 624, 626 associated with that tooth. The computer then creates a triangular surface mesh 642 using the center point 640 and the points 636, 638 on the gingival curves as vertices (step 804). The computer uses this surface mesh 642 to cut the tooth crowns away from the gingival tissue (step 806). In some implementations, a tooth root model is created for each crown, io e.g., by projecting the gingival curves onto a distant plane, as described above (step 808). The computer connects the roots to the crowns to complete the individual tooth models (step 810).
All of the segmentation techniques described above are useful in creating digital models of individual teeth, as well as a model of gingival tissue ~s surrounding the teeth. In some cases, the computer identifies and segments the teeth using one of these techniques to form the individual tooth models, and then uses all remaining data to create the gingival model.
Other Implementations 2o In many instances, the computer creates proposals for segmenting the dentition model and then allows the user to select the best alternative. For example, one version of the arch curve fitting technique described above requires the computer to create a candidate catenary or spline curve, which the user is allowed to modify by manipulating the mathematical control 2s parameters. Other techniques involve displaying several surfaces that are candidate cutting surfaces and allowing the user to select the appropriate surfaces.
Some implementations of the invention are realized in digital electronic circuitry, such as an application specific integrated circuit (ASIC); others are SUBSTITUTE SHEET (RULE 26) realized in computer hardware, firmware, and software, or in combinations of digital circuitry and computer components. The invention is usually embodied, at least in part, as a computer program tangibly stored in a machine-readable storage device for execution by a computer processor. In these situations, s methods embodying the invention are performed when the processor executes instructions organized into program modules, operating on input data and generating output. Suitable processors include general and special purpose microprocessors, which generally receive instructions and data from read-only memory and/or random access memory devices. Storage devices that are ~o suitable for tangibly embodying computer program instructions include all forms of nonvolatile memory, including semiconductor memory devices, such as EPROM, EEPROM, and flash memory devices; magnetic disks such as internal hard disks and removable disks; magneto-optical disks; and CD-ROM.
The invention has been described in terms of particular embodiments.
~ s Other embodiments are within the scope of the following claims.
SUBSTITUTE SHEET (RULE 26)
Claims (274)
1. A computer-implemented method for use in creating a digital model of an individual component of a patient's dentition, the method comprising:
(a) receiving a data set that forms a three-dimensional (3D) representation of the patient's dentition;
(b) applying a computer-implemented test to the data set to identify data elements that represent portions of an individual component of the patient's dentition; and (c) creating a digital model of the individual component based upon the identified data elements.
(a) receiving a data set that forms a three-dimensional (3D) representation of the patient's dentition;
(b) applying a computer-implemented test to the data set to identify data elements that represent portions of an individual component of the patient's dentition; and (c) creating a digital model of the individual component based upon the identified data elements.
2. The method of claim 1, wherein the data set includes data taken from at least one of the following sources: two-dimensional (2D) x-ray data and three-dimensional (3D) x-ray data.
3. The method of claim 1, wherein the data set includes data taken from at least one of the following sources: computed tomography (CT) scan data and magnetic resonance imaging (MRI) scan data.
4. The method of claim 1, wherein the data set includes data taken from a photographic image of the patient's dentition.
5. The method of claim 1, wherein some of the data is obtained by imaging a physical model of the patient's teeth.
6. The method of claim 1, wherein some of the data is obtained by imaging the patient's teeth directly.
7. The method of claim 1, wherein the data set forms a 3D
volumetric representation of the patient's dentition.
volumetric representation of the patient's dentition.
8. The method of claim 1, wherein the data set includes geometric surface data that forms a 3D geometric surface model of the patient's dentition.
9. The method of claim 1, wherein the individual component is an individual tooth in the patient's dentition.
10. The method of claim 1, wherein the individual component includes gum tissue found in the patient's dentition.
11. The method of claim 1, wherein applying the computer-implemented test includes receiving information input by a human user to identify a boundary of the individual component to be modeled.
12. The method of claim 11, wherein receiving information includes receiving position data from a computer-implemented tool through which the human user identifies the boundary in a graphical representation of the patient's dentition.
13. The method of claim 12, wherein the computer-implemented tool is a saw tool that allows the user to identify the boundary by defining a curve in the graphical representation that separates the data elements associated with the individual component from other elements of the data set.
14. The method of claim 12, wherein the computer-implemented tool is an eraser tool that allows the user to identify the boundary by erasing a portion of the graphical representation representing the boundary.
15. The method of claim 1, wherein receiving the data, applying the computer-implemented test, and creating the electronic model all are carried out by a computer without human intervention.
16. The method of claim 1, wherein applying the computer-implemented test includes automatically applying a rule to identify a boundary of the individual component to be modeled.
17. The method of claim 16, wherein the boundary includes a surface of a tooth.
18. The method of claim 16, wherein the boundary includes a gingival margin.
19. The method of claim 1, wherein applying the computer-implemented test includes identifying elements of the data set that represent a structural core of the individual component to be modeled and labeling those data elements as belonging to the individual component.
20. The method of claim 19, wherein the individual component to be modeled includes an individual tooth and the structural core approximately coincides with neurological roots of the tooth.
21. The method of claim 19, wherein applying the computer implemented test includes applying a test to link other data elements to those representing the structural core and labeling the linked data elements as belonging to the individual component.
22. The method of claim 21, wherein applying the test to link other data elements to those representing the structural core includes assigning a distance measure to each element of the data set, where the distance measure indicates a measured distance between a reference point in the dentition and the portion of the dentition represented by the data element to which the distance measure is assigned.
23. The method of claim 22, wherein applying the test to link other data elements includes linking a data element to the structural core if the assigned distance measure is less than the distance measure assigned to a data element representing a portion of the structural core.
24. The method of claim 22, wherein the reference point lies on a tooth surface.
25. The method of claim 21, wherein applying the test to link other data elements to the structural core includes applying a test to determine whether a data element lies outside of the dentition and, if so, labeling the data element as a background element.
26. The method of claim 25, wherein applying the test to determine whether the data element lies outside of the dentition includes comparing an image value associated with the data element to a threshold value.
27. The method of claim 19, further comprising applying another computer-implemented test to identify elements of the data set that represent a structural core of another individual component of the dentition and labeling those data elements as belonging to the other individual component.
28. The method of claim 27, wherein applying the computer implemented tests includes applying tests to link other elements of the data set to those representing the structural cores of the individual components and labeling the linked elements as belonging to the individual components to which they are linked.
29. The method of claim 28, wherein applying the tests to link other data elements to the structural cores of the individual components includes determining whether a data element already is labeled as belonging to one of the individual components.
30. The method of claim 1, wherein applying the computer implemented test includes identifying an initial 2D cross-section of the individual component having continuous-latitudinal width, a relative minimum value of which occurs at an end of the initial cross-section.
31. The method of claim 30, wherein applying the computer-implemented test includes isolating portions of the data corresponding to the initial 2D cross-section of the individual component to be modeled.
32. The method of claim 31, wherein the received data includes 3D
image data obtained by imaging the individual component volumetrically, and wherein isolating portions of the data corresponding to the initial 2D cross-section includes isolating elements of the 3D image data representing the initial 2D cross-section.
image data obtained by imaging the individual component volumetrically, and wherein isolating portions of the data corresponding to the initial 2D cross-section includes isolating elements of the 3D image data representing the initial 2D cross-section.
33. The method of claim 30, wherein applying the computer-implemented test includes applying a test to identify the end of the initial cross-section at which the relative minimum value of the latitudinal width occurs.
34. The method of claim 33, wherein applying the test to identify the end of the initial cross-section includes:
(a) establishing line segments within the initial cross-section, each of which is bounded at each end by an endpoint tying on a surface of the individual component, and each of which is roughly perpendicular to a latitudinal axis of the individual component;
(b) calculating a length for each line segment; and (c) identifying elements of the data set that correspond to the endpoints of the line segment with the shortest length.
(a) establishing line segments within the initial cross-section, each of which is bounded at each end by an endpoint tying on a surface of the individual component, and each of which is roughly perpendicular to a latitudinal axis of the individual component;
(b) calculating a length for each line segment; and (c) identifying elements of the data set that correspond to the endpoints of the line segment with the shortest length.
35. The method of claim 34, wherein applying the computer-implemented test also includes:
(a) isolating portions of the data set corresponding to other 2D
cross-sections of the individual component, all lying in planes parallel to the initial 2D cross-section;
(b) for each of the other cross-sections, identifying data elements that correspond to endpoints of a line segment representing an end of the cross-section; and (c) defining a solid surface that contains all of the identified data elements.
(a) isolating portions of the data set corresponding to other 2D
cross-sections of the individual component, all lying in planes parallel to the initial 2D cross-section;
(b) for each of the other cross-sections, identifying data elements that correspond to endpoints of a line segment representing an end of the cross-section; and (c) defining a solid surface that contains all of the identified data elements.
36. The method of claim 35, further comprising labeling the solid surface as representing a surface of the individual component to be modeled.
37. The method of claim 35, further comprising using the data elements identified in the initial cross-section as guides for identifying the data elements in the other cross-sections.
38. The method of claim 34, wherein applying the test to identify the end of the initial cross-section includes first creating an initial curve that is roughly perpendicular to the latitudinal axis of the individual component and that is fitted between the surfaces of the 2D cross-section on which the endpoints of the line segments will lie.
39. The method of claim 38, wherein establishing the line segments includes first establishing a set of initial line segments that are roughly perpendicular to the curve and to the latitudinal axis and that have endpoints lying on the surfaces of the individual component.
40. The method of claim 39, wherein establishing the line segments also includes pivoting each initial line segment about a point at which the initial line segment intersects the curve until the initial line segment has its shortest possible length.
41. The method of claim 40, wherein establishing the line segments also includes:
{a) locating a midpoint for each of the initial line segments after pivoting; and (b) creating a refined curve that passes through all of the midpoints.
{a) locating a midpoint for each of the initial line segments after pivoting; and (b) creating a refined curve that passes through all of the midpoints.
42. The method of claim 41, wherein establishing the line segments also includes creating the line segments to be perpendicular to the refined curve.
43. The method of claim 38, wherein the individual component is a tooth and the curve is a portion of a larger curve fitted among the lingual and buccal surfaces of all teeth in a 2D cross-section of a tooth arch in which the tooth lies.
44. The method of claim 43, wherein the larger curve is a catenary.
45. The method of claim 43, wherein the larger curve is created by manipulating mathematical control points to fit the curve to the shape of the cross-section of the tooth arch.
46. The method of claim 34, wherein establishing the line segments includes first establishing an initial line segment by creating a line that intersects the initial 2D cross-section, such that the initial line segment has endpoints that lie on surfaces of the individual component.
47. The method of claim 46, wherein establishing the line segments also includes establishing at least one additional line segment parallel to and spaced a predetermined distance from a previously established line segment.
48. The method of claim 47, wherein establishing the line segments also includes, for each additional line segment, locating a midpoint of the additional line segment and pivoting the additional line segment about the midpoint until the additional line segment has its shortest possible length.
49. The method of claim 48, wherein establishing the line segments also includes limiting the rotation of each additional line segment to no more than a predetermined amount.
50. The method of claim 49, wherein the rotation of each additional line segment is limited to no more than approximately +/- 10°.
51. The method of claim 48, wherein establishing the line segments also includes establishing a curve that is fitted among the midpoints of the additional line segments.
52. The method of claim 51, wherein establishing the line segments includes establishing the line segments to be perpendicular to the curve.
53. The method of claim 52, wherein establishing the line segments includes locating midpoints for each of the line segments and pivoting each line segment about its midpoint until the line segment has its shortest possible length.
54. The method of claim 30, wherein the individual component is a tooth and the relative minimum value of the initial 2D cross-section lies on an interproximal surface of the tooth.
55. The method of claim 54, wherein identifying the initial 2D cross-section includes isolating elements of the data set that correspond to 2D
cross-sections of the tooth lying in parallel planes between the roots and the occlusal surface of the tooth.
cross-sections of the tooth lying in parallel planes between the roots and the occlusal surface of the tooth.
56. The method of claim 55, wherein identifying the initial 2D cross-section also includes identifying adjacent ones of the 2D cross-sections in which the interproximal surface of the tooth is obscured by gum tissue in one of the adjacent cross-sections and is not obscured by gum tissue in the other adjacent cross-section.
57. The method of claim 56, wherein identifying the initial 2D cross-section also includes selecting as the initial 2D cross-section the adjacent cross-section in which the interproximal surface of the tooth is not obscured by gum tissue.
58. The method of claim 55, wherein identifying the initial 2D cross-section also includes, for each of the isolated cross-sections, establishing a contour line that outlines the shape of the dentition in that cross-section.
59. The method of claim 58, wherein identifying the initial 2D cross-section also includes applying a test to each of the isolated cross-sections to identify those cross-sections in which the interproximal surface of the tooth is not obscured by gum tissue.
60. The method of claim 59, wherein applying the test includes calculating the rate of curvature of the contour line.
61. The method of claim 59, wherein identifying the initial 2D cross-section includes selecting as the initial 2D cross-section the isolated cross-section that lies closest to the roots of the tooth and in which the interproximal surface of the tooth is not obscured by gum tissue.
62. The method of claim 30, wherein applying the computer-implemented test also includes identifying two elements of the data set that define endpoints of a line segment spanning the relative minimum width of the initial 2D cross-section.
63. The method of claim 62, wherein applying the computer-implemented test also includes defining, for each endpoint, a neighborhood containing a predetermined number of elements of the data set near the endpoint in the initial 2D cross-section.
64. The method of claim 63, wherein applying the computer implemented test also includes identifying an additional 2D cross-section of the individual component in a plane parallel and adjacent to the initial 2D cross-section, where the additional 2D cross-section also has a continuous, latitudinal width with a relative minimum value occurring at one end of the cross-section.
65. The method of claim 64, wherein applying the computer-implemented test also includes identifying two elements of the data set that define endpoints of a line segment spanning the relative minimum width of the additional 2D cross-section by:
(a) defining two neighborhoods of data elements, each containing elements of the data set that are adjacent to the data elements contained in the neighborhoods defined for the initial 2D cross-section;
and (b) identifying one data element in each neighborhood that corresponds to one of the endpoints of the line segment spanning the relative minimum width of the additional 2D cross-section.
(a) defining two neighborhoods of data elements, each containing elements of the data set that are adjacent to the data elements contained in the neighborhoods defined for the initial 2D cross-section;
and (b) identifying one data element in each neighborhood that corresponds to one of the endpoints of the line segment spanning the relative minimum width of the additional 2D cross-section.
66. The method of claim 65, further comprising establishing a solid surface that is fitted among line segments spanning the relative minimum widths of the parallel 2D cross-sections.
67. The method of claim 66, wherein the individual component to be modeled is a tooth and the solid surface represents an interproximal surface of the tooth.
68. The method of claim 30, further comprising receiving information provided by a human user that identifies elements of the data set that correspond to the relative minimum width of the initial 2D cross-section.
69. The method of claim 68, further comprising displaying a graphical representation of the patient's dentition in which the user identifies portions corresponding to the relative minimum width of the cross-section.
70. The method of claim 69, wherein the graphical representation is three dimensional.
71. The method of claim 69, wherein the graphical representation includes a 2D representation of the initial 2D cross-section.
72. The method of claim 71, further comprising receiving the information from an input device used by the human user to identify the relative minimum width of the initial 2D cross-section in the graphical representation.
73. The method of claim 71, wherein the initial 2D cross-section is one of many 2D cross-sections displayed to the human user.
74. The method of claim 71, further comprising receiving information from the human user identifying which of the displayed 2D cross-sections is the initial 2D cross-section.
75. A computer-implemented method for use in creating a digital model of a tooth in a patient's dentition, the method comprising:
(a) receiving a three-dimensional (3D) data set representing the patient's dentition;
(b) applying a computer-implemented test to identify data elements that represent an interproximal margin between two teeth in the dentition;
(c) applying another computer-implemented test to select data elements that lie on one side of the interproximal margin for inclusion in the digital model.
(a) receiving a three-dimensional (3D) data set representing the patient's dentition;
(b) applying a computer-implemented test to identify data elements that represent an interproximal margin between two teeth in the dentition;
(c) applying another computer-implemented test to select data elements that lie on one side of the interproximal margin for inclusion in the digital model.
76. The method of claim 75, further comprising creating a set of 2D
planes that intersect the dentition roughly perpendicular to an occlusal plane of the dentition, each 2D plane including data elements that form a 2D cross-section of the dentition.
planes that intersect the dentition roughly perpendicular to an occlusal plane of the dentition, each 2D plane including data elements that form a 2D cross-section of the dentition.
77. The method of claim 76, further comprising identifying the 2D
plane with the smallest cross-sectional area.
plane with the smallest cross-sectional area.
78. The method of claim 77, further comprising rotating the 2D plane with the smallest cross-sectional area to at least one other orientation to form at least one other 2D cross-section of the dentition.
79. The method of claim 78, further comprising selecting the orientation that gives the rotated plane its smallest possible cross-sectional area.
80. The method of claim 79, further comprising identifying data elements that represent the selected orientation of the rotated plane as lying on an interproximal margin.
81. The method of claim 78, wherein the plane is rotated about two orthogonal lines passing through its center point.
82. The method of claim 77, further comprising creating a set of additional 2D planes in the vicinity of the 2D plane with the smallest cross-sectional area.
83. The method of claim 82, further comprising identifying the plane in the set of additional planes that has the smallest cross-sectional area.
84. The method of claim 83, further comprising rotating the plane with the smallest cross-sectional area to at least one other orientation to form at least one other 2D cross-section of the dentition.
85. The method of claim 84, further comprising selecting the orientation that produces the 2D cross-section with the smallest possible area.
86. The method of claim 76, wherein creating a set of 2D planes includes creating an initial plane near one end of the dentition.
87. The method of claim 86, further comprising selecting a point in the dentition that is a predetermined distance from the initial plane and creating a second plane.
88. The method of claim 87, wherein the second plane is roughly parallel to the initial plane.
89. The method of claim 87, further comprising rotating the second plane to at least one additional orientation to form at least one additional cross-section of the dentition.
90. The method of claim 89, further comprising selecting the orientation that produces the 2D cross-section with the smallest cross-sectional area.
91. The method of claim 89, further comprising selecting a point that is a predetermined distance from the second plane and creating a third plane that includes the selected point.
92. The method of claim 91, further comprising rotating the third plane to at least one other orientation to create at least one additional 2D
cross-section of the dentition.
cross-section of the dentition.
93. The method of claim 91, further comprising creating additional planes, each including a point that is a predetermined distance from a preceding plane, until the other end of the dentition is reached.
94. The method of claim 93, further comprising identifying at least one plane having a local minimum in cross-sectional area.
95. The method of claim 93, further comprising identifying a centerpoint of the cross-section in each of the planes and creating a curve that fits among the identified centerpoints.
96. The method of claim 95, further comprising creating a set of additional 2D planes along the curve, where the curve is roughly normal to each of the additional planes, and where each of the additional planes is roughly perpendicular to the occlusal plane.
97. The method of claim 96, further comprising identifying at least one of the additional planes that has a local minimum in cross-sectional area.
98. A computer-implemented method for use in creating a digital model of a tooth in a patient's dentition, the method comprising:
(a) receiving a 3D dataset representing at least a portion of the patient's dentition, including at least a portion of a tooth and gum tissue surrounding the tooth;
(b) applying a test to identify data elements lying on a gingival boundary that occurs where the tooth and the gum tissue meet;
and (c) applying a test to the data elements lying on the boundary to identify other data elements representing portions of the tooth.
(a) receiving a 3D dataset representing at least a portion of the patient's dentition, including at least a portion of a tooth and gum tissue surrounding the tooth;
(b) applying a test to identify data elements lying on a gingival boundary that occurs where the tooth and the gum tissue meet;
and (c) applying a test to the data elements lying on the boundary to identify other data elements representing portions of the tooth.
99. The method of claim 98, wherein applying the test to identify data elements on the gingival boundary includes creating an initial 2D plane that intersects the dentition roughly perpendicular to an occlusal plane of the dentition and that includes data elements representing an initial cross-sectional surface of the dentition.
100. The method of claim 99, wherein applying the test includes locating a cusp in the initial cross-sectional surface.
101. The method of claim 100, wherein locating the cusp includes calculating rate of curvature of the initial cross-sectional area at selected points on the cross-sectional surface.
102. The method of claim 101, wherein locating the cusp includes identifying the point at which the rate of curvature is greatest.
103. The method of claim 100, wherein applying the test includes creating a second 2D plane that is roughly parallel to the initial 2D plane and that includes data elements representing a second cross-sectional surface of the dentition.
104. The method of claim 103, wherein applying the test includes locating a cusp in the second cross-sectional surface.
105. The method of claim 104, wherein locating the cusp in the second cross-sectional surface includes defining a neighborhood of data elements around the cusp in the initial cross-sectional surface and projecting the neighborhood onto the second cross-sectional surface.
106. The method of claim 105, wherein locating the cusp in the second cross-sectional surface includes searching for the cusp only within the neighborhood projected onto the second cross-sectional surface.
107. The method of claim 99, wherein applying the test includes locating two cusps in the initial cross-sectional surface.
108. The method of claim 107, wherein applying the test includes creating a second 2D plane that is roughly parallel to the initial 2D plane and that includes data elements representing a second cross-sectional surface of the dentition.
109. The method of claim 108, wherein applying the test includes locating two cusps in the second cross-sectional surface.
110. The method of claim 109, wherein locating the cusps in the second cross-sectional surface includes defining two neighborhoods of data elements around the two cusps in the initial cross-sectional surface and projecting the neighborhoods onto the second cross-sectional surface.
111. The method of claim 110, wherein each neighborhood projected onto the second cross-sectional surface includes data elements representing portions of the tooth and data elements representing the gum tissue surrounding the tooth.
112. The method of claim 111, wherein the data elements representing the tooth include voxels of one color and the data elements representing the gum tissue include voxels of another color.
113. The method of claim 111, wherein locating the cusps in the second cross-sectional surface includes locating the pair of data elements representing gum tissue that lie closest together, where each of the two neighborhoods projected onto the second cross-sectional surface includes one of the data elements in the pair.
114. The method of claim 98, wherein applying the test to identify data elements on the gingival boundary includes creating a series of roughly parallel 2D planes, each intersecting the dentition roughly perpendicular to an occlusal plane of the dentition, and each including data elements that represent a cross-sectional surface of the dentition.
115. The method of claim 114, wherein the cross-sectional surface in each 2D plane includes two cusps that roughly identify the locations of the gingival boundary.
116. The method of claim 115, wherein applying the test includes identifying the cusps in each cross-sectional surface.
117. The method of claim 116, wherein identifying the cusps includes locating the cusps in one of the planes and then confining the search for cusps in an adjacent plane to a predetermined area in the vicinity of the identified cusps.
118. The method of claim 115, further comprising allowing a human user to select data elements that roughly identify the locations of the cusps in a selected one of the cross-sectional areas.
119. The method of claim 118, further comprising searching for cusps in the selected cross-sectional area and confining the search to data elements lying within a predetermined area in the vicinity of the data elements selected by the human user.
120. The method of claim 118, further comprising searching for cusps in an adjacent cross-sectional area and confining the search to data elements lying within a predetermined area in the vicinity of the data elements selected by the human user.
121. A computer program, stored on a tangible storage medium, for use in creating a digital model of an individual component of a patient's dentition, the program including executable instructions that, when executed by a computer, cause the computer to:
(a) receive a data set that forms a three-dimensional (3D) representation of the patient's dentition;
(b) apply a test to the data set to identify data elements that represent portions of an individual component of the patient's dentition;
and (c) create a digital model of the individual component based upon the identified data elements.
(a) receive a data set that forms a three-dimensional (3D) representation of the patient's dentition;
(b) apply a test to the data set to identify data elements that represent portions of an individual component of the patient's dentition;
and (c) create a digital model of the individual component based upon the identified data elements.
122. The program of claim 121, wherein the computer receives the data, applies the test, and creates the electronic model without human intervention.
123. The program of claim 121, wherein the computer, in applying the test, applies a rule to identify a boundary of the individual component to be modeled.
124. The program of claim 121, wherein the computer, in applying the test, identifies elements of the data set that represent a structural core of the individual component to be modeled and labels those data elements as belonging to the individual component.
125. The program of claim 124, wherein the computer, in applying the test, links other data elements to those representing the structural core and labels the linked data elements as belonging to the individual component.
126. The program of claim 125, wherein the computer, in applying the test:
(a) assigns a distance measure to each element of the data set, where the distance measure indicates a measured distance between a reference point in the dentition and the portion of the dentition represented by the data element to which the distance measure is assigned; and (b) links a data element to the structural core if the assigned distance measure is less than the distance measure assigned to a data element representing a portion of the structural core.
(a) assigns a distance measure to each element of the data set, where the distance measure indicates a measured distance between a reference point in the dentition and the portion of the dentition represented by the data element to which the distance measure is assigned; and (b) links a data element to the structural core if the assigned distance measure is less than the distance measure assigned to a data element representing a portion of the structural core.
127. The program of claim 121, wherein the computer, in applying the test, identifies an initial 2D cross-section of the individual component having continuous latitudinal width, a relative minimum value of which occurs at an end of the initial cross-section.
128. The program of claim 127, wherein the computer, in applying the test, identifies the end of the initial cross-section at which the relative minimum value of the latitudinal width occurs by:
(a) establishing line segments within the initial cross-section, each of which is bounded at each end by an endpoint lying on a surface of the individual component, and each of which is roughly perpendicular to a latitudinal axis of the individual component;
(b) calculating a length for each line segment; and (c) identifying elements of the data set that correspond to the endpoints of the line segment with the shortest length.
(a) establishing line segments within the initial cross-section, each of which is bounded at each end by an endpoint lying on a surface of the individual component, and each of which is roughly perpendicular to a latitudinal axis of the individual component;
(b) calculating a length for each line segment; and (c) identifying elements of the data set that correspond to the endpoints of the line segment with the shortest length.
129. The program of claim 128, wherein the computer, in applying the test:
(a) isolates portions of the data set corresponding to other 2D
cross-sections of the individual component, all lying in planes parallel to the initial 2D cross-section;
(b) for each of the other cross-sections, identifies data elements that correspond to endpoints of a line segment representing an end of the cross-section; and (c) defines a solid surface that contains all of the identified data elements.
(a) isolates portions of the data set corresponding to other 2D
cross-sections of the individual component, all lying in planes parallel to the initial 2D cross-section;
(b) for each of the other cross-sections, identifies data elements that correspond to endpoints of a line segment representing an end of the cross-section; and (c) defines a solid surface that contains all of the identified data elements.
130. The program of claim 128, wherein the computer, in applying the test:
(a) first creates an initial curve that is roughly perpendicular to the latitudinal axis of the individual component and that is fitted between the surfaces of the 2D cross-section on which the endpoints of the line segments lie;
(b) establishes a set of initial line segments that are roughly perpendicular to the curve and to the latitudinal axis and that have endpoints lying on the surfaces of the individual component;
(c) pivots each initial line segment about a point at which the initial line segment intersects the curve until the initial line segment has its shortest possible length;
(d) locates a midpoint for each of the initial line segments after pivoting; and (e) creates a refined curve that passes through all of the midpoints and that is roughly normal to all of the line segments.
(a) first creates an initial curve that is roughly perpendicular to the latitudinal axis of the individual component and that is fitted between the surfaces of the 2D cross-section on which the endpoints of the line segments lie;
(b) establishes a set of initial line segments that are roughly perpendicular to the curve and to the latitudinal axis and that have endpoints lying on the surfaces of the individual component;
(c) pivots each initial line segment about a point at which the initial line segment intersects the curve until the initial line segment has its shortest possible length;
(d) locates a midpoint for each of the initial line segments after pivoting; and (e) creates a refined curve that passes through all of the midpoints and that is roughly normal to all of the line segments.
131. The program of claim 128, wherein the computer, in applying the test, also:
(a) establishes an initial line segment by creating a line that intersects the initial 2D cross-section, such that the initial line segment is bounded by endpoints that lie on surfaces of the individual component;
(b) establishes at least one additional line segment parallel to and spaced a predetermined distance from a previously established line segment; and (c) for each additional line segment, locates a midpoint of the additional line segment and pivots the additional line segment about the midpoint until the additional line segment has its shortest possible length.
(a) establishes an initial line segment by creating a line that intersects the initial 2D cross-section, such that the initial line segment is bounded by endpoints that lie on surfaces of the individual component;
(b) establishes at least one additional line segment parallel to and spaced a predetermined distance from a previously established line segment; and (c) for each additional line segment, locates a midpoint of the additional line segment and pivots the additional line segment about the midpoint until the additional line segment has its shortest possible length.
132. The program of claim 131, wherein the computer, in applying the test, also:
(a) establishes a curve that is fitted among the midpoints of the additional line segments;
(b) establishes a new set of line segments that are perpendicular to the curve;
(c) locates midpoints for each of the line segments in the new set; and (d) pivots each line segment in the new set about its midpoint until the line segment has its shortest possible length.
(a) establishes a curve that is fitted among the midpoints of the additional line segments;
(b) establishes a new set of line segments that are perpendicular to the curve;
(c) locates midpoints for each of the line segments in the new set; and (d) pivots each line segment in the new set about its midpoint until the line segment has its shortest possible length.
133. The program of claim 127, wherein the individual component is a tooth and the relative minimum value of the initial 2D cross-section lies on an interproximal surface of the tooth.
134. The program of claim 133, wherein the computer, in identifying the initial 2D cross-section, isolates elements of the data set that correspond to 2D cross-sections of the tooth lying in parallel planes between the roots and the occlusal surface of the tooth.
135. The program of claim 134, wherein the computer, in identifying the initial 2D cross-section, identifies adjacent ones of the 2D cross-sections in which the interproximal surface of the tooth is obscured by gum tissue in one of the adjacent cross-sections and is not obscured by gum tissue in the other adjacent cross-section.
136. The program of claim 135, wherein the computer, in identifying the initial 2D cross-section, selects as the initial 2D cross-section the adjacent cross-section in which the interproximal surface of the tooth is not obscured by gum tissue.
137. The program of claim 134, wherein the computer, in identifying the initial 2D cross-section, identifies for each of the isolated cross-sections a contour line that outlines the shape of the dentition in that cross-section.
138. The program of claim 137, wherein the computer, in identifying the initial 2D cross-section, applies a test to each of the isolated cross-sections to identify those cross-sections in which the interproximal surface of the tooth is not obscured by gum tissue.
139. The program of claim 138, wherein the computer, in applying the test to each of the isolated cross-sections, calculates the rate of curvature of the contour line.
140. The program of claim 138, wherein the computer, in identifying the initial 2D cross-section, selects as the initial 2D cross-section the isolated cross-section that lies closest to the roots of the tooth and in which the interproximal surface of the tooth is not obscured by gum tissue.
141. The program of claim 127, wherein the computer, in applying the test, identifies two elements of the data set that define endpoints of a line segment spanning the relative minimum width of the initial 2D cross-section.
142. The program of claim 141, wherein the computer, in applying the test, defines for each endpoint a neighborhood containing a predetermined number of elements of the data set near the endpoint in the initial 2D cross-section.
143. The program of claim 142, wherein the computer, in applying the test, identifies an additional 2D cross-section of the individual component in a plane parallel and adjacent to the initial 2D cross-section, where the additional 2D cross-section also has a continuous, latitudinal width with a relative minimum value occurring at one end of the cross-section.
144. The program of claim 143, wherein the computer, in applying the test, identifies two elements of the data set that define endpoints of a line segment spanning the relative minimum width of the additional 2D cross-section by:
(a) defining two neighborhoods of data elements, each containing elements of the data set that are adjacent to the data elements contained in the neighborhoods defined for the initial 2D cross-section;
and (b) identifying one data element in each neighborhood that corresponds to one of the endpoints of the line segment spanning the relative minimum width of the additional 2D cross-section.
(a) defining two neighborhoods of data elements, each containing elements of the data set that are adjacent to the data elements contained in the neighborhoods defined for the initial 2D cross-section;
and (b) identifying one data element in each neighborhood that corresponds to one of the endpoints of the line segment spanning the relative minimum width of the additional 2D cross-section.
145. The program of claim 144, wherein the computer also establishes a solid surface that is fitted among line segments spanning the relative minimum widths of the parallel 2D cross-sections.
146. The program of claim 145, wherein the individual component to be modeled is a tooth and the solid surface represents an interproximal surface of the tooth.
147. A computer program, stored on a tangible storage medium, for use in creating a digital model of tooth in a patient's dentition, the program including executable instructions that, when executed by a computer, cause the computer to:
(a) receive a three-dimensional (3D) data set representing the patient's dentition;
(b) apply a test to identify data elements that represent an interproximal margin between two teeth in the dentition;
(c) apply another test to select data elements that lie on one side of the interproximal margin for inclusion in the digital model.
(a) receive a three-dimensional (3D) data set representing the patient's dentition;
(b) apply a test to identify data elements that represent an interproximal margin between two teeth in the dentition;
(c) apply another test to select data elements that lie on one side of the interproximal margin for inclusion in the digital model.
148. The program of claim 147, wherein the computer creates a set of 2D planes that intersect the dentition roughly perpendicular to an occlusal plane of the dentition, each 2D plane including data elements that form a 2D
cross-section of the dentition.
cross-section of the dentition.
149. The program of claim 148, wherein the computer identifies the 2D plane with the smallest cross-sectional area.
150. The program of claim 149, wherein the computer rotates the 2D
plane with the smallest cross-sectional area to at least one other orientation to form at least one other 2D cross-section of the dentition.
plane with the smallest cross-sectional area to at least one other orientation to form at least one other 2D cross-section of the dentition.
151. The program of claim 150, wherein the computer selects the orientation that gives the rotated plane its smallest possible cross-sectional area.
152. The program of claim 151, wherein the computer identifies data elements that represent the selected orientation of the rotated plane as lying on an interproximal margin.
153. The program of claim 150, wherein the computer rotates the plane about two orthogonal lines passing through its center point.
154. The program of claim 149, wherein the computer creates a set of additional 2D planes in the vicinity of the 2D plane with the smallest cross-sectional area.
155. The program of claim 154, wherein the computer identifies the plane in the set of additional planes that has the smallest cross-sectional area.
156. The program of claim 155, wherein the computer rotates the plane with the smallest cross-sectional area to at least one other orientation to form at least one other 2D cross-section of the dentition.
157. The program of claim 156, wherein the computer selects the orientation that produces the 2D cross-section with the smallest possible area.
158. The program of claim 148, wherein the computer, in creating the set of 2D planes, creates an initial plane near one end of the dentition.
159. The program of claim 158, wherein the computer selects a point in the dentition that is a predetermined distance from the initial plane and creates a second plane that includes the selected point.
160. The program of claim 159, wherein the second plane is roughly parallel to the initial plane.
161. The program of claim 159, wherein the computer rotates the second plane to at least one additional orientation to form at least one additional 2D cross-section of the dentition.
162. The program of claim 161, wherein the computer selects the orientation that produces the 2D cross-section with the smallest cross-sectional area.
163. The program of claim 161, wherein the computer selects a point that is a predetermined distance from the second plane and creates a third plane that includes the selected point.
164. The program of claim 163, wherein the computer rotates the third plane to at least one other orientation to create at least one additional 2D
cross-section of the dentition.
cross-section of the dentition.
165. The program of claim 163, wherein the computer creates additional planes, each including a point that is a predetermined distance from a preceding plane, until the other end of the dentition is reached.
166. The program of claim 165, wherein the computer identifies at least one plane having a local minimum in cross-sectional area.
167. The program of claim 165, wherein the computer identifies a centerpoint of the cross-section in each of the planes and creates a curve that fits among the identified centerpoints.
168. The program of claim 167, wherein the computer creates a set of additional 2D planes along the curve, where the curve is roughly normal to each of the additional planes, and where each of the additional planes is roughly perpendicular to the occlusal plane.
169. The program of claim 168, wherein the computer identifies at least one of the additional planes that has a local minimum in cross-sectional area.
170. A computer program, stored on a tangible storage medium, for use in creating a digital model of a tooth in a patient's dentition, the program including executable instructions that, when executed by a computer, cause the computer to:
(a) receive a 3D data set representing at least a portion of the patient's dentition, including at least a portion of a tooth and gum tissue surrounding the tooth;
(b) apply a test to identify data elements lying on a gingival boundary that occurs where the tooth and the gum tissue meet; and (c) apply a test to the data elements lying on the boundary to identify other data elements representing portions of the tooth.
(a) receive a 3D data set representing at least a portion of the patient's dentition, including at least a portion of a tooth and gum tissue surrounding the tooth;
(b) apply a test to identify data elements lying on a gingival boundary that occurs where the tooth and the gum tissue meet; and (c) apply a test to the data elements lying on the boundary to identify other data elements representing portions of the tooth.
171. The program of claim 170, wherein the computer, in applying the test to identify data elements on the gingival boundary, creates an initial 2D
plane that intersects the dentition roughly perpendicular to an occlusal plane of the dentition and that includes data elements representing an initial cross-sectional surface of the dentition.
plane that intersects the dentition roughly perpendicular to an occlusal plane of the dentition and that includes data elements representing an initial cross-sectional surface of the dentition.
172. The program of claim 171, wherein the computer locates a cusp in the initial cross-sectional surface.
173. The program of claim 172, wherein the computer, in locating the cusp, calculates rate of curvature of the initial cross-sectional area at selected points on the cross-sectional surface.
174. The program of claim 173, wherein the computer, in locating the cusp, identifies the point at which the rate of curvature is greatest.
175. The program of claim 172, wherein the computer creates a second 2D plane that is roughly parallel to the initial 2D plane and that includes data elements representing a second cross-sectional surface of the dentition.
176. The program of claim 175, wherein the computer locates a cusp in the second cross-sectional surface.
177. The program of claim 176, wherein the computer, in locating the cusp in the second cross-sectional surface, defines a neighborhood of data elements around the cusp in the initial cross-sectional surface and projects the neighborhood onto the second cross-sectional surface.
178. The program of claim 177, wherein the computer, in locating the cusp in the second cross-sectional surface, searches for the cusp only within the neighborhood projected onto the second cross-sectional surface.
179. The program of claim 171, wherein the computer locates two cusps in the initial cross-sectional surface.
180. The program of claim 179, wherein the computer creates a second 2D plane that is roughly parallel to the initial 2D plane and that includes data elements representing a second cross-sectional surface of the dentition.
181. The program of claim 180, wherein the computer locates two cusps in the second cross-sectional surface.
182. The program of claim 181, wherein the computer, in locating the cusps in the second cross-sectional surface, defines two neighborhoods of data elements around the two cusps in the initial cross-sectional surface and projects the neighborhoods onto the second cross-sectional surface.
183. The program of claim 182, wherein each neighborhood projected onto the second cross-sectional surface includes data elements representing portions of the tooth and data elements representing the gum tissue surrounding the tooth.
184. The program of claim 183, wherein the data elements representing the tooth include voxels of one color and the data elements representing the gum tissue include voxels of another color.
185. The program of claim 183, wherein the computer, in locating the cusps in the second cross-sectional surface, locates the pair of data elements representing gum tissue that lie closest together, where each of the two neighborhoods projected onto the second cross-sectional surface includes one of the data elements in the pair.
186. The program of claim 170, wherein the computer, in applying the test to identify data elements on the gingival boundary, creates a series of roughly parallel 2D planes, each intersecting the dentition roughly perpendicular to an occlusal plane of the dentition, and each including data elements that represent a cross-sectional surface of the dentition.
187. The program of claim 186, wherein the cross-sectional surface in each 2D plane includes two cusps that roughly identify the locations of the gingival boundary.
188. The program of claim 187, wherein the computer identifies the cusps in each cross-sectional surface.
189. The program of claim 188, wherein the computer, in identifying the cusps, locates the cusps in one of the planes and then confines the search for cusps in an adjacent plane to a predetermined area in the vicinity of the identified cusps.
190. The program of claim 187, wherein the computer allows a human user to select data elements that roughly identify the locations of the cusps in a selected one of the cross-sectional areas.
191. The program of claim 188, wherein the computer searches for cusps in the selected cross-sectional area and confines the search to data elements lying within a predetermined area in the vicinity of the data elements selected by the human user.
192. The program of claim 188, wherein the computer searches for cusps in an adjacent cross-sectional area and confines the search to data elements lying within a predetermined area in the vicinity of the data elements selected by the human user.
193. A computer-implemented method for use in manipulating a digital model of a patient's dentition, the method comprising:
obtaining a three-dimensional (3D) digital model of the patient's dentition; and analyzing the dentition model to determine the orientation of at least one axis of the dentition model automatically.
obtaining a three-dimensional (3D) digital model of the patient's dentition; and analyzing the dentition model to determine the orientation of at least one axis of the dentition model automatically.
194. The method of claim 193, further comprising creating an Oriented Bounding Box (OBB) around the dentition model.
195. The method of claim 194, wherein the dentition model has a z-axis that extends in a direction in which the OBB has minimum thickness.
196. The method of claim 195, wherein the z-axis extends from a bottom surface of the dentition model to a top surface of the model, and wherein the method includes automatically identifying the top and bottom surfaces of the dentition model.
197. The method of claim 196, wherein one of the surfaces is substantially flat and another of the surfaces is textured, and wherein identifying the top and bottom surfaces includes:
creating one or more planes that are roughly normal to the z-axis;
and creating line segments that extend between the one or more planes and the top and bottom surfaces of the dentition model.
creating one or more planes that are roughly normal to the z-axis;
and creating line segments that extend between the one or more planes and the top and bottom surfaces of the dentition model.
198. The method of claim 197, wherein identifying the top and bottom surfaces includes identifying the surface for which all of the line segments are of one length as being the flat surface.
199. The method of claim 197, wherein identifying the top and bottom surfaces includes identifying the surface for which the line segments have varying lengths as being the textured surface.
200. The method of claim 193, wherein analyzing the dentition model includes selecting a two-dimensional (2D) plane that contains the axis and an arch-shaped cross section of the dentition model and identifying the orientation of the axis in this plane.
201. The method of claim 200, wherein the arch-shaped cross section is roughly symmetrical about the axis.
202. The method of claim 201, wherein analyzing the dentition model includes:
identifying a point at each end of the arch-shaped cross section;
creating a line segment that extends between the identified points;
and identifying the orientation of the axis as being roughly perpendicular to the line segment.
identifying a point at each end of the arch-shaped cross section;
creating a line segment that extends between the identified points;
and identifying the orientation of the axis as being roughly perpendicular to the line segment.
203. The method of claim 202, wherein identifying a point at each end of the arch includes:
selecting a point that lies within an area surrounded by the arch-shaped cross section;
creating a line segment that extends between the selected point and n edge of the 2D plane;
sweeping the line segment in a circular manner around the selected point; and identifying points at the ends of the arch-shaped cross section at which the sweeping line segment begins intersecting the cross section of the dentition model and stops intersecting the cross section of the dentition model.
selecting a point that lies within an area surrounded by the arch-shaped cross section;
creating a line segment that extends between the selected point and n edge of the 2D plane;
sweeping the line segment in a circular manner around the selected point; and identifying points at the ends of the arch-shaped cross section at which the sweeping line segment begins intersecting the cross section of the dentition model and stops intersecting the cross section of the dentition model.
204. The method of claim 201, wherein analyzing the dentition model includes identifying the orientation of another axis that is roughly perpendicular to the identified axis.
205. A computer-implemented method for use in creating a digital model of an individual component of a patient's dentition, the method comprising:
obtaining a 3D digital model of the patient's dentition;
identifying points in the dentition model that lie on an inter-proximal margin between adjacent teeth in the patient's dentition; and using the identified points to create a cutting surface for use in separating portions of the dentition model representing the adjacent teeth.
obtaining a 3D digital model of the patient's dentition;
identifying points in the dentition model that lie on an inter-proximal margin between adjacent teeth in the patient's dentition; and using the identified points to create a cutting surface for use in separating portions of the dentition model representing the adjacent teeth.
206. The method of claim 205, further comprising displaying 2D cross sections of the dentition model and receiving input from a human operator identifying approximate points at which the interproximal margin between the adjacent teeth meets gingival tissue.
207. The method of claim 206, wherein the dentition model includes a 3D volumetric model of the patient's dentition and the input from the human operator identifies two voxels in the volumetric model.
208. The method of claim 207, further comprising defining a neighborhood of voxels around each of the two voxels identified by the human operator, where each neighborhood includes voxels representing the dentition model and voxels representing a background image.
209. The method of claim 208, further comprising applying a computer-implemented test to select a pair of voxels, both representing the background image, that lie closest together, where each neighborhood contains one of the voxels.
210. The method of claim 207, -further comprising automatically identifying voxels on another 2D cross section that represent the interproximal margin.
211. The method of claim 210, wherein automatically identifying voxels on another 2D cross section includes:
defining a neighborhood of voxels around each of the selected voxels, where each neighborhood includes voxels representing the dentition model and voxels representing a background image;
projecting the neighborhoods onto the other 2D cross section; and selecting two voxels in the projected neighborhoods that represent the inter-proximal margin.
defining a neighborhood of voxels around each of the selected voxels, where each neighborhood includes voxels representing the dentition model and voxels representing a background image;
projecting the neighborhoods onto the other 2D cross section; and selecting two voxels in the projected neighborhoods that represent the inter-proximal margin.
212. The method of claim 211, wherein selecting two voxels in the projected neighborhoods includes selecting a pair of voxels, both representing the background image, that lie closest together, where each of the neighborhoods contains one of the voxels.
213. A computer-implemented test for use in creating a digital model of an individual component of a patient's dentition, the method comprising:
displaying an image of a dentition model;
receiving input from a human operator identifying points in the image representing a gingival line at which a tooth in the dentition model meets gingival tissue; and using the identified points to create a cutting surface for use in separating the tooth from the gingival tissue in the dentition model.
displaying an image of a dentition model;
receiving input from a human operator identifying points in the image representing a gingival line at which a tooth in the dentition model meets gingival tissue; and using the identified points to create a cutting surface for use in separating the tooth from the gingival tissue in the dentition model.
214. The method of claim 213, wherein the cutting surface extends roughly perpendicular to an occlusal plane in the dentition model.
215. The method of claim 214, wherein creating the cutting surface includes projecting at least a portion of the gingival line onto a plane that is roughly parallel to the occlusal plane.
216. The method of claim 215, wherein creating the surface includes creating a surface that connects the gingival line to the projection.
217. The method of claim 215, further comprising creating the plane by fitting the plane among the points on the gingival line.
218. The method of claim 217, further comprising shifting the plane away from the tooth in a direction that is roughly normal to the plane.
219. The method of claim 218, wherein shifting the plane includes creating a line segment that includes a point near the center of the tooth and that is roughly perpendicular to the plane.
220. The method of claim 219, wherein the length of the line segment is approximately equal to the length of a tooth root.
221. The method of claim 219, further comprising creating a sphere that has a radius equal to the length of the line segment and that is centered on the point near the center of the tooth.
222. The method of claim 221, wherein shifting the plane includes moving the plane along the line segment so that the plane is tangential to the sphere.
223. The method of claim 222, further comprising receiving instructions from a human operator to slide the plane to a new position along the sphere.
224. The method of claim 213, wherein the cutting surface extends roughly parallel to an occlusal plane in the dentition model.
225. The method of claim 224, wherein the input received from the human operator identifies points that form two 3D curves representing gingival lines at which teeth in the dentition model meet gum tissue on both the buccal and lingual sides of the dentition model.
226. The method of claim 225, wherein creating the cutting surface includes fitting a surface among the points lying on the two curves.
227. The method of claim 225, wherein creating the surface includes, for each tooth, identifying a point lying between the two curves and creating surface triangles having vertices at the identified point and at points on the two curves.
228. The method of claim 227, wherein identifying the point includes averaging, for each tooth, x, y and z coordinate values of the points on portions of the two curves adjacent to the tooth.
229. The method of claim 225, further comprising creating a surface that represents tooth roots.
230. The method of claim 229, wherein creating the surface representing tooth roots includes projecting points onto a plane that is roughly parallel to the occlusal plane.
231. The method of claim 230, wherein creating the surface includes connecting points on the two curves to the projected points.
232. The method of claim 231, further comprising using the surface to separate portions of the dentition model representing the tooth roots from portions representing gingival tissue.
233. The method of claim 232, further comprising connecting the portions of the dentition model representing the tooth roots to the portion representing the tooth.
234. A computer program, stored on a tangible storage medium, for use in manipulating a digital model of a patient's dentition, the program comprising executable instructions that, when executed by a computer, cause the computer to:
obtain a three-dimensional (3D) digital model of the patient's dentition; and analyze the dentition model to determine the orientation of at least one axis of the dentition model automatically.
obtain a three-dimensional (3D) digital model of the patient's dentition; and analyze the dentition model to determine the orientation of at least one axis of the dentition model automatically.
235. The program of claim 234, wherein the computer creates an Oriented Bounding Box (OBB) around the dentition model.
236. The program of claim 235, wherein the dentition model has a z-axis that extends in a direction in which the OBB has minimum thickness.
237. The program of claim 236, wherein the z-axis extends from a bottom surface of the dentition model to a top surface of the model, and wherein the computer automatically identifies the top and bottom surfaces of the dentition model.
238. The program of claim 237, wherein one of the surfaces is substantially flat and another of the surfaces is textured, and wherein, in identifying the top and bottom surfaces, the computer:
creates one or more planes that are roughly normal to the z-axis;
and creates line segments that extend between the one or more planes and the top and bottom surfaces of the dentition model.
creates one or more planes that are roughly normal to the z-axis;
and creates line segments that extend between the one or more planes and the top and bottom surfaces of the dentition model.
239. The program of claim 238, wherein, in identifying the top and bottom surfaces, the computer identifies the surface for which all of the line segments are of one length as being the flat surface.
240. The program of claim 238, wherein, in identifying the top and bottom surfaces, the computer identifies the surface for which the line segments have varying lengths as being the textured surface.
241. The program of claim 234, wherein, in analyzing the dentition model, the computer selects a two-dimensional (2D) plane that contains the axis and an arch-shaped cross section of the dentition model and identifying the orientation of the axis in this plane.
242. The program of claim 241, wherein the arch-shaped cross section is roughly symmetrical about the axis.
243. The program of claim 242, wherein, in analyzing the dentition model, the computer:
identifies a point at each end of the arch-shaped cross section;
creates a line segment that extends between the identified points;
and identifies the orientation of the axis as being roughly perpendicular to the line segment.
identifies a point at each end of the arch-shaped cross section;
creates a line segment that extends between the identified points;
and identifies the orientation of the axis as being roughly perpendicular to the line segment.
244. The program of claim 243, wherein, in identifying a point at each end of the arch, the computer:
selects a point that lies within an area surrounded by the arch-shaped cross section;
creates a line segment that extends between the selected point and n edge of the 2D plane;
sweeps the line segment in a circular manner around the selected point; and identifies points at the ends of the arch-shaped cross section at which the sweeping line segment begins intersecting the cross section of the dentition model and stops intersecting the cross section of the dentition model.
selects a point that lies within an area surrounded by the arch-shaped cross section;
creates a line segment that extends between the selected point and n edge of the 2D plane;
sweeps the line segment in a circular manner around the selected point; and identifies points at the ends of the arch-shaped cross section at which the sweeping line segment begins intersecting the cross section of the dentition model and stops intersecting the cross section of the dentition model.
245. The program of claim 242, wherein, in analyzing the dentition model, the computer identifies the orientation of another axis that is roughly perpendicular to the identified axis.
246. A computer program, stored on a tangible storage medium, for use in creating a digital model of an individual component of a patient's dentition, the program comprising executable instructions that, when executed by a computer, cause the computer to:
obtain a 3D digital model of the patient's dentition;
identify points in the dentition model that lie on an inter-proximal margin between adjacent teeth in the patient's dentition; and use the identified points to create a cutting surface for use in separating portions of the dentition model representing the adjacent teeth.
obtain a 3D digital model of the patient's dentition;
identify points in the dentition model that lie on an inter-proximal margin between adjacent teeth in the patient's dentition; and use the identified points to create a cutting surface for use in separating portions of the dentition model representing the adjacent teeth.
247. The program of claim 246, wherein the computer displays 2D
cross sections of the dentition model and receives input from a human operator identifying approximate points at which the interproximal margin between the adjacent teeth meets gingival tissue.
cross sections of the dentition model and receives input from a human operator identifying approximate points at which the interproximal margin between the adjacent teeth meets gingival tissue.
248. The program of claim 247, wherein the dentition model includes a 3D volumetric model of the patient's dentition and the input from the human operator identifies two voxels in 10 volumetric model.
249. The program of claim 248, wherein the computer defines a neighborhood of voxels around each of the two voxels identified by the human operator, where each neighborhood includes voxels representing the dentition model and voxels representing a background image.
250. The program of claim 249, wherein the computer automatically selects a pair of voxels, both representing the background image, that lie closest together, where each neighborhood contains one of the voxels.
251. The program of claim 248, wherein the computer automatically identifies voxels on another 2D cross section that represent the interproximal margin.
252. The program of claim 251, wherein, in automatically identifying voxels on another 2D cross section, the computer:
defines a neighborhood of voxels around each of the selected voxels, where each neighborhood includes voxels representing the dentition model and voxels representing a background image;
projects the neighborhoods onto the other 2D cross section; and selects two voxels in the projected neighborhoods that represent the inter-proximal 30 margin.
defines a neighborhood of voxels around each of the selected voxels, where each neighborhood includes voxels representing the dentition model and voxels representing a background image;
projects the neighborhoods onto the other 2D cross section; and selects two voxels in the projected neighborhoods that represent the inter-proximal 30 margin.
253. The program of claim 252, wherein, in selecting two voxels in the projected neighborhoods, the computer selects a pair of voxels, both representing the background image, that lie closest together, where each of the neighborhoods contains one of the voxels.
254. A computer program, stored on a tangible storage medium, for use in creating a digital model of an individual component of a patient's dentition, the program comprising executable instructions that, when executed by a computer, cause the computer to:
display an image of a dentition model;
receive input from a human operator identifying points in the image representing a gingival line at which a tooth in the dentition model meets gingival tissue; and use the identified points to create a cutting surface for use in separating the tooth from the gingival tissue in the dentition model.
display an image of a dentition model;
receive input from a human operator identifying points in the image representing a gingival line at which a tooth in the dentition model meets gingival tissue; and use the identified points to create a cutting surface for use in separating the tooth from the gingival tissue in the dentition model.
255. The program of claim 254, wherein the cutting surface extends roughly perpendicular to an occlusal plane in the dentition model.
256. The program of claim 255, wherein, in creating the cutting surface, the computer projects at least a portion of the gingival line onto a plane that is roughly parallel to the occlusal plane.
257. The program of claim 256, wherein, in creating the surface, the computer creates a surface that connects the gingival line to the projection.
258. The program of claim 256, wherein the computer creates the plane by fitting the plane among the points on the gingival line.
259. The program of claim 258, wherein the computer shifts the plane away from the tooth in a direction that is roughly normal to the plane.
260. The program of claim 259, wherein, in shifting the plane, the computer creates a line segment that includes a point near the center of the tooth and that is roughly perpendicular to the plane.
261. The program of claim 260, wherein the length of the line segment is approximately equal to the length of a tooth root.
262. The program of claim 260, wherein the computer creates a sphere that has a radius equal to the length of the line segment and that is centered on the point near the center of the tooth.
263. The program of claim 262, wherein, in shifting the plane, the computer moves the plane along the line segment so that the plane is tangential to the sphere.
264. The program of claim 263, wherein the computer receives instructions from a human operator to slide the plane to a new position along the sphere.
265. The program of claim 264, wherein the cutting surface extends roughly parallel to an occlusal plane in the dentition model.
266. The program of claim 265, wherein the input received from the human operator identifies points that form two 3D curves representing gingival lines at which teeth in the dentition model meet gum tissue on both the buccal and lingual sides of the dentition model.
267. The program of claim 266, wherein, in creating the cutting surface, the computer fits a surface among the points lying on the two curves.
268. The program of claim 269, wherein, in creating the surface, the computer, for each tooth, identifies a point lying between the two curves and creates surface triangles having vertices at the identified point and at points on the two curves.
269. The program of claim 268, wherein, in identifying the point, the computer averages, for each tooth, x, y and z coordinate values of the points on portions of the two curves adjacent to the tooth.
270. The program of claim 266, wherein the computer creates a surface that represents tooth roots.
271. The program of claim 270, wherein, in creating the surface representing tooth roots, the computer projects points onto a plane that is roughly parallel to the occlusal plane.
272. The program of claim 271, wherein, in creating the surface, the computer connects points on the two curves to the projected points.
273. The program of claim 272; wherein the computer uses the surface to separate portions of the dentition model representing the tooth roots from portions representing gingival tissue.
274. The program of claim 273, wherein the computer connects the portions of the dentition model representing the tooth roots to the portion representing the tooth.
Applications Claiming Priority (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16927698A | 1998-10-08 | 1998-10-08 | |
US09/169,276 | 1998-10-08 | ||
US09/264,547 | 1999-03-08 | ||
US09/264,547 US7063532B1 (en) | 1997-06-20 | 1999-03-08 | Subdividing a digital dentition model |
US09/311,941 | 1999-05-14 | ||
US09/311,941 US6409504B1 (en) | 1997-06-20 | 1999-05-14 | Manipulating a digital dentition model to form models of individual dentition components |
PCT/US1999/023532 WO2000019935A1 (en) | 1998-10-08 | 1999-10-08 | Manipulating a digital dentition model to form models of individual dentition components |
Publications (1)
Publication Number | Publication Date |
---|---|
CA2346256A1 true CA2346256A1 (en) | 2000-04-13 |
Family
ID=27389631
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CA002346256A Abandoned CA2346256A1 (en) | 1998-10-08 | 1999-10-08 | Manipulating a digital dentition model to form models of individual dentition components |
Country Status (7)
Country | Link |
---|---|
US (3) | US6409504B1 (en) |
EP (1) | EP1119312B1 (en) |
JP (1) | JP3630634B2 (en) |
AU (1) | AU6422999A (en) |
CA (1) | CA2346256A1 (en) |
TW (1) | TW471960B (en) |
WO (1) | WO2000019935A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10467746B2 (en) | 2013-03-11 | 2019-11-05 | Carestream Dental Technology Topco Limited | Method for producing teeth surface from x-ray scan of a negative impression |
Families Citing this family (280)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6409504B1 (en) * | 1997-06-20 | 2002-06-25 | Align Technology, Inc. | Manipulating a digital dentition model to form models of individual dentition components |
US11026768B2 (en) | 1998-10-08 | 2021-06-08 | Align Technology, Inc. | Dental appliance reinforcement |
US8821158B1 (en) | 1999-10-14 | 2014-09-02 | Geodigm Corporation | Method and apparatus for matching digital three-dimensional dental models with digital three-dimensional cranio-facial CAT scan records |
US6989471B2 (en) * | 2000-02-15 | 2006-01-24 | The Procter & Gamble Company | Absorbent article with phase change material |
US6633789B1 (en) * | 2000-02-17 | 2003-10-14 | Align Technology, Inc. | Effiicient data representation of teeth model |
US7373286B2 (en) * | 2000-02-17 | 2008-05-13 | Align Technology, Inc. | Efficient data representation of teeth model |
WO2001074268A1 (en) * | 2000-03-30 | 2001-10-11 | Align Technology, Inc. | System and method for separating three-dimensional models |
EP2204136B1 (en) * | 2000-04-19 | 2013-08-28 | OraMetrix, Inc. | Orthodontic archwire |
US7027642B2 (en) * | 2000-04-28 | 2006-04-11 | Orametrix, Inc. | Methods for registration of three-dimensional frames to create three-dimensional virtual models of objects |
KR100373818B1 (en) * | 2000-08-01 | 2003-02-26 | 삼성전자주식회사 | Real size display system |
US7040896B2 (en) * | 2000-08-16 | 2006-05-09 | Align Technology, Inc. | Systems and methods for removing gingiva from computer tooth models |
US6386878B1 (en) * | 2000-08-16 | 2002-05-14 | Align Technology, Inc. | Systems and methods for removing gingiva from teeth |
US6915178B2 (en) | 2000-09-06 | 2005-07-05 | O'brien Dental Lab, Inc. | Dental prosthesis manufacturing process, dental prosthesis pattern & dental prosthesis made thereby |
US7050876B1 (en) | 2000-10-06 | 2006-05-23 | Phonak Ltd. | Manufacturing methods and systems for rapid production of hearing-aid shells |
EP2039321B1 (en) | 2000-11-08 | 2013-01-02 | Institut Straumann AG | Surface recording and generation |
US7580846B2 (en) | 2001-01-09 | 2009-08-25 | Align Technology, Inc. | Method and system for distributing patient referrals |
US7080979B2 (en) * | 2001-04-13 | 2006-07-25 | Orametrix, Inc. | Method and workstation for generating virtual tooth models from three-dimensional tooth data |
US6733289B2 (en) | 2001-07-30 | 2004-05-11 | 3M Innovative Properties Company | Method and apparatus for selecting a prescription for an orthodontic brace |
US20040243538A1 (en) * | 2001-09-12 | 2004-12-02 | Ralf Alfons Kockro | Interaction with a three-dimensional computer model |
CA2465102A1 (en) * | 2001-10-31 | 2003-05-08 | Imagnosis Inc. | Medical simulation apparatus and method for controlling 3-dimensional image display in the medical simulation apparatus |
WO2003057041A1 (en) * | 2002-01-14 | 2003-07-17 | Cadent Ltd. | Method and sytem for imaging a patient's teeth arrangement |
US7387511B2 (en) * | 2002-01-22 | 2008-06-17 | Geodigm Corporation | Method and apparatus using a scanned image for automatically placing bracket in pre-determined locations |
US7347686B2 (en) * | 2002-01-22 | 2008-03-25 | Geodigm Corporation | Method and apparatus using a scanned image for marking bracket locations |
US7245750B2 (en) * | 2002-01-22 | 2007-07-17 | Geodigm Corporation | Method and apparatus for automatically determining the location of individual teeth within electronic model images |
US7716024B2 (en) * | 2002-04-29 | 2010-05-11 | Geodigm Corporation | Method and apparatus for electronically generating a color dental occlusion map within electronic model images |
EP1539019B1 (en) * | 2002-07-22 | 2012-05-30 | Cadent Ltd. | A method for defining a finish line of a dental prosthesis |
US7773802B2 (en) * | 2002-07-26 | 2010-08-10 | Olympus Corporation | Image processing system with multiple imaging modes |
US7756327B2 (en) * | 2002-07-26 | 2010-07-13 | Olympus Corporation | Image processing system having multiple imaging modes |
US7033327B2 (en) | 2002-09-13 | 2006-04-25 | 3M Innovative Properties Company | Method of determining the long axis of an object |
US20080311535A1 (en) | 2007-05-04 | 2008-12-18 | Ormco Corporation | Torque Overcorrection Model |
AU2003277136A1 (en) | 2002-09-26 | 2004-04-19 | Ormco Corporation | Custom orthodontic appliance system and method |
ATE380997T1 (en) * | 2002-10-18 | 2007-12-15 | Aepsilon Rechteverwaltungs Gmb | DEVICES AND METHOD FOR SURFACE DETECTION AND FOR PRODUCING DENTAL PROSTHESIS PARTS |
US7600999B2 (en) * | 2003-02-26 | 2009-10-13 | Align Technology, Inc. | Systems and methods for fabricating a dental template |
US20040166462A1 (en) | 2003-02-26 | 2004-08-26 | Align Technology, Inc. | Systems and methods for fabricating a dental template |
WO2004100067A2 (en) * | 2003-04-30 | 2004-11-18 | D3D, L.P. | Intra-oral imaging system |
US7228191B2 (en) * | 2003-05-02 | 2007-06-05 | Geodigm Corporation | Method and apparatus for constructing crowns, bridges and implants for dental use |
US7695278B2 (en) * | 2005-05-20 | 2010-04-13 | Orametrix, Inc. | Method and system for finding tooth features on a virtual three-dimensional model |
JP4571625B2 (en) * | 2003-05-05 | 2010-10-27 | ディーフォーディー テクノロジーズ エルエルシー | Imaging by optical tomography |
US7004754B2 (en) * | 2003-07-23 | 2006-02-28 | Orametrix, Inc. | Automatic crown and gingiva detection from three-dimensional virtual model of teeth |
US7261533B2 (en) * | 2003-10-21 | 2007-08-28 | Align Technology, Inc. | Method and apparatus for manufacturing dental aligners |
US7354270B2 (en) | 2003-12-22 | 2008-04-08 | Align Technology, Inc. | Surgical dental appliance |
CN101699850A (en) * | 2004-01-23 | 2010-04-28 | 奥林巴斯株式会社 | Image processing system and camera |
US9492245B2 (en) | 2004-02-27 | 2016-11-15 | Align Technology, Inc. | Method and system for providing dynamic orthodontic assessment and treatment profiles |
US7824346B2 (en) * | 2004-03-11 | 2010-11-02 | Geodigm Corporation | Determining condyle displacement utilizing electronic models of dental impressions having a common coordinate system |
US7702492B2 (en) | 2004-03-11 | 2010-04-20 | Geodigm Corporation | System and method for generating an electronic model for a dental impression having a common coordinate system |
US7481647B2 (en) * | 2004-06-14 | 2009-01-27 | Align Technology, Inc. | Systems and methods for fabricating 3-D objects |
DE602005009432D1 (en) * | 2004-06-17 | 2008-10-16 | Cadent Ltd | Method and apparatus for color forming a three-dimensional structure |
US20060073433A1 (en) * | 2004-06-18 | 2006-04-06 | Anderson Michael C | Thermoforming plastic sheets for dental products |
WO2006009747A1 (en) * | 2004-06-18 | 2006-01-26 | Dentsply International Inc. | Prescribed orthodontic activators |
CA2575302C (en) * | 2004-07-26 | 2012-04-10 | Dentsply International Inc. | Method and system for personalized orthodontic treatment |
US7322824B2 (en) * | 2004-08-17 | 2008-01-29 | Schmitt Stephen M | Design and manufacture of dental implant restorations |
CA2580481C (en) * | 2004-09-14 | 2014-04-08 | Dentsply International Inc. | Notched pontic and system for fabricating dental appliance for use therewith |
US20060069591A1 (en) * | 2004-09-29 | 2006-03-30 | Razzano Michael R | Dental image charting system and method |
DE102004051165B3 (en) * | 2004-10-20 | 2006-06-08 | Willytec Gmbh | Method and device for generating data sets for the production of dental prostheses |
US7384266B2 (en) * | 2004-11-02 | 2008-06-10 | Align Technology, Inc. | Method and apparatus for manufacturing and constructing a physical dental arch model |
US7357634B2 (en) * | 2004-11-05 | 2008-04-15 | Align Technology, Inc. | Systems and methods for substituting virtual dental appliances |
US6976627B1 (en) | 2004-11-12 | 2005-12-20 | Align Technology, Inc. | Identification of units in customized production |
ES2370405T3 (en) * | 2004-11-17 | 2011-12-15 | Dentsply International, Inc. | PLASTIC SHEETS FOR THERMOCONFORMING OF DENTAL PRODUCTS. |
US7819662B2 (en) * | 2004-11-30 | 2010-10-26 | Geodigm Corporation | Multi-component dental appliances and a method for constructing the same |
US7442040B2 (en) * | 2005-01-13 | 2008-10-28 | Align Technology, Inc. | Template for veneer application |
GB0514554D0 (en) * | 2005-07-15 | 2005-08-24 | Materialise Nv | Method for (semi-) automatic dental implant planning |
US20070026358A1 (en) * | 2005-07-26 | 2007-02-01 | Schultz Charles J | Two-phase invisible orthodontics |
US7747418B2 (en) * | 2005-12-09 | 2010-06-29 | Leu Ming C | Computer aided dental bar design |
US7840042B2 (en) * | 2006-01-20 | 2010-11-23 | 3M Innovative Properties Company | Superposition for visualization of three-dimensional data acquisition |
US7698014B2 (en) * | 2006-01-20 | 2010-04-13 | 3M Innovative Properties Company | Local enforcement of accuracy in fabricated models |
US8366442B2 (en) * | 2006-02-15 | 2013-02-05 | Bankruptcy Estate Of Voxelogix Corporation | Dental apparatus for radiographic and non-radiographic imaging |
US8043091B2 (en) * | 2006-02-15 | 2011-10-25 | Voxelogix Corporation | Computer machined dental tooth system and method |
ES2282037B1 (en) * | 2006-03-08 | 2008-09-16 | Juan Carlos Garcia Aparicio | MANUFACTURING PROCEDURE FOR DIGITAL REMOVABLE DENTAL PROTESIES DESIGNED AND SYSTEM REQUIRED FOR SUCH PURPOSE. |
US7746339B2 (en) * | 2006-07-14 | 2010-06-29 | Align Technology, Inc. | System and method for automatic detection of dental features |
US7844356B2 (en) | 2006-07-19 | 2010-11-30 | Align Technology, Inc. | System and method for automatic construction of orthodontic reference objects |
US7844429B2 (en) | 2006-07-19 | 2010-11-30 | Align Technology, Inc. | System and method for three-dimensional complete tooth modeling |
US7690917B2 (en) | 2006-08-17 | 2010-04-06 | Geodigm Corporation | Bracket alignment device |
US20080050692A1 (en) * | 2006-08-22 | 2008-02-28 | Jack Keith Hilliard | System and method for fabricating orthodontic aligners |
US8038444B2 (en) | 2006-08-30 | 2011-10-18 | Align Technology, Inc. | Automated treatment staging for teeth |
US8442283B2 (en) * | 2006-08-30 | 2013-05-14 | Anatomage Inc. | Patient-specific three-dimensional dentition model |
WO2008030965A2 (en) * | 2006-09-06 | 2008-03-13 | Voxelogix Corporation | Methods for the virtual design and computer manufacture of intra oral devices |
US8044954B2 (en) * | 2006-09-22 | 2011-10-25 | Align Technology, Inc. | System and method for automatic construction of tooth axes |
US7835811B2 (en) * | 2006-10-07 | 2010-11-16 | Voxelogix Corporation | Surgical guides and methods for positioning artificial teeth and dental implants |
US8602780B2 (en) | 2006-10-16 | 2013-12-10 | Natural Dental Implants, Ag | Customized dental prosthesis for periodontal or osseointegration and related systems and methods |
US8454362B2 (en) * | 2006-10-16 | 2013-06-04 | Natural Dental Implants Ag | Customized dental prosthesis for periodontal- or osseointegration, and related systems and methods |
US9539062B2 (en) | 2006-10-16 | 2017-01-10 | Natural Dental Implants, Ag | Methods of designing and manufacturing customized dental prosthesis for periodontal or osseointegration and related systems |
US10426578B2 (en) | 2006-10-16 | 2019-10-01 | Natural Dental Implants, Ag | Customized dental prosthesis for periodontal or osseointegration and related systems |
US7708557B2 (en) * | 2006-10-16 | 2010-05-04 | Natural Dental Implants Ag | Customized dental prosthesis for periodontal- or osseointegration, and related systems and methods |
US7711447B2 (en) | 2006-10-20 | 2010-05-04 | Align Technology, Inc. | System and method for automated generating of a cutting curve on a surface |
EP2117750A2 (en) * | 2006-11-07 | 2009-11-18 | Geodigm Corporation | Sprue formers |
DE102006061134A1 (en) * | 2006-12-22 | 2008-06-26 | Aepsilon Rechteverwaltungs Gmbh | Process for the transport of dental prostheses |
DE102006061143A1 (en) * | 2006-12-22 | 2008-07-24 | Aepsilon Rechteverwaltungs Gmbh | Method, computer-readable medium and computer relating to the manufacture of dental prostheses |
US20090148816A1 (en) * | 2007-01-11 | 2009-06-11 | Geodigm Corporation | Design of dental appliances |
WO2008086526A2 (en) | 2007-01-11 | 2008-07-17 | Geodigm Corporation | Design of dental appliances |
US20080228303A1 (en) * | 2007-03-13 | 2008-09-18 | Schmitt Stephen M | Direct manufacture of dental and medical devices |
DE102007012584A1 (en) * | 2007-03-13 | 2008-09-18 | Paul Weigl | Method for controlling preparation of a prepared tooth by CAD method |
US7878805B2 (en) | 2007-05-25 | 2011-02-01 | Align Technology, Inc. | Tabbed dental appliance |
US8562338B2 (en) | 2007-06-08 | 2013-10-22 | Align Technology, Inc. | Treatment progress tracking and recalibration |
US8075306B2 (en) | 2007-06-08 | 2011-12-13 | Align Technology, Inc. | System and method for detecting deviations during the course of an orthodontic treatment to gradually reposition teeth |
WO2009006273A2 (en) * | 2007-06-29 | 2009-01-08 | 3M Innovative Properties Company | Synchronized views of video data and three-dimensional model data |
US8738394B2 (en) | 2007-11-08 | 2014-05-27 | Eric E. Kuo | Clinical data file |
US7865259B2 (en) | 2007-12-06 | 2011-01-04 | Align Technology, Inc. | System and method for improved dental geometry representation |
JP5250251B2 (en) * | 2007-12-17 | 2013-07-31 | イマグノーシス株式会社 | Medical imaging marker and its utilization program |
CA2713122A1 (en) | 2008-01-23 | 2009-07-30 | Sensable Technologies, Inc. | Haptically enabled dental modeling system |
DE102008006048A1 (en) * | 2008-01-25 | 2009-07-30 | Straumann Holding Ag | Method for modeling an individual denture |
US8108189B2 (en) | 2008-03-25 | 2012-01-31 | Align Technologies, Inc. | Reconstruction of non-visible part of tooth |
WO2009140582A2 (en) * | 2008-05-16 | 2009-11-19 | Geodigm Corporation | Method and apparatus for combining 3d dental scans with other 3d data sets |
US8092215B2 (en) | 2008-05-23 | 2012-01-10 | Align Technology, Inc. | Smile designer |
US9492243B2 (en) | 2008-05-23 | 2016-11-15 | Align Technology, Inc. | Dental implant positioning |
US8172569B2 (en) | 2008-06-12 | 2012-05-08 | Align Technology, Inc. | Dental appliance |
EP2313868B1 (en) * | 2008-07-18 | 2016-03-16 | Vorum Research Corporation | Method, apparatus, signals and media for producing a computer representation of a three-dimensional surface of an appliance for a living body |
KR100971762B1 (en) * | 2008-08-28 | 2010-07-26 | 주식회사바텍 | Method and apparatus for generating virtual teeth, and the recording media storing the program performing the said method |
US8152518B2 (en) | 2008-10-08 | 2012-04-10 | Align Technology, Inc. | Dental positioning appliance having metallic portion |
US9642678B2 (en) | 2008-12-30 | 2017-05-09 | Align Technology, Inc. | Method and system for dental visualization |
US20100291505A1 (en) * | 2009-01-23 | 2010-11-18 | Curt Rawley | Haptically Enabled Coterminous Production of Prosthetics and Patient Preparations in Medical and Dental Applications |
US20100192375A1 (en) | 2009-02-02 | 2010-08-05 | Remedent Nv | Method for producing a dentist tool |
US8640338B2 (en) | 2009-02-02 | 2014-02-04 | Viax Dental Technologies, LLC | Method of preparation for restoring tooth structure |
US8292617B2 (en) | 2009-03-19 | 2012-10-23 | Align Technology, Inc. | Dental wire attachment |
CA2755555C (en) * | 2009-03-20 | 2018-09-11 | 3Shape A/S | System and method for effective planning, visualization, and optimization of dental restorations |
US20110044512A1 (en) * | 2009-03-31 | 2011-02-24 | Myspace Inc. | Automatic Image Tagging |
US20100296742A1 (en) * | 2009-05-22 | 2010-11-25 | Honeywell Inernational Inc. | System and method for object based post event forensics in video surveillance systems |
US8765031B2 (en) | 2009-08-13 | 2014-07-01 | Align Technology, Inc. | Method of forming a dental appliance |
US8896592B2 (en) | 2009-08-21 | 2014-11-25 | Align Technology, Inc. | Digital dental modeling |
EP2306400B1 (en) | 2009-09-04 | 2015-02-11 | Medicim NV | Method for digitizing dento-maxillofacial objects |
US8348669B1 (en) | 2009-11-04 | 2013-01-08 | Bankruptcy Estate Of Voxelogix Corporation | Surgical template and method for positioning dental casts and dental implants |
US8708697B2 (en) | 2009-12-08 | 2014-04-29 | Align Technology, Inc. | Tactile objects for orthodontics, systems and methods |
US8457772B2 (en) * | 2010-02-10 | 2013-06-04 | Biocad Medical, Inc. | Method for planning a dental component |
US9934360B2 (en) * | 2010-02-10 | 2018-04-03 | Biocad Medical, Inc. | Dental data planning |
US9241774B2 (en) | 2010-04-30 | 2016-01-26 | Align Technology, Inc. | Patterned dental positioning appliance |
US9211166B2 (en) | 2010-04-30 | 2015-12-15 | Align Technology, Inc. | Individualized orthodontic treatment index |
US8837774B2 (en) * | 2010-05-04 | 2014-09-16 | Bae Systems Information Solutions Inc. | Inverse stereo image matching for change detection |
US9179988B2 (en) | 2010-05-25 | 2015-11-10 | Biocad Medical, Inc. | Dental prosthesis connector design |
EP3669819B1 (en) | 2010-07-12 | 2022-04-13 | 3Shape A/S | 3d modeling of an object using textural features |
US8712733B2 (en) * | 2010-09-17 | 2014-04-29 | Biocad Medical, Inc. | Adjusting dental prostheses based on soft tissue |
US20120114199A1 (en) * | 2010-11-05 | 2012-05-10 | Myspace, Inc. | Image auto tagging method and application |
JP5959539B2 (en) * | 2011-02-18 | 2016-08-02 | スリーエム イノベイティブ プロパティズ カンパニー | Orthodontic digital setup |
WO2012149961A1 (en) * | 2011-05-03 | 2012-11-08 | Fujitsu Limited | Computer - implemented method of simplifying a complex part in a geometric model |
US9037439B2 (en) | 2011-05-13 | 2015-05-19 | Align Technology, Inc. | Prioritization of three dimensional dental elements |
RU2615120C2 (en) | 2011-05-26 | 2017-04-03 | Вайэкс Дентал Текнолождиз, Ллк | Dental instrument and guide devices |
FR2977469B1 (en) | 2011-07-08 | 2013-08-02 | Francois Duret | THREE-DIMENSIONAL MEASURING DEVICE USED IN THE DENTAL FIELD |
US8842904B2 (en) | 2011-07-21 | 2014-09-23 | Carestream Health, Inc. | Method for tooth dissection in CBCT volume |
US8761493B2 (en) | 2011-07-21 | 2014-06-24 | Carestream Health, Inc. | Method and system for tooth segmentation in dental images |
US9129363B2 (en) | 2011-07-21 | 2015-09-08 | Carestream Health, Inc. | Method for teeth segmentation and alignment detection in CBCT volume |
US8929635B2 (en) | 2011-07-21 | 2015-01-06 | Carestream Health, Inc. | Method and system for tooth segmentation in dental images |
US8849016B2 (en) * | 2011-07-21 | 2014-09-30 | Carestream Health, Inc. | Panoramic image generation from CBCT dental images |
US9403238B2 (en) | 2011-09-21 | 2016-08-02 | Align Technology, Inc. | Laser cutting |
US9375300B2 (en) | 2012-02-02 | 2016-06-28 | Align Technology, Inc. | Identifying forces on a tooth |
US9220580B2 (en) | 2012-03-01 | 2015-12-29 | Align Technology, Inc. | Determining a dental treatment difficulty |
US9414897B2 (en) | 2012-05-22 | 2016-08-16 | Align Technology, Inc. | Adjustment of tooth position in a virtual dental model |
US20140067334A1 (en) | 2012-09-06 | 2014-03-06 | Align Technology Inc. | Method and a system usable in creating a subsequent dental appliance |
US9123147B2 (en) * | 2012-09-07 | 2015-09-01 | Carestream Health, Inc. | Imaging apparatus for display of maxillary and mandibular arches |
US9364296B2 (en) | 2012-11-19 | 2016-06-14 | Align Technology, Inc. | Filling undercut areas of teeth relative to axes of appliance placement |
US9135498B2 (en) * | 2012-12-14 | 2015-09-15 | Ormco Corporation | Integration of intra-oral imagery and volumetric imagery |
US10617489B2 (en) | 2012-12-19 | 2020-04-14 | Align Technology, Inc. | Creating a digital dental model of a patient's teeth using interproximal information |
WO2014135695A1 (en) | 2013-03-08 | 2014-09-12 | 3Shape A/S | Visualising a 3d dental restoration on a 2d image |
US9740989B2 (en) | 2013-03-11 | 2017-08-22 | Autodesk, Inc. | Techniques for slicing a 3D model for manufacturing |
US11024080B2 (en) | 2013-03-11 | 2021-06-01 | Autodesk, Inc. | Techniques for slicing a 3D model for manufacturing |
US10054932B2 (en) * | 2013-03-11 | 2018-08-21 | Autodesk, Inc. | Techniques for two-way slicing of a 3D model for manufacturing |
US9754412B2 (en) | 2013-03-11 | 2017-09-05 | Autodesk, Inc. | Techniques for slicing a 3D model for manufacturing |
US9972083B2 (en) * | 2013-04-22 | 2018-05-15 | Carestream Dental Technology Topco Limited | Detection of tooth fractures in CBCT image |
US9510757B2 (en) * | 2014-05-07 | 2016-12-06 | Align Technology, Inc. | Identification of areas of interest during intraoral scans |
GB2541818A (en) * | 2014-06-19 | 2017-03-01 | Halliburton Energy Services Inc | Forming facsimile formation core samples using three-dimensional printing |
US9626462B2 (en) * | 2014-07-01 | 2017-04-18 | 3M Innovative Properties Company | Detecting tooth wear using intra-oral 3D scans |
EP3682807A1 (en) * | 2014-09-16 | 2020-07-22 | Sirona Dental, Inc. | Methods, systems, apparatuses, and computer programs for processing tomographic images |
US10449016B2 (en) | 2014-09-19 | 2019-10-22 | Align Technology, Inc. | Arch adjustment appliance |
US9610141B2 (en) | 2014-09-19 | 2017-04-04 | Align Technology, Inc. | Arch expanding appliance |
US9744001B2 (en) | 2014-11-13 | 2017-08-29 | Align Technology, Inc. | Dental appliance with cavity for an unerupted or erupting tooth |
US10504386B2 (en) | 2015-01-27 | 2019-12-10 | Align Technology, Inc. | Training method and system for oral-cavity-imaging-and-modeling equipment |
US10248883B2 (en) | 2015-08-20 | 2019-04-02 | Align Technology, Inc. | Photograph-based assessment of dental treatments and procedures |
US10339649B2 (en) * | 2015-09-11 | 2019-07-02 | Carestream Dental Technology Topco Limited | Method and system for hybrid mesh segmentation |
US11554000B2 (en) | 2015-11-12 | 2023-01-17 | Align Technology, Inc. | Dental attachment formation structure |
US11931222B2 (en) | 2015-11-12 | 2024-03-19 | Align Technology, Inc. | Dental attachment formation structures |
US11596502B2 (en) | 2015-12-09 | 2023-03-07 | Align Technology, Inc. | Dental attachment placement structure |
US11103330B2 (en) | 2015-12-09 | 2021-08-31 | Align Technology, Inc. | Dental attachment placement structure |
BR112018017121A2 (en) | 2016-02-24 | 2018-12-26 | 3Shape As | detection and monitoring of the development of a dental condition |
NL2016800B1 (en) * | 2016-05-19 | 2017-12-05 | Umc Utrecht Holding Bv | Method of positioning an interventional device. |
WO2017218947A1 (en) | 2016-06-17 | 2017-12-21 | Align Technology, Inc. | Intraoral appliances with sensing |
EP3471653B1 (en) | 2016-06-17 | 2021-12-22 | Align Technology, Inc. | Orthodontic appliance performance monitor |
KR102443088B1 (en) | 2016-07-27 | 2022-09-14 | 얼라인 테크널러지, 인크. | Intraoral scanner with dental diagnostics |
US11291532B2 (en) | 2016-07-27 | 2022-04-05 | James R. Glidewell Dental Ceramics, Inc. | Dental CAD automation using deep learning |
WO2018038748A1 (en) | 2016-08-24 | 2018-03-01 | Carestream Health, Inc. | Method and system for hybrid mesh segmentation |
EP3308738B1 (en) | 2016-10-13 | 2020-04-22 | a.tron3d GmbH | Method for cleaning up virtual representation of objects |
US10888395B2 (en) * | 2016-10-28 | 2021-01-12 | Align Technology, Inc. | Mold and aligner with cut line markings |
EP4295748A3 (en) | 2016-11-04 | 2024-03-27 | Align Technology, Inc. | Methods and apparatuses for dental images |
US11559378B2 (en) | 2016-11-17 | 2023-01-24 | James R. Glidewell Dental Ceramics, Inc. | Scanning dental impressions |
US20200015936A1 (en) | 2016-11-30 | 2020-01-16 | Carestream Dental Technology Topco Limited | Method and system for braces removal from dentition mesh |
WO2018102702A1 (en) | 2016-12-02 | 2018-06-07 | Align Technology, Inc. | Dental appliance features for speech enhancement |
US11376101B2 (en) | 2016-12-02 | 2022-07-05 | Align Technology, Inc. | Force control, stop mechanism, regulating structure of removable arch adjustment appliance |
US10993783B2 (en) | 2016-12-02 | 2021-05-04 | Align Technology, Inc. | Methods and apparatuses for customizing a rapid palatal expander |
WO2018102809A1 (en) | 2016-12-02 | 2018-06-07 | Align Technology, Inc. | Palatal expanders and methods of expanding a palate |
US10548700B2 (en) | 2016-12-16 | 2020-02-04 | Align Technology, Inc. | Dental appliance etch template |
US10792127B2 (en) | 2017-01-24 | 2020-10-06 | Align Technology, Inc. | Adaptive orthodontic treatment |
US10779718B2 (en) | 2017-02-13 | 2020-09-22 | Align Technology, Inc. | Cheek retractor and mobile device holder |
US11007035B2 (en) | 2017-03-16 | 2021-05-18 | Viax Dental Technologies Llc | System for preparing teeth for the placement of veneers |
JP7018604B2 (en) * | 2017-03-16 | 2022-02-14 | 東芝エネルギーシステムズ株式会社 | Subject positioning device, subject positioning method, subject positioning program and radiation therapy system |
EP4241725A3 (en) | 2017-03-20 | 2023-11-01 | Align Technology, Inc. | Generating a virtual depiction of an orthodontic treatment of a patient |
US10613515B2 (en) | 2017-03-31 | 2020-04-07 | Align Technology, Inc. | Orthodontic appliances including at least partially un-erupted teeth and method of forming them |
USD843417S1 (en) * | 2017-04-19 | 2019-03-19 | Navix International Limited | Display screen or portion thereof with icon |
US11045283B2 (en) | 2017-06-09 | 2021-06-29 | Align Technology, Inc. | Palatal expander with skeletal anchorage devices |
US10639134B2 (en) | 2017-06-26 | 2020-05-05 | Align Technology, Inc. | Biosensor performance indicator for intraoral appliances |
US10885521B2 (en) | 2017-07-17 | 2021-01-05 | Align Technology, Inc. | Method and apparatuses for interactive ordering of dental aligners |
CN111107806B (en) | 2017-07-21 | 2022-04-19 | 阿莱恩技术有限公司 | Jaw profile anchoring |
CN116327391A (en) | 2017-07-27 | 2023-06-27 | 阿莱恩技术有限公司 | System and method for treating orthodontic appliances by optical coherence tomography |
US11633268B2 (en) | 2017-07-27 | 2023-04-25 | Align Technology, Inc. | Tooth shading, transparency and glazing |
US11116605B2 (en) | 2017-08-15 | 2021-09-14 | Align Technology, Inc. | Buccal corridor assessment and computation |
WO2019036677A1 (en) | 2017-08-17 | 2019-02-21 | Align Technology, Inc. | Dental appliance compliance monitoring |
CA3072981A1 (en) | 2017-08-31 | 2019-03-07 | 3Shape A/S | Volume rendering using surface guided cropping |
US10813720B2 (en) | 2017-10-05 | 2020-10-27 | Align Technology, Inc. | Interproximal reduction templates |
CN111565668B (en) | 2017-10-27 | 2022-06-07 | 阿莱恩技术有限公司 | Substitute occlusion adjusting structure |
CN116602778A (en) | 2017-10-31 | 2023-08-18 | 阿莱恩技术有限公司 | Dental appliance with selective bite loading and controlled tip staggering |
CN115252177A (en) | 2017-11-01 | 2022-11-01 | 阿莱恩技术有限公司 | Automated therapy planning |
US10997727B2 (en) | 2017-11-07 | 2021-05-04 | Align Technology, Inc. | Deep learning for tooth detection and evaluation |
US11534974B2 (en) | 2017-11-17 | 2022-12-27 | Align Technology, Inc. | Customized fabrication of orthodontic retainers based on patient anatomy |
TWI644655B (en) | 2017-11-23 | 2018-12-21 | 勘德股份有限公司 | Digital dental mesh segmentation method and digital dental mesh segmentation device |
EP3716885B1 (en) | 2017-11-30 | 2023-08-30 | Align Technology, Inc. | Orthodontic intraoral appliances comprising sensors |
US11432908B2 (en) | 2017-12-15 | 2022-09-06 | Align Technology, Inc. | Closed loop adaptive orthodontic treatment methods and apparatuses |
US10980613B2 (en) | 2017-12-29 | 2021-04-20 | Align Technology, Inc. | Augmented reality enhancements for dental practitioners |
ES2907213T3 (en) | 2018-01-26 | 2022-04-22 | Align Technology Inc | Diagnostic intraoral scanning and tracking |
US11007040B2 (en) | 2018-03-19 | 2021-05-18 | James R. Glidewell Dental Ceramics, Inc. | Dental CAD automation using deep learning |
US11937991B2 (en) | 2018-03-27 | 2024-03-26 | Align Technology, Inc. | Dental attachment placement structure |
KR20200141498A (en) | 2018-04-11 | 2020-12-18 | 얼라인 테크널러지, 인크. | Releasable palate dilator |
US11154381B2 (en) | 2018-05-08 | 2021-10-26 | Align Technology, Inc. | Automatic ectopic teeth detection on scan |
US11026766B2 (en) | 2018-05-21 | 2021-06-08 | Align Technology, Inc. | Photo realistic rendering of smile image after treatment |
US11020206B2 (en) | 2018-05-22 | 2021-06-01 | Align Technology, Inc. | Tooth segmentation based on anatomical edge information |
JP7261245B2 (en) * | 2018-05-29 | 2023-04-19 | メディシム ナームロゼ ベンノートチャップ | Methods, systems, and computer programs for segmenting pulp regions from images |
US10217237B1 (en) | 2018-06-21 | 2019-02-26 | 3D Med Ag | Systems and methods for forming a desired bend angle in an orthodontic appliance |
US11389131B2 (en) | 2018-06-27 | 2022-07-19 | Denti.Ai Technology Inc. | Systems and methods for processing of dental images |
US11395717B2 (en) | 2018-06-29 | 2022-07-26 | Align Technology, Inc. | Visualization of clinical orthodontic assets and occlusion contact shape |
US11553988B2 (en) | 2018-06-29 | 2023-01-17 | Align Technology, Inc. | Photo of a patient with new simulated smile in an orthodontic treatment review software |
US11464604B2 (en) | 2018-06-29 | 2022-10-11 | Align Technology, Inc. | Dental arch width measurement tool |
US10996813B2 (en) | 2018-06-29 | 2021-05-04 | Align Technology, Inc. | Digital treatment planning by modeling inter-arch collisions |
EP4331532A2 (en) | 2018-06-29 | 2024-03-06 | Align Technology, Inc. | Providing a simulated outcome of dental treatment on a patient |
US10835349B2 (en) | 2018-07-20 | 2020-11-17 | Align Technology, Inc. | Parametric blurring of colors for teeth in generated images |
US10251729B1 (en) | 2018-08-31 | 2019-04-09 | 3D Med Ag | Intra-oral device |
US11534272B2 (en) | 2018-09-14 | 2022-12-27 | Align Technology, Inc. | Machine learning scoring system and methods for tooth position assessment |
EP3847628A1 (en) | 2018-09-19 | 2021-07-14 | Arbrea Labs Ag | Marker-less augmented reality system for mammoplasty pre-visualization |
US11151753B2 (en) | 2018-09-28 | 2021-10-19 | Align Technology, Inc. | Generic framework for blurring of colors for teeth in generated images using height map |
US11654001B2 (en) | 2018-10-04 | 2023-05-23 | Align Technology, Inc. | Molar trimming prediction and validation using machine learning |
US10315353B1 (en) | 2018-11-13 | 2019-06-11 | SmileDirectClub LLC | Systems and methods for thermoforming dental aligners |
EP3673863A1 (en) | 2018-12-28 | 2020-07-01 | Trophy | 3d printing optimization using clinical indications |
EP3673864A1 (en) | 2018-12-28 | 2020-07-01 | Trophy | Tooth segmentation using tooth registration |
EP3673862A1 (en) | 2018-12-28 | 2020-07-01 | Trophy | Dental model superimposition using clinical indications |
US11478334B2 (en) | 2019-01-03 | 2022-10-25 | Align Technology, Inc. | Systems and methods for nonlinear tooth modeling |
US11007042B2 (en) | 2019-02-06 | 2021-05-18 | Sdc U.S. Smilepay Spv | Systems and methods for marking models for dental aligner fabrication |
US10482192B1 (en) | 2019-02-12 | 2019-11-19 | SmileDirectClub LLC | Systems and methods for selecting and marking a location on a dental aligner |
EP3949888A4 (en) * | 2019-03-28 | 2022-12-21 | DIO Corporation | Dental image registration device and method |
US11707344B2 (en) | 2019-03-29 | 2023-07-25 | Align Technology, Inc. | Segmentation quality assessment |
US11357598B2 (en) | 2019-04-03 | 2022-06-14 | Align Technology, Inc. | Dental arch analysis and tooth numbering |
US11540906B2 (en) | 2019-06-25 | 2023-01-03 | James R. Glidewell Dental Ceramics, Inc. | Processing digital dental impression |
US11534271B2 (en) | 2019-06-25 | 2022-12-27 | James R. Glidewell Dental Ceramics, Inc. | Processing CT scan of dental impression |
US11622843B2 (en) | 2019-06-25 | 2023-04-11 | James R. Glidewell Dental Ceramics, Inc. | Processing digital dental impression |
US11676701B2 (en) | 2019-09-05 | 2023-06-13 | Pearl Inc. | Systems and methods for automated medical image analysis |
US10984529B2 (en) * | 2019-09-05 | 2021-04-20 | Pearl Inc. | Systems and methods for automated medical image annotation |
US11651494B2 (en) | 2019-09-05 | 2023-05-16 | Align Technology, Inc. | Apparatuses and methods for three-dimensional dental segmentation using dental image data |
US11083411B2 (en) * | 2019-09-06 | 2021-08-10 | Sdc U.S. Smilepay Spv | Systems and methods for user monitoring |
US11273008B2 (en) | 2019-12-04 | 2022-03-15 | Oxilio Ltd | Systems and methods for generating 3D-representation of tooth-specific appliance |
US10717208B1 (en) | 2019-12-04 | 2020-07-21 | Oxilio Ltd | Methods and systems for thermoforming orthodontic aligners |
US10695146B1 (en) | 2019-12-04 | 2020-06-30 | Oxilio Ltd | Systems and methods for determining orthodontic treatments |
US10631956B1 (en) | 2019-12-04 | 2020-04-28 | Oxilio Ltd | Methods and systems for making an orthodontic aligner having fixing blocks |
US10631954B1 (en) | 2019-12-04 | 2020-04-28 | Oxilio Ltd | Systems and methods for determining orthodontic treatments |
US10726949B1 (en) | 2019-12-05 | 2020-07-28 | Oxilio Ltd | Systems and methods for generating 3D-representation of tooth-specific platform for dental appliance |
AU2020408811A1 (en) * | 2019-12-18 | 2022-05-26 | Vita Zahnfabrik H. Rauter Gmbh & Co. Kg | Method for defining at least one boundary surface inside an artificial tooth element |
US11903793B2 (en) | 2019-12-31 | 2024-02-20 | Align Technology, Inc. | Machine learning dental segmentation methods using sparse voxel representations |
US11055789B1 (en) | 2020-01-17 | 2021-07-06 | Pearl Inc. | Systems and methods for insurance fraud detection |
US10751149B1 (en) | 2020-02-18 | 2020-08-25 | Oxilio Ltd | Method of determining deformation of gingiva |
US10898298B1 (en) | 2020-04-08 | 2021-01-26 | Oxilio Ltd | Systems and methods for determining orthodontic treatment |
US10856954B1 (en) | 2020-05-19 | 2020-12-08 | Oxilio Ltd | Systems and methods for determining tooth center of resistance |
US11026767B1 (en) | 2020-07-23 | 2021-06-08 | Oxilio Ltd | Systems and methods for planning an orthodontic treatment |
US10950061B1 (en) | 2020-07-23 | 2021-03-16 | Oxilio Ltd | Systems and methods for planning an orthodontic treatment |
US20220023002A1 (en) | 2020-07-23 | 2022-01-27 | Align Technology, Inc. | Intelligent photo guidance for dentition capture |
US10945812B1 (en) | 2020-07-24 | 2021-03-16 | Oxilio Ltd | Systems and methods for planning an orthodontic treatment |
CN111991106B (en) * | 2020-08-17 | 2021-11-23 | 苏州瀚华智造智能技术有限公司 | Automatic tooth socket cutting line generation method and application |
US11544846B2 (en) | 2020-08-27 | 2023-01-03 | James R. Glidewell Dental Ceramics, Inc. | Out-of-view CT scan detection |
USD958170S1 (en) | 2020-09-08 | 2022-07-19 | Arkimos Ltd | Display screen or portion thereof with graphical user interface |
US10993782B1 (en) | 2020-09-08 | 2021-05-04 | Oxilio Ltd | Systems and methods for determining a tooth trajectory |
US11864970B2 (en) | 2020-11-06 | 2024-01-09 | Align Technology, Inc. | Accurate method to determine center of resistance for 1D/2D/3D problems |
US11058515B1 (en) | 2021-01-06 | 2021-07-13 | Arkimos Ltd. | Systems and methods for forming dental appliances |
US11191618B1 (en) | 2021-01-06 | 2021-12-07 | Arkimos Ltd | Systems and methods for forming a dental appliance |
US11166787B1 (en) | 2021-01-06 | 2021-11-09 | Arkimos Ltd | Orthodontic attachment systems and methods |
US11197744B1 (en) | 2021-01-06 | 2021-12-14 | Arkimos Ltd | Method and system for generating interdental filler models |
US11116606B1 (en) | 2021-01-06 | 2021-09-14 | Arkimos Ltd. | Systems and methods for determining a jaw curve |
US11776677B2 (en) | 2021-01-06 | 2023-10-03 | Pearl Inc. | Computer vision-based analysis of provider data |
US11055850B1 (en) | 2021-01-06 | 2021-07-06 | Oxilio Ltd | Systems and methods for tooth segmentation |
US11278377B1 (en) | 2021-06-02 | 2022-03-22 | Oxilio Ltd | System and method for performing digital separation of teeth |
WO2023068760A1 (en) * | 2021-10-18 | 2023-04-27 | 주식회사 메디트 | Data processing device and data processing method |
Family Cites Families (129)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US2467432A (en) * | 1943-07-23 | 1949-04-19 | Harold D Kesling | Method of making orthodontic appliances and of positioning teeth |
US3407500A (en) | 1966-05-06 | 1968-10-29 | Peter C. Kesling | Tooth positioner |
US3660900A (en) | 1969-11-10 | 1972-05-09 | Lawrence F Andrews | Method and apparatus for improved orthodontic bracket and arch wire technique |
US3600808A (en) | 1970-01-22 | 1971-08-24 | James Jackson Reeve | Anterior root-torquing auxiliary wire |
US3860803A (en) | 1970-08-24 | 1975-01-14 | Diecomp Inc | Automatic method and apparatus for fabricating progressive dies |
US3683502A (en) | 1970-09-14 | 1972-08-15 | Melvin Wallshein | Orthodontic systems |
US3916526A (en) * | 1973-05-10 | 1975-11-04 | Fred Frank Schudy | Method and apparatus for orthodontic treatment |
US3922786A (en) | 1974-01-30 | 1975-12-02 | Joseph L Lavin | Method and apparatus for forming and fitting orthodontic appliances |
US3983628A (en) | 1975-01-24 | 1976-10-05 | Raul Acevedo | Dental articulator, new bite registration guide, and diagnostic procedure associated with stereodont orthodontic study model |
US3950851A (en) * | 1975-03-05 | 1976-04-20 | Bergersen Earl Olaf | Orthodontic positioner and method for improving retention of tooth alignment therewith |
US4014096A (en) * | 1975-03-25 | 1977-03-29 | Dellinger Eugene L | Method and apparatus for orthodontic treatment |
JPS5358191A (en) * | 1976-11-05 | 1978-05-25 | Osamu Yoshii | Method of producing dental correction treating instrument using silicon resin material |
US4348178A (en) * | 1977-01-03 | 1982-09-07 | Kurz Craven H | Vibrational orthodontic appliance |
US4195046A (en) * | 1978-05-04 | 1980-03-25 | Kesling Peter C | Method for molding air holes into a tooth positioning and retaining appliance |
US4324547A (en) | 1978-09-16 | 1982-04-13 | Vishay Intertechnology, Inc. | Dentistry technique |
US4253828A (en) | 1979-04-09 | 1981-03-03 | Coles Donna C | Orthodontic appliance |
DE2936847A1 (en) * | 1979-09-12 | 1981-03-19 | Paul Dr. 6054 Rodgau Heitlinger | METHOD FOR PRODUCING DENTAL SPARE AND DEVICE FOR IMPLEMENTING THE METHOD |
US4575805A (en) * | 1980-12-24 | 1986-03-11 | Moermann Werner H | Method and apparatus for the fabrication of custom-shaped implants |
DE3203937C2 (en) * | 1982-02-05 | 1985-10-03 | Luc Dr. 4150 Krefeld Barrut | Method and device for machine restoration or correction of at least one tooth or for machine preparation of at least one tooth for a fixed prosthetic restoration and for machine production of the fixed prosthetic restoration |
FR2525103B1 (en) * | 1982-04-14 | 1985-09-27 | Duret Francois | IMPRESSION TAKING DEVICE BY OPTICAL MEANS, PARTICULARLY FOR THE AUTOMATIC PRODUCTION OF PROSTHESES |
US4663720A (en) * | 1984-02-21 | 1987-05-05 | Francois Duret | Method of and apparatus for making a prosthesis, especially a dental prosthesis |
US4500294A (en) | 1983-10-03 | 1985-02-19 | Epic International Corporation | Method and device for detecting dental cavities |
US4526540A (en) | 1983-12-19 | 1985-07-02 | Dellinger Eugene L | Orthodontic apparatus and method for treating malocclusion |
DE3415006A1 (en) * | 1984-04-19 | 1985-11-07 | Helge Dr. 8000 München Fischer-Brandies | DENTAL PROCESS AND DEVICE FOR BENDING AND TURNING A WIRE PIECE |
US4798534A (en) | 1984-08-03 | 1989-01-17 | Great Lakes Orthodontic Laboratories Inc. | Method of making a dental appliance |
US4575330A (en) | 1984-08-08 | 1986-03-11 | Uvp, Inc. | Apparatus for production of three-dimensional objects by stereolithography |
US4609349A (en) | 1984-09-24 | 1986-09-02 | Cain Steve B | Active removable orthodontic appliance and method of straightening teeth |
US4591341A (en) | 1984-10-03 | 1986-05-27 | Andrews Lawrence F | Orthodontic positioner and method of manufacturing same |
US4664626A (en) | 1985-03-19 | 1987-05-12 | Kesling Peter C | System for automatically preventing overtipping and/or overuprighting in the begg technique |
US4763791A (en) * | 1985-06-06 | 1988-08-16 | Excel Dental Studios, Inc. | Dental impression supply kit |
GB2176402B (en) | 1985-06-20 | 1989-04-19 | Craig Med Prod Ltd | Wound management appliance for use on the human skin |
US4936862A (en) | 1986-05-30 | 1990-06-26 | Walker Peter S | Method of designing and manufacturing a human joint prosthesis |
CH672722A5 (en) | 1986-06-24 | 1989-12-29 | Marco Brandestini | |
CA1284040C (en) | 1986-06-26 | 1991-05-14 | Peter C. Kesling | Edgewise bracket to provide both free crown tipping and a predetermineddegree of root uprighting |
US4877398A (en) | 1987-04-16 | 1989-10-31 | Tp Orthodontics, Inc. | Bracket for permitting tipping and limiting uprighting |
US4676747A (en) | 1986-08-06 | 1987-06-30 | Tp Orthodontics, Inc. | Torquing auxiliary |
US4983334A (en) | 1986-08-28 | 1991-01-08 | Loren S. Adell | Method of making an orthodontic appliance |
DE3700237A1 (en) * | 1987-01-07 | 1988-07-21 | Bekum Maschf Gmbh | CO EXTRUSION HEAD |
US4755139A (en) | 1987-01-29 | 1988-07-05 | Great Lakes Orthodontics, Ltd. | Orthodontic anchor appliance and method for teeth positioning and method of constructing the appliance |
US4850864A (en) * | 1987-03-30 | 1989-07-25 | Diamond Michael K | Bracket placing instrument |
US4850865A (en) | 1987-04-30 | 1989-07-25 | Napolitano John R | Orthodontic method and apparatus |
US4856991A (en) | 1987-05-05 | 1989-08-15 | Great Lakes Orthodontics, Ltd. | Orthodontic finishing positioner and method of construction |
US5186623A (en) | 1987-05-05 | 1993-02-16 | Great Lakes Orthodontics, Ltd. | Orthodontic finishing positioner and method of construction |
US4836778A (en) | 1987-05-26 | 1989-06-06 | Vexcel Corporation | Mandibular motion monitoring system |
DE3723555C2 (en) * | 1987-07-16 | 1994-08-11 | Steinbichler Hans | Process for the production of dentures |
NL8702391A (en) * | 1987-10-07 | 1989-05-01 | Elephant Edelmetaal Bv | METHOD FOR MANUFACTURING A DENTAL CROWN FOR A TEETH PREPARATION USING A CAD-CAM SYSTEM |
US4793803A (en) * | 1987-10-08 | 1988-12-27 | Martz Martin G | Removable tooth positioning appliance and method |
US4880380A (en) | 1987-10-13 | 1989-11-14 | Martz Martin G | Orthodonture appliance which may be manually installed and removed by the patient |
US5035813A (en) * | 1988-05-27 | 1991-07-30 | Union Oil Company Of California | Process and composition for treating underground formations penetrated by a well borehole |
US4941826A (en) | 1988-06-09 | 1990-07-17 | William Loran | Apparatus for indirect dental machining |
US5100316A (en) * | 1988-09-26 | 1992-03-31 | Wildman Alexander J | Orthodontic archwire shaping method |
US5055039A (en) | 1988-10-06 | 1991-10-08 | Great Lakes Orthodontics, Ltd. | Orthodontic positioner and methods of making and using same |
US4935635A (en) * | 1988-12-09 | 1990-06-19 | Harra Dale G O | System for measuring objects in three dimensions |
US5011405A (en) | 1989-01-24 | 1991-04-30 | Dolphin Imaging Systems | Method for determining orthodontic bracket placement |
JPH04504510A (en) * | 1989-01-24 | 1992-08-13 | ドルフィン イメージング システムス インコーポレーテッド | Method and device for creating craniometric images |
US4889238A (en) | 1989-04-03 | 1989-12-26 | The Procter & Gamble Company | Medicament package for increasing compliance with complex therapeutic regimens |
US4975052A (en) * | 1989-04-18 | 1990-12-04 | William Spencer | Orthodontic appliance for reducing tooth rotation |
US5128870A (en) * | 1989-06-09 | 1992-07-07 | Regents Of The University Of Minnesota | Automated high-precision fabrication of objects of complex and unique geometry |
US5027281A (en) * | 1989-06-09 | 1991-06-25 | Regents Of The University Of Minnesota | Method and apparatus for scanning and recording of coordinates describing three dimensional objects of complex and unique geometry |
US5184306A (en) * | 1989-06-09 | 1993-02-02 | Regents Of The University Of Minnesota | Automated high-precision fabrication of objects of complex and unique geometry |
US5121333A (en) * | 1989-06-09 | 1992-06-09 | Regents Of The University Of Minnesota | Method and apparatus for manipulating computer-based representations of objects of complex and unique geometry |
US5257203A (en) * | 1989-06-09 | 1993-10-26 | Regents Of The University Of Minnesota | Method and apparatus for manipulating computer-based representations of objects of complex and unique geometry |
JPH039712U (en) * | 1989-06-20 | 1991-01-30 | ||
SE467392B (en) * | 1989-08-03 | 1992-07-13 | Richard Berg Ab | DEVICE FOR Separation of sand and other heavier particles from a liquid |
US5395238A (en) | 1990-01-19 | 1995-03-07 | Ormco Corporation | Method of forming orthodontic brace |
US5139419A (en) * | 1990-01-19 | 1992-08-18 | Ormco Corporation | Method of forming an orthodontic brace |
US5431562A (en) | 1990-01-19 | 1995-07-11 | Ormco Corporation | Method and apparatus for designing and forming a custom orthodontic appliance and for the straightening of teeth therewith |
US5447432A (en) | 1990-01-19 | 1995-09-05 | Ormco Corporation | Custom orthodontic archwire forming method and apparatus |
US5454717A (en) | 1990-01-19 | 1995-10-03 | Ormco Corporation | Custom orthodontic brackets and bracket forming method and apparatus |
US5474448A (en) | 1990-01-19 | 1995-12-12 | Ormco Corporation | Low profile orthodontic appliance |
US5533895A (en) | 1990-01-19 | 1996-07-09 | Ormco Corporation | Orthodontic appliance and group standardized brackets therefor and methods of making, assembling and using appliance to straighten teeth |
US5368478A (en) | 1990-01-19 | 1994-11-29 | Ormco Corporation | Method for forming jigs for custom placement of orthodontic appliances on teeth |
US5440326A (en) | 1990-03-21 | 1995-08-08 | Gyration, Inc. | Gyroscopic pointer |
US5562448A (en) | 1990-04-10 | 1996-10-08 | Mushabac; David R. | Method for facilitating dental diagnosis and treatment |
US5452219A (en) | 1990-06-11 | 1995-09-19 | Dentsply Research & Development Corp. | Method of making a tooth mold |
US5340309A (en) | 1990-09-06 | 1994-08-23 | Robertson James G | Apparatus and method for recording jaw motion |
SE468198B (en) * | 1990-12-12 | 1992-11-23 | Nobelpharma Ab | PROCEDURE AND DEVICE FOR MANUFACTURE OF INDIVIDUALLY DESIGNED THREE-DIMENSIONAL BODIES USEFUL AS TENDERS, PROTESTES, ETC |
US5139429A (en) | 1991-02-15 | 1992-08-18 | Hubbell Incorporated | Electrical connector lockout device |
US5131844A (en) * | 1991-04-08 | 1992-07-21 | Foster-Miller, Inc. | Contact digitizer, particularly for dental applications |
US5131843A (en) * | 1991-05-06 | 1992-07-21 | Ormco Corporation | Orthodontic archwire |
US5145364A (en) | 1991-05-15 | 1992-09-08 | M-B Orthodontics, Inc. | Removable orthodontic appliance |
US5176517A (en) | 1991-10-24 | 1993-01-05 | Tru-Tain, Inc. | Dental undercut application device and method of use |
JPH05269146A (en) * | 1992-03-23 | 1993-10-19 | Nikon Corp | Extracting method for margin line at the time of designing crown |
US5273429A (en) | 1992-04-03 | 1993-12-28 | Foster-Miller, Inc. | Method and apparatus for modeling a dental prosthesis |
US5384862A (en) * | 1992-05-29 | 1995-01-24 | Cimpiter Corporation | Radiographic image evaluation apparatus and method |
FR2693096B1 (en) | 1992-07-06 | 1994-09-23 | Deshayes Marie Josephe | Process for modeling the cranial and facial morphology from an x-ray of the skull. |
US5542842A (en) * | 1992-11-09 | 1996-08-06 | Ormco Corporation | Bracket placement jig assembly and method of placing orthodontic brackets on teeth therewith |
WO1994010935A1 (en) * | 1992-11-09 | 1994-05-26 | Ormco Corporation | Custom orthodontic appliance forming method and apparatus |
US5456600A (en) * | 1992-11-09 | 1995-10-10 | Ormco Corporation | Coordinated orthodontic archwires and method of making same |
US5336198A (en) * | 1993-02-01 | 1994-08-09 | Innova Development Corp. | Hypodermic syringe with needle retraction feature |
US5528735A (en) | 1993-03-23 | 1996-06-18 | Silicon Graphics Inc. | Method and apparatus for displaying data within a three-dimensional information landscape |
SE501411C2 (en) * | 1993-07-12 | 1995-02-06 | Nobelpharma Ab | Method and apparatus for three-dimensional body useful in the human body |
CN1054737C (en) | 1993-07-12 | 2000-07-26 | 欧索-泰公司 | A multi-racial preformed orthodontic treatment appliance |
SE501410C2 (en) | 1993-07-12 | 1995-02-06 | Nobelpharma Ab | Method and apparatus in connection with the manufacture of tooth, bridge, etc. |
NL9301308A (en) * | 1993-07-26 | 1995-02-16 | Willem Frederick Van Nifterick | Method of securing a dental prosthesis to implants in a patient's jawbone and using means thereof. |
US5382164A (en) | 1993-07-27 | 1995-01-17 | Stern; Sylvan S. | Method for making dental restorations and the dental restoration made thereby |
US5338198A (en) | 1993-11-22 | 1994-08-16 | Dacim Laboratory Inc. | Dental modeling simulator |
SE502427C2 (en) | 1994-02-18 | 1995-10-16 | Nobelpharma Ab | Method and device utilizing articulator and computer equipment |
US5621648A (en) | 1994-08-02 | 1997-04-15 | Crump; Craig D. | Apparatus and method for creating three-dimensional modeling data from an object |
US5880961A (en) * | 1994-08-02 | 1999-03-09 | Crump; Craig D. | Appararus and method for creating three-dimensional modeling data from an object |
US5549478A (en) * | 1995-01-20 | 1996-08-27 | Mcguire; David | Universal trailer light locator |
US5549476A (en) | 1995-03-27 | 1996-08-27 | Stern; Sylvan S. | Method for making dental restorations and the dental restoration made thereby |
JP3672966B2 (en) | 1995-04-14 | 2005-07-20 | 株式会社ユニスン | Method and apparatus for creating dental prediction model |
US5645421A (en) | 1995-04-28 | 1997-07-08 | Great Lakes Orthodontics Ltd. | Orthodontic appliance debonder |
US5655653A (en) * | 1995-07-11 | 1997-08-12 | Minnesota Mining And Manufacturing Company | Pouch for orthodontic appliance |
JP3727660B2 (en) * | 1995-07-21 | 2005-12-14 | カデント・リミテッド | Method for obtaining a three-dimensional tooth image |
US5742700A (en) * | 1995-08-10 | 1998-04-21 | Logicon, Inc. | Quantitative dental caries detection system and method |
US5671136A (en) * | 1995-12-11 | 1997-09-23 | Willhoit, Jr.; Louis E. | Process for seismic imaging measurement and evaluation of three-dimensional subterranean common-impedance objects |
US5725376A (en) | 1996-02-27 | 1998-03-10 | Poirier; Michel | Methods for manufacturing a dental implant drill guide and a dental implant superstructure |
US6382975B1 (en) * | 1997-02-26 | 2002-05-07 | Technique D'usinage Sinlab Inc. | Manufacturing a dental implant drill guide and a dental implant superstructure |
US5799100A (en) * | 1996-06-03 | 1998-08-25 | University Of South Florida | Computer-assisted method and apparatus for analysis of x-ray images using wavelet transforms |
US5725378A (en) * | 1996-08-16 | 1998-03-10 | Wang; Hong-Chi | Artificial tooth assembly |
JPH1075963A (en) * | 1996-09-06 | 1998-03-24 | Nikon Corp | Method for designing dental prosthetic appliance model and medium recording program for executing the method |
AUPO280996A0 (en) * | 1996-10-04 | 1996-10-31 | Dentech Investments Pty Ltd | Creation and utilization of 3D teeth models |
JP2824424B2 (en) * | 1996-11-07 | 1998-11-11 | 株式会社エフ・エーラボ | 3D machining method |
US6217334B1 (en) * | 1997-01-28 | 2001-04-17 | Iris Development Corporation | Dental scanning method and apparatus |
US5879158A (en) * | 1997-05-20 | 1999-03-09 | Doyle; Walter A. | Orthodontic bracketing system and method therefor |
US5866058A (en) * | 1997-05-29 | 1999-02-02 | Stratasys Inc. | Method for rapid prototyping of solid models |
EP0990224B1 (en) * | 1997-06-17 | 2002-08-28 | BRITISH TELECOMMUNICATIONS public limited company | Generating an image of a three-dimensional object |
AU744385B2 (en) * | 1997-06-20 | 2002-02-21 | Align Technology, Inc. | Method and system for incrementally moving teeth |
US5975893A (en) * | 1997-06-20 | 1999-11-02 | Align Technology, Inc. | Method and system for incrementally moving teeth |
US6409504B1 (en) * | 1997-06-20 | 2002-06-25 | Align Technology, Inc. | Manipulating a digital dentition model to form models of individual dentition components |
US6183248B1 (en) * | 1998-11-30 | 2001-02-06 | Muhammad Chishti | System and method for releasing tooth positioning appliances |
US6152731A (en) * | 1997-09-22 | 2000-11-28 | 3M Innovative Properties Company | Methods for use in dental articulation |
US5934288A (en) * | 1998-04-23 | 1999-08-10 | General Electric Company | Method and apparatus for displaying 3D ultrasound data using three modes of operation |
US6089868A (en) * | 1998-05-14 | 2000-07-18 | 3M Innovative Properties Company | Selection of orthodontic appliances |
US6190165B1 (en) * | 1999-03-23 | 2001-02-20 | Ormco Corporation | Plastic orthodontic appliance having mechanical bonding base and method of making same |
US6350120B1 (en) * | 1999-11-30 | 2002-02-26 | Orametrix, Inc. | Method and apparatus for designing an orthodontic apparatus to provide tooth movement |
US6524101B1 (en) * | 2000-04-25 | 2003-02-25 | Align Technology, Inc. | System and methods for varying elastic modulus appliances |
-
1999
- 1999-05-14 US US09/311,941 patent/US6409504B1/en not_active Expired - Lifetime
- 1999-10-08 AU AU64229/99A patent/AU6422999A/en not_active Abandoned
- 1999-10-08 WO PCT/US1999/023532 patent/WO2000019935A1/en active Application Filing
- 1999-10-08 JP JP2000573298A patent/JP3630634B2/en not_active Expired - Lifetime
- 1999-10-08 EP EP99951884A patent/EP1119312B1/en not_active Expired - Lifetime
- 1999-10-08 CA CA002346256A patent/CA2346256A1/en not_active Abandoned
- 1999-12-28 TW TW088117637A patent/TW471960B/en not_active IP Right Cessation
-
2002
- 2002-03-12 US US10/099,310 patent/US7110594B2/en not_active Expired - Lifetime
- 2002-10-15 US US10/271,665 patent/US7123767B2/en not_active Expired - Lifetime
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10467746B2 (en) | 2013-03-11 | 2019-11-05 | Carestream Dental Technology Topco Limited | Method for producing teeth surface from x-ray scan of a negative impression |
Also Published As
Publication number | Publication date |
---|---|
US20020037489A1 (en) | 2002-03-28 |
EP1119312B1 (en) | 2008-06-25 |
WO2000019935A1 (en) | 2000-04-13 |
EP1119312A4 (en) | 2007-05-16 |
US20030039389A1 (en) | 2003-02-27 |
TW471960B (en) | 2002-01-11 |
US6409504B1 (en) | 2002-06-25 |
WO2000019935A9 (en) | 2000-08-31 |
US20020102009A1 (en) | 2002-08-01 |
JP2003521014A (en) | 2003-07-08 |
US7123767B2 (en) | 2006-10-17 |
JP3630634B2 (en) | 2005-03-16 |
AU6422999A (en) | 2000-04-26 |
EP1119312A1 (en) | 2001-08-01 |
US7110594B2 (en) | 2006-09-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7110594B2 (en) | Manipulating a digital dentition model to form models of individual dentition components | |
US7063532B1 (en) | Subdividing a digital dentition model | |
US7247021B2 (en) | Subdividing a digital dentition model | |
EP2026285B1 (en) | Teeth viewing system | |
US7134874B2 (en) | Computer automated development of an orthodontic treatment plan and appliance | |
CN110087577B (en) | Method and system for removing braces from dentition grid | |
US9161824B2 (en) | Computer automated development of an orthodontic treatment plan and appliance | |
US7905725B2 (en) | Clinician review of an orthodontic treatment plan and appliance | |
US8135569B2 (en) | System and method for three-dimensional complete tooth modeling | |
JP2019524367A (en) | Method and system for hybrid mesh segmentation | |
KR20190037241A (en) | METHOD AND SYSTEM FOR REMOVING DIETARY METAL BRASCE |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FZDE | Discontinued |