WO2001033501A1 - A knowledge-engineering protocol-suite - Google Patents
A knowledge-engineering protocol-suite Download PDFInfo
- Publication number
- WO2001033501A1 WO2001033501A1 PCT/US2000/028319 US0028319W WO0133501A1 WO 2001033501 A1 WO2001033501 A1 WO 2001033501A1 US 0028319 W US0028319 W US 0028319W WO 0133501 A1 WO0133501 A1 WO 0133501A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- data
- layer
- relationships
- sets
- modeling
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B15/00—Systems controlled by a computer
- G05B15/02—Systems controlled by a computer electric
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/418—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM]
- G05B19/41885—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM] characterised by modeling, simulation of the manufacturing system
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/31—From computer integrated manufacturing till monitoring
- G05B2219/31338—Design, flexible manufacturing cell design
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/31—From computer integrated manufacturing till monitoring
- G05B2219/31339—From parameters, build processes, select control elements and their connection
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/31—From computer integrated manufacturing till monitoring
- G05B2219/31353—Expert system to design cellular manufacturing systems
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/32—Operator till task planning
- G05B2219/32345—Of interconnection of cells, subsystems, distributed simulation
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/33—Director till display
- G05B2219/33027—Artificial neural network controller
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/33—Director till display
- G05B2219/33079—Table with functional, weighting coefficients, function
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/45—Nc applications
- G05B2219/45232—CMP chemical mechanical polishing of wafer
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/02—Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]
Definitions
- the present invention generally relates to knowledge-engineering, o search-space organizational validation therein, and to protocol-suites or use therewith.
- the present invention relates to synergistically combining knowledge bases of disparate resolution data-sets, such as by actual or simulated integrating of lower resolution expert-experience based model-like templates to higher resolution empirical data-capture dense quantitative search-spaces.
- the present invention may also be understood to relate to knowledge-engineering embodiments where this synergetic combining is beneficially accomplished, such as in control systems, command control systems, command control communications systems, computational apparatus associated with the aforesaid, and to quantitative modeling and measuring tools used therewith. Equivalent! ⁇ the present invention may be understood to relate to domains in which this synergetic combining is applied, such as design and fabrication of semiconductors, integrated circuits, medical treatment modalities, social engineering models, corporate management enterprise systems, transactional modifications for financial business practices, or substantially any other organized modality of practice or information; technological, bio-physical, mercantile, social, etc. GENERAL BACKGROUND OF THE INVENTION
- data-sets of assorted characters relates to data-sets that differ with respect to data structure complexity, to data resolution, to data quantification, or to any combination thereof.
- Data structure complexity, data resolution, and data quantification may each relate to one-dimensional metrics or to multi-parametric characterizations.
- data structure complexity generally relates to local interconnectivity between a data element being characterized with respect to complexity and other data elements, and similarly global interconnectivity between any data-set, which includes this data element, and other data-sets.
- a root node in a binary tree locally has two children branches of its own, and similarly may globally have many relationships that relate it to root nodes of other data structures.
- data resolution hereinafter “resolution” generally relates to an embedded relational concept wherein data-sets and proper data subsets are identified. The subset has a higher resolution than the superset, in that detailed data is placed in the subset while overview data is placed in the superset.
- a superset may be a workflow overview organization, while subsets contain detailed charts of productivity measurements for each station in the workflow process.
- quantification generally relates to a common sense notion of measurement precision.
- some known precision e.g. velocity at mm/sec or pH to four decimal places
- market surveys it is common to measure customer satisfaction using perhaps two to five select-only-one categories. While the average for a large number of surveyed customers may reach the same numerical precision as a physical measurement for a perhaps smaller number of samplings, nevertheless common sense still says that the physical measurement is a more realistic quantification than the survey result.
- Database management and knowledge-engineering represent a class of computer-implemented strategies for addressing such problems.
- Database management relates to organizational tools for establishing and maintaining data-sets of assorted character. For example, Boyce-Codd normal forms address tradeoff issues of efficiency and redundancy in very large purpose-specific data banks. However, database management does not address how to best benefit from knowledge that is held in these data banks.
- Modeling may be generally described as a low complexity topological graph describing node relations wherein each node corresponds to a data structure of empirical data. These nodes are homogeneously relating to a lower resolution and homogeneously relating to like quantification, while the associated data structures are disparately relating to higher resolution and to homogeneously like quantification within each data structure but not necessarily between data structures.
- the model is then used to simulate how the modeled system might react to a hypothetical perturbation of some of the empirical data.
- modeling is applied in situations where there are many variables having complex interactions, especially where some of these interactions must be described using non-linear equations or using random variation functional components. Modeling is also applied in situations where visualizations, of the variables and their interactions, are believed to contribute to understanding aspects of the system being modeled.
- the simplest models posit a pair-wise functional relationship between variables, such that each variable is a node of the topological graph and the pair-wise relationship describes the low complexity.
- the higher resolution data-sets then are used to describe an empirical manifold in the multi-dimensional space, as described by the pair-wise functionally orthogonal variables.
- Ordinary algebra, calculus, or statistics is then applied to simulate hypothetical empirical situations.
- a more complex class of models posits multivariate functional relationship between assorted combinatorial groupings (n-tuples) of variables, wherein the aggregate of relationships join all of the variables into a single topological graph.
- higher resolution data-sets then are used to describe an empirical manifold for each relationship between the assorted combinatorial groupings of variables.
- Integrating a relational rule set with ordinary algebra, calculus, or statistics then allows hypothetical empirical situations to be simulated.
- a most complex class of models posits embedding of either or both of the above described models within nodes of the more complex class of models. The designing and integrating of relational rules then becomes a cumbersome task that depends on the level of genius of design team, especially for computer implementations.
- the classes of hypothetical empirical situations to be simulated are generally limited by the structure of the design.
- expert systems shift the focus of the simulation from the empirical data manifolds to the designing and integrating of relational rules. Since it is presumed that the experts have subsumed the empirical manifolds, simulating hypothetical empirical situations at the manifold level is replaced by simulating a higher complexity topological graph describing node relations. Expert systems then become a most complex class of models that are critically limited by the structure of their design. Methodologically, the only way to improve an expert system is by implementing a longitudinal study of interviewing experts and integrating their changes of mind and mood.
- process control models Another class of modeling tools, called process control models, has been developed.
- the complexity of functional relationships between variables is grouped as a single node for each station in a process, and the topological graph of node relationships is according to the complexity of the process being modeled.
- each station in the process is internally amenable to any of the above modeling methodologies including expert systems, albeit as constrained by the inputs and outputs for each station.
- the overall process is likewise amenable to benefit from using any of the above modeling methodologies including expert systems, albeit as constrained by the topology of the process.
- process control focuses simulation and decision resources on a limited class of optimization hypotheses that are constrained by the topology of the process.
- Process control models are chosen in circumstances where the overall process is pragmatically optimized by locally optimizing the process at each station. Furthermore, for most applications, process control focuses simulation and decision resources on a limited class of optimization hypotheses that are constrained by using the simplest modeling techniques for each station. For this reason, statistical process control tools, neural network tools, and similar tools have become popular, in that they can be facilely applied to any station, as if that station were isolated from factors at other stations.
- SPC statistical process control
- gross statistically derived threshold-type limits are assigned individually for metrics associated with inputs or outputs at a station; wherein each of these metrics was considered in isolation, in conceptually similar ways to that used in the simplest class of modeling and simulation.
- an SPC station may assemble two primitive components C1 and C2 together to form an aggregated component C3.
- Each of these components has statistically defined acceptable tolerance limits for at least one measurable aspect of the component; C1 (min, max), C2 (min, max), and C3 (min, max).
- the process control engineer When out of specification C3 components are produced, the process control engineer first decides either to stop the process or to let the process continue. Typically, the process is stopped when the result is potentially catastrophic, such as in nuclear power plant SPC or in chemical synthesis of essential therapeutic drugs. Otherwise, the process control engineer may elect to let the process continue, even though the resultant out of specification C3 components may be worth much less than in specification C3 components.
- neural networks In neural networks, high-resolution empirical data is accumulated and correlated with low-resolution decision data, substantially in order to define limits like those that were defined in the SPC method.
- Neural networks are used in situations where setting specification threshold limits for inputs is excessively complex, often because input variables being measured are highly interdependent, and simultaneously where setting threshold limits for outputs is well understood or at least easy to define.
- setting specification threshold limits for inputs is excessively complex, often because input variables being measured are highly interdependent, and simultaneously where setting threshold limits for outputs is well understood or at least easy to define.
- OSI Open Systems Interconnect
- ISO International Standards Organization
- the (OSI) reference model offers a seven-layer model structure defining the "ideal" network communication architecture. This model allows communication software to be broken into modules. Each layer provides services needed by the next layer in a way that frees the upper layer from concern about how these services are provided. This simplifies the design of each layer. With the emergence of open systems, the OSI model set rules that would allow different manufacturers to build products that would seamlessly interact. One of the key areas of importance is the interoperability of network technologies. As a result, this model was designed for the development of network protocols. Although no protocol has yet been developed using this model, it has come to be accepted as a standard way of describing and categorizing existing protocols.
- the ISO model defines seven layers, providing a logical grouping of the network functions. This model is good for teaching, and for planning the implementation of a computer network. Furthermore, dividing functionality in defined layers has the advantage that different parts of the network can be provided from different vendors and still work together. When describing the different layers, one starts from the bottom and proceeds up through the upper layers. This is because some of the functionality and problems of the higher layers result from properties of the lower layers.
- the network stack used in the Internet illustrates the fact that a network is (usually), not implemented exactly as described in the OSI model.
- One protocol stack in use is referred to as the TCP/IP (Transfer Control Protocol/Internet Protocol) stack.
- OSI Physical Layer layer 1
- This layer provides mechanical, electrical, functional, and procedural means to activate and deactivate physical transmission connections between data-links.
- This layer is concerned with the encoding and decoding of digital bits (1s and 0s) between network interfaces. It is typically a function of the interface card, rather than a software utility.
- OSI Data-link Layer (layer 2) deals with getting data packets on and off the wire, error detection and correction, and retransmission.
- This layer is generally broken into two sub-layers: The LLC (Logical Link Control) on the upper half, which does the error checking; and the MAC (Medium Access Control) on the lower half, which deals with getting the data on and off the wire.
- the data link layer is concerned with the transmission of packets from one network interface card to another, based on the physical address of the interface cards. Typical data link protocols are Token Ring and Ethernet.
- the device driver that comes with the network interface card typically enables these protocols. The device driver will be loaded in a specific order with the other protocol programs.
- the data link layer is a point-to-point protocol, much like an airline flight. If you have a direct flight, one plane can get you to your final destination. However, if you have a connecting flight, the plane gets you to your connection point, and another will get you from there to your destination, but its up to you to make the connection yourself. Bridges operate at this layer. • OSI Network Layer (layer 3) makes certain that a packet sent from one device to another actually gets there in a reasonable period of time. Routing and flow controls are performed here. This is the lowest layer of the OSI model that can remain unaware of the physical network. This layer provides a means of connectionless-mode transmission among transport entities. It makes transport entities independent of routing and relay considerations associated with connectionless-mode transmission.
- the network layer is concerned with the end-to-end delivery of messages. It operates on the basis of network addresses that are global in nature. Using the airline example, the network layer makes sure that all the connecting flights are made, so that you will actually arrive in your final destination.
- Network layer protocols include the IPX portion of the Netware IPX /SPX protocol and the IP portion of the TCP/IP protocol stack. Routers operate at this level.
- OSI Transport Layer layer 4 makes sure the lower three layers are doing their job correctly, and provides a transparent, logical data stream between the end user and the network service being used. This is the lower layer that provides local user services. This layer provides transparent data transfer between sessions and relieves them of concern about achieving reliable and cost effective data transfer.
- SUPER-UX supports Transmission Control Protocol (TCP) and User Datagram Protocol (UDP).
- TCP Transmission Control Protocol
- UDP User Datagram Protocol
- the transport layer is concerned with issues such as the safe, intact arrival of messages. It makes the receiver aware that it is going to receive a message, insures that it does get it, and can control the flow of the message if the receiver is getting it too fast, or re-transmit portions that arrive garbled. In our airline analogy, suppose you are flying your children to Grandma's house unaccompanied.
- the data link layer planes will make their flights. A small fee will insure that network layer ground attendants get your kids from one flight to their connection.
- the transport layer will call Grandma to let her know they are coming and what their luggage looks like, and will expect a call from Grandma when she has them safe and sound.
- Typical transport layer protocols are the SPX portion of Netware SPX /IPX and the TCP portion of TCP/IP.
- OSI Session Layer (layer 5) is where communications between applications across a network are controlled. Testing for out-of-sequence packets and handling two-way communication are handled here.
- This layer provides the services needed by protocols in the presentation layer to organize and synchronize their dialogue and manage data exchange.
- the session layer is the layer that manages all the activities of the layers below it. It does this by establishing what is called a virtual connection. Essentially a virtual connection is established when a transmitting station exchanges messages with the receiving station, and tells it to set up and maintain a communications link. This is similar to what happens when you log into the network. Once you have logged in, a connection is maintained throughout the course of your user session until you log out, even though you may not be accessing the network continuously.
- OSI Presentation Layer (layer 6) is where differences in data representation are dealt with. For example, UNIX-style line endings (CR only) might be converted to MS-DOS style (CRLF), or EBCIDIC to ASCII character sets.
- This layer manages the representation of the information that application layer protocols either communicate or reference during communication.
- the presentation layer's function is to establish a common data format between communicating nodes. It is responsible for formatting the data in a way the receiving node can understand. It may also perform data translation between different data formats. Examples of data format differences include byte ordering (should it be read from left to right, or vice versa) and character set (ASCII characters or IBMs EBCDIC character set) as well as differences in numeric representation.
- OSI Application Layer (layer 7) is where the user applications software lies. Such issues as file access and transfer, virtual terminal emulation, inter-process communication, and the like are handled here. This layer serves as the window between corresponding application processes that are exchanging information.
- the application layer provides the user-accessible services of the network. These services include such things as network file transfer and management, remote job initiation and control, virtual terminal sessions with attached hosts, electronic mail services, and network directory services.
- Knights Technology creates software systems that allow engineers to collect, correlate, analyze, and report essential FAB data and to try to determine sources of semiconductor yield loss and wafer defects.
- Knights has several programs and an encyclopedic trouble-shooting guide.
- Knights gives its clients a very sophisticated but un-integrated tool kit. It can only leave the client dimly aware of the need of one smooth running global system that employs the variant pieces of software that are readily available today and would oversee production parameters and make adjustments, as necessary, automatically.
- Knights' product also suffers from the built in limitation of being defect oriented. It is true they might be successful in correcting random yield loss but they will completely miss the cause of systematic yield loss.
- ObjectSpace produces Advance Process Control (APC) software technology or system that enables Run-to-Run control and fault detection applications in the factory. The client would be better served if such software were not limited to a one variable adjustment.
- API Advance Process Control
- ObjectSpace leaves the industry in need of a software technology that has a more global view of the fabrication process and incorporates "wafer history " into a more dynamic, self-correcting system.
- Pattern Domain applies statistical measures of primary and secondary parameters or production data either as collected in real time during on line production or after completion of several production cycles.
- Their Pattern software detects and warns operators when abnormal process conditions occur.
- Pattern's analysis capabilities can enable engineers to scan large volumes of data with the hope of identifying exceptional regions requiring further analysis and to assist engineers in identifying causes.
- the aforementioned software does not provide possible solutions, nor does it automatically expand its scope of analysis from the data that it collates.
- Semy has a supervisory system and metrology tools that collect data from Advanced Run-to-Run Control closed loop systems. Based on the physical measurements derived from the metrology tools, user selected process parameters are automatically modified to keep the process centered.
- This application can be used to control a single step in the process using a feedback technique or it can automatically adjust a subsequent step based on the results of a previous step using a feed forward technique.
- the automatic adjustments are limited to the narrow parameters of the process recipes within specific limits established by the process engineer. This limits the trouble-shooting to a local target, without taking into consideration wafer history, and leaves the user with a static model that cannot implement past data analysis into the present model.
- HPL San Jose Gateway Plaza, 2033 Gateway Place; San Jose, Calif.
- HPL offers a package of four standard Failure Analysis Navigation and Visualization solutions.
- Their software provides integrated access to data that harbors yield loss cause information, product and process engineering and design data, in-line fabrication data, test data, and other data; with the ability to add new data without changing application software.
- User interactive modes of operation of their software include some systematic correlation of information; which "drill-down" to root causes of failures and yield limiters. When there is an alarm, the engineers and design experts must come in; and, using a mining tool, locate the defect, and make the necessary adjustment. The system would be more effective if the model possessed a self-learning mode that would, in future alarm situations, be able to point to possible defect areas and suggest solutions, and in so doing would be able save valuable time and increase yield levels.
- KLA-Tencor Corporation 160 Rio Robles; San Jose, Calif., U.S.A. (www.kla.com)
- KLA-Tencor manufactures a combination of hardware and software systems that have application in identifying and helping to reduce defects in integrated circuit fabrication.
- the KLA-Tencor yield management consultants must decide where and how much to sample. This methodology of FAB yield evaluation paired with certain defect source analysis techniques hopefully may lead to a rapid isolation of a defect source. Once the FAB parameters have been breached, the defect becomes more readily observable, measured, and located, by the engineers if they can correctly interpret the software analysis and recommendations. They are saddled with the same limitation and narrowness of view as Knights in that they are defect oriented. Triant Technologies Inc. — 20 Townsite Road; Nanaimo, BC, Canada (www.triant.com)
- Triant Technologies Inc.'s focus is on improving overall equipment effectiveness by providing solutions that increase equipment up time, minimizing the use of test wafers, accrue useful data on process problem areas, and reduce scrap.
- the company's monitoring components range from a data collection system to a real-time multivariate modeling system. The collected data is stored for both on- and off-line visualization. Both gross and subtle equipment faults are detected by the employment of set point and model-based monitoring and alarming. These technologies reduce false alarms and thus allow the process engineers to determine the source and cause of the fault. If the problem is not in the fabrication equipment, then the speed in which the correction is made is no longer in the hands of Triant's technologies, but in the hands of other yield management and fault detection and analysis tools.
- Triant apparently believes that modeling tools have reached their limit in terms of effectiveness. Because of this, their Tools employ models that are in the main defect driven; and manual rather than automatic in their operational mode. Thus, their Tools are in the end, static models lacking a self-learning ability; unable to suggest possible solutions once an actual process alarm has been rung.
- Yield Dynamics markets a suite of seven products in yield analysis; including data viewing, charting and analysis, wafer map data, data mining and advanced statistical tools. In the area of statistics they provide an option for multivariate analysis by adding a suite of advanced statistical algorithms to their product; allowing for the viewing of many parameters simultaneously; and, hopefully, uncovering relationships that standard univariate techniques are unable to capture due to their complicated interdependencies.
- the increase in analysis tools, and the accumulation of more and more data, cries out for an APC model that is more automated and dynamic; a self learning empirical model that provides a more holistic view of the fabrication process and incorporates an ability to point to the possible causes of the detected deviation, and that suggest solutions.
- the state of the art model is one that is capable of gathering increasingly larger and larger amounts of data-which the engineers are forced to dig their way through with their "mining tools"-nn search of a possible solutions.
- sample illustrative examples in the General Background of the Invention section related to (firstly) combining consumers' perceptions of fruit and vegetable quality with the agronomists' data capture universe; and, (secondly) to combining demographic and actuarial databases with personal medical records and medical research data.
- additional typical illustrative examples may be categorized according to nine discrete classification; regions in the matrix. These nine regions are designated according to the parameters: "complexity” and "quantification”; and therein (for each parameter), according to an initial subjective assessment categorization of High, Middle, or Low.
- a measure of graph directed topology size such as total number of nodes in a model representation, and ranges of branching ratios therein, such as inputs and outputs for a given node.
- a measure of nodes or of relationships in the graph directed topology characterizing if it is the case that variables therein are numerically measured to a predetermined degree of precision (High or Middle) or
- the method of the present invention will be understood as having greater utility than methods of the prior art.
- This improved utility is because the present method operates over a broader domain of problems.
- this improved utility is because the present method allows problems to be defined according to a plurality of perspectives.
- this improved utility is because the present method provides a convenient protocol suite compartmentalization for conceptualizing relationships between the perspectives and relevant empirical data sets, and therein provides facile tools for understanding and developing these relationships.
- each of the forthcoming examples should be appreciated as representing a juxtaposition of perspectives with empirical data sets that heretofore demanded a large-scale custom-built software system.
- This example relates to a network of events starting from a discussion about an initial design concept and concluding when a packaged semiconductor from a batch of substantially identical semiconductors is quality categorized by an end of process testing system.
- This network of events includes inter-relations between hundreds of thousands of related steps, sub-steps and variables.
- this network of events includes upgrading CAD/CAM tools, apparatus in a fabrication facility (FAB), changing specifications to sub-contractors or suppliers, or even building a new fabrication facility.
- This example relates to a network of events starting from a discussion about an initial design concept, continues with the eventual testing of a newly manufactured vehicle, and concludes when all of the sales and maintenance reports are studied against the actual design and manufacture.
- this network of events includes inter-relations between hundreds of thousands of related steps, sub-steps and variables.
- this network of events includes upgrading CAD/CAM tools, apparatus in assembly plants, changing specifications to sub-contractors or suppliers, or even building a new assembly plant.
- the network of events of examples 1a or 1b should properly be represented as a model having a very large number of nodes wherein each node has complex inter-relationships (edges) with sometimes-large numbers of other nodes.
- many of the variables need to be recorded to high degrees of precision, both in the specifications and as measured at many stages in the fabrication (or manufacture), process.
- a very large network of events of this type is related to, in the prior art, by dividing the network into many substantially independent sub-networks (often as long chains of nodes), and applying disparate tools to different sub-networks.
- the design discussions may be managed using project management time tables and documentation version control indexes.
- sections of the fabrication may be managed using statistical process control techniques and design of experiment paradigms.
- the final results may be aggregated using gross measures of batch yield, customer satisfaction, and corporate profitability.
- Embodiments of the method of the present invention allow this fragmented management of a single network to be modeled and considered both as a global symbiotic milieu model and as an ensemble of synergetic separable local sub-models.
- Embodiments of the method of the present invention provide modalities whereby an individual may be related to a plurality of data-sets that describe him, or his ancestors, or persons having a profile-resemblance to him, or groups to which at least one of the aforesaid belong. These embodiments may portray this individual in his relations to these other individuals and groups. Furthermore, these embodiments may then quantitatively posit and quantitatively test hypotheses about the individual or about groups of individuals. This may provide many new opportunities for superior results in managing health care for individuals, in managing public health policy, in improving actuarial table precision, etc.
- This example relates to the process of improving the health of a patient, regardless of whether the patient is sick or healthy.
- the first stage of this process includes combining subjective observations by a patient, objective observations by that patient's medical-service professionals, and quantitative clinical pathology metrics for the patient.
- the second stage of this process includes a definitive analysis of the patient's health, a prognosis for that patient, a strategy to improve the patient's health, and-often-a follow-up procedure that iterates another pairing of these first and second stages.
- embodiments of the method of the present invention may be configured to resemble an overly conservative physician who performs the pruning of the data graph topology and the applying of the disparate precision data.
- These method-enabled pruning and filtering operations should save the skilled physician considerable time when positing a patient specific health improvement strategy.
- these method-enabled operations should permit the physician to expend greater consideration on the actual object of the process, achieving a best possible health improvement for a specific patient.
- Example: This example relates to a classic operations research problem wherein all available substantially external data about a casualty is juxtaposed against available medical resources (facilities, supply, personnel, etc), in order to classify the casualty as destined for initial treatment: immediately, as soon as possible after those classified immediately are treated, or eventually.
- This problem essentially attempts to transform a topologically complex set of interrelated physiological observations into a simple decision result.
- Existing triage models while attempting to consider these interrelated physiological observations "scientifically,” usually focus on the actual decision that needs to be made; given the limited medical resources of the actual situation. Therefore, seemingly external considerations (such as medical treatment success statistics, short term and long term costs, and expected resultant life "quality"), often dominate in choosing a triage decision model.
- Embodiments of the method of the present invention may be used to integrate physiological data and actual casualty data with existing triage models in order to test if any of these models objectively deliver the results that they expect to deliver.
- embodiments of the present invention may be used to derive new triage models, test-simulate them, and compare them to known field-tested triage models.
- This example relates to the well-appreciated problem of comparatively evaluating and proportionately compensating employees.
- Embodiments of the method of the present invention may be applied to organize data about workflow, skills, evaluation, etc. Thereafter, the present method may be used to test these fleeting dogmatic axioms of management, to posit more individualized alternatives, and to quantitatively validate these alternatives.
- Tests and models remain less successful in the selection of suitable candidates for a particular employment slot than a competent manager's intuition and experience.
- Embodiments of the present invention may incorporate into a model the intuition and experience of many managers, and by doing so improve the results.
- embodiments of the present invention may validate presumptions about relationships between the multitude of variables suggested by these managers.
- Existing occupational tracking models, operating substantially independently of experienced rnanagers, cannot conclusively prove that their respective evaluation methods do not emphasize factors that may be at cross-purposes to the apparent objective.
- embodiments of the present invention may be applied to validating accepted conjectures relating tracking to metrics, and may furthermore be applied to testing new prepositional relationships.
- Macroeconomics relates to integrating data about what is produced, its costs, who consumes it, and what they pay for it. Weather, international conflicts, and their effect on the marketplace are all normally measured down to the penny, at the end of any given period, and thus readily classifiable as quantifiably high.
- Graphs and models representing scarcity, opportunity costs, production possibilities, supply and demand, output, national income, budgets, deficits, the national debt, inflation, unemployment, foreign exchange, balance of payments, and supply side economics; are some of the many aspects that go into making up a fiscal policy; and this fiscal policy is the static model that nations use to navigate the very dynamic inter-relations in international economics.
- Embodiments of the present invention may be applied to allow for a greater understanding of day-to-day changes, or even hour-to-hour changes; with suggestions of relationships pointing to their meaning and significance; in that embodiments of the present invention may be multivariate and dynamic (self-learning); with a potential to validate independent values and to make the necessary adjustments in a more robust manner.
- Adjustments, feedback and feed-forward are applicable modalities of intervention (being a benefit deriving from the high quantify-ability of the present example), which may be applied in real time instead of at the end of set reporting periods; as in known classical cases.
- Robust modalities of embodiments of the present invention are beneficially distinguishable over known models, which operate with substantially preset parameters.
- embodiments of the present invention would not only allow for real time monitoring but also be able to make predictions and suggest possible financial adjustments or corrections; be they at the level of transaction policies of nations, investment management strategies of consortiums, or management of personal financial portfolios.
- this robust facility may be especially useful in today's electronic-transaction financial-market environment where volatile instruments, such as futures and derivatives, are more actively traded.
- Embodiments of the present invention would give the experimental physicist an added advantage of narrowing down a wide, but not overwhelming, array of variables. Furthermore, embodiments of the present invention would allow the experimental physicist to discover from previous experiment-models the common modalities and their hierarchy of importance through a dynamic feedback and feed-forward analysis processes.
- the main tool in an experiment is an angle-resolved photoelectron spectrometer in the UV range
- detailed and highly precise information can be obtained about valence states in a volume, surface states, resonances, or chemical shifts of core levels with this device.
- embodiments of the present invention utilizing the information of past models and the precise results of recent experiments, would first remove irrelevant variables and add previously neglected ones based on the self-learning enablement and dynamism of newly generated model linkages. This enhancement would lead to a greater ability to predict, with ever-increasing accuracy, the results of future experiments, and would be able to eliminate unnecessary ones and in so doing, would save valuable time and considerable funds.
- a topological graph representation of Psychiatric Behavior Intervention models is characterized by being of intermediate complexity, as compared to the aforementioned examples.
- the metrics used in this field are usually of intermediate quantification.
- Embodiments of the present invention would enable a model to self-correct and remain relevant in its interpretation of the social patterns that are constantly evolving, as well as allowing for greater individualization.
- the intuitive fear of aggressive behavior and violence on the part of the medical care community might give more weight to certain variables than they deserve, and at the same time overlook others that play a greater role than previously realized.
- COCE Craik-O'Brien-Cornsweet effect
- luminance profile i.e., in distributions of absolute measurements of reflected light
- brightness i.e., in the subjective perception of lightness and darkness
- the two regions are identical in terms of the objective property of luminance profile, but one looks darker than the other does.
- the difference in brightness between rectangles depends upon the difference in luminance at the borders.
- Effects, such as COCE present problems, which it is the business of theoretical work in vision to solve. Any viable model of the human visual system is constrained in the sense that their output should correspond to the percept when their input corresponds to the stimulus.
- embodiments of the present invention would be able to expand the scope of the experiment by bringing in and comparing variables from many experiments on perception, based on different social and cultural groupings, and how these particular brightness curves compare to those describing luminance in the original experiment. There might, of course, be missing variables in the measuring of luminance as well, which might be the hidden factor for why one rectangle appears brighter when there is no apparent measurable difference.
- a model enhanced by an embodiment of the present invention could take into account, that the measure of particular luminance may posses undetected differences. For how is it possible that the eyes and mind of one person being so different than that of another perceive the same difference in brightness, being that it is a psychological effect and the psychology of one is so different from another.
- Marketers might not know how exactly to sell the clients a product, but they can sell advertising just by claiming they have the "know how.”
- Embodiments of the present invention would allow for a better, long-term, quantitative, view of past and projected modalities of advertising. For example, to quantitatively test a conjecture, stating that, today, Internet advertising may have more in common with old-fashion billboard highway advertising than with more contemporary television advertising. Alternatively, embodiments of the present invention could quantify the validity of a conjecture stating that, fifty years ago, success was a function of what type of product was being sold and of what type of consumer was going to buy it. Embodiments of the present invention could have both a feedback as well as a feed-forward adjustment that could take in new information such as economic changes and relate them to how they are effecting consumer habits; thereby improving the success of ongoing advertising campaigns.
- This may be represented as an intermediate complexity topology graph (process diagram), since not only are there multiple media advertising conduits, each directed to an audience having a different distribution of personal profiles, and each having a different rate structure; but each media has an innately different quotient of effectiveness within each sub-population.
- quanta for effectiveness are generally only accessible through limited sampling studies. For example, some small percentage of subscribers to a service, or purchasers of a product, might agree to provide some response data about how they came to buy the service or product, and therein might remember which advertisement helped them to decide; if an advertisement helped them to decide at all.
- Embodiments of the present invention may be useful to help integrate diverse aspects of data collection and to validate portfolio management models therewith.
- a topological graph of a combustion engine's process is relatively almost a non-branching chain of events. Measured fuel and measured air are mixed and ignited under controlled conditions, so that their resultant rapid oxidation events are converted into mechanical energy and into exhaust. Nevertheless, every measurement and virtually every aspect of the controlled conditions may be captured as data to arbitrarily high precision.
- TEC Total Engine Control
- the TEC-I series of engine control units consist of a Direct Fire Unit-that holds the coils, and a TEC controller-that holds the injector drive circuits and control logic. This configuration represents an improvement for extremely powerful engines with multiple injectors at each cylinder.
- these hand built and custom configured systems are purpose built special order items. Using a known model dictates that specific engine input and condition specifications will deliver a certain output. When these expectations are not met, the engineers must investigate where the fault lies. What is needed is a model that measures the expected output of standard variables in the combustion system against its actual output, rather than a model that only predicts overall system output.
- Embodiments of the present invention would be able to establish a more global model that would help to increase the optimization of a whole combustion system. At that point, more esoteric and overlooked variables could begin to be added to the present invention model, in its empirical self-learning capacity. Thus, the present invention would allow for an improvement of individual engine controllers, which today is only attempted with labor-intensive human intervention; almost on the level of the engine craftsman.
- Cow Life Cycle One might describe an example of Cow Life Cycle equally well in terms similar to any other management or process description.
- a topological graph of a cow life cycle is likewise relatively almost a non-branching chain of events.
- Measured fodder and measured genetic makeup are integrated and developed under controlled conditions, and resultant in milk or meat, or leather or offspring, or other byproduct productions. Nevertheless, every measurement and virtually every aspect of the controlled conditions may be captured as data to arbitrarily high precision. Accordingly, embodiments of the present invention are useful to model aspects of cow life cycle; on individual cows, on genetically like cows, on individual breeds of cows, on conjectural mixed breeds of cows, etc. Furthermore, these models may be developed to manage real time aspects of the cow's management.
- any Sub-Set of a larger process control problem relating Assembly or Service Process, would benefit from improved process control.
- these examples include modeling small portions of those described in examples (1a) and (1b) above. When the "small portions" reach the resolution of individual items of manufacture or fabrication equipment, then this example begins to resemble classical Statistical Process Control (SPC) methods or the like.
- SPC Statistical Process Control
- Embodiments of the present invention may be applied to improve the performance of individual units or aggregates of units; all of which are subsets of the larger respective process. Nevertheless, embodiments of the present invention integrate the subsets in all their detail and not only according to external metrics of their respective performance.
- This example relates to any nexus of complex processes that is simplified into a low topological complexity representation, with an intermediate scale metrification of associated variables.
- Embodiments of the present example can validate the relative significance of elements of the respective representation. This transforms any initial recommended strategy for success into a qualified portrayal of the actual weighting of significance that empirically successful strategies actually employ. Simply stated, "wouldn't it be nice to know" which pieces of common advice actually yield significant results.
- the present example relates to gathering, analyzing, and beneficially using the results from consumer satisfaction surveys. Since consumers will only answer short simplistic surveys, wherein there is usually lots of room for misunderstanding, and since there have not been discovered any better way to glean a description of the consumers' actual impressions, consumer research is restricted to understanding a system that is limited to low resolution quality process maps (topological graphs), having a quantitative basis which is likewise of lowest order metrics. Since these surveys form the basis of countless corporate decisions, it would be beneficial to improve the quality of conclusions that can be derived from such systems. Embodiments of the present invention may prove to be of great value in validating models of such systems and in improving these systems, to become better instruments for accomplishing their intended purposes. (9b) Complexity Measure Low & Quantify-Ability Measure Low — Voting Preference — Example:
- the present example relates to a class of lowest complexity with lowest quantify-ability, because it is commonly never known how any individual actually voted nor which configuration of factors actually determined his actual vote. Nevertheless, it is common practice to spend lots of money to influence the voting habits of the populace.
- obscure modalities of electioneering e.g - planting trees in public areas, or providing a free car tune-ups
- the present invention may be applied to developing and analyzing such models, even if they are focused to test peculiar speculations.
- the knowledge-engineering protocol-suite of the present invention provides the context to define, develop, integrate, and test such tools. More specifically, the present invention provides such tools embodied as methods, systems, and apparatus for search-space organizational validation; and as other appurtenances developed for use with the knowledge-engineering protocol-suite.
- Layer 1 of the protocol suite of the present invention A physical layer for interfacing with apparatus.
- Layer 2 of the protocol suite of the present invention A data-link layer for facilitating data-communications within any of these Layers 1-7, or between any plurality of these Layers 1-7. r
- Layer 3 of the protocol suite of the present invention A network layer for maintaining transactional access to data ensembles (e.g. an index of data related to empirical contents from Layer 1 ).
- Layer 4 of the protocol suite of the present invention A transport layer for organizing and maintaining token correspondences and adjacency lists, wherein are represented network layer relationships between the data sets or between elements in the data sets (e.g. a tabular organization for maintaining relationships between indexed data or data categories in Layer 3).
- Layer 5 of the protocol suite of the present invention A session layer for validating the transport layer represented relationships (e.g. a convergence for checking elements of model relationships from Layer 4 with indexed empirical data from Layer 3), and for simulating alternative transport layer relationships (from Layers 6 or 7).
- Layer 6 of the protocol suite of the present invention A presentation layer for designing and executing experimental session layer simulations, evaluations thereof, and modifications thereto; (e.g. a propositional logic formation region wherein alternative or supplemental relationships to those maintained in Layer 4 may be articulated and passed to Layer 5 for testing against indexed empirical data from Layer 3)-
- Layer 7 of the protocol suite of the present invention An application layer for prioritizing n-tuple strategy dynamics of presentation layer transactions, (e.g. a combinatoric set formation region wherein the entire collection of all possible Layer 4 permutations are considered with a specific view to considering what the most productive order for their evaluation might be; and this order is used to pass substantially one at a time to Layer 6 on an as available for testing basis).
- presentation layer transactions e.g. a combinatoric set formation region wherein the entire collection of all possible Layer 4 permutations are considered with a specific view to considering what the most productive order for their evaluation might be; and this order is used to pass substantially one at a time to Layer 6 on an as available for testing basis).
- Empirical Controller A layer-based embodiment for controlling the underlying Layer 1 interconnected devices and apparatus, including sensors, actuators, etc.
- Knowledge Tree A composite topological graph constructed from Layer 4 contents (as input via Layer 1). These contents generally include a process map (either derived from a single source or composed from fragmentary process maps), and expert suggested relationships between "nodes” (herein called interconnection cells), in the topological graph; such as "causal" relationships suggested by experts, or relationships proposed in Layers 6 or 7 and subsequently validated against empirical data; also used to describe a graphical presentation of same.
- Interconnection Cell a node in the topological graph "Knowledge Tree” wherein is represented inputs and outputs from the process map and metrics and relationships suggested by experts or Layers 6 or 7.
- the Empirical Controller (E-C) The concept consists of several components described in the sequel.
- the qualitative component of the invention that integrates physical knowledge and logical understandings into a homogenetic knowledge structure is called the Knowledge Tree (K-T)
- K-T The Knowledge Tree is displayed graphically as a directed network with nodes, which are called Interconnection Cells. These cells express the local relationship between input and output process parameter measurements.
- the POEM algorithmic approach is applied to obtain (from process measurement data) the precise quantitative relationship at each cell.
- Each Interconnection Cell is converted to an Interconnection Model or Model, in short.
- the Model contains the quantitative relationships between input and output.
- the Knowledge Tree together with this quantitative layer yields the Empirical Model.
- the Empirical Model serves as a multivariable characterization of the process being described, and can be used to predict and control process behavior.
- ADM operates and analyzes the Empirical Model to determine solutions that best meet the specified objectives and constraints.
- the entire three-tier structure consisting of the ADM, the Empirical Model, and the Knowledge Tree, is refereed to as the Empirical Controller.
- the Empirical Controller is a generic learning and thinking system, which performs Empirical Control.
- Eden A conceptual cluster of EVEs; equivalently a meta-interconnection cell representing a contiguous Knowledge sub-Tree contained therein.
- Poem a general methodology used for validating individual or contiguous Interconnection Cells in Knowledge Tree. For example, using a known modeling tool for the specific device or apparatus represented by the Interconnection Cell, or using SPC, or using IPC, or using APC, etc.
- the product that contains the E-C technology and is the realization of the Automated Decision-Maker is referred to as the Adam (Automated Decision-Maker).
- Adam serves as a global Decision-Maker tool, encompassing the entire process.
- the E-C technology when embodied in a product and used for intermediate process control of work groups or equipment clusters is referred to as the Eden (Empirical Decision Enabling Network).
- the E-C technology when embodied in a product and used for troubleshooting, optimization and control at the processing equipment or measuring tool level is referred to as the Eve (Equipment Variable Evaluator).
- the present invention relates to a knowledge-engineering protocol-suite for facilitating open systems interconnection transactions in a multi-layer knowledge-engineering reference model substantially having
- Layer 1 a physical layer for interfacing with apparatus
- Layer 2 a data-link layer for facilitating data-communications within any of these Layers 1-7, or between any plurality of these Layers 1-7;
- Layer 3 a network layer for maintaining transactional access to data ensembles
- Layer 4 a transport layer for organizing and maintaining token correspondences and adjacency lists wherein are represented network layer relationships between the data sets or between elements in the data sets;
- Layer 5 a session layer for validating the transport layer represented relationships and for simulating alternative transport layer relationships
- Layer 6 a presentation layer for designing and executing experimental session layer simulations, evaluations thereof and modifications thereto;
- Layer 7 an application layer for prioritizing n-tuple strategy dynamics of presentation layer transactions; wherein the knowledge-engineering protocol-suite includes:
- the present invention relates to programs for facilitating open systems interconnection transactions in the frame of reference of a multi-layer knowledge-engineering reference model using a knowledge-engineering protocol-suite.
- these programs are embodied for use in a structured system of data-logic processors (e.g. knowledge-engineering workstation, computer, process-management computer).
- these programs are embodied for use in a distributed asynchronous system of process-modeling computers.
- the structured system functionally is a substantially hierarchical (graph directed) organization of the same method embodied programs as those of the distributed system.
- actual method embodied programs conforming to the knowledge-engineering suite may be embodied differently for each system variety.
- the knowledge-engineering protocol suite of the present invention provides a conceptual organization that is built on the same framework as the familiar OSI model, and is facilely applied to disparate applications; such as those that differ greatly with respect to
- the embodied programs of the present invention generally include search-space organizational validation for such disparate applications, and also other higher knowledge-engineering functions.
- programs provide a synergistic combining of knowledge bases of disparate resolution data-sets, such as by actual or simulated integrating of lower resolution, expert-experience based, model-like, templates; to higher resolution empirical data-capture dense quantitative search-spaces.
- the knowledge-engineering protocol suite of the present invention may be applied to disparate applications; such as manufacturing systems, control systems, command control systems, or command control communications systems. Furthermore, the suite may be applied to computational apparatus associated with these applications, and to the task of providing appropriate quantitative modeling and measuring tools for these applications.
- the present invention also relates to a search-space organizational validation method substantially complying with a knowledge-engineering protocol-suite, the method including the steps of:
- correlated empirical data-sets may be derived from sensors of layer 1 , conveyed via a communications conduit facility of layer 2, and stored in a memory media of layer 3. More specifically, correlated empirical data-sets generally include raw input, process, or output data from a specific machine or a specific organism, or from a plurality of specific machines or a plurality of specific organisms, or from a conceptual characterization thereof, or from a simulation of a model relating thereto.
- a specific machine may be an identified etching machine, or an identified annealing oven in a semiconductor fabrication facility, or an identified locomotive engine, or an identified component or sub-system of a specific machine.
- a specific organism may be an identified individual person, or an identified dairy cow or racehorse, or an identified strain of genetically substantially identical bacteria, or an identified organ or part of an organ or specific part of any of the aforesaid specific organisms.
- a plurality of specific machines may be a stage in an identified industrial process facility wherein more than one functionally identical specific machines divide a portion of a common input into a parallel process and thereafter into a common output.
- a semiconductor fabrication facility may divide workflow at a specific stage into one of a group of annealing ovens, presumably because annealing is a time consuming process while other stages of the fabrication are more "instant".
- This type of "plurality of specific machines” generally occurs at any stage in an industrial process that would otherwise impose a delay on the entire process, unless such a parallel processing is simultaneously precluded for an excessively economically costly machine.
- a plurality of specific organisms may be a human family, a herd of dairy cows, or even a fermentation vat.
- a conceptual characterization thereof may be a household, a grocery store-in a chain of grocery stores, an elementary school-or a class therein.
- a simulation of a model relating thereto may be from an annealing oven modeling, from a line-width etching modeling, from a modeling of public health-and epidemic factorizations therein, from a dairy herd management modeling, from a social modeling of parameters in elementary education, etc.
- interrelated nodes of graph-directed, expertise-suggested, data-set, relationships generally may relate to quantitative or qualitative "axioms," which are either accepted as true in a specific domain of applied knowledge, or are postulated by at least one "experf'-according to his long felt suspicions.
- Such axioms may include: "Etching line width is primarily dependent on certain specific voltage settings of the etching station," or "An individual cow's milk production is dependent on three specific environmental factors, and four specific nutritional factors," or "The fuel efficiency of a locomotive engine seems to degrade when there has been a lot of up-hill acceleration or a lot of down-hill braking.”
- These expertise-suggested data-set relationships are stored on a memory media of layer 3, however these relations are embodied into a topological graph using facilities in layer 4 of the present protocol-suite.
- a predetermined measure of inclusion generally relates to a logical intersection between the first plurality of empirical data-sets (associated with layer 1 of the present protocol), and the second plurality of expertise-suggested relationships (associated with layer 3 of the present protocol).
- a predetermined inclusion specifically relates to a topological sub-graph of relationships that can be validated by virtue of having a sufficient pool of empirical records, which can falsify and test each relationship in the sub-graph, according to its respective observed empirical truth.
- predetermined in this context relates to a sufficiency for validating according to some statistical metric of certification (e.g.-within a first or second standard deviation of average), or some blanket assertion (e.g.-this can't happen, or-this always happens, or- usually this acts in some prescribed fashion).
- the data-set resolution of particulars in the first plurality is greater than or equal to that of particulars in the second plurality relates to a situation where the topological complexity of the expertise suggested relationships is not more complex than the supporting data.
- the most common acceptable modeling situation describes a small number of inter-related variables that can be tested against a large collection of empirical data.
- a less coherent class of modeling exists when each individual instantiation must be tested against substantially the entire empirical data collection, and this occurs when trying to diagnose and treat an individual patient or when trying to tune an individual racing car, etc.
- to describe a model that captures more relationships than there are n-tuples of empirical data-sets is outside the scope of the present invention.
- a vantage of a presumption of validity relates to using empirical data in its current form. While many appurtenances may be applied to filter or normalize data, the present invention does not perform these operations.
- the present invention may be used to characterize an empirical data-set as being statistically distant from other like data-sets.
- the present invention may also be used to characterize an individual data instance within a data-set as being statistically distant from other like data-instances. However, these characterizations are of secondary importance in the context of the objects of the present invention.
- the operational postulate of the present invention is that a model, as composed from individual or collective expertise, may be validated and improved, when considered in juxtaposition to empirical data.
- a data anomaly is an object of study and analysis, not a target for correction.
- the present invention has an object of finding out what relations characterize this empirical anomaly. It may be that this anomaly is a false representation of the empirical reality. Alternatively, it may be that this anomaly is a statistically rare representative instance of some combination of relationships that might contribute to broadening understanding in the context of a system under study. It is a salient feature of the present invention to disclose and investigate such rare representative instances. Therefore, it would be at cross-purposes to the present invention to automatically filter out the very instances that might be most productive to improving knowledge of a system under study.
- a validity-metric relates to a synthetic scale assignment that is derived when a relationship, or aggregation of relationships, is quantitatively evaluated, according to the empirical data.
- the metric may reflect a reality that an expertise-suggested relationship is completely supported by the data, or that the relationship only accounts for, or correlates with, some measurable part of the data; or, that the relationship is not supported by the data; or, even; that the data supports a relationship contrary to that suggested by an "expert.”
- n-tuple relates to a "multiple of n" ("n" being two or more).
- an n-tuple relates to one or more relations, between two or more nodes, in a directed graph representation for the expertise-suggested data-set relationships.
- the method steps of the search-space organizational validation method relate to:
- the present invention further relates to a program storage device readable by a logic-machine, tangibly embodying a program of instructions, executable by the logic-machine (e.g.-a data-logic processor or a process-modeling computer), to perform method steps for validating a search-space organization, substantially complying with a knowledge-engineering protocol-suite, these method steps including:
- the present invention relates to a process-modeling computer for use in a distributed asynchronous system of process-modeling computers, substantially according to a knowledge-engineering protocol-suite, the process-modeling computer logically having three active-units, wherein each active-unit has at least one virtual computer processor associated therewith, and wherein the active-units are capable of mutual data-communications interaction, and wherein the process-modeling computer includes:
- the present invention relates to a distributed asynchronous system of process-modeling computers substantially complying with a knowledge-engineering protocol-suite, the system of process-modeling computers including: 1st) at least one process-modeling terminal wherein at least one of the terminals includes a program storage device as described (above);
- the present invention relates to a knowledge-engineering protocol-suite for facilitating open systems interconnection transactions in a seven-layer reference model.
- This knowledge-engineering protocol-suite includes: either firstly a process modeling computer for relating layers 1-3, secondly a search-space organizational validation method for relating layers 3-5, and thirdly a knowledge-engineering work station for relating layers 5-7; or equivalently a distributed asynchronous system of process modeling computers for relating layers 1-7.
- the seven-layer reference model for facilitating open systems interconnection transactions is defined in the context of the present invention as having: a seven layer knowledge-engineering protocol-suite wherein:
- Layer 1 relates to embodiments of a physical layer from which data about physical input, process, or output attributes, is collected or targeted.
- the physical layer may be tied to a physical machine such as a process-controlled machine.
- the physical layer may be tied to a data input terminal through which input, process, or output data may be collected.
- the physical layer may be tied to a data output terminal (or a printer) through which input, process, or output transactions may be targeted, reports generated, work-orders authorized, process-control parameters modified, etc.
- the physical layer is tied to an accessible data storage media.
- Layer 2 relates to embodiments of a data-link layer data-communications (including, for example-the ISO OSI model type data-communications per se, inter-net, intra-net, WAN, LAN, and DBMS).
- a data-link layer data-communications including, for example-the ISO OSI model type data-communications per se, inter-net, intra-net, WAN, LAN, and DBMS.
- Layer 3 relates to embodiments of a data-set network layer having therein the first plurality data-sets, the second plurality data sets, and other data banks, which may yield content that can be manually or automatically transformed into the aforesaid pluralities.
- Layer 4 relates to a transport layer wherein token correspondence (adjacency list) constructions are mapped within each plurality and between sets of the pluralities.
- Layer 5 relates to a session layer wherein validation or simulation of the layer 4 mappings may be run on layer 3 data , or as an on the fly control system on layer 1 data.
- Layer 6 relates to a presentation layer wherein design of experiments may be articulated for specific sessions.
- Layer 7 relates to an application layer wherein a broader construction of experimental strategy may be articulated such as an n-tuple strategy. Furthermore, in the context of more preferred scale embodiments of the present invention, the knowledge-engineering protocol-suite pertains to:
- Figure 1 illustrates systems complying with a knowledge-engineering protocol-suite
- Figure 2 illustrates apparatus included in the systems of figure 1 ;
- Figure 3 illustrates optional layer 2 protocols for use in the systems of figure 1 ;
- Figure 4 illustrates useful data-ensembles in the context of the systems of figure 1 ;
- Figure 5 illustrates localization of graph-theoretic orderings in the context of the systems of figure 1 ;
- Figure 6 illustrates a program storage device
- Figure 7 illustrates an article of manufacture
- Figure 8 illustrates a process-modeling computer
- Figure 9 illustrates a distributed asynchronous system of process-modeling computers
- Figure 10 illustrates a method of search space organizational validation
- FIGS 11-15 illustrate variations of the method of Figure 10
- FIGS 16-19 illustrate variations of the methods of Figures 14-15; Figures 20-23 illustrate further variations of the method of Figure
- FIGS 24-26 illustrate variations of the method of figure 23
- Figure 27 illustrates another variation option for use with the method of Figure 10
- Figure 28 illustrates a variation option for use with the method of
- FIG 29 illustrates still another useful variation for use with the method of Figure 10.
- Figure 30 portrays a typical schematic knowledge-tree representation example
- Figure 31 A portrays a set up for a schematic analysis diagram for SPC.
- Figure 31 B portrays a typical schematic
- Figure 32 portrays an analysis diagram for a conditional SPC example.
- Figure 33 portrays a diagram for a conditional SPC example.
- Appendix 1 presents, software code on Microfiche, from which potentially executable code which can be derived, for running a prototype of a system embodying aspects of the present invention; and includes therein an organized collection of source code, documentation thereof, sample menus, and other working appurtenances that have been developed for use therewith; and
- Appendix 2 presents, also on the Microfiche, source code independent descriptive notes, and other working papers that have been written in the course of the development of the prototype of appendix 1 , especially according to the most recent preferred enabling embodiment.
- Figure 1 relates to a knowledge-engineering protocol-suite for facilitating open systems interconnection transactions in a multi-layer knowledge-engineering reference model substantially having
- Layer 1 (1/1) a physical layer for interfacing with apparatus (e.g. 2/1 );
- Layer 3 (1/3) a network layer for maintaining transactional access to data ensembles
- Layer 4 (1/4) a transport layer for organizing and maintaining token correspondences and adjacency lists wherein are represented network layer relationships between the data sets or between elements in the data sets;
- Layer 5 (1/5) a session layer for validating the transport layer represented relationships and for simulating alternative transport layer relationships;
- Layer 6 (1/6) a presentation layer for designing and executing experimental session layer simulations, evaluations thereof and modifications thereto;
- Layer 7 (1/7) an application layer for prioritizing n-tuple strategy dynamics of presentation layer transactions; wherein the knowledge-engineering protocol-suite includes: either a structured system (1/8) having at least one process-management computer (1/9) with a program (1/10) for relating Layers 1-3, at least one computer (1/11) embodying a search-space organizational validation method program (1/12) for relating Layers 3-5, and at least one knowledge-engineering workstation (1/13) with a program (1/14) for relating Layers 5-7; or equivalently a distributed asynchronous system (1/15) of process-modeling computers (1/16) (1/16a) with programs (1/17) (1/17a) for relating Layers 1-7.
- a structured system (1/8) having at least one process-management computer (1/9) with a program (1/10) for relating Layers 1-3, at least one computer (1/11) embodying a search-space organizational validation method program (1/12) for relating Layers 3-5, and at least one knowledge-engineering workstation (1/13) with
- Figure 2 relates to the protocol-suite, as was illustrated in Figure 1 , wherein the process-management computer or a process-modeling computer includes apparatus (2/1) interfacing with the physical layer, used by the process-management computer or by the distributed asynchronous system of process-modeling computers, and these apparatus' are selected from data-communications devices (2/2) or process-control machines (2/3), and the data-communications devices are for input (2/4) or data storage (2/5) or output (2/6), and the process-control machines have sensors (2/7) or program storage (2/8) or actuators (2/9).
- apparatus (2/1) interfacing with the physical layer, used by the process-management computer or by the distributed asynchronous system of process-modeling computers, and these apparatus' are selected from data-communications devices (2/2) or process-control machines (2/3), and the data-communications devices are for input (2/4) or data storage (2/5) or output (2/6), and the process-control machines have sensors (2/7) or program storage (2/8) or actuators (2/9).
- Figure 3 relates to the protocol-suite as was illustrated in Figure 1 wherein any said program (e.g. (1/10) (1/17) (1/17a)) relating to the data-link layer, used by the process-management computer (e.g. (1/9)) or by the computer (e.g.(1/11 )) embodying a search space organizational validation method, or by the knowledge-engineering workstation (e.g. (1/13)), or by the distributed asynchronous system (e.g. (1/15)) of process-modeling computers (e.g. (1/16) (1/16a)), and used for facilitating data-communications within any of the layers 1-7 or between any plurality of the layers 1-7 as required therein, includes at least one data communications protocol (3/1 ) selected from the list:
- Figure 4 relates to the protocol-suite as was illustrated in Figures 1 and 2 wherein any said program (e.g. (1/10) (1/12) (1/17) (1/17a)) relating to the network layer, used by the process-management computer (e.g. (1/9)) or by the computer (e.g. (1/11)) embodying a search space organizational method, or by the distributed asynchronous system (e.g. (1/15)) of process-modeling computers (e.g. (1/16) (1/16a)), and used for maintaining transactional access to data ensembles (4/1 ), includes in said data ensembles
- Figure 5 relates to the protocol-suite as was illustrated in Figure 1 wherein any said program (e.g. (1/14) (1/17) (1/17a)) relating to the application layer, used by the knowledge-engineering workstation (e.g. (1/13)) or by the distributed asynchronous system (e.g. (1/15)) of process-modeling computers (e.g. (1/16) (1/16a)), and used for (5/1 ) prioritizing n-tuple strategy dynamics of presentation layer transactions as required therein, includes performing graph-theoretic orderings (5/2) of elements or of sets, and said orderings are performed sequentially, in parallel, concurrently, synchronously, asynchronously, heuristically, or recursively.
- any said program e.g. (1/14) (1/17) (1/17a)
- the knowledge-engineering workstation e.g. (1/13)
- the distributed asynchronous system e.g. (1/15)
- process-modeling computers e.g. (1/16) (1/16a
- Figure 6 relates to a program storage device (6/1 ) readable by a logic-machine (6/2), tangibly embodying a program (e.g. (1/12) in Figure 1) of instructions executable by the logic-machine to perform method steps for validating a search-space organization substantially complying with a knowledge-engineering protocol-suite, said method steps including:
- Figure 7 relates to an article of manufacture (7/1 ) including a computer usable medium (7/2) having computer readable program code (7/3) embodied therein a method for validating a search-space organization and substantially complying with a knowledge-engineering protocol-suite, the computer readable program (e.g.
- code in said article of manufacture including: computer readable program code (7/4) for causing a computer to organize a search-space for a first plurality of correlated empirical data-sets, by mapping a second plurality of interrelated nodes of graph-directed expertise-suggested data-set relationships onto the first plurality of correlated empirical data-sets, at least until the second plurality of nodes and relationships substantially includes a predetermined measure of particulars in the first plurality data-sets, wherein the data-set resolution of particulars in the first plurality is greater than or equal to that of particulars in the second plurality; and computer readable program code (7/5) for causing the computer to validate the search-space from a vantage of a presumption of validity for the first plurality of data-sets, by simulating a validity-metric for an n-tuple of directed graph components in the mapped second plurality or measuring if each input to a node of the n-tuple significantly contributes to that node's output, where
- Figure 8 relates to a process-modeling computer (1/16) for use in a distributed asynchronous system (e.g. (1/15)) of process-modeling computers substantially according to a knowledge-engineering protocol-suite, the process-modeling computer logically having three active-units (8/1) (8/2) (8/3) wherein each active-unit has at least one virtual computer processor associated therewith (8/12) (8/29) (8/39) and wherein the active-units are capable of mutual data-communications interaction, and the process-modeling computer includes: a first active-unit (8/1 ) of the three active-units, and said first active-unit is further capable of data-communications interaction with sensors (e.g. (2/7)) or actuators (e.g.
- sensors e.g. (2/7)
- actuators e.g.
- Figure 9 relates to a distributed asynchronous system (1/15 in Figure 1 ) of process-modeling computers substantially complying with a knowledge-engineering protocol-suite, the system of process-modeling computers including: at least one process-modeling terminal (9/1 ) (9/1 a) wherein at least one of the terminals includes a program storage device (6/1 ) as was illustrated in figure 6;
- a plurality of process-modeling computers (1/16) (1/16a) wherein each computer is as was illustrated in greater detail in figure 8; a data-communications interaction conduit (9/2) providing sufficient transactional data exchange services between the plurality of process-modeling computers; between, at least one of the process-modeling terminals and the plurality of process-modeling computers; and between the process-modeling terminals.
- Figure 10 relates to a search-space organizational validation method (10/1) substantially complying with a knowledge-engineering protocol-suite, the method including the steps of: organizing (10/2) a search-space for a first plurality of correlated empirical data-sets, by mapping (10/3) a second plurality of interrelated nodes of graph-directed expertise-suggested data-set relationships onto the first plurality of correlated empirical data-sets, at least until there is a predetermined measure of inclusion by the second plurality of nodes and relationships of particulars in the first plurality data-sets, wherein the data-set resolution of particulars in the first plurality is greater than or equal to that of particulars in the second plurality; and validating (10/4) the search-space from a vantage of a presumption of validity for the first plurality of data-sets, by simulating (10/5) a validity-metric for an n-tuple of directed graph components in the mapped second plurality, or measuring (10/6) if each input to a node of
- Figure 11 relates to the method as was illustrated in Figure 10 wherein mapping (10/3) includes defining (11/1 ) substantially every node in the second plurality to have at least one graph-directed input and at least one graph-directed output.
- Figure 12 relates to the method as was illustrated in Figure 10 wherein mapping (10/3) includes defining (12/1) substantially every node in the second plurality to have only one graph-directed output.
- Figure 13 relates to the method as was illustrated in Figure 10 wherein mapping (10/3) includes standardizing (13/1 ) a format representation for nodes or relationships in the second plurality.
- Figure 14 relates to the method as was illustrated in Figure 10 wherein mapping (10/3) includes representing (14/1) graph-directed data-set relationships using expertise-suggested initial weightings.
- Figure 15 relates to the method as was illustrated in Figure 10 wherein mapping (10/3) includes ' representing (15/1 ) graph-directed data-set relationships using initial weightings based on statistical process-control generated distribution functions.
- Figure 16 relates to the method as was illustrated in Figure 14 wherein validating (10/4) includes, for at least one weighted directed graph component in the directed graph of second plurality components, improving (16/1) the weighted component using a validity-metric proportional directed graph component weighting.
- Figure 17 relates to the method as was illustrated in Figure 15 wherein validating (10/4) includes, for at least one weighted directed graph component in the directed graph of second plurality components, improving (17/1) the weighted component using a validity-metric proportional directed graph component weighting.
- Figure 18 relates to the method as was illustrated in Figure 16 wherein validating (10/4) includes generating (18/1 ) a conditional statistical process-control distribution function and convoluting (18/2) the conditional distribution function with the present weightings.
- Figure 19 relates to the method as was illustrated in Figure 17 wherein validating (10/4) includes generating (19/1 ) a conditional statistical process-control distribution function and convoluting (19/2) the conditional distribution function with the present weightings.
- Figure 20 relates to the method as was illustrated in Figure 10 wherein validating (10/4) includes, for at least one directed graph component in the directed graph of second plurality components, assigning (20/1) a validity-metric proportional directed graph component weighting.
- Figure 21 relates to the method as was illustrated in Figure 10 wherein validating (10/4) includes, for at least one validity-metric above a threshold value, adding (21/1) a virtual directed graph component to the second plurality.
- Figure 22 relates to the method as was illustrated in Figure 10 wherein validating (10/4) includes, for at least one validity-metric below a threshold value, deleting (22/1) a directed graph component from the second plurality.
- Figure 23 relates to the method as was illustrated in Figure 10 wherein mapping (10/3) includes updating (23/1) the first plurality of correlated empirical data-sets.
- Figure 24 relates to the method as was illustrated in Figure 23 wherein updating (23/1) includes modifying (24/1) at least one real-time empirical data-set.
- Figure 25 relates to the method as was illustrated in Figure 23 wherein mapping (10/3) includes activating (25/1) an alarm when an updated empirical value is outside of a threshold range.
- Figure 26 relates to the method as was illustrated in Figure 23 wherein validating (10/4) includes generating (26/1) a report having recorded therein an updated empirical value that is outside of a threshold range.
- Figure 27 relates to the method as was illustrated in Figure 10 wherein mapping (10/3) includes accumulating (27/1 ) empirical data using a data mining engine.
- Figure 28 relates to the method as was illustrated in Figure 13 wherein standardizing (13/1 ) a format representation for nodes or relationships in the second plurality includes either providing (28/1) for substantially each node in the second plurality: at least one input token; a process token; and at least one output token; or providing (28/2) for substantially each relationship in the second plurality: a first process token, a linkage token; and a next process token.
- topological graph of nodes and/or edges there are two different ways of representing a topological graph of nodes and/or edges; or, for the example of Knowledge-Tree, interconnection cells, and relationships.
- the graph represents actual interconnections or not.
- the tabular representation is either by for substantially each node in the second plurality: having a table including at least one input token; a process token; and at least one output token; or for substantially each relationship in the second plurality: having a table including a first process token, a linkage token; and a next process token.
- Figure 29 relates to the method as was illustrated in Figure 10 wherein mapping (10/3) includes defining (29/1) a correspondence in the search-space, between the second plurality of interrelated nodes and a process model representation, by performing the steps of:
- Figure 30 portrays a typical schematic knowledge-tree representation (30/1) example containing interconnection cells and incorporating graph directed linkages from an underlying process map description and other proven or alleged causal factor graph directed linkages; and Figures 30, 31A-B, 32, and 33 portray a typical schematic analysis diagram for a conditional SPC example, wherein the domain of the interconnection cell inputs and the range of the interconnection cell outputs are differentiated into higher precision discrete subsets than are classically represented using SPC, and the combinations of inputs are n-tupled and correlated to substantially each the output subsets.
- FIG. 30 there is seen a schematic representation of an example of a knowledge-tree, referenced generally 50.
- This knowledge-tree 50 contains modules or interconnection cells, referenced 51-61.
- linkages represented by arrows as, for example those referenced 64 and 66; between interconnection cells 51-61. These linkages are based upon various factors as described following. Linkages between interconnection cells normally include those based upon actual steps in a manufacturing process. For example, linkage 64 connecting interconnection cells 51 and 52 represents the transition between the a first shown and a second shown manufacturing steps.
- Linkages further normally include those based upon proven causal relationships.
- Proven causal relationships are defined as those relationships for which there is empirical evidence such that changes in the parameter or metric of the source or input interconnection cell produce significant changes in the output of the destination interconnection cell.
- Significant changes are defined as those that produce differences greater than a certain previously determined amount. These changes may be determined by, but are not limited to, those greater than two sigmas (2 ⁇ 's) of the calculated standard deviation of the values designated as in-specification.
- linkage 72 connects interconnection cells "Expose" 52 and "Etch” 56. Data (empirical evidence) exists which proves that changing a metric of the Expose manufacturing step will produce a significant change in the output obtained by subsequently executing the Etch manufacturing step.
- Linkages may still further include those based upon alleged causal relationships. These relationships are usually, but not limited to, those relationships suggested by professed experts in the manufacturing process, or in some portion thereof. An example of such a relationship is shown in Figure 30 (30/1) by arrow 74 wherein arrow 74 is seen to connect interconnection cells Bake 54 and Resist Strip 59. Linkages of this type may be tentatively established, and added to the knowledge-tree, on any basis whatsoever; real, imagined, supposed, or otherwise. It is an integral part of the invention to later test and/or validate these linkages.
- the term Knowledge-Tree is used to include the various types of linkages described above. A Knowledge-Tree that includes only manufacturing steps is equivalent to a process map, a term familiar to those skilled in the art. The term Knowledge-Tree in fact indicates a process map that has been modified to include other types of linkages, such as, but not limited to, those described above.
- FIG. 31 A (31/1 ), there is seen a graphic representation of a feed forward optimization process which is divided into two sections.
- six such variables or manufacturing steps are represented by bars 81-86.
- Each of the six bars 81-86 is in turn divided into three sections.
- bar 81 is divided into a upper section 92, a middle section 94 and a lower section 96.
- These upper, middle, and lower sections (92, 94, and 96; respectively), are also assigned arbitrary letters in order to further facilitate graphic representation of some inputs to the manufacturing process.
- the upper section 92 is assigned a letter-A, 102; the middle section 94 is assigned a letter-B, 104; and, the lower section 96 is assigned a letter-C, 106.
- the letters A, B, and C are also used to designate the upper, middle, and lower sections, respectively; of bars 82-86. It should be noted that the choice of three letters and three sections is also completely arbitrary and has been made solely in order to simplify description.
- Each bar in section 31/2 is used, at this point, to represent a single, complete integration cell, such as the Expose cell, 52 ( Figure 30.) What is postulated here is that this bar is part of a conditional statistical process control and that some factor associated therewith is involved in a direct causal relationship in a change in the interrelated output produced in this manufacturing process. Factors affecting each integration cell are, at least three-fold. Such types of factors include, but are not limited to input/internal factors, cell output/external factors, and adjacent-but non-proximate-factors. This last mentioned type is normally associated with those alleged factors described above. All of these types are further described below.
- the letters A, B, and C are arbitrary, they represent specific subjective value ranges for each of the input variables represented by bars 81-86.
- the "A” or upper sections of each of the bars 81-86 represent input values above or greater than some pre-determined upper specification limit.
- the "B” or middle sections of each of the bars 81-86 represent input values within some pre-determined specification limits.
- the “C” or lower sections of each of the bars 81-86 represent input values below or less than some pre-determined lower specification limit.
- section B represent a range of input values known to produce a high yield of usable products when utilized throughout the manufacturing process. This range of values has usually been determined by statistical process-control generated distribution functions.
- the invention uses these statistical process-control, hereinafter referred to as SPC, distribution functions to "map out" an initial range of in-specification values.
- a curved line 120 representing the bell-shaped curve itself.
- Curved line 120 is intersected by two straight lines, an upper (as depicted in section), line 112; and a lower (as depicted), line 114.
- Straight lines 112 and 114 are associated with three-lettered labels 122 and 124, respectively.
- Three-lettered label 122, which is designated USL represents an upper in-specification limit; and three-lettered label 124, which is designated LSL represents a lower in-specification limit.
- Specification limits can be set in a variety of fashions. These manners include utilizing empirical data, consulting with process engineers, referencing text-book values, as well as using arbitrary values. These values may also be set to customer customized limits as required. For example, for a military specification, a limit of "use a A" nut” may be modified to: “use a 6.35 ⁇ 0.01mm nut.”
- the present embodiment of the invention includes the possibility of employing SPC evaluation of empirical data and, in addition, provides ways to validate "expertise-suggested” (read: knowledgeable process engineer recommended), information.
- the middle in-specification range which is area 94 for bar 1 , 81.
- the actual values may reflect an input very close to the "A" area of each bar. For example, a value for the first process variable represented by dot 134. If this barely in-specification choice of input values is repeated throughout the manufacturing process, an unsatisfactory product may result. Processes for evaluating input values and for methods of returning measured response values back to within acceptable limits are part of the function of the present invention and are described below.
- modifications may be made to the integration cells or to the linkages between the cells. Modifications to the integration cells are referred to as internal modifications and involve changing the actual input to the relevant cell. For example, this change may be altering an oven temperature or the thickness or quality of a raw material.
- modifications effected at the level of linkages between integration cells are based upon what is received by any given cell in the manufacturing process from a cell earlier in the process. Changes may be made based upon the relevancy of the known interrelationships between cells.
- Validation data is defined as that which verifies that a change in a specific input process variable directly results in a corresponding change in output response.
- Convolution data identifies what specific change in the validated input process variable produces that change.
- Convolution data may further quantify the specific desired change required as well as the extent that the measured response will be affected.
- line segment 98 represents using an input metric for the fourth variable, herein represented by bar 184, that is above the pre-determined "acceptable" in-specification value.
- line segment 98 can be represented by the letter combination BCCAB, wherein each of the letters represents that portion of each of the respective input metrics that was utilized in this particular input process. It is a significant part of the presently described preferred embodiment of the invention, that all possible combinations of input metrics can be represented by n-tuples of such letter combinations, it being, of course, understood that the respective combinations must be expanded to include all of the metrics of any given multi-dimensional manifold of the orthogonal system defined by any given set of variables.
- Such sets of data can be used by the system in many ways, some of which are described herein.
- the system can generate all possible, theoretical combinations of n-tuples of input variables.
- the system included in the invention can then further display the measured responses expected to be obtained from all of those inputs, or from any selected or desired part therefrom. This can be done in at least two ways. Statistical analysis of measured responses already in the system's databases, and/or by applying modeling and predicting, simulation functions to the present data.
- FIG. 32 there is seen a series of graphs depicting the measured responses obtained for a given set of n-tuples of input metrics.
- FIG 32/1 five-letter combination AAAAA, 152, represents a particular example of such an n-tuple. Recalling briefly section 31/2 of Figure 31 A, will show that this combination of input metrics represents a case wherein all of the inputs were out of specification.
- the third step of the manufacturing process is proceeding at an adjusted rate or range, but that sufficient correction cannot be made in or at the completion of this step.
- Such an inability may be process oriented.
- the oven temperature does not extend above a certain range, or the raw material thickness is limited by current supplies or state of the art manufacturing techniques.
- the system however has accessed the data appearing in sections 32/2 and 32/3; showing that changes in the fourth process variable can be made to bring the majority 156 or even totality 158 of measured response values into the in-specification output range.
- section 32/3 represents a sample of measured responses that may not be empirically, statistically valid.
- the system further includes the ability to analyze this data.
- the invention includes the ability to analyze the data depicted in this figure for greater cost-effectiveness. This may be in terms of a savings in the actual expense associated with the purchase of a particularly costly raw material; or the time saved in, for example, reducing the length of time allowed for a specific process to occur.
- Appendix 1 presents, software code on Microfiche, from which potentially executable code which can be derived, for running a prototype of a system embodying aspects of the present invention; and includes therein an organized collection of source code, documentation thereof, sample menus, and other working appurtenances that have been developed for use therewith; and Appendix 2 presents, also on the Microfiche, source code independent descriptive notes, and other working papers that have been written in the course of the development of the prototype of appendix 1 , especially according to the most recent preferred enabling embodiment.
- index for the MICROFICHE Appendix is:
- the attached Microfiche presents software for generating therewith executable code, for running a prototype of a system embodying aspects of the present invention; and thereby relates to an "Environment":
- This system is developed under MS - NT 4.0 operating system.
- the database is SYBASE SQL ANYWARE version 5.5 in stand-alone version.
- connection to database is via ODBC.
- the source code is written in PowerBuilder version 6.5.
- Database Report Detailed Data Base Structure of the Global Yield Enhancement System This document contains a data structure diagram with detailed lists describing tables of database. An example of which is code generated by the software which is then labeled: ins_batch_010, which is set up to keep track of a particular batch of semi conductors from the beginning to the end of production.
- the Eden environment is defined by three principal components: 1. Server 2.Administrater 3.Client. The three are clearly outlined in function and inter-relation.
- POEM IPC Process Outcome Empirical Modeler-Intelligent Process Control
- the MES collects data in real time from the manufacturing floor whereas the Eden acquires additional data from other sources, for example: E-Tests (electrical tests), the Sort (end of process where every chip is checked), data entered by an operator, etc.
- Eden Administrator defines outputs and inputs; this is the model (Knowledge Tree).
- Eden can relate every measurement from MES to its related function. (The models are built from the functions and every function has its own conditions for alerts.)
- Client gives a succinct description of GUI for POEM IPC: a display of data of specific functions to a user, in real time, showing specification limits from MES etc. and Eden on-line optimization that can automatically compute a prediction for a batch allowing a user to adjust an input thus optimizing the output (displayed in window).
- Alerts are issued in various ways (on-screen, e-mail, etc.), and accompanied by wizard guides. This is followed by a section on Eden Processes: including client connect, new measured data, and alert solution hints.
- the Eden system has three tiers: 1. Database 2. Application Server, i.e. the engine that does overall computing and manages user connections 3. Client, i.e. the human-machine interface of the product. Eden needs three types of Clients: 1. Regular for engineers or operators 2. System Manager for defining functions, alerts, users, etc. 3. System Manager for administrative tasks to define backups and restore, clean data etc.
- Eden's Functions and Advantages relates to Eden's base: a core technology known as Knowledge Tree and Inter Connection Cell (KT defines the dependencies of the various parameters by containing all the Inter Connection Cells).
- KT Knowledge Tree and Inter Connection Cell
- the document shows that on the basis of this technology Eden's implementation, in Advance Process Control and Health Monitoring will substantially maximize wafer fabrication by improving control and analysis tools.
- the document also includes a description of Eden's Architecture, which includes both interactive and background components.
- the interactive component which the user interacts with, sets and updates data of the manufacturing process, builds KT, maintains (reading system messages, backup and restore), enters manually measured data, displays KT, displays system model, displays prediction/optimization, displays alerts and possible solutions, displays statistical charts and information.
- the background component builds and updates models of the Inter Connection Cells, saves new measured data in database, issues alerts for existing and expected problems while offering solutions when possible.
- This document displays some of the windows that are used in the present invention's systems and includes a technical description of each window. For example there is a window titled Knowledge Tree, it has a name and a type, in this case it is sheet (multiple document window) and is called from the menu>Adam>Knowledge Tree. The argument is given; in this case, there is none. Finally the description: In the left tree the user can navigate through the process flow... can select output and... output is displayed in right tree... displayed as well, the parameters that effect this output, etc. Power Point Overview Presentation — Emphasis Poem (Process Outcome Empirical Modeler)
- ADM Automated Decision Making integrates the Knowledge Tree, the Process Model with the GUI.
- the Knowledge Tree is used for the automatic extraction of relevant data from the central database.
- ADAM provides several types of analysis tools. The first are control tools for on-line analysis of process. The second is an algorithm (this document gives a description of the various algorithms used by the present invention, with emphasis on the one based on discretizing, used for creating the Empirical Model) that makes for faster and easier troubleshooting.
- the third are process and device optimization and characterization tools that allow for automated multi-dimensional modeling of process on-line without performing experiments (an advance form of DOE-design of experiment) and automated Robust Optimization involving the nominal setting of process parameters.
- This document gives an explanation and field of Adam (Automated Decision Maker) and Eve (Equipment Variable Evaluator), two products that can stand alone but can be configured together with Eve's communication manager Eden (Empirical Decision Enabling Network) to form a three tier intelligent Empirical Control (diagram included.)
- This document deals mainly with the present invention's application in the field of semiconductor fabrication.
- the Adam Global Control, Optimization (including Robust Optimization), and Troubleshooting product can be operated in either an automated mode or in a human decision and intervention mode.
- Adam analyzes data from the engineering and manufacturing database point of view of its internal Empirical Model....
- the process of updating the decision making model is called Empirical Control (empowered by a dynamic multidimensional learning model).
- the Eve performs automatic control, optimization, and troubleshooting at the equipment level like Adam performs these functions at the global process level and Eden at the intermediate level for process control of work groups or equipment clusters.
- This document also explains the enhanced optimization and trouble shooting potential of the products as well as presenting product options available now or in the near future; including the Adam semiconductor analysis pack, the Adam semiconductor data warehouse and the Techo-Eco modeling product from the Or Suite of Products.
- Empirical Controller begins by giving a general description of the Empirical Controller and its potential embodiments in such varied fields as communications, design of experiments and other operations research, automated control of enterprise, process machines, measuring equipment etc.
- the document then goes on to describe the Empirical Controller as a generic learning and thinking system which performs Empirical Control and being a three tier structure consisting of Knowledge Tree, Empirical Model and ADM (Automated Decision Maker or Adam, which in non-automated environments provides natural language instructions to engineers or operators who then intervene with system or process operations to achieve objectives within defined constraints.)
- the three describe, model, and control the behavior of complex interrelated processes.
- the component, of the present invention that integrates physical knowledge and logical understanding into a homogenetic knowledge structure is called the Knowledge Tree.
- the second component, the Empirical Model sits above the Knowledge Tree and integrates data using various analysis tools to create quantified functional relations in the aforementioned homogenetic knowledge structure.
- the Empirical Model can be used to predict and control system (process) behavior.
- the ADM sits on top of the Empirical Model and operates and analyzes it to determine solutions that best meet specified objectives and constraints.
- the Empirical Model is updated automatically as a function of new data collected by the Process Outcome Empirical Modeler (POEM), which is the core analysis tool of the ADM.
- POEM Process Outcome Empirical Modeler
- the POEM algorithmic approach generates automatically a set of functional relationships between inputs and outputs of each Interconnection Cell in the Knowledge Tree describing a process.
- Empirical Controller embedded in the three tier Adam, Edert (Empirical Decision Enabling Network), and Eve (Equipment Variable Evaluator) product configuration is able to control large complex processes to bring global process control relationships down to the lowest operational levels for optimum decision making and control.
- the Eve performs automatic control, optimization, and troubleshooting at the equipment level like Adam performs these functions at the global process level and Eden at the intermediate level for process control of work groups or equipment clusters.
- the Empirical Model is built around the actual and not theoretical system to be controlled. 2. There is greater optimization with the Empirical Model. 3.
- the Empirical Model can enable not only feedback and feedforward capability but automatic self-control as well. 4.
- the Empirical Model adapts automatically to system changes as a result of multivariable changes between variables based on POEM. 5. When used in engine control application the Empirical Controller controls each cylinder. An Empirical Model for each cylinder grants greater efficiency and optimization. 6. Based on claim 5; increased engine durability and reliability. 7. Focused data mining tool. 8. Model construction from data without statistical assumptions. 9. Puts into unified framework all available information and knowledge pieces about the process.
- This document is an internal memo between the inventors. It is an analysis and comparison of products that appear to be competitive with the present invention.
- the document lists three fundamental advantages of the technology of the present invention over the competition in processing systems and particularly in its embodiment of semiconductor health monitoring, yield management, and SPC (Statistical Process Control).
- the first advantage is that the present invention's Knowledge Tree is formed from know-how regarding non-quantified relationships as opposed to quantified relationships.
- the second advantage is the Process Outcome Empirical Modeler (Poem). As an embodiment of the present invention, it provides a substantially more accurate predictive tool than is available on the market today and at the same time it is simpler to implement.
- a Protocol of the present invention is designed to recall and use past relationships to determine by analogy future behavior. Further, a Protocol of the present invention, unlike neural networks, does not impose inappropriate mathematical models on data.
- the third advantage is that the a Protocol of the present invention is a more effective decision making tool in process control because it has technology that can detect when a process is moving, or likely to move, out of control and by receiving information in advance (feedback), it can successfully intervene (feedforward) and bring the process back into control before there is an excursion.
- Empirical Controller Empirical Controller 2.
- POEM Process Outcome Empirical Modeler
- POEM SPC application 4.POEM - SPC for engines and for other applications unable to use SPC now ⁇ .Automatic Design of Experiments in Empirical Controller (online tool) 6. Automatic Updating of Knowledge Tree 7. Automatic Creation of the Knowledge Tree 8. Adam (Automated Decision Maker), Eve (Equipment Variable Evaluator), and Eden (Empirical Decision Enabling Network) Application (products).
- Process Mapping a homogenetic (deriving from a substantially similar template, format, disclosure structure, etc.) integration of physical and logical means... to describe complicated systems.
- Process Outcome Predictor POP
- On -line Optimization including Robust Optimization - calling for on-line Robust Analysis with operating data and modifying the "Model" periodically, e.g. engine and semi-conductor application 4.
- Automated Generation of the 1 st cut Process Map the IDM system would provide means for the analyst to updated and customize the auto-generated PM.
- the PM for ADS that is a very similar embodiment of the later Empirical Controller and its various components. Also, the PM for ADS is described as being a very effective application of the present invention when integrated into the Process Optimization and
- Modeler (POEM) and used for enhancing semiconductor fabrication.
- POEM SPC Process Output Empirical Modeler and Statistical Process Control This document describes the application of POEM and its suggestion of a Conditional Statistical Process Control; a more sensitive and precise form of SPC because it relates to the specific class behavior of the variables. The document gives the POEM methodology for calculating the functional relationship between the input values and the output value (including diagram). The document ends with a list of claims describing the advantages of POEM over conventional SPC.
- This document is basically an expanded and edited version of the previous POEM document (POEM SPC - 06/02/99 01 :43p).
- POEM Process Outcome Empirical Modeler
- the goal, as described in this document, of the Process Output Empirical Modeler is to utilize process data to uncover the functional relationship between the input and the output.
- POEM plays a central role in transforming the qualitative Knowledge Tree to a quantitative Empirical Model.
- the concept described in this document is directly applied to significantly improve conventional SPC by introducing POEM SPC.
- the last part of the document gives a more detailed description of the POEM algorithms than the previous POEM document.
- a memo concerns a meeting for the presentation of two examples of embodiments of the present invention in the fields of semiconductor fabrication and agriculture (growing of vegetables).
- the memo signifies a change of approach in the way the invention would be presented in order to better elucidate its particular uniqueness.
- Attached to this memo is a document stating five claims that were to be the basis of the discussion at said meeting.
- Also attached to this memo is another document entitled Uniqueness of Knowledge Tree, which gives an example of integrating disciplinary and heuristic means into a homogenetic Knowledge Tree in the field of agriculture.
- This document also contains two Knowledge Tree maps: one for semiconductor FAB and another for the growing of vegetables in a standard agricultural framework.
- KT Knowledge Tree
- KC Knowledge Cell
- Interconnection Cell This document includes a list of five claims concerning the Knowledge Tree (KT) and Knowledge Cell (KC - and in later embodiments Interconnection Cell).
- KT describes in non-quantitative terms a homogenetic relationship pattern between input and output variables... so that a control unit can utilize the model derived from the KT as a basis for making auto-control decisions.
- 2. (based on 1) When KT is used as data analysis tool to build a model of a system or process, without human intervention, it adequately describes the behavior of said system of process. 3. (based on1&2) Wherein KC's describe individual physical and logical components and inter-relations in the KT. 4. (based in1 ,2, & 3) Wherein knowledge used to build KT is derived from process flow diagrams etc. as well as other appropriate disciplinary and heuristic knowledge structures. 5. (based on 1 ,2,3,&4) Wherein each Knowledge Cell is able to be used as a data analysis tool...able to build a model of
- the present invention relates to improving the quality of process control by using expert knowledge which facilitates constructing a topological process graph (often times a directed graph; also referred to as a process map), from the descriptions of at least one expert, or even from a composite collection of interviewing many involved workers (e.g. in situations where not even one expert study has ever been conducted.)
- a topological process graph often times a directed graph; also referred to as a process map
- Figure 30 portrays a typical schematic knowledge-tree representation example of all or part of such a model.
- Other sample representations may be constructed automatically by punning the prototype (of appendix 1) on a sample database (also in appendix 1 ) or on another database of equivalent form.
- the present invention allows the composite model construction to be used in a novel way. Initially, it is important to validate the composite model. Testing each link in the composite model against actual empirical data accomplishes this validation. In the even that a statistically inadequate quanta of empirical data is available, then the model may be tested against simulation data which was seeded by the empirical data; or in the worst case, by theoretical suppositions.
- Validation of each link may be expressed quantitatively. For example, a correlation represented by a link between two nodes may be supported by all available data (100% validated), by some lesser plurality of the data, not at all, or even in opposition to the actual empirical data. At this stage, quantitative validation may be used to
- the 7 layer model of the present invention is applied to a system having two interconnected processing machines: a cutting machine (cutter) which is then followed by a polishing machine (polisher).
- the process map is of simple linear directed graph topology having initial input connected to the cutter connected in turn to the polisher connected in turn to the final output.
- an initial Knowledge Tree presents two input factors to the cutter, two interim factors between the cutter and the polisher, two output factors from the polisher, and furthermore two actuator inputs to the cutter and an additional two actuator inputs to the polisher.
- the sensors and actuators are connected (e.g. directly or via a LAN) to Layer 1.
- the description of the process map and of the expert suggested relationships are contained in data sets input on a first data storage device and likewise connected to Layer 1.
- data sets containing data collected by the sensors and actuators are stored on a second data storage device and likewise connected to Layer 1.
- the index for each of the data sets is maintained in Layer 3; initialization and updates being provided using the services of Layer 2.
- an initial Knowledge tree is assembled from the index of the first storage device.
- Tree is performed by computing a causality metric between respective inputs and outputs of each interconnection cell, the data being provided using the index of the second data storage device.
- This computing may be performed using standard SPC or using conditional SPC of the present invention or using substantially any of the appropriate prior art methods as described for other uses in the Background Section.
- Insufficiently valid inputs, interim measurements, or outputs may be deleted from the Knowledge Tree.
- the result is a first version validated Knowledge Tree, which may be used as an Empirical Controller for on-line alarm or report generation from the operating cutting and polishing process.
- a process control engineer may consider this Knowledge Tree and propose that a measurement recorded as input to the cutter may hypothetically be of significance as input to the polisher.
- This hypothetical input to the polishers interconnection cell of the Knowledge Tree is articulated in Layer 6 and directed to Layer 5 for testing; using the empirical data services provided by lower layers.
- Layer 6 may be used to actually modify and test a change in one of the actuators; since the method of the present invention is not limited nor restricted to theoretical type simulations and their respective validations. In either scenario, if the results are of greater validity and productivity than those of the present validated Knowledge Tree, then the Knowledge Tree may be modified to reflect these results; and the cutting and polishing process modified accordingly.
- a combinatorial algorithm in Layer 7 may be used to articulate all possible combinations and relationships between sensors and actuators; the considerations of how most effectively to generate or consider these combinations may be strategically input; and, in the presence of surplus computational resources, these combinations may be evaluated in Layer 6 (where it is certified that they have not yet or recently been considered), and thereafter forwarded to Layer 5 for actual testing-again either (preferably) against existing empirical data, or (alternatively) by altering the actual process of cutting and polishing.
- Layer 6 where it is certified that they have not yet or recently been considered
- n-tuplings defines the actual productive intersection region more precisely.
- a proliferation of n-tuplings by increasing the number of discrete regions in each respective parametric representation (e.g. in the mean plus/minus 2 or 3 standard deviations) will increase the respective yield for the process; using the conditional SPC methods of the present invention.
Abstract
Description
Claims
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP00970863A EP1245003A1 (en) | 1999-10-31 | 2000-10-13 | A knowledge-engineering protocol-suite |
AU80182/00A AU8018200A (en) | 1999-10-31 | 2000-10-13 | A knowledge-engineering protocol-suite |
TW090110128A TWI235935B (en) | 2000-06-07 | 2001-04-27 | A knowledge-engineering protocol-suite system |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
IL132663 | 1999-10-31 | ||
IL13266399A IL132663A (en) | 1999-10-31 | 1999-10-31 | Knowledge-engineering protocol-suite |
US09/588,681 | 2000-06-07 | ||
US09/588,681 US6952688B1 (en) | 1999-10-31 | 2000-06-07 | Knowledge-engineering protocol-suite |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2001033501A1 true WO2001033501A1 (en) | 2001-05-10 |
Family
ID=26323892
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2000/028319 WO2001033501A1 (en) | 1999-10-31 | 2000-10-13 | A knowledge-engineering protocol-suite |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP1245003A1 (en) |
AU (1) | AU8018200A (en) |
WO (1) | WO2001033501A1 (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7698012B2 (en) | 2001-06-19 | 2010-04-13 | Applied Materials, Inc. | Dynamic metrology schemes and sampling schemes for advanced process control in semiconductor processing |
US8005634B2 (en) | 2002-03-22 | 2011-08-23 | Applied Materials, Inc. | Copper wiring module control |
US8070909B2 (en) | 2001-06-19 | 2011-12-06 | Applied Materials, Inc. | Feedback control of chemical mechanical polishing device providing manipulation of removal rate profiles |
US8504620B2 (en) | 2000-11-30 | 2013-08-06 | Applied Materials, Inc. | Dynamic subject information generation in message services of distributed object systems |
US9088831B2 (en) | 2008-10-22 | 2015-07-21 | Rakuten, Inc. | Systems and methods for providing a network link between broadcast content and content located on a computer network |
US9094721B2 (en) | 2008-10-22 | 2015-07-28 | Rakuten, Inc. | Systems and methods for providing a network link between broadcast content and content located on a computer network |
US9712868B2 (en) | 2011-09-09 | 2017-07-18 | Rakuten, Inc. | Systems and methods for consumer control over interactive television exposure |
CN117434912A (en) * | 2023-12-21 | 2024-01-23 | 宁晋县润博达医疗防护用品有限公司 | Method and system for monitoring production environment of non-woven fabric product |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5257384A (en) * | 1991-09-09 | 1993-10-26 | Compaq Computer Corporation | Asynchronous protocol for computer system manager |
US5668810A (en) * | 1995-04-26 | 1997-09-16 | Scientific-Atlanta, Inc. | Data transmission protocol method and apparatus |
US5739919A (en) * | 1996-05-17 | 1998-04-14 | Nko, Inc. | Point of presence (POP) for digital facsimile network |
US5764915A (en) * | 1996-03-08 | 1998-06-09 | International Business Machines Corporation | Object-oriented communication interface for network protocol access using the selected newly created protocol interface object and newly created protocol layer objects in the protocol stack |
-
2000
- 2000-10-13 AU AU80182/00A patent/AU8018200A/en not_active Abandoned
- 2000-10-13 EP EP00970863A patent/EP1245003A1/en not_active Withdrawn
- 2000-10-13 WO PCT/US2000/028319 patent/WO2001033501A1/en not_active Application Discontinuation
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5257384A (en) * | 1991-09-09 | 1993-10-26 | Compaq Computer Corporation | Asynchronous protocol for computer system manager |
US5668810A (en) * | 1995-04-26 | 1997-09-16 | Scientific-Atlanta, Inc. | Data transmission protocol method and apparatus |
US5764915A (en) * | 1996-03-08 | 1998-06-09 | International Business Machines Corporation | Object-oriented communication interface for network protocol access using the selected newly created protocol interface object and newly created protocol layer objects in the protocol stack |
US5739919A (en) * | 1996-05-17 | 1998-04-14 | Nko, Inc. | Point of presence (POP) for digital facsimile network |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8504620B2 (en) | 2000-11-30 | 2013-08-06 | Applied Materials, Inc. | Dynamic subject information generation in message services of distributed object systems |
US7698012B2 (en) | 2001-06-19 | 2010-04-13 | Applied Materials, Inc. | Dynamic metrology schemes and sampling schemes for advanced process control in semiconductor processing |
US8070909B2 (en) | 2001-06-19 | 2011-12-06 | Applied Materials, Inc. | Feedback control of chemical mechanical polishing device providing manipulation of removal rate profiles |
US8694145B2 (en) | 2001-06-19 | 2014-04-08 | Applied Materials, Inc. | Feedback control of a chemical mechanical polishing device providing manipulation of removal rate profiles |
US8005634B2 (en) | 2002-03-22 | 2011-08-23 | Applied Materials, Inc. | Copper wiring module control |
US9088831B2 (en) | 2008-10-22 | 2015-07-21 | Rakuten, Inc. | Systems and methods for providing a network link between broadcast content and content located on a computer network |
US9094721B2 (en) | 2008-10-22 | 2015-07-28 | Rakuten, Inc. | Systems and methods for providing a network link between broadcast content and content located on a computer network |
US9420340B2 (en) | 2008-10-22 | 2016-08-16 | Rakuten, Inc. | Systems and methods for providing a network link between broadcast content and content located on a computer network |
US9712868B2 (en) | 2011-09-09 | 2017-07-18 | Rakuten, Inc. | Systems and methods for consumer control over interactive television exposure |
CN117434912A (en) * | 2023-12-21 | 2024-01-23 | 宁晋县润博达医疗防护用品有限公司 | Method and system for monitoring production environment of non-woven fabric product |
CN117434912B (en) * | 2023-12-21 | 2024-02-20 | 宁晋县润博达医疗防护用品有限公司 | Method and system for monitoring production environment of non-woven fabric product |
Also Published As
Publication number | Publication date |
---|---|
AU8018200A (en) | 2001-05-14 |
EP1245003A1 (en) | 2002-10-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6952688B1 (en) | Knowledge-engineering protocol-suite | |
Matel et al. | An artificial neural network approach for cost estimation of engineering services | |
TWI235935B (en) | A knowledge-engineering protocol-suite system | |
Kibira et al. | Methods and tools for performance assurance of smart manufacturing systems | |
Thakkar et al. | Selection of third-party logistics (3PL): a hybrid approach using interpretive structural modeling (ISM) and analytic network process (ANP) | |
Greasley et al. | Enhancing discrete-event simulation with big data analytics: A review | |
Padayachee | An interpretive study of software risk management perspectives. | |
Ho et al. | An intelligent forward quality enhancement system to achieve product customization | |
Patterson et al. | Six sigma applied throughout the lifecycle of an automated decision system | |
Mohammadrezaytayebi et al. | Introducing a system dynamic–based model of quality estimation for construction industry subcontractors’ works | |
WO2001033501A1 (en) | A knowledge-engineering protocol-suite | |
Villarreal-Zapata et al. | Intelligent system for selection of order picking technologies | |
Hauke et al. | Individuals and their interactions in demand planning processes: an agent-based, computational testbed | |
CN115422402A (en) | Engineering prediction analysis method | |
Zidi et al. | A new approach for business process reconfiguration under uncertainty using Dempster-Shafer Theory | |
Rossit et al. | knowledge representation in Industry 4.0 scheduling problems | |
Moutinho et al. | Strategic diagnostics and management decision making: a hybrid knowledge‐based approach | |
Miller | A conceptual framework for interdisciplinary decision support project success | |
Sader | An experimental approach to total quality management in the context of Industry 4.0 | |
Weidele et al. | AutoDOViz: Human-Centered Automation for Decision Optimization | |
Sajid | A methodology to build interpretable machine learning models in organizations | |
Ziv et al. | Improving nonconformity responsibility decisions: a semi-automated model based on CRISP-DM | |
Tse et al. | Solving complex logistics problems with multi-artificial intelligent system | |
Wissuchek et al. | Survey and Systematization of Prescriptive Analytics Systems: Towards Archetypes from a Human-Machine-Collaboration Perspective | |
Mehrez et al. | The meta-model of OR/MS |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AL AM AT AU AZ BA BB BG BR BY CA CH CN CR CU CZ DE DK DM EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
DFPE | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101) | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2000970863 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 2000970863 Country of ref document: EP |
|
REG | Reference to national code |
Ref country code: DE Ref legal event code: 8642 |
|
WWW | Wipo information: withdrawn in national office |
Ref document number: 2000970863 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: JP |