Proceedings of the 2010 Industrial Engineering Research Conference

An Agent-Based Approach to Enhance Bio-Manufacturing Quality Control Using Data Mining
Tzu-Liang (Bill) Tseng1, Richard Chiou2, Chun-Che Huang3 and Johnny C. Ho4 Department of Industrial Engineering, The University of Texas at El Paso, USA 2 Applied Engineering Technology, Drexel University, USA 3 Department of Information Management, National Chi Nan University, Taiwan 4 Turner College of Business and Computer Science, Columbus State University, USA Abstract
Quality Control (QC) is a process employed to ensure a certain level of quality in a product or service. One of the techniques in QC is to predict the product quality based on the product features. However, traditional QC techniques have faced some drawbacks such as heavily depending on the collection and analysis of data and frequently dealing with uncertainty processing. In order to improve the effectiveness during a QC process, an agent-based hybrid approach incorporated with data mining techniques such as rough set theory (RST) is proposed in this paper. Under the agent-based framework, each agent is able to perform one or more functionality during the entire QC process. Based on empirical case study in bio-manufacturing, the proposed solution approach provides a great promise in QC processes.
1

Keywords
Quality control, Agent technology, Rough set theory, Bio-manufacturing

1. Introduction
To date, quality has become one of the major manufacturing strategies and perhaps the single most important way to achieve manufacturing success in a highly competitive manufacturing market. High quality production provides some advantages such as reduced scrap or re-machining cost and increased market share. To ensure the quality in machining process, it is important to response the dynamic environment quickly. According to literatures, the decision rules are appreciated to support for QC procedures while various variations occur in the machining process. Consequently, an effective prediction model utilized significant features for part quality is required in contemporary manufacturing and so called e-manufacturing. Traditionally, statistical process control (SPC) seeks to control and minimize variation in manufacturing process. However, quality control and improvement using SPC is very difficult to set up the best condition of manufacturing specification in plants with complex sequential processes [1]. Moreover, the statistical method isn’t able to handle linguistic variables and uncertain or incomplete information. Therefore, a hybrid data mining approach for a QC system which integrates rough set theory, fuzzy logic system and a genetic algorithm is proposed and applied. To enhance effectiveness and extension of the data mining approach, it is required improvement of gathered, managed, distributed and utilized information to decision-makers. Consequently, agent technology is a potential tool to enhance the approach. In this paper, the agent approach is proposed to augment the part quality control. Under the agent technology based framework, three main stages are identified and constructed for the QC prediction system: (1) Stage I - Quality control rule induction stage: A Rough Set (RS) based approach to select significant features and derive decision rules is used at this stage. (2) Stage II – Process variation modeling stage: After significant features are identified at stage I, a Fuzzy Set (FS) approach is used due to it is capable to model and compensate process variations effectively. (3) Stage III – Solution

although the execution of its sub-services may involve a number of other agents. The domain resources include not only databases and jobs.. fuzzy. for example. a hybrid data mining approach which integrates rough set theory. the proposed approach expects to provide a way to optimize prediction for the lowest defective rate. Thus. Each service is managed by one agent. Solution Approaches to the QC Problem In this section. To resolve the drawbacks of these statistical methodologies in quality control. A service corresponds to some problem solving activities in QC processes. The latter case allows a nested (hierarchical) agent system to be constructed in which higher-level agents realize their functionality through lower level-agents (the lower-level agents have the same structure as the higher-level agents and. Literature Survey The traditional way of achieving and ensuring the quality standards is mainly via statistical process control (SPC) procedures [7]. which is managed by another agent. In this agentbased system.4. 3.e. it cannot simply instruct that agent to start the service. there are no control dependencies between them.Proceedings of the 2010 Industrial Engineering Research Conference optimization stage: At this stage. the final solutions derived from standard statistical techniques may not be optimal because these methodologies are not able to learn from historical data. Current statistical approaches are difficult in analyzing qualitative information such as character the qualitative variable in several levels. if an agent requires a service. variation) of vague observations is essentially non-statistical in nature. Therefore. the agents must come to a mutually acceptable agreement about the terms and conditions under which the desired service will be performed.a joint decision making process in which the parties verbalize their (possibly contradictory) demands and then move towards agreement by a process of concession or search for new alternatives. Based on the aforementioned deficiencies from current statistical approaches. Since agents are autonomous. the RST uses an individual. Furthermore.1-3. However. Each stage has several agents and may use domain resources. and the solution optimization through genetic algorithm are presented in Section 3. uncertain and complicated problems than its individual components [7]. It is hard to meet all these assumptions in practice. object-model based approach that makes a very good tool for analyzing quality control problems [3]. therefore. The way for making agreements is negotiation . and hence these observations may not adequately support the random variation assumption inherent in statistical quality control methods. FST has demonstrated its ability in a number of applications. data mining has been proved that it is capable to improve quality control.1. Moreover. but also other agents. it is very difficult to set up the best conditions of manufacturing specifications for SPC by executing the design of experiments (DOE) in plants that have large equipment or sequential processes [1]. [6]. Genetic Algorithms (GA) is used to train membership function at Stage II in order to implement fuzzy solution optimization. etc. The Agent-based hybrid approach The multi-agent environment. The holistic multi-agent environment and architecture. The data mining approach of variable precision rough set and fuzzy set would produce a model that is more capable of solving noise. can. especially for the control of complex non-linear systems that may be difficult to model analytically. a fuzzy rule system for part quality prediction and modeling process variation. and the uncertainty (i. the proposed approach integrates the essence of RST. quality control rule induction. respectively. The Genetic Algorithm (GA) operates on a population solution rather than a single solution [2]. Form literatures. Rather. fuzzy set theory. which aims enhance communication effectiveness for the quality prediction of machining parts is presented in Figure 1. The conventional SPC and six sigma techniques must respect several statistical assumptions such as normality of distribution of the variables. have sub-agents as well as the jobs). constant variance of the variables. in sequential manufacturing processes. The nesting of services can be arbitrarily complex and at the topmost level the entire business process ultimately can be viewed as a service. . product quality is influenced by many factors that involve causal relationship and interact with each other. 3. 2. Three stages and eleven types of agents are conducted. genetic algorithm and agent based technology is proposed. each agent is able to perform one or more services. FST and GA and provides solution alternatives to QC issues incorporation with of agent technology. Comparing to standard statistical tools that use population based approach.

3. Define a proper feature set and a target file. Stop and output the results. Step 4. Feature sets are used for predicting an object’s outcome with this algorithm based on training set. otherwise. 3. If the object is uncompleted.2 The feature & rule extraction stage and its solution procedure To reduce the complexity of original data records.. This step is critical to obtain high-accuracy outcomes generated by the algorithm. the rule extraction agent is constructed. Step 5. After the RST approach is applied. Stop and output the results (i. The three types of agents are illustrated in the following sections. The reduct generation agent A reduct. Examine each object in the set for completeness. Domain experts and knowledge workers select the features with high frequent appearance from the premises of the decision rules as significant features.2. then transfer the reduct rules to Step 5. The rule-validation agent To validate the decision rules from reduct rules using a threshold. Collect all of the satisfied reduct rules and go to Step 6. 60%) then go to Step 4. otherwise go to Step 4.Proceedings of the 2010 Industrial Engineering Research Conference Figure 1: Environment of the proposed agent-based approach 3. Step 4. Finally. the rule-validation agent is provided and a validation procedure is presented next (Figure 2).2. remove the rules. If all of candidate decision rules are satisfied then go to Step 6.2. The rule extraction agent To obtain significant features from the data set. 3. repeat through all objects. Figure 2: The procedure of rule validation agent . the rule extraction agent is proposed based on the heuristic algorithm developed by Tseng and Huang (2006). otherwise. Determine the final rules from the candidate decision rules generated by the reduct generation procedure. the reduct generation agent is developed. Step 2. Repeat comparison of the reduct rules with objects from test set until no reduct rule is left. To extract significant features from useful reduct rules. Step 2. The significant features will be used in the fuzzy logic system illustrated in next section. If accuracy of the rule is greater than a predefined threshold value (e.1. Calculate the accuracy of each rule by using the total matched objects (for each rule) divided by summation of total correctly matched objects and total incorrectly matched objects. Step 3.e. 3. which is generated by the reduct generation agent of the feature & rule extraction stage. It also requires a procedure to validate the derived reduct rules based on the testing set. is a minimal sufficient subset of features but provides the same quality off discriminating concepts as the original set of features. go to Step 4. Step 1. which is determined by the domain expert. The rule extraction procedure consists of the following steps Step 1.g. restore the objects associated with unsatisfied rule and go to Step 3. If candidate reduct rules are satisfied.2. This paper adopts the reduct generation procedure proposed by [4]. Step 3. the critical rules are validated by the validation agent and significant features are extracted. Compare each reduct rule derived from the rule-extraction algorithm with each new object from test set. the potential rules for further validation). Calculate how many objects are matched with the rule. otherwise. Most of the rough set based approaches may generate more than one reduct for an object.. then delete the object from the file. The data set is randomly divided into the training set and the testing set. Step 6. the decision rules are generated.

the quality prediction stage involves four agents: The fuzzifer agent converts crisp numbers to fuzzy sets. choose individuals for parent according to their fitness value e. 1]. These feedbacks express performance of an individual as fitness. The solution procedure for the GA agent is illustrated as follows: Step 1. The remaining of Section 4 solves the quality control problem by applying the methodology discussed in this paper and analyzes the computational results. To develop the FLS. And if the number is huge. Step 2. At the selection stage. The GA fitness function is given as follows: Min E (i) = 1/2Σ(yi-di)2 (2) where E(i) = error between the actual defective rate and the fuzzy output. and rules. All the fuzzy rules are encoded to chromosomes. e is a result of the fitness function eval(v) Step 4. higher fitness value is favorite. It is consisted of four components such as fuzzifier. 4. while the fuzzy output membership functions are adapted to minimize errors. A good fuzzy rule base is determined by the fitted membership functions and fuzzy rules. inference. Step 3. The defuzzification agent is used to map the fuzzy set from the inference agent into a crisp result. ABC Inc. and the other is to fix membership functions and adapted fuzzy rules. Step 5: Determine the mutation probability to mutate the new population members randomly. Section 4. The purpose of GA adaptation is to adapt the membership function of each fuzzy rule such that the inference agent can predict more accurately. In this paper. plans to investigate the features which impact the quality of the abdominal aortic aneurysm model and develop a prediction model for the selected features (see Figure 3) .Proceedings of the 2010 Industrial Engineering Research Conference 3. Repeat Step 3 to Step 7 until a satisfied solution is obtained. One is to fix fuzzy rules and adapted membership functions. Step7: Produce next generation of population. Initiate the population according to the principle of gene encoding.1 Background and problem description The bio-manufacturing company. The GA adaptation starts with approximate control rules derived from the empirical models and refines the control rules through a learning process when process variations occur. There are two different ways in the GA agent. Generally. The inference agent infers the prediction based on the fuzzy rules.4 The optimization stage The GA agent is used to search the optimal solution of the quality prediction. The rule determination agent defines fuzzy rules corresponding to all kinds of level combination with respective to all significant features. For example. The fuzzy input remains the same. the purpose of evaluation is to provide a standard for selection. Since the GA is suitable to self-learning and self-organizing. yi = fuzzy output. then the computation becomes complicated. 3. Step 6: Crossover each in turn two members of the selected part of population to form a new population member.1 describes the background and problem description of the features which impact to the quality of the final product in ABC Inc. The initial individuals are given a lot of testing and they also provide feedbacks. Case Study This case study conducted by the authors illustrates the methodology presented in Section 3 to implement the integrated data mining approach in the BM process called the “Dip-Spinning Coating” process. At the evaluation stage. di= resulting part quality. the former method is considered because adapting the fuzzy rules is complicated in case that the number of rules is much more than the number of input variable. 4. If the number of individual is too low. it is effective in searching optimal solution in FLS. Determine the size of the individual for the next stage. defuzzifer. then the evaluation is slowly. the evolution function would rate the chromosomes as follows: eval(v1) = f(x1) = e1 (1) where the chromosome v represents the real value x.3 The quality prediction stage The proposed hybrid FLS is used to input the significant features from the feature & rule extraction stage and predict the part quality by using inference engine. A fuzzy set is defined on a universe of discourse X and is characterized by a membership function  F (x) that takes on values in the interval [0.

Feature 4 (Numbers of dipping) Feature 6 (Prototype mold) and Feature 9 (Diameter of the mold) are significant in the fabricating process of the part since they are selected through Rough Set software developed by University of Texas at El Paso in the promises of the eight decision rules. 95%] (2) IF (F1 = 1) AND (F2 = 2) THEN (Product = “Good Part”) [8.3 The results at the quality prediction stage At the feature & rule extraction stage. are the focus of this project as well as the prediction model development. Silicone solution viscosity (F8). Rotation speed (F5).. Type I and II) of Fuzzy Logic System (FLS) are desired to perform further investigation. (1) IF (F3 = 0) AND (F4 = 1) AND (F9 = 1) THEN (Product = “Good Part”).2 Computational results All data sets including 65 customer’s order data records are cleansed and the incomplete data sets are removed by the data collection agent. According to the different levels of uncertainty. Since two different types of the membership functions are incorporated in FLS. The 65 records were cleansed and reduced to 55 since some data in the 10 orders are not complete. Withdrawn rate (F7). The membership function of Type I FLS is applied to Feature 6 while the membership function of Type II FLS is used for Feature 4 and Feature 9 since Features 4 and 9 are much similar and the cohesion contains more uncertainly than Feature 6. the resulting flexible models manufactured for cardiovascular device deployment testing right). [10. Note that two values are listed in the bracket. In other words. The analysts of the BM department of the ABC Inc. 75%] (4) IF (F3 = 2) AND (F4 = 3) AND (F7 = 1) THEN (Product = “Good Part”) [4. Prototype mold (F6). 75%] (5) IF (F2 = 0) AND (F 3 = 1) THEN (Product = “Good Part”) [3. Curing time (F3). the combined FLS called a Hybrid FLS.g. 90%] (7) IF (F1 = 2) AND (F6 = 1) AND (F9 = 1) THEN (Product = “Defective”) [4. 89%] (3) IF (F4 = 0) AND (F9 = 3) THEN (Product = “Defective”) [5. Since the construction and rule derivation of fuzzy membership function depend on expert’s knowledge and experience. The features of the Dip-Spinning Coating process include Dipping orientation (F1). The result of reduct generation and the validated decision rules are represent below. 4. 80%] 4.Proceedings of the 2010 Industrial Engineering Research Conference during the Dip-Spinning Coating process. the rules with strong evidences.e. 4. Numbers of dipping (F4). 82%] (6) IF (F4 = 1) AND (F6 = 1) THEN (Product = “Good Part”) [3. The first number is number of supporting objects while the second one is accuracy. the other is the testing data set to verify the decision rules. the triangle membership function is applied since the triangle membership function is simple and suitable for most conditions. The analysts were also required to determine which attributes are significant to the derived rules and prove that the rules are valid in the determination of the relationship between the features and the quality of the final product.. Curing temperature (F2). which is used to derive the decision rules. In this case. These three features are different in nature. Diameter of the mold (F9) while the output feature is a good part or defective. a subjective . Type I FLS was applied as a baseline. were responsible for extracting the reliable and concise decision rules from the given process features. supported by more examples and as few attributes as possible. different types (e.4 Evaluation of Type I and Hybrid FLS using GA agent To be able to compare the Hybrid system. Figure 3: Example abdominal aortic aneurysm model manufactured in the UTEP Keck Center for a biomedical device manufacturer showing the geometric computer model of the patient-specific anatomy (left). i. The 55 data sets are divided as two groups: One is the training data set.

Boston: Kluwer Academic Publishers. “Rough set-based approach to feature selection in customer relationship management. Acknowledgements This work was supported by the US National Science Foundation (CCLI Phase I DUE-0737539) and the US Dept.” International Journal of Production Economics. Finally. K. pp. Addison-Wesley. In the case study.” Omega. M. Vol. Huang... [2] Goldberg. “Fuzzy control for manufacturing quality based on variable precision rough set. Consequently. quality engineers and inspectors can focus on these selected features since the overall part quality will be improved through intensive care on these significant features. [3] Kusiak. 2001. 60-61. T. Vol. Vol. process designers. 24.. 21(6). Genetic Algorithms in Search.3 0.5 Real Quality Case I (Type I.-L..” IEEE Transactions on Electronics Packaging Manufacturing. Sang. Optimization and Machine Learning.. References [1] Boo. Luo. [7] Zhai.. pp. The proposed solution approach has the advantages to compensate the weaknesses of traditional quality techniques.. 1989. For example. 44-50. No.” Intelligent Control and Automation. The rules derived from the data set provide an indication of how to study this problem further and pave a path for effective further investigation.. MA. K.(Bill). Yongjin.. 1. After Training) 1 2 3 4 5 6 7 8 9 10 11 12 Quality 0. The authors wish to express sincere gratitude for their financial support. S. X.(Bill).. Deok. the software is used to refine the original fuzzy membership function with the empirical data. The decision rules generated during the feature & rule extraction stage are able to provide decision support for quality improvement in manufacturing processes. C.. This paper forms the basis for solving many other similar problems that occur in manufacturing industries. Xie. the operators. Z.4 0. of Education (Award #P116B080100A). Corrected Proof. the proposed approach provides higher accuracy of quality prediction than other similar approaches since the curve generated from a hybrid approach (after training) is more close to the real quality curve in most conditions.2 0. A. [5] Tseng.6 0. Rough Sets: Theoretical Aspects of Reasoning about Data. Before Training_ Case III (Hybrid.” Robotics and Computer Integrated Manufacturing.E. (2) Hybrid FLS (before GA training). “Intelligent process control in manufacturing industry with sequential processes. C. C. 559-567. . pp.Proceedings of the 2010 Industrial Engineering Research Conference judgment might lead into inaccuracy. Figure 4 shows the performance of these three cases. Xu. Available online. and (3) Hybrid FLS (after GA training). 3. Conclusions In this paper. Figure 4: Comparison of prediction performance of three cases of FLS 5. H. 583-590. Vol. the traditional techniques being applied alone results in non-optimum results. 2006. the GA approach searches the optimum solutions by incorporating with the constructed FLS. 2005. After this investigation. Comparision of Prediction Performance of Three Cases of FLS 0. D. pp. T.. P. 1999. There are three different cases of comparison of prediction performance: (1) Type I FLS (after GA training).1 0 Part No. 1991. “Rough Set Theory: A data mining tool for semiconductor manufacturing. [6] Tseng. 2004. 2347-2351. Reading. After Training) Case II (Hybrid .-L. In Press. Yalcin. the hybrid approach is developed through a three-stage approach and in an environment includes twelve agents which each agent has its own functionality and unique solution procedure. C. “Feature-based rule induction in machining operation using rough set theory for quality assurance. J.. The outcomes generated at the feature & rule extraction stage are significant features which can be used to model and compensate process variations effectively through FLS. [4] Pawlak. C. M.7 0.

Sign up to vote on this title
UsefulNot useful