## Are you sure?

This action might not be possible to undo. Are you sure you want to continue?

Guojian Cheng

School of Computer Science Xi’an Shiyou University, Shaanxi Province, 710065 Xi’an , China gjcheng@xsyu.edu.cn

Lan Zeng

School of Computer Science Xi’an Shiyou University Shaanxi Province, 710065 Xi’an , China zenglan531@126.com

Shiyou Lian

School of Computer Science Xi’an Shiyou University Shaanxi Province, 710065 Xi’an , China lsy7622@126.com

Abstract—In this paper, we present a method based on the attribute reduction of rough set and support vector machine regression and the new method can be used to predict three important petroleum reservoir parameters which are porosity, permeability and saturation. First, we use rough set theory to reduce the attributes of sampling dataset in order to select the decision-making attributes constituting a new sampling dataset. Second, we use the theory of Support Vector Regression (SVR) for training data and establish the predicting model. After that, the test data will be predicted. The experimental results show that the method can get a better fitting result and reduce the computational complexity of SVR in training dataset and it can also improve the accuracy of reservoir physical parameters. The implementation of the method can provide the foundation of decision making for reservoir development. Keywords-Rough Set; Support Saturation; Porosity; Permeability Vector Regression;

II.

ATTRIBUTE REDUCTION BY ROUGH SET

I.

INTRODUCTION

Porosity, permeability, saturation are three main petrophysical parameters which are used to evaluate a reservoir, their changes in space are of great signification for exploration and development of oil reservoirs. How to get the exact value of three parameters is a very important problem. The reservoirs parameters predicting accurately can improve the efficiency of exploration and can provide an important foundation for making out a reasonable production planning. In addition to a lot of empirical formulas introduced by geological experts, there are also many new methods, such as multiple variable regressions and neural network. These methods can all be used to predict permeability, saturation and porosity. Because Rough Set (RS) can be used to reduce the attribute dimensionality, that is, the necessary conditions are selected from condition attributes in order to reduce the dimension of sampling dataset. Moreover, Support Vector Machine (SVM) can be used to solve the nonlinear problem effectively, so we use a method based on the combination of RS and SVM to establish the predicting model of reservoir parameters.

The theory of RS was introduced by Polish scientist Pawlak[1]for dealing with uncertain and ambiguity problem. It can be used in the fields of data mining, pattern recognition. The idea of attributes reduction in RS is to find the important attributes from condition attributes for constituting the best combination of attributes reduction. The idea of attribute reduction can be expressed by mathematical formula as follows: Given a set of S = {A, B, C, D}, where A = {x1, x 2,..., x 3}, is the universe of discourse, B = E ∪ F is a limited set of attributes, E ∪ F = Φ , E is the set of condition attributes, F is the set of decision-making attributes. The decision-making table consists of condition attributes and decision-making attributes. If ai ∈ E , POSE ( F ) = POS ( E − ai )( F ) , so ai is called unnecessary attribute in F , or vice versa. According to the theory of RS, the condition attributes reduction has the following steps: Step1: Calculate the equivalence set of condition attributes and decision-making attributes; Step2: Calculate the approximate set of each decisionmaking attribute; Step3: Calculates the value of POS ( E )( F ) ; Step4: Divide the set of condition attributes and calculate each POS( E ' )( F ) . If POS( E )( F ) = POS( E ' )( F ) then the condition attributes are reductions of decision-making table in this combination, or vice versa. We will get all reduction attributes of decision-making table by the above-mentioned calculation steps. III. SUPPORT VECTOR MACHINE REGRESSION MODEL

The regression problem of SVM is similar to classification problem and their difference is the values of output variables[2]. Given a set of sampling data {(xi, yi),..., (xm,...,ym)} , where, xi ∈ R n , yi ∈ R, i = 1,2,3,...m . In a classification problem, the value of output variable y is -1 or 1, but in regression problem, the value of y can take any arbitrary real number. The regression problem can be described by mathematical language:

978-1-4244-6349-7/10/$26.00 c 2010 IEEE

V2-701

It is a compromise factor between the rate of training error and generalization ability and is used to control the complexity of model. we adopt the sampling dataset of well logging in an oil field of northeast of China and make predicting researches of petroleum reservoir parameters. ym) ∈ R n × R (1) The goal of regression is to get linear function as following: A. borehole diameter.Given a set of sampling data S = {( xi. We will adopt Lagrange multiplier to transform the optimization problem into a corresponding dual problem. Flowchart of well logging for parameters predicting. which include the well depth. ξi ≥ 0... m * m Where. K in the formula (5) stands for kernel function . At last. the number of training samples is 90 and the number of testing samples is 39. i = 1. b is threshold value. we establish a new sampling dataset. etc.( xm. so we select RBF to construct the discriminant function of SVM. αi and αi are Lagrange multipliers. yi ). Attribute reduction of well logging and Samples Selection In order to predict the porosity. ym)}. f i =1 ( xi) − Yi ≤ ξi * + ε 1 ω 2 2 + c ∑ (ξi + ξi * ) m (3) f ( xi) − Yi ≤ ξi + ε ξi * . Here.. target value can be expressed as follows : WELL LOGGING PARAMETERS PREDICTING BY COMBINATION OF RS AND SVR ( xi....t. we firstly reduce the attributes in order to decrease the time and space complexity of training.. c is a penalty coefficient. 0 ≤ αi. For the porosity predicting. SVR can be induced to an optimization problem through minimizing the follow function: min ω ...Well Logging Reservoir Parameters Predicting Flowchart In this paper.. The number of sampling dataset is 135 and the number of attribute parameters is 15 which include the well depth. finding a function f ( x) in R . xi ) + b i =1 m (5) This is the target function of SVR. α * ) = − 1 m * * ∑ (αi − αi )(αj − αj ) K ( xi ⋅ xj ) 2 i . i = 1. Three reservoir parameters to be predicted are porosity. For permeability predicting. saturation accurately. resistivity. Because of lacking enough sampling data for well logging. i = 1... ω is a parameter vector and (ω ⋅ x) is inner product of ω and x . C.. yi ∈R. we first consider the problem of linear regression[3] . m Figure 1.. so we select different samples when we predict the three parameters. n n IV. the number of attribute dimension is decreased from 15 to 9..2. the sampling data has n dimensional vector and m sample data. thickness. K ( x. gamma ray and so on.m . ( xm. xi ) = exp(−γ x − xi ) 2 (6) V2-702 2010 2nd International Conference on Computer Engineering and Technology [Volume 2] . By introducing the kernel function approach.. permeability. Where.. According to the steps of attribute reduction which is mentioned above. So the regression function which is showed by formula (2) can be written as: * B.. the expression of RBF is: f ( x) = ∑ (αi − αi * )K ( x..b s.. borehole diameter.3. i =1 * ∑ (αi − αi ) = 0 . ξi * and ξi are relaxation factors. yi ). the number of training samples and testing samples are 90 and 41 respectively. nature gamma.t. saturation. ε is a error limit which is set in advance. formula (3) can be transformed into solving the maximum of the following function: L(α . Model Construction Since Radial Basis Function (RBF) can solve the problem of complex nonlinear and function approximation.. we select 94 samples as training samples and 41 samples as testing samples. to make y = f ( x) . The predicting flowchart of the three parameters is shown as in Figure 1: f ( x) = ω ⋅ x + b (2) Where.. j =1 m * + ∑ yi (αi − αi ) − ε ∑ (αi + αi ) * m (4) i =1 i =1 s. where xi ∈R . αi ≤ C. permeability. thickness. compensated density. For saturation predicting. they are dual to each other..

the global errors of ANN are bigger than RS-SVR. saturation predicting is shown as in Table 1: TABLE I.198161 for porosity and saturation’s is 0. the predicting results of the 19 testing samples are list in Table 2.5 10000 D. permeability. a good fitting result of predicted value and the real value is got.231589 and 0.25 0. Reservoir parameter Porosity Saturation Permeabilit y SVM Parameter Kernel Penalty factor parameter c 1 γ Error control parameter ε 0.05 0. The sampling dataset of permeability in the experiment is the same for both methods. The curve fitting result for saturation predicting. The absolute errors for the 40 samples of permeability are also calculated and the result is shown in figure 5.0025 1260 0. In order to compare the advantages and disadvantages for the two methods. THE TRAINING PARAMETERS FOR POROSITY. As it can be seen from Table2 and Figure 5. we select the appropriate parameters through a series of cross-validation in order to obtain the best mean square deviation and correlation coefficient.Taking (6) into (5). As shown in figure 4. As it can be seen from the Figure 2 and 3.060138 respectively. Figure 5.35 0. The fitting result of permeability is shown in figure 4.053218.00001 0. The MSE for the methods of ANN and RS-SVR is 0. The curve fitting result for porosity predicting. The curve fitting result for permeability predicting. [Volume 2] 2010 2nd International Conference on Computer Engineering and Technology V2-703 . it can be seen that predicting result of RS-SVR is much better than ANN. The parameters of training for porosity. although some data points predicted by the method of ANN are better. we can obtain the function of SVR which is used by model: f ( x) = ∑ (αi − αi * ) exp(−γ x − xi ) + b i =1 m 2 (7) In the process of training. we adopt the methods of Artificial Neural Network (ANN) and combination of RS and SVR (RS-SVR) respectively to predict the permeability in the experiment. The comparative result of absolute error for permeability predicting. Experimental Results and Analysis The fitting results of porosity and saturation are shown in figure 2 and 3. PERMEABILITY AND SATURATION PREDICTING Figure 2. For comparison. After comprehensive comparison. the fitting result for the method of RS-SVR is better than ANN. Figure 3. The mean square error (MSE) is 0. Figure 4.

CONCLUSIONS [1] [2] The method based on the combination of rough set and support vector regression is presented for the reservoir parameters predicting by analyzing the characteristics of rough set. Cheng Hua. Reservoir Porosity and Permeability Estimation from Well Logs using Fuzzy Logic and Neural Networks[J].Al-Bazzaz. Zhang Guang-qun.15898 1 0.1 0.080039 0. Rough Sets Theory and Knowledge Acquisition. Zhang Xi-bin. December 2006. W. Wang Hang-jun. Jong-SeLim.2 0.001856 0.09814 1 0.7 0.7 0.V.034870 Absolute error of ANN 0. 2001.H. Deng Shao.18737 0. February 2007.05898 0. Support Vector Machine and its Application in Mixed Gas Infrared Spectrum Analysis[M].7 0. Fan Yi.77916 0. Forecast method of logging physical property parameters based on LS-SVM [J].[M]. ACKNOWLEDGMENT This work is supported by NSFC -. By using the algorithm of RS attributes reduction.037820 0. Metrology Technology.53223 0.312899 0.92084 0.09949 0. Xi-dian University Press.013560 0. The experimental results show that the reservoir parameters predicting by combination of RS and SVR can get much better generalization.322619 0. Journal of Petrochemical Universities.79927 0. pp.1 0. and Y.3 0.11524 0.ren. REFERENCES TABLE II.09665 0. 19.816369 0. pp.02668 0.13–30.412899 0. [3] [4] [5] [6] [7] [8] Wang Guo-yin.1 0.2004. Sun Guo-qiang.572588 1.W. The predicted accuracy can be obtained by using the method of SVR.6 0.3 0.18581 0. the dimension of SVR and the computational complexity can be reduced in the training process.225570 0.01524 0. So we can understand the structure of petroleum reservoir further and supply a better decision making for reservoir exploration and exploitation.2 0.2 0.065656 0. 2007. vol.99927 0. Xue Lei.27048 0.1 0.233107 0.4 1. pp. 43.051390 0. Society of Petroleum Engineers .4 Predictive value of RS-SVR 0.88– 92.3 1.2 0. 2008. Zhang Yuan-yuan. 28.The National Natural Science Foundation of China (40872087).270609 0.29665 0. Qi Xiao-ming A Survey on Rough Set Theory[J].5 0.33223 0. Fang Lu-ming.20311 0. Xi’an:Xi'anJiao-tong University Press.048609 0.11399 V2-704 2010 2nd International Conference on Computer Engineering and Technology [Volume 2] .116369 0.262185 0.1 0.19819 0.10311 0.143–146.29669 0.097186 0.108034 0. pp. vol.079270 0.005115 0.28737 0. vol.1 0.176469 0.7 0.Computer Engineering and Design.208–210. A Forecasting Model of Productivity Contribution in Single Zone of Multiple-Zone Production Based on Support Vector Regression[J].808034 0.28601 Absolute error of RS-SVR 0. pp. 28.000510 0. pp. Implementation and application of attribute reduction algorithms[J].38581 0. support vector regressions and well logging sampling data. Kuwait lnst Permeability Modeling Using Neural-Network Approach for Complex MauddudBurgan Carbonate Reservoir[J]. Bai Peng. Society of Petroleum Engineers.1 0. Jungwhan Kim.070609 0.125570 0.779270 0.033107 0.876469 0.019960 0.00331 0.122619 0. vol.10181 1 0.7 0.1–4.gui.777–779.12668 0. Computer Engineering and Applications.105115 0.002810 0.286438 1. Wu Wei.265656 0.Al-Mehanna.172588 0.2007.365128 Predictive value of ANN 0.12952 0. Testing samples 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 THE COMPARATIVE RESULTS OF THE RS-SVR AND ANN The Comparative Results True value of permeability 0. 2008.3 0.

- Fiducial Points Detection Using Svm
- Takehome Final
- Imp Document3
- svm leaf 08
- A Review Paper on Outlier Detection using Two-Phase SVM Classifiers with Cross Training Approach for Multi- Disease Diagnosis
- Comparisons between different hybrid statistical models for accurate forecasting of photovoltaic systems power
- 428
- Efficient Image Based Searching for Improving User Search Image Goals
- Handwritten Digit Recognition Based On a DSmT-SVM Parallel Combination
- Attribute Based Face Classification Using Support Vector Machine
- UBICC Article 522 522
- 2
- Churn Prediction Using Hadoop
- Face and Expression Analysis Using Local Directional Number Pattern by P.manju Bala, S.vijayalakshmi & M.O.ramkumar
- MLP and SVM Networks
- An Approach for Malicious Spam Detection in Email with Comparison of Different Classifiers
- Cancer Detection
- Kris Analytics for an Online Retailer Harvard
- 1569791171
- Drug design by machine learning support vector machines for pharmaceutical data analysis.pdf
- Improving Prediction Accuracy
- A Training Algorithm for Sparse Ls-svm Using Compressive Sampling
- Fake Face Database and Preprocessing
- Bao Cao Thuc Tap
- 22.pdf
- ISO 9001 certification
- Malware Detection Based on Structural and Behavioural Features Of
- IRJET-SOFTWARE VULNERABILITY PREDICTION USING FEATURE SUBSET SELECTION AND SUPPORT VECTOR MACHINE
- term pep2
- Pattern Recognition System for Detecting Use of Mobile Phones While Driving0v1

Close Dialog## Are you sure?

This action might not be possible to undo. Are you sure you want to continue?

Loading