You are on page 1of 19

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/261774294

How we can improve nuclear data evaluations using neural network


simulation techniques

Data · April 2013

CITATION READS

1 67

3 authors, including:

A.Yu. Konobeyev Pavel Pereslavtsev


Karlsruhe Institute of Technology Karlsruhe Institute of Technology
260 PUBLICATIONS   2,101 CITATIONS    143 PUBLICATIONS   1,717 CITATIONS   

SEE PROFILE SEE PROFILE

Some of the authors of this publication are also working on these related projects:

Spallation Sources View project

MEGAPIE View project

All content following this page was uploaded by A.Yu. Konobeyev on 23 April 2014.

The user has requested enhancement of the downloaded file.


How we can improve nuclear data
evaluations using neural network
simulation techniques

A.Yu.Konobeyev, U.Fischer, P.E.Pereslavtsev

KIT – University of the State of Baden-Württemberg and


Large-scale Research Center of the Helmholtz Association www.kit.edu
Objective

-to outline the possible use and advantages of artificial


neural networks for nuclear data evaluation

-to discuss the future development of methods for data


evaluation using neural networks

JEFF Meeting. April 17-19, 2013 2


JEFF/EAF session
Neural networks
Human brain: neurons sending activation signals

http://de.123rf.com/

http://wps.prenhall.com

JEFF Meeting. April 17-19, 2013 3


JEFF/EAF session
Algorithmic version: artificial neural network

Warren McCulloch (neurophysiologist), Walter Pits (logician), 1943

A model of a neuron
http://www.emilstefanov. net/Projects/NeuralNet works.aspx
ftp://ftp.hampson-russell.com

JEFF Meeting. April 17-19, 2013 4


JEFF/EAF session
Artificial neural network: neurons sending activation signals

• The interconnections between different layers of neurons


• The learning process for updating the weights of the interconnections
• The activation function converting a neuron's weighted input to the output

A model of a complicated network A model of a simple feedforward network


http://www.doc.ic.ac.uk http://mechanicalforex.com

JEFF Meeting. April 17-19, 2013 5


JEFF/EAF session
Learning artificial neural network

Example: back-propagation algorithm

• initialize the network weights, e.g. using uniformly distributed


random numbers
• prepare the examples to the network and calculate the outputs
• compute the error
1 N 1 p
E    (Yj  D j )2
N k 1 p j1

p: number of neurons in output layer, N: number of examples

• update the weights backward


E
w ji  
w ji ftp://ftp.hampson-russell.com

Result: approximation a function of multiple input and output data


JEFF Meeting. April 17-19, 2013 6
JEFF/EAF session
Application of artificial neural networks

Financial Data Mining


Stock Market Prediction Prediction
Credit Worthiness Classification
Credit Rating Change and Deviation Detection
Bankruptcy Prediction Knowledge Discovery
Property Appraisal Response Modeling
Fraud Detection Time Series Analysis
Price Forecasts Sales and Marketing
Economic Indicator Forecasts
Sales Forecasting
Medical Targeted Marketing
Medical Diagnosis Service Usage Forecasting
Detection and Evaluation of Medical Phenomena Retail Margins Forecasting
Patient's Length of Stay Forecasts Operational Analysis
Treatment Cost Estimation
Retail Inventories Optimization
Industrial Scheduling Optimization
Process Control Managerial Decision Making
Quality Control Cash Flow Forecasting
Temperature and Force Prediction
HR Management
Science Employee Selection and Hiring
Pattern Recognition Employee Retention
Recipes and Chemical Formulation Optimization Staff Scheduling
Chemical Compound Identification Personnel Profiling
Physical System Modeling
Ecosystem Evaluation
Energy
Polymer Identification Electrical Load Forecasting
Recognizing Genes Energy Demand Forecasting
Botanical Classification Short and Long-Term Load Estimation
Signal Processing: Neural Filtering Predicting Gas/Coal Index Prices
Biological Systems Analysis Power Control Systems
Ground Level Ozone Prognosis Hydro Dam Monitoring
http://www.alyuda.com

JEFF Meeting. April 17-19, 2013 7


JEFF/EAF session
Nuclear physics
The potential of NN simulations is not realized yet

Applications: rather limited


nuclear mass systematics
special problems of high energy physics

Advantages of using NN
Approximations, predictions
extremely complex functions, correlations, interconnections

JEFF Meeting. April 17-19, 2013 8


JEFF/EAF session
Computations

STATISTICA package : www.statsoft.com

Wolfram Mathematica : www.wolfram.com

MATLAB : www.mathworks.de

“R” package : www.r-project.org


no “exotic” knowledge required

“R” is used in the following examples

JEFF Meeting. April 17-19, 2013 9


JEFF/EAF session
Example 1

Generalized superfluid model: nuclear level density

Ueff  Ex  Econd  shift

shift : reflects “possible shortcomings of global systematics of


pairing correlation function and collective enhancement coefficients”

• the use of systematic parameter for calculation of nuclear level density


(unavoidable)

• strong influence of the quality of shift-systematics on predicted cross-


sections

JEFF Meeting. April 17-19, 2013 10


JEFF/EAF session
Simple parameterization shift = a1A+a2
avoiding an uncertainty of “pure” mathematical approximation

: Ueff=U+ A.V.Ignatyuk, R.Capote IAEA TEC-DOC 1506


2,0

1,5

1,0

0,5
shift (MeV)

0,0

-0,5

-1,0

-1,5 data
commonly used approximation
-2,0
0 50 100 150 200 250
Mass number

Systematic deviations, possible interconnections, correlations


Neural networks simulation
JEFF Meeting. April 17-19, 2013 11
JEFF/EAF session
Polynomial power=44, (NN)=(Pol)
2,0

1,5

1,0

0,5
shift (MeV)

0,0

-0,5

-1,0
data
-1,5 Polynomial
NN
-2,0
0 50 100 150 200 250
Mass number

JEFF Meeting. April 17-19, 2013 12


JEFF/EAF session
55
: Ueff=U+
50
Polynomial
NN (2)
45
aprox 2
( i- i )

40

35
i=1

30

25

20
0 10 20 30 40 50 60 70 80 90 100
Power of polynomial

JEFF Meeting. April 17-19, 2013 13


JEFF/EAF session
Polynomial power=100
2,0

1,5

1,0

0,5
shift (MeV)

0,0

-0,5

-1,0
data
-1,5 Polynomial
NN (2)
-2,0
0 50 100 150 200 250
Mass number

JEFF Meeting. April 17-19, 2013 14


JEFF/EAF session
Example 2

Systematics of (n,p) reaction cross-sections at 14.5 MeV

Evaluated cross-sections using measured data for 125 nuclei, A > 39

The best formula (KIT, 2006)

   
2
(n,p)   r02 A1/3  1 exp A0.5 (4.4785S  4.7174  102 V  0.27407) , Z  50

 
(n,p)   r02 A1/3  1 A 0.75718  0.61348S  0.1511 ,
2 3
Z > 50

JEFF Meeting. April 17-19, 2013 15


JEFF/EAF session
The best semi-empirical formula NN
3 3
10 10
En=14.5 MeV En=14.5 MeV
2 2
Cross-section (mb)

10 10

1 1
10 10
=152.4
=830.0

0
10
0 10

expr data expr data


semi-empirical formula NN
-1
-1 10
10 0,00 0,04 0,08 0,12 0,16 0,20
0,00 0,04 0,08 0,12 0,16 0,20
(N-Z)/A (N-Z)/A

JEFF Meeting. April 17-19, 2013 16


JEFF/EAF session
Applications

• complex relationships between input and output data

• hidden structure of the data and “systematic deviations”

• interconnections between input data

Systematics of model parameters and cross-sections


maximal values of excitation functions (EAF)

Approximations
Search of correlations between data
Alternative to common methods
GLSM, UMC: calc + exp  eval

JEFF Meeting. April 17-19, 2013 17
JEFF/EAF session
Conclusion

 the possible applications of neural networks for


evaluation of nuclear data were discussed

 advantages of the method were pointed out

JEFF Meeting. April 17-19, 2013 18


View publication stats
JEFF/EAF session

You might also like