College of Business
Summer II-2016

Course Name: Risk Analysis and Modeling

Course Code: BUSN 415


Submission Date:

Summer II

20 August, 2016

Student Name:

Student ID: S0000000635

Abdulla Alhammadi

Section {

Total Marks: 20

Weightage: 20%




This is individual assessment. Group work is not allowed.

University strictly observe the policy of plagiarism only upto 20%.

SUBMIT the Assignment through STUDENT PORTAL

Assignments submitted through Student Portal will be marked ONLY

The Assignments submitted after DUE DATE will not be ENTERTAINED



Conversely. uncertainty is beyond the manipulate of the man or woman or corporation.• • • • • • Risk is defined as the scenario of prevailing or dropping something worth. via theoretical fashions. if proper measures are taken to govern it. as the destiny activities are unpredictable. Uncertainty is a circumstance where there is no expertise about the future events. In risk. Risk may be measured and quantified. it is not feasible to degree uncertainty in quantitative terms. whereas in case of uncertainty. because the future is unsure. Minimization of risk can be carried out. with the aid of taking essential precautions. Then again. WHAT IS THE MONTE CARLO SIMULATION MAJOR LIMITATION OF ONLY USING TO PERFORM RISK ANALYSIS ? 2 . the outcomes are unknown. In place of uncertainty that can't be minimized. QUESTION 2. Risk may be controlled. The capacity effects are recognized in threat. chances are assigned to a fixed of instances which is not possible in case of uncertainty.

Simulations the usage of an implicit solvent do no longer be afflicted by those drawbacks. the rejection of the trial configuration. there is additionally no trendy. consequently. range for the precise trouble one is inquisitive about. freely available program for the Monte Carlo simulation of proteins due to the fact the selection of which Monte Carlo moves to apply.Due to the fact one does now not solve Newton’s equations of movement. despite the fact that we notice that a Monte Carlo module has recently been brought to CHARMM 3 . and the rates at which they're attempted. Any pass that appreciably alters the inner coordinates of the protein without additionally moving the solvent debris will in all likelihood result in a massive overlap of atoms and. as a result. proper. One of the major difficulties of Monte Carlo simulations of proteins in a specific solvent is the difficulty of engaging in large-scale movements. and. no dynamical records may be accumulated from a conventional Monte Carlo simulation. coarse-grained protein fashions are the maximum famous systems wherein Monte Carlo techniques are used.

The time collection techniques used in ezForecaster are in particular appropriate to sales. Time collection strategies have the advantage of relative simplicity. but certain factors want to be taken into consideration The extrapolation method can be implemented inside the indoors reconstruction hassle. Finance. and manufacturing planning. Frequently there isn't enough statistics that has been obtained from experimentation and we need to extend.WHAT ARE THE DIFFERENCES BETWEEN TIME -SERIES FORECASTING TECHNIQUES AND NONLINEAR EXTRAPOLATION ? Time series forecasting methods produce forecasts based completely on historic values. advertising. Often the extrapolation is linear 4 . from acknowledged statistics to values beyond the recognized.QUESTION 3. Time collection forecasting methods are broadly used in business conditions where forecasts of a year or less are required. or extrapolate.

Maximum commonly. the degree of self-belief that the genuine dating is close to the expected relationship. Ordinary least squares Ordinary Least Squares or OLS is one of the best strategies of linear regression. Heteroskedasticity frequently arises in two bureaucracies: conditional and unconditional. a time collection is a series taken at successive similarly spaced points in time. as an example. monitored over a selected amount of time. counts of sunspots. The investigator also usually assesses the “statistical importance” of the predicted relationships. Thus its miles a series of discrete-time facts.QUESTION 4. Heteroskedasticity Heteroskedasticity. are no constant. the investigator seeks to check the causal impact of 1 variable upon any other—the impact of a price growth upon demand. and the everyday last price of the Dow Jones business common. that is. Regression analysis Regression analysis is a statistical device for the investigation of relationships between variables. is when the standard deviations of a variable. in records. or the impact of adjustments within the money deliver upon the inflation rate. It does so by using minimizing the sum of squared errors from the records. usually. the investigator assembles records at the underlying variables of hobby and employs regression to estimate the quantitative impact of the causal variables upon the variable that they have an effect on. Unconditional Heteroskedasticity is used while futures intervals of excessive and occasional volatility may be diagnosed. The goal of OLS is to intently "fit" a feature with the statistics. Conditional Heteroskedasticity identifies no constant volatility whilst destiny durations of excessive and low volatility cannot be diagnosed. EXPLAIN MEANS : WHAT EACH OF THE FOLLOWING TERMS Time-series analysis A time series is a chain of statistics factors listed (or graphed) in time order. Examples of time collection are heights of ocean tides. To discover such issues. 5 .

and if present in the records the statistical inferences made about the statistics may not be reliable. d. is from the equal source. cycles. It normally exists in those kinds of information-units wherein the statistics.Autocorrelation Autocorrelation is a function of records wherein the correlation between the values of the identical variables is primarily based on related gadgets. 6 . q). which underlies maximum of the conventional models. It’s far consequently a kind of disturbance inside the statistics. ARIMA modeling can don't forget traits. it's miles a shape of regression analysis that seeks to are expecting destiny moves alongside the apparently random stroll taken by means of stocks and the financial marketplace via inspecting the differences between values in the series as opposed to the usage of the actual facts values. errors and non-desk bound factors of a records set when making forecasts. integrated and transferring common parts of the facts set. seasonality. It violates the assumption of instance independence. respectively. Lags of the differenced collection are known as "autoregressive" and lags within forecasted records are called "transferring average. Multicollinearity Multicollinearity is a kingdom of very excessive intercorrelations or interinstitutions a number of the impartial variables. as opposed to being randomly selected." This version kind is normally called ARIMA (p. with the integers regarding the autoregressive. ARIMA A statistical evaluation model that uses time series statistics to predict destiny trends.

The test statistic is computed by an auxiliary or secondary regression. extended economic shocks and activities. which estimates the potential for a first-order autocorrelation. 7 . where the squared residuals or mistakes from the primary regression are regressed on all feasible (and no redundant) go merchandise of the regressors. the White’s take a look at. This dating may be because of a couple of reasons. then autocorrelation exists. Autocorrelation One quite simple approach to test for autocorrelation is to graph the time collection of a regression equation’s residuals. in which the test is based totally on the null speculation of no Heteroskedasticity in opposition to an change speculation of Heteroskedasticity of some unknown well-known form.QUESTION 5. seasonal modifications of the records. The Durbin– Watson take a look at additionally identifies version misspecification. if a specific time-series variable is correlated to itself one length earlier. mental inertia. this is. Many time-collection statistics tend to be vehicle correlated to their ancient occurrences. If these residuals show off some cyclicality. These tests also are relevant for trying out misspecifications and nonlinearities. smoothing. and so forth. THE ESTIMATED MODEL WILL BE FLAWED : Heteroskedasticity Several tests exist to check for the presence of Heteroskedasticity. The best technique is to graphically represent every independent variable against the based variable as illustrated in advance inside the chapter.EXPLAIN WHY IF EACH OF THE FOLLOWING IS NOT DETECTED PROPERLY OR CORRECTED FOR IN THE MODEL. any other method is to use one of the most broadly used fashions. Every other greater sturdy method to locate autocorrelation is the use of the Durbin–Watson statistic. which includes the variables’ spatial relationships (similar time and area).

bendy. Whilst this occurs. In close to collinearity situations. ensuing in a regression equation this is neither efficient nor correct QUESTIONS 6. Generalized Autoregression Conditional Heteroskedasticity Model (GARCH) Autoregressive Conditional Heteroskedasticity version of Order cohesion A time collection ϵthat is given at every example through: ϵt=σtwtϵt=σtwt where wtwt is discrete white noise.CRITICALLY MODELS: EXPLAIN THE FOLLOWING ECONOMETRIC Vector Autoregression Model (VAR) The vector autoregression (VAR) model is one of the most successful. Its miles a natural extension of the univariate autoregressive model to dynamic multivariate time series. and easy to use fashions for the evaluation of multivariate time series.Multicollinearity Multicollinearity exists when there is a linear dating between the unbiased variables. It often gives superior forecasts to the ones from univariate time series fashions and problematic theory-based simultaneous equations models. The VAR version has demonstrated to be mainly useful for describing the dynamic conduct of economic and economic time collection and for forecasting. with 0 imply and unit variance and σ2tσt2 is given through: σ2t=αzero+α1ϵ2t−1σt2=α0+α1ϵt−12 wherein αzeroαzero and α1α1 are parameters of the version. the regression equation cannot be anticipated in any respect. This case is specifically proper while a step-clever regression technique is used. wherein the statistically large unbiased variables can be thrown out of the regression blend earlier than anticipated. 8 . Forecasts from VAR fashions are quite bendy due to the fact they may be made conditional on the capability future paths of targeted variables in the version. the expected regression equation will be biased and provide faulty results.

it is one of the most popular objects of look at in chance. or system. with the index variable relating to time. there are several (regularly infinitely many) instructions wherein the technique may also evolve. This indexing can be 9 . in a stochastic or random technique there's a few indeterminacy: even though the initial condition (or starting point) is known. a stochastic system or on occasion random technique (broadly used) is a collection of random variables. of answers of a regular differential equation). for instance. hence the feasible instabilities of optimization workouts are decreased. the go back of a protection may additionally depend on its volatility (threat). BROWNIAN In probability theory.Exponential Generalized Autoregression Conditional Heteroskedasticity Model (EGARCH) EGARCH models benefit from no parameter regulations. on the other hand the theoretical properties of QML estimators of EGARCH fashions aren't clarified to an excellent quantity. the Gracchi-in-suggest (GARCH-MI) version provides a Heteroskedasticity time period into the mean equation. more typically. a stochastic method refers to a circle of relatives of random variables indexed towards some different variable or set of variables. and time collection. PROCESS (E.. this is often used to represent the evolution of a few random cost. a few simple styles of stochastic tactics consist of Markov strategies.WHAT IS A STOCHASTIC MOTION)? CRITICALLY EXPLAIN. To model such phenomena. instead of describing a system which could only evolve in a single manner (as in the case. Generalized Autoregression Conditional Heteroskedasticity in Mean Model (GARCH-M) In finance. Poisson approaches (along with radioactive decay). over the years. in possibility principle. that is the probabilistic counterpart to a deterministic method (or deterministic machine). in radioactive decay each atom is concern to a fixed probability of breaking down in any given time interval. QUESTION 7.G. a technique related to the operation of risk. for instance.

both discrete and non-stop. SHOULD YOU STILL GO AHEAD AND CORRELATE THEM IN A SIMULATION? The correlation coefficient is a degree of the strength and direction of the relationship between variables. that is.0.IF YOU KNOW THAT TWO SIMULATED VARIABLES ARE CORRELATED BUT DO NOT HAVE THE RELEVANT CORRELATION VALUE. it's far crucial to word that correlation does not suggest causation. QUESTION 8.g. There are standard styles of correlations: parametric and nonparametric correlations. the correlation coefficient can be decomposed into its path or sign (high-quality or bad dating between two variables) and the importance or energy of the connection (the higher the absolute price of the correlation coefficient.. two completely unrelated random variables might display a few correlation. but there is no causation between the 2).zero and +1. the more potent the connection). and may tackle any values among –1. sunspot pastime and activities in the stock market are correlated. the hobby being inside the nature of modifications of the variables with admire to time. Pearson’s correlation coefficient is the most common correlation 10 . but this doesn't suggest any causation among the 2 (e.

but. In truth. Spearman’s rank correlation and Kendall’s tau are the two nonparametric options. but. five hours. meaning that correlations between different variables with exclusive distribution may be implemented. you may count number the trade in your pocket. 10 months. the correlation used is the greater robust nonparametric Spearman’s rank correlation. you couldn’t matter “age”. four 11 . 2 days. you'll get to “all the time” and in no way end counting them. so as to compute the Spearman correlation. Pearson’s correlation is a parametric degree. you can be: 25 years. nonstop Variables could (actually) take for all time to depend. Why not? due to the fact it'd literally take for all time. that is often the case in Monte Carlo simulation. thereby simplifying the process. The Spearman correlation is maximum typically used and is maximum appropriate while implemented within the context of Monte Carlo simulation—there's no dependence on regular distributions or linearity. first rank the entire x and y variable values and then practice the Pearson’s correlation AND CONTRAST BETWEEN A DISCRETE VERSUS CONTINUOUS DECISION VARIABLE WHEN USED IN AN OPTIMIZATION UNDER UNCERTAINTY . QUESTION 9. take age. the nonparametric opposite numbers grow to be greater crucial. and is usually referred to honestly because the correlation coefficient. Discrete variables are countable in a finite amount of time. however the point is — it’s nevertheless countable. inside the case of danger Simulator. which means that that it requires each correlated variables to have an underlying everyday distribution and that the connection among the variables is linear. for instance. while these situations are violated. it would take you a long term to rely that ultimate object. as an example. for instance. threat Simulator will then apply its own algorithms to transform them into Spearman’s rank correlation. the correlation consumer inputs required are the Pearson’s correlation coefficient. you can also depend the quantity of money in anybody’s financial institution account. to simplify the simulation procedure and to be steady with Excel’s correlation characteristic. you may be counted the money to your bank account.

some fashions best make experience if the variables tackle values from a discrete set. improvements in algorithms coupled with advancements in computing technology have dramatically expanded the scale and complexity of discrete optimization problems that can be solved effectively. ninety nine Pico sends…and so forth. models with discrete variables are discrete optimization troubles. while other models incorporate variables that can take on any real cost. acceptance of residual dangers that end result from with hazard treatment has to take vicinity at the level of the government management of the 12 . continuous optimization algorithms are important in discrete optimization because many discrete optimization algorithms generate a series of continuous sub issues. however. regularly a subset of integers. non-stop optimization troubles tend to be simpler to remedy than discrete optimization troubles. ANALYSIS ACCEPTED IN AN Management is liable for defining an employer's applicable level of chance. fashions with non-stop variables are continuous optimization troubles. 8 nanoseconds. the safety practitioner have to recognize the technique and be capable of indicate to management how underlining security threats can negatively have an effect on commercial enterprise objectives.HOW TO GET THE RISK ORGANIZATION ? EXPLAIN CRITICALLY . You could flip age into a discrete variable after which you could count number it. the smoothness of the functions approach that the goal function and constraint function values at a point xx may be used to infer records about factors in a community of xx. 4 milliseconds.seconds. QUESTION 10.

threat attractiveness has been protected within the evaluation of strategies and equipment.g. inside the occasion that new assertions are made or converting technical conditions recognized. as it is probably a choice criterion for certain styles of companies (e. risk popularity issues the verbal exchange of residual dangers to the choice makers.). dangers which have been commonplace want to be reconsidered. within the monetary and coverage zone.organization To this volume. 13 . In other words. the much less the paintings involved in dealing with dangers (and inversely). this will be finished by way of speaking the outcome of hazard treatment to the management of the organization. positioned among hazard treatment and chance communique . residual dangers are taken into consideration as dangers that the control of the organization knowingly takes. One reason for explicitly bringing up danger reputation is the want to attract control's attention to this difficulty which might otherwise simply be a communicative hobby. risk acceptance is considered as being a non-compulsory procedure. but. once usual. the extent and volume of widespread dangers contain one of the important parameters of the chance control technique. the better the standard residual dangers. that when generic the dangers will no longer change in forthcoming repetitions of the chance control existence-cycle.This manner is visible as an elective one. in the ordinary phases and sports of the hazard management procedures the severity of these risks will be measured over the years. due to the fact it can be covered by means of both risk treatment and risk verbal exchange tactics. within the connected inventories. in vital infrastructure safety etc. this doesn't mean.