You are on page 1of 4

1. How you define Econometrics, and Financial Econometrics?

 Econometrics is the use of statistical methods using quantitative data to develop


theories or test existing hypotheses in economics or finance. Econometrics relies
on techniques such as regression models and null hypothesis testing.
Econometrics can also be used to try to forecast future economic or financial
trends
 Financial econometrics is the application of statistical methods to financial market
data. It differs from other forms of econometrics because the emphasis is usually
on analyzing the prices of financial assets traded at competitive, liquid markets.
2. What is the value of Econometrics in Finance?
 Financial econometrics and statistics have become very important tools for
empirical research in both finance and accounting. Econometric methods are
important tools for asset-pricing, corporate finance, options, and futures, and
conducting financial accounting research.
3. How you understand population in the context of statistics and what are the major
category of population? What is sampling and sample?
 A population is the entire group that you want to draw conclusions about. A
sample is the specific group that you will collect data from. The size of the sample
is always less than the total size of the population.
4. List and define each measure of descriptive statistics.
 Descriptive statistics are broken down into measures of central tendency and
measures of variability (spread). Measures of central tendency include the mean,
median, and mode, while measures of variability include standard deviation,
variance, minimum and maximum variables, kurtosis, and skewness.
5. What does the Central Limit Theorem states?
 The central limit theorem states that if you have a population with mean μ and
standard deviation σ and take sufficiently large random samples from the
population with replacement , then the distribution of the sample means will be
approximately normally distributed.
6. What is model? What are the steps involved in forming an econometric model?
 Econometric models are statistical models used in econometrics. An econometric
model specifies the statistical relationship that is believed to hold between the
various economic quantities pertaining to a particular economic phenomenon.
 Steps in Carrying Out an Empirical Study
 Selection of a Hypothesis or an Observed Phenomenon
 Establishing the Objectives of the Study
 Developing an Economic Model
 Developing an Econometric Model
 Estimating the Values of Coefficients
 Data Analysis and Validation
7. What are the major Data types we use in econometrics analysis? Which Data Type is
common in financial sector? Give an example of problems that could be tackled using
such type of data.
 There are three types of data: time series, cross-section, and a combination of
them is called pooled data.
 The sector comprises commercial banks, insurance companies, non-banking
financial companies, co-operatives, pension funds, mutual funds and other smaller
financial entities.
8. What is a regression model? How it is different from correlation?
 A frequently applied statistical technique that serves as a basis for studying and
characterizing a system of interest, by formulating a mathematical model of the
relation between a response variable, y and a set of q explanatory variables x1, x2,
… xq.
 The main difference in correlation vs regression is that the measures of the degree
of a relationship between two variables; let them be x and y. Here, correlation is
for the measurement of degree, whereas regression is a parameter to determine
how one variable affects another.
9. What is disturbance term? Describe the reasons for the inclusion of the disturbance term.
 The disturbance term u indicates the term that cannot be explained by x in y .
Usually, x is assumed to be nonstochastic. Note that x is said to be nonstochastic
when it.
 The reasons a disturbance term u is necessary are as follows: (a) There are some
unpredictable elements of randomness in human responses, (b) an effect of a large
number of omitted variables is contained in x, (c) there is a measurement error in
y, or (d) a functional form of f(x) is not known in general.
10. Describe the assumptions concerning disturbance terms and their interpretation.
 Assumption 1: Linear Relationship.
 Assumption 2: Independence.
 Assumption 3: Homoscedasticity.
 Assumption 4: Normality.

11. What is statistical inference? What tests you need to conduct for these purposes?
 Statistical inference involves hypothesis testing (evaluating some idea about a population
using a sample) and estimation (estimating the value or potential range of values of some
characteristic of the population based on that of a sample).
 Given a hypothesis about a population, for which we wish to draw inferences, statistical
inference consists of (first) selecting a statistical model of the process that generates the
data and (second) deducing propositions from the model.
12. What are Ordinary Least Squares? Why is it so called?
 Least squares in y is often called ordinary least squares (OLS) because it was the first
ever statistical procedure to be developed circa 1800, see history. It is equivalent to
minimizing the L2 norm, ||Y−f(X)||2.
 Ordinary Least Squares or OLS is one of the simplest (if you can call it so) methods of
linear regression. The goal of OLS is to closely "fit" a function with the data. It does so
by minimizing the sum of squared errors from the data.
13. What is an Event Study and give a practical example?
 An event study is a statistical method to assess the impact of an event on the value of a
firm. For example, the announcement of a merger between two business entities can be
analyzed to see whether investors believe the merger will create or destroy value. The
event study was invented by Ball and Brown (1968).
14. Assume you are assigned to conduct research on identifying the key determinants of
customer service quality using a SERVQUAL model? Develop an econometric model by
identifying the dependent variable and the corresponding independent variables.

አልገብቶኝም አልሰራሁትም ይችን እንደምንም ኮርጅ


15. Define the following terms:

- Theory

 A theory is a carefully thought-out explanation for observations of the natural world that
has been constructed using the scientific method, and which brings together many facts
and hypotheses.

- Hypothesis

 A hypothesis is an assumption, an idea that is proposed for the sake of argument so that it
can be tested to see if it might be true. In the scientific method, the hypothesis is
constructed before any applicable research has been done, apart from a basic background
review.

- Degree of Freedom

 Degree of freedom refers to the maximum number of logically independent values, which
are values that have the freedom to vary, in the data sample. Degrees of freedom are
commonly discussed in relation to various forms of hypothesis testing in statistics, such
as a chi-square.

- Normal distribution

 Normal distribution, also called Gaussian distribution, the most common distribution


function for independent, randomly generated variables. Its familiar bell-shaped curve is
ubiquitous in statistical reports, from survey analysis and quality control to resource
allocation.

- Precision value

 We identify and assess evidence of clinical, economic, and humanistic value, then
analyze and communicate it to effectively demonstrate the total value of medical
innovations and maximize commercial success.

You might also like