You are on page 1of 4

STATISTICAL METHODS

Basic Concept
Statistics is a term that pertains to your acts of collecting and analyzing numerical data.
Doing statistics then means performing some arithmetic procedures like addition, division,
subtraction, multiplication, and other mathematical calculations. Statistics demands much
of your time and effort, for it is not merely a matter of collecting and examining data, but
involves analysis, planning, interpreting, and organizing data in relation to the design of
the experimental method you chose.
Statistical Methodologies
1. Descriptive Statistics
This describes a certain aspect of a data set by making you calculate the Mean,
Medium, Mode and Standard Deviation. It tells about the placement or position of one
data item in relation to the other data, the extent of the distribution or spreading out of
data, and whether they are correlations or regressions between or among variables. This
kind of statistics does not tell anything about the population.
2. Infe
Inferential statistics is a branch of statistics that focuses on conclusions,
generalizations, predictions, interpretations, hypotheses, and the like. There are a lot of
hypotheses testing in this method of statistics that require you to perform complex and
advanced mathematical operations. This statistical method is not as simple as the
descriptive statistics. This does not focus itself only on the features of the category of set,
but on the characteristics of the sample that are also true for the population from where
you have drawn the sample. Your analysis begins with the sample, then,based on your
findings about the sample, you make inferences or assumptions about the population.
Types of Statistical Data Analysis
 Univariate Analysis – analysis of one variable
 Bivariate Analysis – analysis of two variables (independent and dependent
variables)
 Multivariate Analysis – analysis of multiple relations between multiple variables
Statistical Methods of Bivariate Analysis Bivariate analysis happens by means of the
following methods (Argyrous 2011; Babbie 2013; Punch 2014):
1. Correlation or Covariation (correlated variation) – describes the relationship
between two variables and also tests the strength or significance of their linear
relation. This is a relationship that makes both variables getting the same high score
or one getting a higher score and the other one, a lower score.
2. Cross Tabulation – is also called “crosstab or students-contingency table” that
follows the format of a matrix (plural: matrices) that is made up of lines of numbers,
symbols, and other expressions. Similar to one type of graph called table, matrix
arranges data in rows and columns. By displaying the frequency and percentage
distribution of data, a crosstab explains the reason behind the relationship of two
variables and the effect of one variable on the other variable. If the Table compares
data on only two variables, such table is called Bivariate Table

Example of a Bivariate Table: HEI Participants in the 2016 NUSP Conference

Measure of Correlation
The following are the statistical tests to measure correlation or covariation:
1. Correlation Coefficient
This is a measure of the strength and direction of the linear relationship between
variables and likewise gives the extent of dependence between two variables;
meaning, the effect of one variable on the other variable. This is determined through
the following statistical tests for Correlation Coefficient: (Argyrous 2011; Creswell
2014; Levin & Fox 2014)
 Spearman’s rho (Spearman’s r, or r) – the test to measure the
dependence of the dependent variable on the independent variable.
 Pearson product-moment correlation (Pearson’s r, r or R) – measures
the strength and direction of the linear relationship of two variables and of
the association between interval and ordinal variables.
 Chi-square – is the statistical test for bivariate analysis of nominal
variables, specifically, to test the null hypothesis. It tests whether or not a
relationship exists between or among variables and tells the probability that
the relationship is caused by chance. This cannot in any way show the
extent of the association between two variables.
 t-test – evaluates the probability that the mean of the sample reflects the
mean of the population from where the sample was drawn. It also tests the
difference between twomeans: the sample mean and the population mean.
ANOVA or analysis of variance also uses t-test to determine the variance
or the difference between the predicted number of the sample and the
actual measurement.
The ANOVA is of various types such as the following:
A. One-way analysis of variance – study of the effects of the
independent variable
B. ANCOVA (Analysis of Covariation) – study of two or more
dependent variables that are correlated with one another
C. MANCOVA (Multiple Analysis of Covariation) – multiple
analyses of one or more independent variables and one
dependent variable to see if the independent variables affect one
another
2. Regression
Similar to correlation, regression determines the existence of variable
relationships, but does more than this by determining the following:
1)which between the independent and dependent variable can signal the
presence of another variable;
2)how strong the relationship between the two variables are;
3)when an independent variable is statistically significant as a soothsayer
or predictor.
Each of these statistical tests has its own formula that, with your good
background knowledge about statistics, you may be able to follow easily. However,
without solid foundation about statistics, to be able to apply them to your research,
you need to read further about statistics or hire the services of a statistician.
Written
Report of
Group 6
(STATISTICAL METHODS)
Submitted by:
Christian Dave Orate
Yra Louisse Taroma
Lawrence Schwarz Cepeda
Ella Loreen Zata
Brynne Matthew Rivera
Andrea Mae Magoncia
Twinkle Fadri
Alliyah Clare De Leon

You might also like