You are on page 1of 16

Muhammad Arif Hasan

Roll Number - 622

Techniques of data Analysis


1. One way and two way ANOVA & MANOVA
Anova
ANOVA, which stands for Analysis of Variance, is a statistical test used to analyze the
difference between the means of more than two groups.
It's a statistical test that was developed by Ronald Fisher 1918

Types of Anova
One-way ANOVA uses one independent variable, while a two-way ANOVA uses two
independent variable.

Manova
A MANOVA (“Multivariate Analysis of Variance”) is identical to an ANOVA, except it
uses two or more response variables. Similar to the ANOVA, it can also be one-way or two-
way. Note: An ANOVA can also be three-way, four-way.

Types of Manova
The one-way multivariate analysis of variance (one-way MANOVA) is used to determine
whether there are any differences between independent groups on more than one continuous
dependent variable. In this regard, it differs from a one-way ANOVA, which only measures one
dependent variable.
(in a two-way MANOVA) are groups where there is no relationship between the participants
in any of the groups.

 2.Factor Analysis
It is a technique that is used to reduce a large number of variables into fewer numbers of factors. 
This technique extracts maximum common variance from all variables and puts them into a
common score

Who introduced
Factor analysis was developed by the British psychologist Charles Spearman in the early 20th
century as a technique for analyzing intelligence structures.
Purpose of Factor Analysis
The purpose of factor analysis is to reduce many individual items into a fewer number of
dimensions.

Types of Factor Analysis.


1. Principal component analysis

It is the most common method which the researchers use. Also, it extracts the maximum variance
and put them into the first factor. Subsequently, it removes the variance explained by the first factor
and extracts the second factor. Moreover, it goes on until the last factor.

2. Common Factor Analysis

It’s the second most favoured technique by researchers. Also, it extracts common variance and put
them into factors. Furthermore, this technique doesn’t include the variance of all variables and is
used in SEM.

3. Exploratory Factor Analysis

Exploratory factor analysis (EFA) is a classical formal measurement model that is used when
both observed and latent variables are assumed to be measured at the interval level.

3. Interpretive Structural Modeling


Interpretive structural modeling (ISM) is a process that transforms unclear and poorly
articulated mental models of systems into visible, well-defined models useful for many
purpose
Interpretive Structural Modeling (ISM) is a computer based technique for helping small groups
develop graphical representations of complex systems
Who Introduced ISM
Interpretative structural modeling (ISM) was first proposed by Warfield in 1973, with the aim of
analyzing complex systems
What is the purpose of structural modeling?
Structural models show the organization and architecture of a system.

4. Grey Relational Analysis.


It defines situations with no information as black, and those with perfect information as white.
However, neither of these idealized situations ever occurs in real world problems. In fact,
situations between these extremes, which contain partial information, are described as being
grey, hazy or fuzzy. A variant of GRA model, Taguchi-based GRA model, is a popular
optimization method in manufacturing engineering.

Grey relational analysis (GRA) was developed by Deng Julong

 It is one of the most widely used models of grey system theory. GRA uses a specific concept
of information.

The theory has been applied in various fields of engineering and management. Initially, the grey
method was adapted to effectively study air pollution [4] and subsequently used to investigate the
nonlinear multiple-dimensional model of the socio-economic activities’ impact on the city air
pollution.[5] It has also been used to study the research output and growth of countries.

5.RIDIT Analysis
What is Ridit analysis?
RIDIT (Relative to an Identified Distribution) is a very efficient technique that can be used to
examine the Likert scale data. The outcomes from the RIDIT analysis can be used to arrange
Likert scale items either in an ascending or in a descending order based on importance.

Bross (1958) developed the ridit analysis for handling of the ordinal data

RIDIT analysis was used to determine the important attributes from both expert opinion
and user perception data. RIDIT analysis is a simple tool which is closely related to
distribution free statistical methods and is used to interpret the results of ordinal
data meaningfully
6.Data Envelopment Analysis
DEA was initiated by Charnes Cooper and Rhodes in 1978 in their seminal paper Chames et
al. (1978)

DEA is used to empirically measure productive efficiency of decision-making units (DMUs).


Although DEA has a strong link to production theory in economics, the method is also used
for benchmarking in operations management, whereby a set of measures is selected to
benchmark the performance of manufacturing and service operations.

Data envelopment analysis (DEA) is a nonparametric method in operations


research and economics for the estimation of production frontiers

7. Artificial Neural Networks.


The first artificial neural network was invented in 1958 by psychologist Frank Rosenblatt.

Definition
An artificial neural network is an attempt to simulate the network of neurons that make up a
human brain so that the computer will be able to learn things and make decisions in a
humanlike manner. ANNs are created by programming regular computers to behave as though
they are interconnected brain cells

What are artificial neural networks used for


Artificial neural networks are created to digitally mimic the human brain. They are currently
used for complex analyses in various fields, ranging from medicine to engineering, and
these networks can be used to design the next generation of computers
 
8. Conjoint Analysis
What is a conjoint analysis?
Definition:
Conjoint analysis is a research technique used to quantify how people value the individual
features of a product or service. A conjoint survey question shows respondents a set of
concepts, asking them to choose or rank the most appealing ones
It is a survey-based statistical technique used in market research that helps determine how people
value different attributes (feature, function, benefits) that make up an individual product or
service.

There are two main types of conjoint analysis: Choice-based Conjoint (CBC) Analysis and
Adaptive Conjoint Analysis (ACA).

9. Canonical Correlation
A canonical correlation is a correlation between two canonical or latent types of variables. In
canonical correlation, one variable is an independent variable and the other variable is a
dependent variable.

This might be regarded as the simplest form of a latent trait model. Instead of determining the
correlation between observed variables (e.g., test scores), a canonical correlation (CR) calculates
the correlation between (a) the common latent trait(s) in a given set of two or more observed
variables and (b) the common latent traits(s) in another set of two or more observed variables. It
is like a multiple R in which both the independent variables and the dependent variables consist
of a number of different measurements. However, the C divides the common variance between
the two sets of variables into orthogonal (i.e., uncorrelated) components, called canonical
variates.
10. Co-integration
. Nobel laureates Robert Engle and Clive Granger introduced the concept of co-integration in
1987

What is Co-integration?
Co-integration is a statistical method used to test the correlation between two or more non-
stationary time series in the long run or for a specified period. The method helps identify long-
run parameters or equilibrium for two or more variables.
Why do we use co-integration test
Co-integration is a statistical method used to test the correlation between two or more non-
stationary time series in the long run or for a specified period. The method helps identify long-
run parameters or equilibrium for two or more variables

11. Analytic Hierarchy process


AHP was developed in the 1970s by Thomas L. Saaty and has since been extensively studied

Analytic Hierarchy Process:  A decision making method that compares multiple alternatives,
each with several criteria to help select the best option. Pairwise Comparison: The process of
comparing criteria two at a time.

It is currently used in decision making for complex scenarios, where people work together to
make decisions when human perceptions, judgments, and consequences have long-term
repercussions
12. Analytic Network Process
Analytic network process (ANP) is a mathematical theory, developed by Thomas L. Saaty, to
identify decision-making priorities of multiple variables without establishing one-way
hierarchical relationship among decision levels, which has been successfully applied in various
areas
13. Interpretive Ranking Process
Interpretive ranking process (IRP) is a multi-criteria decision making method based on
paired comparison in an interpretive manner. Due to paired comparisons, the number of
interpretations to be made for n ranking variables are n(n-1)/2 to establish dominance with
respect to each reference variable or criterion.
14. Decision Tree Analysis
Decision tree analysis involves visually outlining the potential outcomes, costs, and
consequences of a complex decision. These trees are particularly helpful for analyzing
quantitative data and making a decision based on numbers
Five Steps of Decision Tree Analysis

Define the problem area for which decision making is necessary. Draw a decision tree with all
possible solutions and their consequences. Input relevant variables with their respective
probability values. Determine and allocate payoffs for each possible outcome
What is the purpose of decision tree analysis?
Decision trees help you to evaluate your options. Decision Trees are excellent tools for helping
you to choose between several courses of action.
15. Data Mining
Data mining is the process of finding anomalies, patterns and correlations within large data
sets to predict outcomes. Using a broad range of techniques, you can use this information to
increase revenues, cut costs, improve customer relationships, reduce risks and more.
Four Stages Of Data Mining
(1) data acquisition; (2) data cleaning, preparation, and transformation; (3) data
analysis, modeling, classification, and forecasting; and (4) reports.
16. Cluster Analysis
Cluster analysis definition.
Cluster analysis is a statistical method for processing data. It works by organizing items into
groups, or clusters, on the basis of how closely associated they are.

Streaming services often use clustering analysis to identify viewers who have similar
behavior. For example, a streaming service may collect the following data about individuals:
Minutes watched per day. Total viewing sessions per week.23-Aug-2021

What is a cluster analysis in statistics?


Cluster analysis is a multivariate method which aims to classify a sample of subjects (or ob-
jects) on the basis of a set of measured variables into a number of different groups such that
similar subjects are placed in the same group.
17. Multi-Dimensional Scaling
Kruskal & Wish (1978) — the authors of one of the first multidimensional scaling books —
state that this type of logic problem is ideal for multidimensional scaling
Multidimensional scaling refers to a family of mathematical (not statistical) models that can
be used to analyze distances between objects (e.g., health states). Information contained in a
set of data is represented by a set of points in space.

Purpose of Multidimensional Scaling


The purpose of multidimensional scaling is to map the relative location of objects using data
that show how the objects differ. Seminal work on this method was undertaken by Torgerson
(1958). A reduced version is one-dimensional scaling

What are the uses of multidimensional scaling?

Multidimensional scaling (MDS) is used to determine whether two or more perceptual


dimensions underlie the perceived similarities between stimuli.

18. Correspondence Analysis


Who invented correspondence analysis?

It is a multivariate statistical tool that was first proposed in 1935 by Herman Otto Hartley.
Hartley wrote a paper on contingency tables that paved the way for Jean-Paul Benzécri to
develop the analysis technique in the 1960s that we know today

Correspondence analysis is a statistical technique that provides a graphical representation


of cross tabulations (which are also known as cross tabs, or contingency tables). Cross
tabulations arise whenever it is possible to place events into two or more different sets of
categories, such as product and location for purchases in market research or symptom and
treatment in medical testing

19. Time Series Analysis


The innovator was John Graunt, a 17th-century London haberdasher
A 'Time Series' is a collection of observations indexed by time. The observations each occur at
some time t, where t belongs to the set of allowed times, T
Purpose and uses od Time Series Analysis
Time series analysis helps organizations understand the underlying causes of trends or
systemic patterns over time. Using data visualizations, business users can see seasonal trends
and dig deeper into why these trends occur. With modern analytics platforms, these
visualizations can go far beyond line graphs.

20. Econometric Analysis


Econometrics was pioneered by Lawrence Klein, Ragnar Frisch, and Simon Kuznets. All
three won the Nobel Prize in economics for their contributions
Econometrics uses economic theory, mathematics, and statistical inference to quantify
economic phenomena. In other words, it turns theoretical economic models into useful tools for
economic policymaking
What is the purpose of econometrics?
An econometrics analysis is used to test a hypothesis, whether it's an existing economic theory or
a brand new idea. It can also be used to forecast future financial or economic trends using current
data. This makes econometrics for finance an everyday tool for Wall Street traders and financial
analysts
21. Data Stationarity
Stationary data refers to the time series data that mean and variance do not vary across time.
The data is considered non-stationary if there is a strong trend or seasonality observed from the
data
What is meant by stationarity?
Stationarity can be defined in precise mathematical terms, but for our purpose we mean a flat
looking series, without trend, constant variance over time, a constant autocorrelation
structure over time and no periodic fluctuations (seasonality)
22. Granger Causality
The Granger causality test is a statistical hypothesis test for determining whether one time
series is useful in forecasting another, first proposed in 1969.
[1]
 Ordinarily, regressions reflect "mere" correlations, but Clive Granger argued
that causality in economics could be tested for by measuring the ability to predict the
future values of a time series using prior values of another time series

Limitations
As its name implies, Granger causality is not necessarily true causality. In fact, the Granger-
causality tests fulfill only the Humean definition of causality that identifies the cause-effect
relations with constant conjunctions

23. Vector Error Correction Model


Vector Error Correction Model is a co-integrated VAR model. This idea of Vector Error
Correction Model (VECM), which consists of a VAR model of the order p - 1 on the differences
of the variables, and an error-correction term derived from the known (estimated) co-integrating
relationship
24. Vector Auto Regression Model
The vector autoregressive (VAR) model is a workhouse multivariate time series model that
relates current observations of a variable with past observations of itself and past
observations of other variables in the system.
Why is a vector auto regression model used?
Vector auto regression (VAR) is a statistical model used to capture the relationship between
multiple quantities as they change over time. VAR is a type of stochastic process model. VAR
models generalize the single-variable (univariate) autoregressive model by allowing for
multivariate time series

25. Delphi Technique


The Delphi technique is a well-established approach to answering a research question
through the identification of a consensus view across subject experts. It allows for reflection
among participants, who are able to nuance and reconsider their opinion based on the
anonymized opinions of others.

Why is it called the Delphi technique?


The Delphi technique was named after the Ancient Greek oracle, who could predict the
future. It involves the collection and aggregation of expert opinion and was initially used by the
military to estimate the probable effects of massive atomic bombi
26. Game Theory
Game theory is the study of mathematical models of strategic interactions among rational
agents. ... It has applications in all fields of social science
Game Theory is branch of applied mathematics that provides tools for analyzing situations
in which parties, called players, make decisions that are interdependent. This
interdependence causes each player to consider the other player's possible decisions, or
strategies, in formulating strategy.
27. Formal Logic
Formal logic, the abstract study of propositions, statements, or assertively used sentences
and of deductive arguments. The discipline abstracts from the content of these elements the
structures or logical forms that they embody
28. Thematic Content Analysis
Thematic analysis is a method of analyzing qualitative data. It is usually applied to a set of
texts, such as an interview or transcripts. The researcher closely examines the data to identify
common themes – topics, ideas and patterns of meaning that come up repeatedly
Who proposed thematic analysis?
Description. TA as a method was first developed by Gerald Holton, a physicist and historian of
science, in the 1970s (Merton, 1975).
What is the aim of thematic analysis?
The goal of a thematic analysis is to identify themes, i.e. patterns in the data that are
important or interesting, and use these themes to address the research or say something
about an issue. This is much more than simply summarizing the data; a good thematic analysis
interprets and makes sense of it
29. Cybernetic Modeling
Cybernetics is associated with models in which a monitor compares what is happening to a
system at various sampling times with some standard of what should be happening, and a
controller adjusts the system's behaviour accordingly
30. Simulation
A simulation is a model that mimics the operation of an existing or proposed system,
providing evidence for decision-making by being able to test different scenarios or process
changes. This can be coupled with virtual reality technologies for a more immersive experience.
A simulation uses a mathematical description, or model, of a real system in the form of a
computer program. This model is composed of equations that duplicate the functional
relationships within the real system.24-Sept-2019

Simulation, in industry, science, and education, a research or teaching technique that


reproduces actual events and processes under test conditions. Developing a simulation is
often a highly complex mathematical process

31. Linear Programming


Linear programming (LP), also called linear optimization, is a method to achieve the best
outcome (such as maximum profit or lowest cost) in a mathematical model whose requirements
are represented by linear relationships. Linear programming is a special case of mathematical
programming (also known as mathematical optimization).
More formally, linear programming is a technique for the optimization of a linear objective
function, subject to linear equality and linear inequality constraints. Its feasible region is
a convex polytope, which is a set defined as the intersection of finitely many half spaces, each of
which is defined by a linear inequality. Its objective function is a real-valued affine (linear)
function defined on this polyhedron. A linear programming algorithm finds a point in
the polytope where this function has the smallest (or largest) value if such a point exists.
Linear programs are problems that can be expressed in canonical form.
Advantages of Linear Programming

 LP is useful for the business as the decision-maker can obtain an optimum solution by
considering the effective use of scarce resources
 It is a structured technique and helps in making informed data-driven decisions
 It provides alternate solutions which the decision-maker can analyze further and finalize
based on subjective matters that also need to be considered
 LP can also be used for changing situations. Changed constraints or additional constraints
can be included in the model to get revised output.

 In finance, decisions have to be taken on where the money should be spent and from
where the company needs to get the money from to ensure that they maximize the returns
keeping the risks under acceptable control. Buying and selling bonds, managing
corporate finances, making financial decisions.
 32 .Wavelet Analysis
 Wavelet analysis is an alternative to windowed Fourier transforms that also yields a
two-dimensional plot showing strengths of variations as a function of both period
(or frequency) and time
 A wavelet is a wave-like oscillation with an amplitude that begins at zero, increases or
decreases, and then returns to zero one or more times

Software
M-Plus
What is M-plus?
M-plus is a highly flexible, powerful statistical analysis software program that can fit an
extensive variety of statistical models using one of many estimators available.  Perhaps its
greatest strengths are in its capabilities to model latent variables, both continuous and
categorical, which underlie its flexibility.  Among the many models M-plus can fit are:

 Regression models (linear, logistic, Poisson, Cox proportional hazards, etc.)


 Factor analysis, exploratory and confirmatory
 Structural equation models
 Latent growth models
 Mixture models (latent class, latent profile, etc.)
 Longitudinal analysis (latent transition analysis, growth mixture models, etc.)
 Multilevel models
 Bayesian analysis
 E-views
 What is E-Views used for?
 Using E-Views, you can quickly and efficiently manage your data, perform econometric
and statistical analysis, generate forecasts or model simulations, and produce high quality
graphs and tables for publication or inclusion in other applications. E-Views is designed
with your workflow in mind.

How we help with E-Views


We help students for the following:

 Entering data in the panel work files of E-Views for data analysis and interpreting the

outcome

 Specify the equations according to various criteria

 Choose an estimation method which is to be used for analysis

 Primavera
 Primavera is an enterprise project portfolio management software. It includes project

management, scheduling, risk analysis, opportunity management, resource

management, collaboration and control capabilities, and integrates with other enterprise

software such as Oracle and SAP’s ERP systems.[1][2] Primavera was launched in 1983

by Primavera Systems Inc., which was acquired by Oracle Corporation in 2008

 Primavera is advanced software that is trusted by project managers and companies from

different industries globally. It provides sophisticated solutions to plan, manage and execute

projects of any size and scale. It increases project efficiency significantly by identifying

bottlenecks and scheduler overruns

 LINGO
 LINGO is a software program used for solving simultaneous linear and nonlinear

equations and inequalities. However, another product named Lingo is the scripting language

for Macromedia Director


 LINGO includes a user interface for interactive use and a callable programming interface that

allows you to embed LINGO in your own application. LINDO API does not have a traditional

user interface. It only has a programming interface

 LINDO
 LINDO (Linear, Interactive, and Discrete Optimizer) is a software package for linear

programming, integer programming, nonlinear programming, stochastic

programming and global optimization. LINDO® linear, nonlinear, integer,

stochastic and global programming solvers have been used by thousands of companies

worldwide to maximize profit and minimize cost on decisions involving production planning,

transportation, finance, portfolio allocation, capital budgeting, blending, scheduling,

inventory, resource allocation and more

 Adanco
 ADANCO (“advanced analysis of composites”) is a software with graphical user
interface for variance-based structural equation modeling (SEM)[1] using among others
the partial least squares (PLS) method[2][3] including consistent PLS.[4][5] The software can
be used in empirical research to analyze primary or secondary data and test theories that
consist of relationships between scientific constructs. ADANCO runs
on Windows and macOS operating systems. ADANCO (“advanced analysis of
composites”) is a user-friendly software for variance-based structural equation modeling.
 KH-Coder
 KH Coder is an open source software for computer assisted qualitative data analysis,
particularly quantitative content analysis and text mining. It can be also used
for computational linguistics. It supports processing and etymological information of text
in several languages, such as Japanese, English, French, German, Italian, Portuguese and
Spanish. Specifically, it can contribute factual examination co-event system hub
structure, computerized arranging guide, multidimensional scaling and comparative
calculations
 DEAP
 Distributed Evolutionary Algorithms in Python (DEAP) is an evolutionary
computation framework for rapid prototyping and testing of ideas.[2][3][4] It incorporates
the data structures and tools required to implement most common evolutionary
computation techniques such as genetic algorithm, genetic programming, evolution
strategies, particle swarm optimization, differential evolution, traffic
flow  and estimation of distribution algorithm. It is developed at Université Laval since
[5]

2009.
 Lisrel
What is LISREL used for?
LISREL is statistical software that is used for structural regression modeling. Structural
equation models are the system of linear equations. LISREL is the simultaneous estimation of
the structural model and measurement model. Structural model assumes that all variables are
measured without error. LISREL can be used to fit:

 measurement models,
 structural equation models based on continuous or ordinal data,
 multilevel models for continuous and categorical data using a number of link functions,
 generalized linear models based on complex survey data.

Mendeley
 What is Mendeley software used for?
 Mendeley Reference Manager is a free web and desktop reference management
application. It helps you simplify your reference management workflow so you can focus
on achieving your goals. With Mendeley Reference Manager you can: Store, organize
and search all your references from just one library.
 What kind of software is Mendeley?
 Mendeley: A free research management tool for desktop and web.
op 10 Reference Management Software
 Mendeley.
 EndNote.
 ReadCube Papers.
 EasyBib.com.
 Zotero.
 Article Galaxy Enterprise.

 Visio
 What is Visio software used for?
 With Visio on your PC or mobile device, you can: Organize complex ideas visually. Get
started with hundreds of templates, including flowcharts, timelines, floor plans, and more.
Add and connect shapes, text, and pictures to show relationships in your data.
 Edraw Max
 Edraw Max is a 2D business technical diagramming software which helps
create flowcharts, organizational charts, mind map,[1] network diagrams, floor
plans, workflow diagrams, business charts, and engineering diagrams. The current
version, Edraw Max 11.5.0 was released in November 2021 for Microsoft
Windows, macOS, and Linux. Edraw Max is a Visio-like[2] diagramming tool

 Wharp PLS

 It is a software with graphical user interface for variance-based and factor-
based structural equation modeling (SEM) using the partial least squares and factor-based
methods.[1][2] The software can be used in empirical research to analyse collected data
(e.g., from questionnaire surveys) and test hypothesized relationships. Since it runs on the
MATLAB Compiler Runtime, it does not require the MATLAB software development
application to be installed; and can be installed and used on various operating systems in
addition to Windows, with virtual installations.

 Matlab

 MATLAB  is a programming platform designed specifically for engineers and
®

scientists to analyze and design systems and products that transform our world. The
heart of MATLAB is the MATLAB language, a matrix-based language allowing the most
natural expression of computational mathematics. MATLAB (an abbreviation of
"MATrix LABoratory"[22]) is a proprietary multi-paradigm programming
language and numeric computing environment developed by MathWorks. MATLAB
allows matrix manipulations, plotting of functions and data, implementation
of algorithms, creation of user interfaces, and interfacing with programs written in other
languages
 SPSS
 SPSS Statistics is a statistical software suite developed by IBM for data management,
advanced analytics, multivariate analysis, business intelligence, and criminal
investigation. Long produced by SPSS Inc., it was acquired by IBM in 2009. Current
versions (post 2015) have the brand name: IBM SPSS Statistics.
 The software name originally stood for Statistical Package for the Social
Sciences (SPSS),[3] reflecting the original market, then later changed to Statistical
Product and Service Solutions.[4][5]

 Is SPSS still used?
 SPSS is a widely used program for statistical analysis in social science.

What is SPSS data set?
 This is a manufactured data set that was created to provide suitable data for the
demonstration of statistical techniques such as t-test for repeated measures, and one-way
ANOVA for repeated measures.
 AMOS
 AMOS is statistical software and it stands for analysis of a moment structures. AMOS
is an added SPSS module, and is specially used for Structural Equation Modeling, path
analysis, and confirmatory factor analysis. It is also known as analysis of covariance or
causal modeling software.
 STATA
 Stata was initially developed by Computing Resource Center in California and the first
version was released in 1985.[6] In 1993, the company moved to College Station, TX and
was renamed Stata Corporation, now known as StataCorp.[1] A major release in 2003
included a new graphics system and dialog boxes for all commands.[6] Since then, a new
version has been released once every two years.[7] The current version is Stata 17,
released in April 2021.[8]

 Used by researchers for more than 30 years, Stata provides everything you need for
data science—data manipulation, visualization, statistics, and automated reporting.

.NVivo
 NVivo is a software program used for qualitative and mixed-methods research.
Specifically, it is used for the analysis of unstructured text, audio, video, and image data,
including (but not limited to) interviews, focus groups, surveys, social media, and journal
articles. NVivo is used predominantly
by academic, government, health and commercial researchers across a diverse range of
fields, including social sciences such
as anthropology, psychology, communication, sociology, as well as fields such
as forensics, tourism, criminology and marketing
 .SmartPLS
 SmartPLS is a software with graphical user interface for variance-based structural
equation modeling (SEM) using the partial least squares (PLS) path modeling method.

You might also like