Professional Documents
Culture Documents
METHODOLOGY
3.1. Introduction
In this chapter, we describe the process by which scales to measure variables were developed
and how these scales were applied to a relevant sample in order to generate data to empirically
examine the phenomena chosen for study. We also describe the characteristics of the sample
and the criteria used to establish the adequacy of the sample. We describe the analytical
technique of Structural Equation Modeling (SEM) that was employed to analyse the data to
test the hypothesis proposed in this study.
This study tries to critically evaluate the existing research work on the performance parameters
associated with the logistics sector. It specifically explains the role of Logistics Effectiveness
as a performance parameter in the strategy of a company in the home textile sector in order to
enhance its competitive capabilities leading to superior organizational performance.
In order to fulfill this purpose detailed information regarding certain identified activity
indicators well supported by a critical- reflexive engagement with relevant and related literature
were identified in this study. Search for a relationship between the various parameters of
logistics performance as reflected in logistics effectiveness as they impact competitive
capabilities of organizations and consequently their overall performance is also made.
A pilot study was undertaken in which data was collected from 18 manufacturing companies.
Exploratory Factor Analysis (EFA) and Confirmatory Factor Analysis (CFA) were used to
check the validity of the developed scales. The reliability of the scales was assessed using
Cronbach Alpha. After establishing the validity and reliability of the scale the Questionnaire
was administered to logistics managers/export executives/CEO’s to establish whether
hypothesis generated were empirically sustainable. Data was collected from an adequate and
representative sample of 113 executives working in 63 Home Textile Firms.
Research Design is the science (and art) of planning procedures for conducting studies so as to
get the most valid findings (Vogt, 1993). There are two methods generally accepted towards
| 65
research studies - Quantitative and Qualitative. In a Qualitative approach, the researcher
collects, analyses and interprets the data that cannot be easily quantified and numbered and
goes beyond the obvious of constructs and variables that are not visible or measurable (Chawla
& Sondhi, 2011). A Quantitative approach on the other hand, seeks to express the data in terms
of numbers and figures that are analyzed with mathematical/ statistical methods in order to
generalize the findings to the population at large. The two approaches are however not treated
as the extreme ends of a theoretical continuum (Chawla and Sondhi 2011).
To fulfill the purpose of the present study, both qualitative and quantitative approaches were
used. The result of the evaluation of logistics effectiveness measures bear a quantitative
approach and is presented in hard data, numbers and figures. The qualitative approach was used
to analyze how the indicators of logistics effectiveness are linked to indicators of organizational
performance like Return on Assets.
The purpose of a Research is generally classified in three ways: Exploratory, Explanatory and
Descriptive. Exploratory research is undertaken for gaining an initial understanding of the
phenomena. Explanatory research is undertaken with a view to discovering and reporting
relationships among the different aspects of the phenomena. Descriptive studies are precise
measurements undertaken to describe and report the characteristics of the phenomena under
investigation (Babbie 2004). It also provides a framework for a conclusive research (Chawla
Sondhi 2011). Explanatory research is also used when there is a necessity to demonstrate the
effect of one variable on another variable.
However, in practice the demarcation between the research designs mentioned above is not
compartmentalized and it is more appropriate to view them as part of a continuum (Chawla,
Sondhi 2011)
| 66
“Following Creswell (2003) the study also adopted the sequential procedure beginning with a
quantitative method in which concepts were tested, followed by a qualitative method involving
study a few cases”
The major steps adopted in the research design in this study are shown below: -
DESCRIPTIVE EXPLANATORY
• Logistic Effectiveness • Determining Interrelationship
• Competitive Capability between the variables
• Organizational Performance • Hypothesis Testing
SECONDARY DATA
COLLECTION PRIMARY DATA
COLLECTION
• Annual Reports
• Trade Literature • Quantitative Analysis
• Industry Databases
DATA ANALYSIS
FINDINGS
&
RESULTS
This study is concerned with three variables identified from an exhaustive survey of related
and relevant literature. After analysing the literature and the context of the study, scales were
developed for the critical variables of Logistics Effectiveness, Competitive Capability and
Organizational Performance.
In the initial round twelve (12) items were developed for Logistics Effectiveness, Eleven (11)
items were developed for competitive capability and fifteen (15) items were developed for
Organizational Performance. All the items were developed after an exhaustive engagement
with the literature. Subsequently after pilot study and exploratory factor analysis, measures
with higher measurement attributes were retained and others were dropped.
| 67
Scaling is a technique of measurement involving the creation of a continuum on which
measurements on objects are located (Chawla & Sondhi 2011). Likert Scaling is commonly
used in instruments measuring opinions, beliefs and attitudes (De Vellis, 2003).
The Likert scale was used to examine how strongly the respondents agree or disagree with
various propositions within a seven response category ranging from “very strongly disagree”
to “very strongly agree”. Clark & Watson (1995) agree that it is ideal to use 6 to 7 response
options as adding more responses might reduce the validity owing to uncertainty amongst the
respondents.
The questionnaire’s content validity was established through a theoretical review and
questionnaires pre-test. Questions in the study instrument were based on previous studies and
discussed with a number of executives and experts in the field.
On the basis of the above discussion, three steps were taken to ensure the content validity of
each dimension. First the initial list of potential items (questions) was compiled after an
extensive review of logistics literature theory. Secondly, the questionnaire was examined with
the Director of a Textile Research Association during the process of draft. Thirdly, the entire
list of potential items were given to two experts from a leading textile Export Promotion
Council and three practising managers from manufacturing firms. They were asked to review
the questionnaire and to comment on the language and clarity of each question as well as the
overall format of the instrument. These experts were also requested to keep, modify and/or
delete items. They were also encouraged to provide suggestions for inclusion of additional
items, if found necessary to cover the intended domain of the variable. These inputs were
gained through structured interviews and was helpful in improving the questionnaire with
regard to its wording, clarity and relevance.
Pilot testing refers to testing and administering the designed instrument on a small group of
people from the population under study (Chawla & Sondhi 2011).
The sample for the pilot study consisted of 18 manufacturing units representing a cross section
of the industry based in North/South and Western India as we were interested in collecting
| 68
views from practitioners working in different regions. The data was collected through personal
contact and through the Industry Associations.
All items were checked for face validity and executives were asked whether the items
represented the variable. Items were either removed or reworded on the basis of the responses
given by executives. Exploratory Factor Analysis (EFA) was carried out and all items which
had factor loading of less than 0.50 or which cross loaded on another factor were removed.
The retained items loaded on one factor only for all the variable items were removed to improve
variance explained and reliability of the scale.
The structured questionnaire was prepared relying on inputs culled from similar questionnaires
available in the Literature survey. The Questionnaire was divided into 4 parts with Part I
covering details about the Company, Part II had questions relating to Logistics Effectiveness
& Performance, Part III focused on Competitive Capabilities, Part IV covered elements of
Organizational Performance. The Questionnaires were sent by e-mail, as electronic surveys
allow the transmission of more information and support a better interaction between the
researcher and the respondent. They contribute to a better quality of information, ensure faster
response cycle and reduction in research costs (Klassen and Jacobs, 2001; Tse, 1998). Leading
Trade Associations were also requested to contact their members in getting the responses.
3.5. Sampling
The population of the research is the Home Textiles Companies in India. The businesses that
are registered with the database of the Apex Registering Authority (The Cotton Textile Export
Promotion Council, Mumbai) recognized by the Ministry of Commerce, Government of India
| 69
and having a valid Import- Export Code No. (I.E. Code) have been determined as the population
frame.
A sample frame of 570 active respondents were identified in this Research. The sample frame
was constructed primarily to target 1) relatively high level managers and 2) operations and
logistics managers. The high level managers were targeted in the belief that they are not only
intimately aware of the internal operations of working of their organizations but also are well
aware about the management goals. The operational/logistic managers were targeted as they
had direct knowledge of the logistic functions in the organization. Thus the total of 113
responses were received from 200 targeted companies.
Even though, the unit of analysis is the manufacturing unit, the sample frame was constructed
primarily to target senior level functionaries such as logistics managers, export executives and
CEO’s. The senior managers were targeted on the assumption that they have sound knowledge
of the organization supply chain function and strategy and the logistics spread needed to
achieve company goals.
Accordingly, in this research 200 home textile companies were targeted out of a population
size of 570. Out of these, 63 companies sent back the filled questionnaire after rigorous and
repeated follow ups. The response rate is thus 31.5% which is not unusual for a survey
research.
Regional Spread
The production of Home Textiles is spread across the country and visible clusters of
production can be identified. These clusters are mainly concentrated in the Northern,
Western, & Southern regions of India. There are very few units located in Eastern India.
Since people working in different geographical locations do have different experiences
respondents working in different regions were contacted.
| 70
Product Heterogeneity
A wide range of products are manufactured under Home Textiles. These include Bed- linen,
Table- Linen, Toilet- Linen, Kitchen- Linen.
Company Performance
Discussing the sample size necessary for structural equation modeling (SEM), Hair et al ( 2006)
state, " SEM models containing five or fewer constructs, each with more than three items
(observed variables), and with high item commonalities ( 0.6 or higher), can be adequately
estimated with samples as small as 100-150". Accordingly, the measurement model illustrated
in Figure 8 incorporates three constructs, each with three or more observed items all of which
exhibit commonalities greater than 0.60. The Sample size of 113 is thus considered satisfactory
to support the structural equation analysis necessary to assess the logistics performance model.
This satisfied the requirement provided by Rummel (1970) that there should be at least 4
respondents for each question in the survey. Since we had 27 questions in the survey, our
sample size of 113 was adequate. Further, our sample size of 113 also satisfied the requirements
of a minimum sample size of 100 respondents for conducting structural equation modelling
(SEM) analysis (Kline, 1998).
The sample size also satisfies the guidelines given by Milton (1986) for multiple regression
analysis. The formulae is derived from the number of independent variables, desired level of
statistical significance, and the anticipated variance on the basis of earlier studies.
The unit of analysis is the firm and 113 responses were obtained from 63 firms. However, in
this context it may be appropriate to consider the survey instrument administered and the
manner in which questionnaire was designed to obtain the responses.
| 71
was prepared relying on inputs culled from similar questionnaires available in the Literature
survey. The Questionnaire was divided in to 4 parts with Part I covering details about the
Company, Part II had questions relating to Logistics Effectiveness & Performance, Part III
focused on Competitive Capabilities, Part IV covered elements of Organizational Performance.
While the observation that multiple responses from single firms can only validate the response
considered for analysis can hold true only for Part I covering details about the company; Part
II, III and IV relating to Logistic Effectiveness & Performance, Competitive Capabilities and
Organizational Performance had adequate, framework to achieve sufficient statistical power
for SEM analysis.
As for the sample size, given that SEM is a large sample technique (Mainul Islam and Faniran,
2005; Hair, Anderson, Talham & Black, 1998), we found that 113 responses exceed the
minimum of 100 cases recommended by Bagozzi and Yi (2012). Notwithstanding, this study
is in-line with previously published SEM studies that used sample size less than minimum
recommended 200 responses (Hox and Bechger, 1998) and is comparable, e.g.. with Mainul
Islam and Faniran (2005) who used 52 cases, Chen et al (2011) who used 124 cases, Doloi et
al. who used 97 case and Xiong et al 2014) who used 125 cases.
The Sampling Method employed in this study is Simple Random Sampling. While the sample
was random, companies with a turnover of Rs.50 million or more were included as they
represented reasonable level of logistic activity for undertaking the study.
As Chawla & Sondhi (2011) point out simple random sampling designs are useful when the
population size is very small and the preparation of a sampling frame does not create a problem.
Accordingly in this research 590 manufacturing companies comprised the population. Out of
these 20 companies were found to be inactive. A sample frame of 570 active respondents was
identified.
In total electronic mails were sent to 200 companies from the drawn sample. Total of 113
respondents representing 63 enterprises responded to the survey questionnaire after vigorous
follow- up over e- mail and repeated telephone calls.
| 72
In some of the enterprises 2 or 3 different functionaries representing logistic division/ export
division & the CEO were the respondents.
Many of the respondents expressed reservations about revealing information citing business
confidentiality.
Data was thus eventually collected from 113 executives working in 63 diverse Home Textiles
enterprises across the country. 32% of the respondents were located in Western India, and an
equal number (32%) were located in Southern India. 31% were located in Northern India. A
small number (5%) were located in Eastern India.
The respondent profile in terms of annual sales revenue for the fiscal year 2012-2013 shows
that 37% of the respondents had sales above Rs 75 Crores. 19% of the respondents had a sales
revenue of less than Rs 75 Crores; 13% of the respondents had sales revenue of less than Rs
50 Crores and 31% of the respondents had sales revenue of less than Rs. 25 Crores.
Profile of the 63 enterprises included in the sample and of the respondents is at Appendix N
and Appendix O.
Throughout the data collection period, electronic mails not only served as a reminder, they
were also a two- way communication channel for respondents to seek clarity about the survey
questionnaire and for the researcher to understand the complexities of logistics operations and
their effectiveness in achieving the objectives of the firms.
In order to address the issue of Common Method Bias we allied procedural methods to
minimize the potential for common method bias since both independent and dependent
measures were obtained from the same source.
There were no reverse coded items and all the hypothesis were stated in a positive direction
(Swink & Song 2007)
Pre-qualification calls were conducted to ensure that the respondent were mid to senior-level
managers and had high levels of relevant knowledge which tends to mitigate single source bias
(Mitchell 1994).
| 73
We also reduced method biases by separating the predictor and criterion variable items over a
lengthy survey instrument and by assuring the survey participants that their responses would
be kept anonymous (Podsakoff et al, 2003)
Further, in order to support the study and as an Integral part of the research work, post facto
validation of the linkage between logistic performance and organizational performance using
secondary data was also undertaken. Time series data from 2010 to 2015 along with the cross-
section data for fifteen listed Indian textile companies was analyzed to understand the important
organizational performance variables that impact logistics performance. Return on assets was
taken as a “proxy” to represent organizational performance of the company and five variables
namely, inventory in value and cycle terms of days, gross profit margin, sales growth and
distribution expenses were taken as independent variables to represent logistics effectiveness
in the study. Panel data approach was used to study fixed and random effects. Estimated results
indicate that gross profit margin and sales growth have positive impact on organizational
performance, inventory value and cycle, distribution expenses have negative impact on
organizational performance.
The data was analyzed with the help of SPSS package AMOS 4 and Structural Equation
Modeling (SEM) was employed for testing the hypothesis proposed in the study. Factor
analysis was also carried out. The mediating effect in the case of all the hypothesis was also
found to be significant.
Factor analysis was used to address whether the correlation between original variables
correlation can be explained by the existence of a small number of hypothetical variables (Kim
& Muller, 1978). Miller (1978), Hair et al (1995) have defined factor analysis as a data
| 74
reduction technique to summarise a large number of variables with smaller number of
underlying dimensions called factors. Individual variables that measure the same construct or
dimension will "load" on the same factor. These loadings can be interpreted as the correlation
between that individual variable and all the other variables on a particular factor.
In order to increase interpretability of the factor structure, each factor is "rotated" in such a way
as to minimize the distance of each individual variable from one of the factors. Principal
Component analysis was used with VARIMAX rotation to assess the underlying dimensions.
The VARIMAX rotation is an orthogonal rotation that means there will be no correlation
among the factors ( Harman 1976). Churchill (1991) & Cooper and Emory (1995) described
the purpose of VARIMAX rotated principal component analysis as being the transformation of
a set of interrelated variables into a set of unrelated linear combination of these variables.
Accordingly factor analysis was carried in this study to bring out the descriptive summaries of
data matrix in order to understand the presence of meaningful patterns amongst the variables.
Validity is whether the method of measurement actually measures what it intends to measure.
In order to test the validity of the instrument, the Questionnaire was sent to two industry experts
before a Pilot Study was undertaken. The Industry Experts suggested a few changes which
were incorporated in the Questionnaire. Reliability concerns whether the method of
measurement gives stable and reliable result. For example, will the measure yield the same
result if it was measured again? It was assumed in the present research that the information
gathered through government publications or official web sites, company published annual
reports; consultancy reports were reliable and trustworthy. To assure high validity and
reliability, the answers from the survey were also verified with experts and other similar
surveys.
On the basis of an intensive and exhaustive Survey of Literature scale items & measures were
formulated as given below: -
| 75
3.9.1. Logistics Effectiveness
Logistics Effectiveness is defined as the extent to which the logistic function’s goal are
accomplished.
| 76
3.9.1.1. Operationalizing the measure of Logistic Effectiveness: -
Table 18 below indicates the literature review which contributed to the development of scale
items for Logistics Effectiveness.
Table : 18
Coyle and Bardi The primary goal of physical distribution Annual freight
(1980) is to move goods from place of supply to charges incurred in
the place of final sale “accomplished in delivery of goods
Fugate, Mentzer such a manner as to contribute to the
& Stank (2010) explicit goods of the organization”
Holcomb (1994) When Orders are filled completely and Fulfilling orders
correctly the operating costs decline and correctly and
customers are satisfied. completely
enhances
performance.
Gunasekaran, A
(2004)
| 77
Literature Review Insight from Literature Review Scale Item
Novich (1990) Inventory needs to be kept low lest it lead Inventory Turns
to blocking capital
Gunasekaran A et
al (2004)
Fugate, Mentzer The number of days stocks are held Inventory Days
& Stank (2010) indicates supply chain effectiveness
Twelve items were developed to operationalize the variable of Logistics Effectiveness. The
twelve items are listed below: -
i. Home Textiles Sale [LEF 1] : How will you rate performance to budgeted ? (Fugate,
Mentzer & Stank, 2010).
ii. Annual Freight Charges [LEF 2] : How will you rate current performance to
budgeted? (Fugate, Mentzer & Stank,2010)
iii. Order Fill Rate [LEF 3] : How has your Company performed in terms of correctly
and completely fulfilling Orders from Customers? (Gunasekaran et al 2004)
iv. Total Logistic Cost on Home Textiles [LEF 4] : How will you rate current
performance to budgeted?(Fugate, Mentzer & Stank 2010)
v. Home Textiles Inventory [LEF 5] : How will you rate current performance to
budgeted? (Fugate, Mentzer & Stank 2010)
vi. Delivery to Customer [LEF 6] : How has your Company performed in terms of the
Order delivery schedules to the Customer? (Tracey et al 2006)
| 78
vii. Product Line Fill Rate [LEF 7] : How has your Company performed in terms of
product-line fill rate? (Gunasekaran et al 2004)
viii. Inventory Turns [LEF 8] : How will you rate current performance to budgeted?
(Fugate, Mentzer & Stank 2010)
ix. Delivery to Warehouse [LEF 9] : How has your Company performed in terms of order
delivery schedules to Warehouse? ( Fugate, Mentzer, Stank 2010)
x. Order Cycle Time [LEF 10] : How has your Company performed in terms of Order
Cycle Time? ( Gunasekaran et al 2004)
xi. Inventory Days [LEF 11] : How will you rate current performance to budgeted?
(Fugate, Mentzer & Stank 2010)
xii. Container Fill Rate [LEF 12] : How has your Company performed in terms of
container Fill Rate? (Gunasekaran et al 2004)
Before carrying out Exploratory Factor Analysis (EFA), the Kaiser-Meyer-Olkin (KMO) index
was checked and was found to be 0.671, which is above the required value of 0.5. Bartlett test
of sphericity was carried out and was found to be significant, the ‘þ’ value was <0.01. Using
the study responses, EFA was carried out. The factor loading values are shown in Table 19
below:
Factor loading for EFA for all 12 items of Logistics Effectiveness (initial items)
| 79
Table 20 : Rotated Component
Matrixa
Component
1 2
LEF1 .972 .024
LEF2 .861 .100
LEF3 .764 .283
LEF4 .735 .220
LEF5 .868 .208
LEF6 .869 .251
LEF7 .775 -.499
LEF8 .914 -.300
LEF9 -.013 .660
LEF10 .164 .679
LEF11 .311 .802
LEF12 -.038 .695
Extraction Method: Principal Component Analysis
Rotation Method: Varimax with Kaiser Normalization
a. Rotation converged in 3 iterations.
The rotated component matrix above indicates that the items split into two different
factors.
While items LEF1 to LEF8 loaded on one factor, items LEF9 to LEF 12 cross loaded
on another factor and had factor loadings below 0.5 Therefore, items LEF9 to LEF12
were dropped from further analysis.
After dropping items LEF9 to LEF 12, the factor analysis was run once again
| 80
As we can see, the KMO index was more than 0.5 and the Bartlett’s test of sphericity was
significant at p<0.01 level.
All the eight items loaded on one component and had factor loadings more than 0.5.
Table 22 :
Component Matrixa
Component
1
LEF1 .971
LEF2 .863
LEF3 .774
LEF4 .751
LEF5 .880
LEF6 .882
LEF7 .750
LEF8 .898
Extraction Method: Principal Component Analysis.
a. 1 components extracted.
The total variance explained by the eight items was 91.78% and the results are shown in Table
23 below:-
Table 23 : Total Variance Explained
Component Initial Eigenvalues Extraction Sums of Squared
Loadings
Total % of Cumulati Total % of Cumulative
Variance ve % Variance %
1 7.342 91.780 91.780 7.342 91.780 91.780
2 .322 4.031 95.811
3 .220 2.754 98.565
4 .079 .982 99.547
5 .024 .301 99.848
6 .010 .122 99.969
7 .002 .030 100.000
7.711E-
8 6.169E-006 100.000
005
Extraction Method: Principal Component Analysis.
The cronbach’s alpha indicting the scale reliability of 0.936.
| 81
3.9.1.3. Confirmatory Factor Analysis
Confirmatory Factor Analysis was carried out to assess the factorial and convergent validity of
the scale developed for Logistic Effectiveness. The measurement model for LEF is shown in
the figure.9 below:
53.935 14 3.852 0.00 0.08 0.903 0.751 0.965 0.974 0.974 0.947 0.025
For the above model, p<0.05. This raises questions about the model fit. Therefore, in order to
improve the model fit, LEF1 and LEF8 were dropped from the measurement model. The
measurement model after dropping LEF1 and LEF8 is shown in fig.10 below:
| 82
Fig 10 : Measurement Model after dropping LEF1 & LEF8
After dropping LEF1 and LEF8, the fit measures of the model improved and are shown in
Table 25 below: -
Table : 25
p
χ2 df χ 2 /df RMSEA GFI AGFI NFI IFI CFI TLI SRMR
value
11.611 7 1.659 0.114 0.077 0.967 0.901 0.970 0.988 0.988 0.974 0.064
The factor loadings of the scale items and the error variances are shown below:
Table : 26
Scale Item Factor Loading Error Variance
(t value)
From the above results, the AVE, CR and Cronbach’s α for Logistic Effectiveness are
computed as shown below:-
| 83
Table : 27
Since all the factor loadings are above 0.5, and AVE>0.5 along with CR>0.6, the scale
developed for measuring LEF is adequate.
Convergent Validity
For assessment of convergent validity of the measure in a research, 3 procedures are suggested
by Fornell and Larcker (1981) Viz.
The assessment of item reliability of a measure was conducted through its factor loading onto
the basic construct.
The Average Variance Extracted (AVE) is to measure the total amount of variance related to
the construct in connection with the variance amount that can be attributed to measurement
error (Fornell & Larcker 1981).
It is believed that when the average variance extracted is equal to or higher than 0.50, the
convergent validity is adequate (Segars, 1997).
These procedures have been followed in the present Research and duly reported for the latent
variable.
| 84
Operationalising the measure of Competitive Capability (CC)
Three first order constructs from the extant literature were inferred for operationalizing the
variable of competitive capability viz. Cost Leadership, Customer Service and Organizational
flexibility as explained below: -
Cost Leadership is defined as the underlying cost structure of an organization that is low
enough to offer a price that is comparable to the competition or the products offered must be
higher in value than the competition. So a price can be commanded (Tracey et al 1999).
Table : 28
Literature Review Insight from Literature Review Scale Item
M.Tracey et al (1999) Competitive price depends on costs Ability to quote low prices.
incurred and service rendered across the
supply chain.
Kim (2006,2009) Reduction in production cost can enhance Ability to produce at lower
competition. costs compared to competitors
Oghazi (2009)
Skinner et al (2008) Reverse logistic recovers value in sold Taking back damaged
goods. goods/rejects by incurring
costs.
Four items were developed to operationalize the variable cost leadership. The four items are
listed below: -
i. Product Price (CLE1): We are able to offer prices as low or lower than our competitors
(Tracey et al 1999)
ii. Premium Price (CLE2): We are able to command premium prices for our products
(Porter 1980)
iii. Production Cost (CLE3): Our cost of production is generally lower than our competitors
(Kim 2006, 2009 ; Oghazi 2009)
| 85
iv. Reverse Logistic Cost (CLE4): In case of damaged goods or rejects, we always take
them back by incurring cost (Skinner et al 2008)
Before carrying out Exploratory Factor Analysis (EFA), the Kaiser-Meyer-Olkin (KMO) index
was checked and was found to be 0.682 which is above the required value of 0.5. Bartlett test
of sphericity was carried out and was found to be significant, the ‘þ’ value was <0.01.
Table: - Factor loading for EFA for all 4 cost leadership items
Using the study responses, EFA was carried out. The factor loading values are shown below:-
The factor loadings for all 4 cost leadership items are shown below:
Table 30 :
Component Matrixa
Component
CLE1 .960
CLE2 .859
CLE3 .941
CLE4 .065
Extraction Method:
Principal Component
Analysis.
a. 1 components
extracted.
| 86
Since, the factor loading of CLE4 is less than 0.5, it is dropped.
The factor analysis results after dropping CLE4 are shown below:
Table 31 :
Component Matrixa
Component
1
CLE1 .928
CLE2 .820
CLE3 .939
Extraction Method:
Principal Component
Analysis.
a. 1 components
extracted.
All the factor loadings are above 0.5 as required. The total variance explained is 80.505% and
the results are shown in Table 32 below:
Customer service is the provision of a service to customers before, during and after a purchase.
It consists of a series of activities designed to enhance the level of the experience of the
customer i.e. creating a feeling that the product or service has met his expectations.
| 87
Table: 33
Bottani and Rizzi The more precisely a product fits a Ability to modify
(2006) customer need, the more value with /customize product
the customer assign to it. line in terms of
pricing, mixing and
packaging in order to
service customers.
Three items were developed to operationalize the variable customer service. The three items
are listed below:-
i. Quality control capability (CSE1): We are able to supply high quality products to our
customers (Kim 2006)
ii. Speedy & Reliable delivery (CSE2): We are able to “deliver our products quickly” to
our customers (Miller & Roth 1994, Vickery et al 2003)
iii. Customized Product line (CSE3): We are able to modify/customize our product line in
terms of pricing, mixing and packaging in order to service our customers (Christoper,
1992, Bottani and Rizzi 2006).
Before carrying out Exploratory Factor Analysis (EFA), the Kaiser-Meyer-Olikin (KMO)
index was checked and was found to be 0.623 which is above the required value of 0.5. Bartlett
test of sphericity was carried out and was found to be significant, þ value was <0.1.
| 88
Table 34 : KMO and Bartlett's Test
Using the study responses, EFA was carried out. The factor loading values are shown below:-
Table 35 : Component
Matrixa
Component
1
CSE1 .916
CSE2 .943
CSE3 .565
Extraction Method: Principal
Component Analysis.
a. 1 components extracted.
All the factor loadings are above 0.5 as required. The total variance explained is 68.262% and
the results are shown below:
| 89
3.9.2.3. Operationalising the Measure - Organizational Flexibility(OFL)
Organizational Flexibility is the ability to offer products or specific services that are
apparently different from other products (Porter 1980)
Table : 37
Insight from Literature
Literature Review Scale Item
Review
i. Value added service (in order handling & packaging) (OFL1): We respond well to
changing customer preferences regarding ancillary services (Tracey et al 1999)
ii. Ability to deliver as per Customer requirements (OFL2): We are able to deliver goods
as per customized product delivery (Collins et al 2001, Tracey et al 2005)
iii. Deliver New Products (OFL3): We are able to develop new products and delivery to
our customers (Kim, 2006)
iv. Deliver broad range of products (OFL4): We are able to develop and deliver an
assortment of products to our customers (Christopher 1992, Tracey et al 2005).
| 90
Exploratory Factor Analysis
Before carrying out Exploratory Factor Analysis (EFA), the Kaiser-Meyer-Olikin (KMO)
index was checked and was found to be 0.728 which is above the required value of 0.5. Bartlett
test of sphericity was carried out and was found to be significant, ‘þ’value was <0.01.
Table: Factor loading for EFA for 4 items for organizational flexibility.
Table 39 : Component
Matrixa
Component
OFL1 .978
OFL2 .940
OFL3 .974
OFL4 -.260
Extraction Method:
Principal Component
Analysis.
a. 1 components
extracted.
Since OFL4 has a factor loading less than 0.5, it was dropped.
| 91
The factor analysis results after dropping OFL4 are shown below:
Table 41 : Component
Matrixa
Component
1
OFL1 .983
OFL2 .931
OFL3 .977
a. 1 components extracted.
All the factor loadings are above 0.5 as required. The total variance explained is 92.928%.
| 92
Confirmatory Factor Analysis – Competitive Capability
Confirmatory Factor Analysis was carried out to assess the factorial and convergent validity of
the scale developed for Competitive Capabilities. The measurement model for CC is shown in
the figure.11 below:
Model Fit
The model fit measures for the above measurement model are shown in Table 43 below:
Table : 43
χ2 Df χ2/df p RMSEA GFI AGFI NFI IFI CFI TLI SRMR
value
31.714 21 1.51 0.063 0.067 0.942 0.877 0.970 0.99 0.989 0.982 0.045
Factor Loadings
The factor loadings of the scale items of CLE and the error variances are shown in Table 44
below:
| 93
Table : 44
Scale Item Factor Loading Error Variance
(t value)
CLE1 0.673 (6.611) 0.543
The factor loadings of the scale items of CSE and the error variances are shown Table 45 below:
Table : 45
Scale Item Factor Loading Error Variance
The factor loadings of the scale items of OFL and the error variances are shown in table 46
below:
Table : 46
Scale Item Factor Loading Error Variance
The factor loadings of the second order construct CC and the error variances are shown in
below:
Table : 47
Scale Item Factor Loading Error Variance
| 94
From the above results, the AVE, CR and Cronbach’s α for Competitive Capabilities are
computed as shown in Table 48 below: -
Table : 48
Construct AVE CR Cronbach’s α
Since all the factor loadings are above 0.5, and AVE>0.5 along with CR>0.6, the scale
developed for measuring CC shows good convergent validity.
Three first order constructs from the extant literature for operationalising the variable of
organizational performances viz. financial performance, customer satisfaction and market
performance is explained below: -
| 95
Table : 49
Literature
Insight from Literature Review Scale Item
Review
Gunasekaran et al The size of order in order received Size of order received
(2004) contributes to financial performance
Gunasekaran et al Organization are under constant pressure to Return on assets
(2004) maximize earnings by improving the deployed
productivity of capital
Gunasekaran e al Reduction in total logistic cost enhances Better financial
(2004) financial position relative to main position
competitors
Gunasekaran et al Designing of an efficient and cost effective Total cost reduction
(2004) distribution system in order to achieve proper
trade-offs amongst cost and service
parameters that enhances financial
performance
Gunasekaran et al An enterprise’s ability to meet its short term Cash to Cash cycle time
(2004) debt requirement efficiently increases its
financial performance
Gunasekaran et al An enterprise’s ability to utilize capital to Working capital
(2004) increase productivity and production efficiency
Stewart (1995)
Six items were developed to operationalize the variable Financial Performance. The six items
are given below: -
i. Size of orders received (FNP1): Is the size of orders received in the last 2 years, better
than the competitors? (Gunasekaran et al 2004)
ii. Return on Asset (FNP2): Our productivity of capital has been better than our
competitors during the last 2 years, which has led to higher return on assets
(Gunasekaran 2004)
iii. Better Financial Position (FNP3): Our capability to reduce total costs has enhanced our
financial position (Flynn et al 2010)
iv. Total Cost Reduction (FNP4): The capability to manage our inventory effectively has
led to reduction in our total costs (Kim 2006, 2009; Oghazi 2009).
v. Cash to cash cycle time (FNP5): Our company has performed better than our
competitors during the past 2 years in turning cash invested in raw material in to cash
collected from the customer (Farris II & Hutchison 2003).
| 96
vi. Working capital efficiency (FNP6): Our company has performed better than our
competitors during the last 2 years, efficiently utilizing the working capital
(Gunasekaran et al 2004).
Sig. .000
Using the study responses, EFA was carried out. The factor loading values are shown below:-
Table: Factor loading for EFA for six items of Financial Performance
Since the factor loadings for FNP 1 and FNP 6 were less than 0.5, they were dropped.
| 97
The factor analysis results after dropping FNP1 and FNP6 are shown Table 52 below:
Bartlett's Test of
Df 6
Sphericity
Sig. .000
Table 53 :
Component Matrixa
Component
1
FNP2 .855
FNP3 .931
FNP4 .855
FNP5 .767
Extraction Method:
Principal Component
Analysis.
a. 1 components
extracted.
| 98
Table 54 : Total Variance Explained
Component Initial Eigenvalues Extraction Sums of Squared Loadings
Total % of Cumulative Total % of Cumulative
Variance % Variance %
1 2.917 72.931 72.931 2.917 72.931 72.931
2 .561 14.030 86.961
3 .334 8.338 95.300
4 .188 4.700 100.000
Extraction Method: Principal Component Analysis.
All the factor loadings are above 0.5 as required. The total variance explained is 72.93%.
Table : 55
Insight from
Literature Review Scale Item
Literature Review
| 99
Four items were developed to operationalize the variable customer satisfaction. The four items
are given below: -
i. After Sales Service (CSN1): Our customers appreciate our after sale service capability
leading to higher sales for the company (Kassinis and Soteriou 2003).
ii. Market Goodwill (CSN2): Our ability to customize products as per the customer’s
needs has enhanced our market goodwill and performance (Gunasekaran et al 2004)
iii. Value of order (CSN3): The value of Orders received from our customers in the last 2
years are better than our competitors (Tracey et al 1999).
iv. Shipment size (CSN4): The size of our shipments to our customers in the last 2 years
are better than our competitors (Tracey et al 1999).
Before carrying out Exploratory Factor Analysis (EFA), the Kaiser-Meyer-Olikin (KMO)
index was checked and was found to be 0.640 which is above the required value of 0.5. Bartlett
test of sphericity was carried out and was found to be significant, ‘þ’ value was <0.01.
| 100
Table: Factor loading for EFA for 4 items if Customer Satisfaction
Table 57 :
Component Matrixa
Component
1
CSN1 .694
CSN2 .809
CSN4 .701
CSN3 -.454
Extraction Method:
Principal Component
Analysis.
a. 1 components
extracted.
The factor loading of CSN3 is less than 0.5. Therefore, it was dropped.
Table 59 : Component
Matrixa
Component
1
CSN1 .848
CSN2 .876
CSN4 .860
Extraction Method:
Principal Component
Analysis.
a. 1 components extracted.
| 101
All the factor loadings are above 0.5. The total variance explained is 74.22% and the
results are shown in Table 60 below.
Market Performance is seen as sales gains and market share growth. (Venkatraman &
Ramanujam 1986, Schramm-Klein and Morschett 2006)
Table : 61
Insight from Literature
Literature Review Scale Items
Review
Green, Kenneth et al. Improving overall sales Overall Sales
(2008) growth is an important Growth
function of the manager.
Gunasekaran (2004) Need to keep customers Order/Shipment
Tracey et al (2005) informed on the progress of the Information
orders placed by them.
Tracey et al (2005) Offering an assortment of Broadened Product
products helps in retaining Mix
customers.
Gunasekaran et al Supplying products to Higher sales
(2004) customers on demand
increases and market
penetration
Fugate, Mentzer et al Management work towards Market share growth
(2010) enhancing their market share
by supplying to regular
customers
Five items were developed to operationalize the variable Market Performance.
| 102
i. Overall Sales Growth (MKP1): Our overall sales growth has been better than our main
competitors during the last 2 years (Green et al 2008)
ii. Demand driven Supply Network (MKP2): We are geared to meet the demands for
enhanced supplies to our customers (Gunasekaran et al 2004)
iv. Market Share Growth (MKP5): Our Market share growth in our primary-market has
been better than our main competitor the during last 2 years (Fugate, Mentzer et al
2010).
v. Broadened Product Mix (MKP6): Our overall market share has grown as we are able
to offer a broad range of products (Tracey et al 1999).
Using the study response, EFA was carried out. The factor loading values are shown in Table
62 below: -
| 103
Table 63 : Rotated Component
Matrixa
Component
1 2
MKP3 .582 -.425
MKP6 .702 -.031
MKP1 .820 .083
MKP2 -.273 .664
MKP5 .234 .771
Extraction Method: Principal
Component Analysis.
Rotation Method: Varimax with
Kaiser Normalization.
a. Rotation converged in 3 iterations.
Since the factor loadings of MKP2 and MKP5 are less than 0.5, they are dropped. The factor
analysis results after dropping MKP2 and MKP5 are shown in Table 64 below: -
| 104
Table 65 :
Component
Matrixa
Componen
t
1
MKP
.829
1
MKP
.833
3
MKP
.815
6
Extraction Method:
Principal
Component
Analysis.
a. 1 components
extracted.
All the factor loadings are above 0.5. The total variance explained is 68.157% and the results
are shown in Table 66 below.
Confirmatory Factor Analysis was carried out to assess the factorial and convergent validity of
the scale developed for Organizational Performance. The measurement model for OP is shown
in the figure.12 below:
| 105
Fig 12 : Measurement model for OP
Model Fit
The model fit measures for the above measurement model are shown in Table 67 below:
Table : 67
χ2 df χ2/df p value RMSEA GFI AGFI NFI IFI CFI TLI SRMR
33.77 23 1.468 0.069 0.065 0.941 0.86 0.964 0.988 0.988 0.977 0.04
Factor Loading
The factor loadings of the scale items of FNP and the error variances are shown in Table 68
below:
Table : 68
Scale Item Factor Loading (t Error Variance
value)
FNP2 0.817 (12.561) 0.330
The factor loadings of the scale items of CSN and the error variances are shown in Table 69
below:
| 106
Table : 69
Scale Item Factor Loading Error Variance
(t value)
The factor loadings of the scale items of MKP and the error variances are shown in Table 70
below:
Table : 70
Scale Item Factor Loading (t Error Variance
value)
The factor loadings of the second order construct OP and the error variances are shown in Table
71 below:-
Table : 71
Scale Item Factor Loading Error Variance
FNP 0.989 (7.410) 0.015
CSN 0.907 (7.352) 0.099
MKP 0.996 (8.398) 0.004
From the above results, the AVE, CR and Cronbach’s α for Organizational Performance are
computed as shown in Table 72 below: -
| 107
Table : 72
Construct AVE CR Cronbach’s α
Since all the factor loadings are above 0.5, and AVE>0.5 along with CR>0.6, the scale
developed for measuring OP shows good convergent validity.
Discriminant Validity
In order to assess the discriminant validity between the constructs which are used in the
structural model of the study, the shared variance between LEF, OP and CC was estimated.
Discriminant validity allows to assess whether the measures developed for the constructs allow
these constructs to be empirically distinguished. If the shared variance between the constructs
is less than the Average Variance Extracted of the construct, then construct has been said to be
empirically distinguished from the other constructs in the study.
The table below shows the shared variance between the constructs used in the structural model
in the study:
| 108
Table : 73
LEF CC 0.156
LEF CLE 0.194
LEF CSE 0.088
LEF OFL 0.206
LEF FNP 0.081
LEF CSN 0.013
LEF MKP 0.083
LEF OP 0.045
CLE CSE 0.324
CLE OFL 0.325
CLE FNP 0.236
CLE CSN 0.533
CLE MKP 0.082
CSE OFL 0.537
CSE FNP 0.430
CSE CSN 0.423
CSE MKP 0.518
OFL FNP 0.559
OFL CSN 0.483
OFL MKP 0.531
FNP CSN 0.352
FNP MKP 0.468
CSN MKP 0.270
CC OP 0.689
CC FNP 0.442
CC MKP 0.374
CC CSN 0.477
OP CLE 0.216
OP CSE 0.549
OP OFL 0.711
| 109
As pointed out earlier, discriminant validity occurs when the shared variance between two
constructs in the Model happen to be less than the variance shared between a construct and its
indicators (Fornell, Tellis & Zinkhan 1982). The assessment is carried out by comparing the
square root of the AVE for a construct with inter-construct correlations between the specific
construct and all other constructs. These have been reported in Table 74 below.
Table : 74
Construct CR AVE MSV ASV
Based on the above Table 74, it can be concluded that the AVE for each construct is higher
than the shared variance with other constructs. Consequently, it can be concluded that the
measures of each construct are distinguished from the measures of other constructs. Since the
AVE of each construct is higher than the corresponding MSV and ASV, it can be concluded
that discriminant validity has been established for all constructs.
The results for discriminant validity for the second order constructs are shown in Table 75
below:
Table : 75
Construct LEF CC OP
LEF 0.506 0.156 0.045
CC 0.395 0.70 0.689
OP 0.212 0.830 0.96
| 110
In the above Table 75, diagonal entries are average variances extracted. Entries above the
diagonal are shared variances. Entries below the diagonal represent square root of shared
variances.
Structural Equation Modeling via AMOS was employed for testing the hypothesis proposed in
the study. Factor analysis was also carried out.
Goodness of fit statistics helps us to identify whether the model is significant or not. The most
commonly used Fit Index is x2. It is very sensitive to sample size. In order to reduce the
sensitivity, it is divided by the degrees of freedom i.e. x2/df Although there is no guideline about
the value of x2/df, it can be less than 5.
Goodness of Fit (GFI) is the measure of the relative amount of variance and co- variance and
a value close to 1 is considered to be good fit ( Byrne,2001). Normed Fit Index (NFI),
Comparative Fit Index (CFI) and Tucker - Lewis Fit Index ( TLI ) are other indexes most
commonly used in SEM. The value of NFI indicates the fit of the researcher model with that
of the null model. CFI is similar to NFI but unaffected by the sample size (Kline 1988). The
value of NFI, CFI, TLI must be more than 0.90, preferably more than 0.95. Standardized Root
Mean Square Residual (SRMR) gives the standardized summary of the average co- variance
residuals.
SRMR equals zero, when the fit of the model is perfect. However a value less than 0.10 is
acceptable. Root Mean Square Error of Approximation (RMSEA), takes into account the error
of approximation in population and a value less than 0.05 indicates good fit.
| 111
The representation of the latent variable through the scale items measuring it is known as the
measurement model in SEM whiles the representation of the relationships between the latent
variables or observed variables, as the case may be is the structural model.
SEM techniques seek to minimize the difference between the empirical covariance matrix and
the sample covariance matrix. In view of the above, in this research we use structural equation
modeling to carry out the multivariate regression analysis of the hypothesis so as to obtain
information on whether the empirical data fits the hypothesized model.
Confirmatory Factor Analysis (CFA) is used by Researchers to assess and validate the
operational constructs (Gerbing and Anderson 1988). All items are allowed to correlate with
each other and those items which load weakly on the hypothesised factors are eliminated from
the scale, thus resulting in a unidimensional scale.
The purpose of the analysis as pointed out by Hinkin (1998) is twofold, first, to assess the
goodness of fit of the measurement model and second is to examine the fit of individual items
within the specified model using the modification indices and t values.
Accordingly, the measurement model fit indices for competitive capabilities and
Organisational Performance” has been reported on Page No.90 and 103 respectively in the draft
thesis. The second part of Hinkin’s (1998) formulation that “the model can be further assessed
by reporting the item ‘t’ values and modification indices” has also been undertaken.
Thus, in line with Hinkin (1998) each model coefficient has been individually examined for
degree of fit after the overall fit of the model has been established. Accordingly, ‘t’ values the
factor loadings and the error variances have only been reported at the sub variable level in line
with the relevant literature in this area.
It has been stated that the sub-variable of customer service in competitive capability and
customer satisfaction in organizational performance can create collinearity.
| 112
Literature views customer service and customer satisfaction as distinct construct. While
collecting the data every effort was made to clarify the concepts. A pilot study was also
conducted in which the variables and operational definitions were validated.
3.10. Summary
The chapter has outlined procedures that were followed for scale development for all the
variables included in the study. The Exploratory Factor Analysis (EFA) and the Confirmatory
Factor Analysis (CFA) were carried out with excellent results for the validity of the scale. The
Cronbach’s alpha coefficient and Composite Reliability (CR) were used to examine the
reliabilities among the items (internal consistency) within each factor and were found to exceed
the widely recognized rule of thumb of 0.70 indicating adequate reliability. The convergent
validity of each measurement scale was evaluated and all construct indicators were found
statistically significant with factor loading greater than 0.50, which suggests the theoretical
constructs express convergent validity. Additionally the average variance extracted (AVE) of
each construct is close to or exceeds the recommended minimum value of 0.50 indicating
strong convergent validity. Discriminant validity was tested with the approach suggested by
Fornell and Larcker (1981) comparing the Average Variance Extracted (AVE) of each
construct with the shared variances between each of the constructs.
All average variances extracted were in excess of the shared variances between constructs
demonstrating discriminant validity.
Data collected from 113 respondents representing 63 firms using the final questionnaire was
found to be adequate for carrying out multivariate analysis including Structural Equation
Modelling. The framework of analysis of data has been described in this chapter and the
assumption of such analysis has also been clarified.
****
| 113