You are on page 1of 15

Accelerat ing t he world's research.

Validating a scale for measuring


teachers' expectations about generic
competences in higher education:
The Ecuad...
Katia Rodriguez

Cite this paper Downloaded from Academia.edu 

Get the citation in MLA, APA, or Chicago styles

Related papers Download a PDF Pack of t he best relat ed papers 

Alfa t uning Lat in America project : t he relat ionship bet ween elaborat ion and implement at ion i…
BARREYRO, GLADYS BEAT RIZ

Burnout and Compet ency Development in pre-Service Teacher Training


Juan Calmaest ra

Tuning impact in Lat in America: is t here implement at ion beyond design?


Pablo Beneit one
Journal of Applied Research in Higher Education
Validat ing a scale f or measuring t eachers’ expect at ions about generic
compet ences in higher educat ion: The Ecuadorian case
Rocio Serrano, Washington Macias, Katia Rodriguez, María Isabel Amor,
Article information:
To cite this document:
Rocio Serrano, Washington Macias, Katia Rodriguez, María Isabel Amor, (2019) "Validating a
scale for measuring teachers’ expectations about generic competences in higher education: The
Ecuadorian case", Journal of Applied Research in Higher Education, https://doi.org/10.1108/
JARHE-09-2018-0192
Permanent link t o t his document :
https://doi.org/10.1108/JARHE-09-2018-0192
Downloaded by 186.68.227.160 At 05:33 21 January 2019 (PT)

Downloaded on: 21 January 2019, At : 05: 33 (PT)


Ref erences: t his document cont ains ref erences t o 74 ot her document s.
To copy t his document : permissions@emeraldinsight . com
Access t o t his document was grant ed t hrough an Emerald subscript ion provided by
Token: Eprint s: dxgKJ4XDj 3626wkhKzWB:
For Authors
If you would like t o writ e f or t his, or any ot her Emerald publicat ion, t hen please use our Emerald
f or Aut hors service inf ormat ion about how t o choose which publicat ion t o writ e f or and submission
guidelines are available f or all. Please visit www. emeraldinsight . com/ aut hors f or more inf ormat ion.
About Emerald www.emeraldinsight.com
Emerald is a global publisher linking research and pract ice t o t he benef it of societ y. The company
manages a port f olio of more t han 290 j ournals and over 2, 350 books and book series volumes, as
well as providing an ext ensive range of online product s and addit ional cust omer resources and
services.
Emerald is both COUNTER 4 and TRANSFER compliant. The organization is a partner of the
Committee on Publication Ethics (COPE) and also works with Portico and the LOCKSS initiative for
digital archive preservation.

*Relat ed cont ent and download inf ormat ion correct at t ime of download.
The current issue and full text archive of this journal is available on Emerald Insight at:
www.emeraldinsight.com/2050-7003.htm

Validating a scale for measuring Generic


competences in
teachers’ expectations higher
education
about generic competences
in higher education
The Ecuadorian case Received 12 September 2018
Revised 1 November 2018
Accepted 22 November 2018
Rocio Serrano
Department of Education, Universidad de Cordoba, Cordoba, Spain
Washington Macias and Katia Rodriguez
Faculty of Social Sciences and Humanities,
Downloaded by 186.68.227.160 At 05:33 21 January 2019 (PT)

Escuela Superior Politecnica del Litoral, Guayaquil, Ecuador, and


María Isabel Amor
Department of Education, Universidad de Cordoba, Cordoba, Spain

Abstract
Purpose – The purpose of this paper is to develop and validate a questionnaire to assess the expectations of
university teachers about the importance of generic competences in Higher Education Institutions of Ecuador
(E-DUC, acronym in Spanish), based on the competences typology from the Tuning Latin America Project.
Design/methodology/approach – A questionnaire with Likert scales was administered to 458 university
teachers from seven universities in Ecuador. Exploratory and confirmatory analyzes have been carried out to
validate the theoretical model.
Findings – After the validation process, four groups of generic competences were confirmed and the
measurement model showed high levels of reliability, as well as content and construct validity.
Research limitations/implications – Since tuning project has an international scope, the research could
be replicated in other Latin American countries for comparability purposes regarding teachers’ perceived
importance of generic competences in teaching activity. In addition, further research can relate teachers’
expectations with teaching performance and other constructs, based on a broad theoretical framework.
Practical implications – These technical characteristics allow the use of E-DUC as an instrument to
measure the expectations of teachers on the general competences that are worked on in higher education in
Ecuador. Data about these perceptions are useful for the design of teachers’ training programs, curriculum
reforms and other higher education policies.
Originality/value – It is the first research carried out in Ecuador and Latin America in order to validate a
scale for measuring the expectations of teachers about the importance of the generic competences proposed in
the Tuning Latin America Project.
Keywords Expectations, Validity, Latin America, Evaluation, Higher education,
Competence-based approach
Paper type Research paper

1. Introduction
Since its inception, the Bologna Process in higher education (Bologna Declaration, 1999) has
become a focus of special attention for researchers and institutions (Corbett, 2011; Chuo-Chun
and Huisman, 2017; Elken, 2017). According to Ravinet (2008), this process fostered great
pedagogical and organizational changes in higher education aimed at guaranteeing the quality
of institutions. At this juncture, new academic roles and functions are required from universities
and teachers (Bahia et al., 2017) which, as indicated by Vukasovic et al. (2015), are influenced by
the achievement of quality and excellence that ensure competitiveness on a global scale. Journal of Applied Research in
Higher Education
In this context of changes, a new model of teaching oriented to practice and a © Emerald Publishing Limited
2050-7003
competency-based learning approach emerges (Marcelo et al., 2014). A model that replaces DOI 10.1108/JARHE-09-2018-0192
JARHE educational systems based on teaching and learning objectives because it is considered a
fragmentary system within the teaching-learning process (Bergsmann et al., 2015). This new
approach, as indicated by López et al. (2016), involves the integration and mobilization of
different types of learning (knowledge, attitudes and skills), where teachers must improve
students’ cognitive, communicative and affective aspects (León and Latas, 2007; Frasquet
et al., 2012).
In the Latin American context, higher education also becomes a demand (Organización
de Estados Iberoamericanos, 2016) due to the urgent need to guarantee homogeneity among
its educational objectives by 2021 in terms of design and organization of the degrees.
The proposal of Tuning Latin America (Beneitone et al., 2007) focused its attention on the
need to increase the value of teaching competences as a process of modernization and
curricular reform. Currently, most of the universities participating in the Tuning Latin
America proposal are modifying their curricula and face the challenge of taking these
agreed competences as reference points to design their curriculums and the profiles of
undergraduate students (Beneitone et al., 2014). Consolidating a common space for Latin
Downloaded by 186.68.227.160 At 05:33 21 January 2019 (PT)

American higher education (Montaño, 2013) means eliminating a strong fragmentation of


higher education characterized by privatization in institutional matters and the great
heterogeneity of quality levels (Fernández and Coppola, 2013). In short, a more coordinated
and interdisciplinary curricular development that allows a more comprehensive education is
needed, as well as the incorporation of active methodologies and tasks in which the
evaluation becomes a real educational practice (Méhaut and Winch, 2012).
The results of the survey of the International University Association (Egron-Polak and
Hudson, 2010) showed that only 51 percent of institutions in Latin America assigned a high
level of importance to the process of internationalization of higher education (Didou-Aupetit,
2013). Malo (2005) and Maldonado-Maldonado (2012) pointed out how the differences between
HEIs in Latin American countries make the convergence process difficult. However, Taylor
(2010) bet that the educational model promoted by Bologna is perceived as an opportunity to
reform and internationalize the prevailing educational models in Latin America.
In its beginnings, the Tuning Latin America Project 2004–2007 (Beneitone et al., 2007)
focused on the collective and consensual definition of the set of generic and specific
competences that should guide the formulation, implementation and evaluation of existing
degrees in Latin America. Based on the competences established in the report Tuning
Educational Structures in Europe I and II (Gonzaĺ ez and Wagenaar, 2003, 2006), several
priority competences were identified for this context, placing the emphasis on teacher
training in this region. As indicated by the results of the study developed by Lagoa-Varela
et al. (2018), 40 percent of teachers recognize having training needs (Qazi et al., 2014), despite
the changes that have taken place.
Studies on competence-based training in higher education in Europe are growing both in
terms of student training and in relation to teacher training (Hernández and Carrasco, 2012;
Colmenero et al., 2015; Romero et al., 2016). Recently, competency assessment has become a
recurring theme in Latin America (González, 2012; Tejada and Ruiz, 2016). Colombia is the
reference country in terms of a competency-based teaching approach. In this sense, they
include competencies as a central element of the higher education curriculum and increase the
importance of ensuring coherence between them and the evaluation system (Núñez, 2016).
Results of PISA Latin America and the Caribbean (Bos et al., 2016) are very clear about
the need for a methodological update of teaching as a key to the renewal of education.
The results of evaluation tests, such as Third Regional Comparative and Explanatory Study
(TERCE, UNESCO, 2016) conducted by the Latin American Laboratory for the Evaluation
of the Quality of Education (LLECE), report the achievements of students in the Latin
American region and the Caribbean. The subjects are language (reading and writing),
mathematics and natural sciences. The evaluation pointed out great educational challenges
and identified some key areas of intervention focused on the improvement of teaching skills Generic
and abilities. In this regard, Schulz and Starnov (2010) mentioned that the evaluation competences in
process of competency acquisition has focused mainly on the perceptions of students and higher
there is a shortage of studies from the teaching perspective.
Moreover, the studies by Instefjord and Munthe (2016), Tang et al. (2016) and Tynjälä et al. education
(2016) emphasize the need to address a classification and delimitation of the basic competences
for the correct exercise of the teaching profession. The scholars coincide in highlighting digital
competence, social competence, academic competence, pedagogical competences and
organization and management competences. However, authors such as Keeley et al. (2012)
consider that, instead of referring to a repertoire of isolated competences, we must take into
account a set of teaching competencies in different specific and observable aspects of the
educational activity that are necessary for the development of the main teaching tasks.
Regarding university curriculum, according to Eurydice (2014) report, interpersonal
skills are identified as a priority. Oberst et al. (2009) define social competences as the ability
to establish interpersonal relationships and personal competences, such as role and teaching
Downloaded by 186.68.227.160 At 05:33 21 January 2019 (PT)

profile (Pekrun et al., 2014; Muntaner et al., 2017). Conversely, in the European Tunning
Project I and II (Gonzaĺ ez and Wagenaar, 2003, 2006), the following classification of general
competences is recommended for all university degrees:
(1) Instrumental competences (García, 2014): related to cognitive abilities, the ability to
learn and handle thoughts and ideas. Methodological skills to manage the
environment: organizing time, making decisions, solving problems and putting into
practice learning strategies. It also includes technological skills related to the use of
information and communication technologies as well as linguistic skills such as oral
and written communication and the acquisition of a foreign language.
(2) Interpersonal competences: these refer to interaction and social cooperation.
They include skills related to the ability to work in groups, express our own feelings
and have an ethical and social commitment (Bartram and Roe, 2005).
(3) Systemic competences: they require a prior acquisition of instrumental and
interpersonal skills and involve the skills and abilities related to autonomous
learning, creativity and adaptability to new contexts.
This classification of competencies is very much in line with the results obtained in the
research by Serrano et al. (2018), where the set of generic competences were grouped in four
dimensions: academic competences, social competences, interpersonal competences and
instrumental competences.
As Clemente-Ricolfe and Escribá-Pérez (2013) and Gómez et al. (2017) indicate, the
structural model followed for the classification of competencies depends on a specific
taxonomy and a particular nomenclature (Zabala and Arnau, 2008; González and López,
2010) mediated by the theoretical or ideological conceptions of each organization. Therefore,
it is not possible to affirm that there is a unique profile of competences (Sánchez, 2016).
Higher education in Latin America and, in particular, the training of university teaching
staff has been scarcely researched (Berry and Taylor, 2014). It is a fact that teachers create
concepts or beliefs about education and professional development that have a significant
influence on their teaching activity and on the development of professional competences
appropriate for the practice of the profession (Luft et al., 2003; Korthagen et al., 2006).
Therefore, as indicated by Pool et al. (2013), it is necessary to know the perception of teachers,
with the aim of adjusting their professional profile to one that allows them to respond
adequately to the new challenges and demands that are emerging in the actual society. The
professional profile of the higher education teacher provokes a debate about the theoretical
and practical knowledge that these professionals must acquire and develop (Elken, 2017).
JARHE In addition, Segovia (2016) indicates that in Ecuador competency-based training is a
current and future challenge for education. Hence, in our study, we explore the expectations
of teachers regarding the importance of the set of generic competences defined by the
Tuning project for higher education institutions (HEI). The general objective of this research
is the psychometric validation of a scale of teachers’ expectations regarding the set of
general competences of higher education in Ecuador.

2. Materials and methodology


2.1 Participants
The sample consisted of 458 teachers (46.7 percent women and 53.3 percent men) from seven
institutions of higher education in Ecuador. The average age of the participants was
44.62 (SD 10.0), with a cumulative university experience of 11.7 years (SD 8.2). Taking into
account the contributions of Morales (2012), the minimum number of participants per item
must be 10 to establish the factorial validity of the questionnaire. The distribution of
teachers by universities can be seen in Table I.
Downloaded by 186.68.227.160 At 05:33 21 January 2019 (PT)

2.2 Instrument
To achieve the objectives proposed in this research, an ad hoc instrument called Scale of
University Teachers Expectations on Competencies (E-DUC, acronym in Spanish) was
designed to be included within a broader research project.
The questionnaire was developed taking into consideration the generic competences
proposed by the Tuning Latin America Project (Beneitone et al., 2014) considered a reference
in this context and, in turn, the model that presents the highest evidence of validity
(Santiesteban, 1990; Martínez, 1996).
First, questions about personal and working information were included, such as age, sex,
university and department, years of teaching experience, employment status (tenure-track,
contract, invited), number of courses, level of the courses taught, type of instruction (online,
classroom, blended) and teaching training.
Then, E-DUC was introduced with the statement “Assess the degree of importance or
relevance that you assign to the following academic competences in your teaching
work,” followed by 27 Likert-type items with five response options (1 ¼ not important at all;
5 ¼ very important). The list of items includes the four groups of competences defined by
Serrano et al. (2018) (Table II).

2.3 Procedure and data analysis


The participating HEI were selected through an intentional sampling for accessibility. In
the first phase of the process, the centers were contacted and invited to participate in the
study. Once the invitation was accepted, a project coordinator was appointed by the center.

University Frequency Percentage

Escuela Superior Politécnica Agropecuaria de Manabí Manuel Félix López


(ESPAM MFL) 35 7.6
Escuela Superior Politécnica del Litoral (ESPOL) 23 5.0
Universidad Laica “Eloy Alfaro” de Manabí (ULEAM) 140 30.6
Universidad Técnica de Manabí (UTM) 186 40.6
Universidad Tecnológica ECOTEC 15 3.3
Table I. Universidad Estatal del Sur de Manabí (UNESUM) 26 5.7
Teacher’s distribution Universidad Casa Grande (Guayaquil) 33 7.2
by university Total 458 100.0
F1: F2: F3:
Generic
learning social interpersonal F4: context competences in
Items process values skills adaptability higher
@1: capacity for abstraction, analysis and synthesis 0.663 education
@2: knowledge about the study area and the profession 0.621
@3: ability to identify, formulate and solve problems 0.696
@4: ability to learn and permanently update 0.676
@5: oral and written communication skills 0.590
@6: critical and self-critical skills 0.670
@7: research capacity 0.613
@8: skills to search, process and analyze 0.692
@9: capacity to apply knowledge in practice 0.563
@10: capacity to formulate and manage projects 0.565
@11: commitment to preserving the environment 0.718
@12: commitment to his/her sociocultural environment 0.792
@13: valuation and respect for diversity and
Downloaded by 186.68.227.160 At 05:33 21 January 2019 (PT)

multiculturalism 0.798
@14: social responsibility and commitment 0.745
@15: ethical commitment 0.666
@16: commitment to quality
@17: capacity to make decisions 0.705
@18: interpersonal skills 0.724
@19: capacity to motivate and lead toward common goals 0.696
@20: capacity for teamwork 0.711
@21: capacity to organize and plan time 0.736
@22: capacity to act in new situations 0.700
@23: capacity to work autonomously 0.644
@24: creative capacity 0.573
@25: capacity to communicate in a second language 0.829
@26: ability to work in international contexts 0.815
@27: skills in using Information and communication Table II.
technology (ICT) 0.623 Rotated component
Note: Only factor loadings above 0.50 are reported matrix

The process of sending the questionnaire was online and each of the institutional
coordinators centralized the follow-up in the data collection.
For the validity of the questionnaire, we defined content validity as the degree to which a
scale adequately and completely represents the construct for whose measurement it was
designed (Thomas and Nelson, 2007). To reach optimal levels of content validity, the expert
technique was used. Hence, the experts were asked to assess different aspects of the initial
information, the measurement questionnaire, the items and a global assessment of each of
them taking into account the degree of understanding and adequacy in the writing.
To verify the validity of comprehension, a pilot study was carried out in which, after
administering the questionnaire to 20 teachers, the degree of comprehension was analyzed
from a qualitative point of view; registering the questions, doubts and suggestions about the
subjects made in the face-to-face session.
Statistical methods applied in order to achieve this study’s purposes were exploratory
factor analysis (EFA) and confirmatory factor analysis (CFA). EFA is used to explore the
possible underlying factor structure of a set of observed variables (the 27 items for
academic competences). On the other hand, CFA is used to verify the hypothetical factorial
structure or measurement model. In this work, CFA was performed testing the
measurement model with structural equation models (SEM). Model fit was assessed with a
set of indicators. It is worth noting that, as sample size increases above 200, p-value
JARHE associated with χ2 statistic has a tendency to reject the model (Hair et al., 2010).
Considering the large sample used in this study, additional measures were used to assess
model fit. CMIN/df is the χ2 standardized by its degrees of freedom. Low levels (but
above 1) imply a good fit, while values above 3 suggest improving the model; however,
some authors suggest a higher threshold of 5 (Hair et al., 2010). The goodness of fit index
(GFI) is similar to the R2 from a linear regression, ranging between 0 and 1 (perfect fit), and
AGFI is the GFI adjusted by its degrees of freedom, which rewards parsimonious models.
Recommended levels for these two measures are above 0.90 (Hair et al., 2010).
The comparative fit index should be greater than 0.95 as an indicator of a good fit (Blunch,
2008). Root mean square error of approximation (RMSEA) tries to correct the tendency of
χ2 of rejecting any model specified with a large enough sample (Hair et al., 2010). Values
between 0.05 (preferable) and 0.08 are considered acceptable (Hair et al., 2010), whereas
values above 0.10 mean the model should be rejected (Blunch, 2008). The PCLOSE is the
p-value of the test under the null hypothesis that the RMSEA is equal to 0.05, and should
be greater than 0.05 to conclude that the model has a “close” fit. The software packages
Downloaded by 186.68.227.160 At 05:33 21 January 2019 (PT)

used for these analyses were SPSS and AMOS.


Constructs’ reliability, discriminant and convergent validity were also analyzed.
A composite reliability (CR) index was preferred instead of Cronbach’s α, since factor
loadings and error variances provide more useful construct’s reliability information
(Sijtsma, 2009; Bagozzi and Yi, 2012). A CR above a threshold of 0.7 indicates the adequate
reliability of the measurement of each construct (Hair et al., 2010). Average variance
extracted by any latent construct should exceed 0.50 for convergent validity and should
be greater than average shared variance (ASV ) and (more strictly) maximum shared
variance to reflect discriminant validity (Fornell and Larcker, 1981).

3. Results
3.1 Content validity and understanding of the instrument
As already indicated, the expert technique was used to guarantee the validity of the E-DUC
questionnaire (Cohen and Swerdlik, 2001). Following Landeta (2002) and García and
Fernández (2008), the group of experts consisted of five teachers from the education area
with a 10 years average experience in the field. For the selection of experts, the criteria
defined by French (2011) were taken into account: academic training, experience in the
subject and experience in the validation of scales.
In addition, evaluation criteria were established: relevance, pertinence, intensity, clarity
and completeness of the questions on the scale (Gable and Wolf, 1993). Through
the qualitative contribution of each of the experts in combination with the average
(quantitative) scores given to each item (values between 1 and 5), the 27 items that make up
the questionnaire did not undergo changes (obtained values close to 5). The items that
underwent the most changes were those introduced in the initial section of the scale
referring to the participants identification data (sex, level at which they teach, number of
courses, age, teaching experience, training, etc.).
Finally, the pilot questionnaire was applied to a sample of 78 teachers from the different
participating universities. No difficulties were detected in the piloting and all the data were
included as part of the final sample.

3.2 Reliability and validity measures


Exploratory factor analysis. Kaiser–Meyer–Olkin measure showed a high level of sampling
adequacy (0.955) and Bartlett’s test was significant ( χ2 ¼ 7,542.16; df ¼ 351; p-value ¼ 0.000)
for factor analysis to be suitable. Four factors were extracted according to eigenvalues,
explaining 64.42 percent of total variance. The unrotated component matrix grouped most
of the items in one single factor, hindering its interpretation. Using a varimax rotation, the Generic
new component matrix suggested four clearly interpretable factors (Table II). Item 16 did competences in
not show an acceptable loading (W0.5, according to Hair et al., 2010) for any factor. higher
Hereafter, the four factors are named as learning process, social values, interpersonal skills
and context adaptability. education
Confirmatory factor analysis. The assumption of multivariate normality when
performing SEM analysis was assessed through skewness, kurtosis and Mardia’s test.
All variables exhibited levels (either for skewness or kurtosis) significantly different from 0,
according to critical ratios (CR). Mardia’s coefficient showed severe multivariate kurtosis
(Mardia ¼ 184.6; CR ¼ 55.11; p o0.01). A deviation from normality could inflate χ2 statistics
(Hair et al., 2010) and underestimate standard errors, so erroneous significant relations may
be found in a model. However, as Hair et al. (2010) explain, problems with non-normal data
could be minimized by increasing the ratio of respondents to parameters near to 15. With a
ratio of 12.4 in this study and a robust estimation technique as maximum likelihood
estimation, the deviation from normality should not be a major concern.
Downloaded by 186.68.227.160 At 05:33 21 January 2019 (PT)

A first CFA was run following the final factorial structure of EFA, contained in Table II,
without item 16. The results of the initial CFA showed low fit indexes (Table III).
According to modification indices, some errors’ covariances belonging to the
same latent construct must be included to improve model fit (items 7 and 8, 11 and 12,
20 and 21). Another issue was a lack of discriminant validity derived from a high
correlation between F1: learning process and F3: interpersonal skills. Results from EFA
showed that items 9 and 24 had the lowest loadings with factors F1 and F3, respectively.
Additionally, we found similar loadings (slightly below 0.5) with each of the other
factors (F3 and F1, respectively). When we removed items 9 and 24, the new measurement
model showed a good fit (Table III). Reliability validity and convergent validity of all
constructs were above their thresholds. Also, discriminant validity was adequate,
taking the ASV as a reference (Table IV ). All factor loadings were greater than the

Measurement model Measurement model


Measures Threshold 1 2

χ2 Low 823.06 544.31


df – 293 243
χ2 probability (p-value)a ⩾ 0.05 0.000 0.000
CMIN/df ⩽ 3 or 5 2.809 2.240
Goodness of fit index (GFI) ⩾ 0.90 0.862 0.901
Adjusted goodness of fit index (AGFI) ⩾ 0.90 0.834 0.878
Comparative fit index (CFI) ⩾ 0.95 0.924 0.952
Root mean square error of approximation (RMSEA) ⩽ 0.08 0.067 0.055
RMSEA p-close W0.05 0.000 0.088 Table III.
Note: aNot appropriate for large sample sizes Fit assessment for
Sources: Hair et al. (2010), Blunch (2008) and authors’ own research measurement model

Latent constructs CR AVE ASV MSV

Learning process 0.903 0.509 0.502 0.714


Interpersonal skills 0.923 0.632 0.498 0.714 Table IV.
Social values 0.895 0.634 0.414 0.545 Reliability and
Context adaptability 0.769 0.537 0.238 0.285 validity measures
JARHE suggested level of 0.5 (Hair et al., 2010) (Figure 1). Taken together, these results suggest
that this instrument meets the different requirements to adequately measure expectations
about academic competences.

4. Discussion and conclusions


The results presented here show a satisfactory metric quality of E-DUC. When we
evaluated with confirmatory procedures, it showed an adequate adjustment of the
proposed model. Specifically, the results revealed a structure of four factors: learning
process, social values, interpersonal skills and context adaptability. These results are very
much in line with the taxonomy made by Pekrun et al. (2014), Instefjord and Munthe
(2016), Tang et al. (2016), Tynjälä et al. (2016) and Serrano et al. (2018). In addition, factors
show adequate reliability in terms of internal consistency. These psychometric results of
factor structure and reliability complement content validity. Hence, the questionnaire has,
according to the empirical evidences analyzed to date, a good psychometric quality.
Moreover, in this questionnaire’s design, we have identified a set of priority competences
Downloaded by 186.68.227.160 At 05:33 21 January 2019 (PT)

for the training of teachers in Ecuador. Therefore, the scale is reliable and the original
structural model is adjusted to the sample used. Thus, we recommend the instrument to

e9 @1

e8 @2 0.70
0.70
e7 @3
0.73
e6 @4 0.72
0.77
e5 @5 Learning
0.77
process
e4 @6 0.61
0.77
e3 @7
0.62
0.35 0.71
@11 e19
e2 @8
0.63 0.46
@12 e20
0.79
e28 @10
0.85 @13 e21
Social values 0.88
0.80 @14 e22

e17 @17 @15 e23


0.79
0.74 0.84
e16 @18 0.82

e15 @19 0.83


0.53
0.72 Interpersonal 0.44
e14 @20 0.77 skills
0.27
0.82
e13 @21
0.81 0.49

e12 @22
0.73 @25 e25
e11 @23
Context 0.90
adaptability @26 e26
Figure 1. 0.53
Measurement model 2
@27 e27
know the perception of the teaching staff about the set of generic competences defined for Generic
HEI, not only in Ecuador, but also in other Latin American countries. competences in
As noted by Taylor (2010), Fernández and Coppola (2013) and Méhaut and Winch higher
(2012), it is not about making a copy of what happened in Europe, but about benefiting
from the European experience to adapt relevant to the circumstances of Latin America education
(Egron-Polak and Hudson, 2010; Maldonado-Maldonado, 2012; Didou-Aupetit, 2013).
The aim of Tuning Latin America is twofold: to address the need to modernize,
reformulate and flexibilize study programs in the face of new trends, societal needs and
changing realities of a globalized world; and to recognize the importance of transcending
the limits of learning providing training that can allow its recognition beyond institutional
and international borders, on the basis of the European Tunning Project I and II (Gonzaĺ ez
and Wagenaar, 2003, 2006).
In the line of Marcelo et al. (2014) and Bergsmann et al. (2015), we think that it is
necessary to initiate reforms in the education systems based on educational goals to a
competence-based teaching model. We agree with Villa and Poblete (2011) claim that
Downloaded by 186.68.227.160 At 05:33 21 January 2019 (PT)

currently, people require the acquisition of academic, social and interpersonal


skills and abilities for their functioning and socio-labor integration. These competences
become a key element of university learning. In addition to the mastery of academic
specialty, students develop a wide range of competences that enrich them as human
beings and future professionals (Bartram and Roe, 2005). Moreover, as suggested
by Lagoa-Varela et al. (2018) and Bos et al. (2016), teachers demand this change
in the form of more teacher training to improve the unsatisfactory students’ outcomes
(UNESCO, 2016).
Faculty conceptions or beliefs (regarding what they think is important) influence their
decision making during the teaching task and, finally, influence the outcomes of the
teaching-learning process (Roelofs and Sanders, 2007). The collection of information with
the designed scale will allow researchers to analyze possible impacts of teachers’ perception
about competences’ relevance over other constructs, such as teachers training or teaching
performance. In this context, the university teacher must develop skills that allow him to
assume new roles and build new and creative scenarios and own learning itineraries.
The results can generate proposals to adapt higher education to the real needs and demands
of modern society. In addition, this information is essential for the quality assurance system
of the degrees offered by HEI. Finally, the opinions of our participants can contribute to the
continuous improvement of the teaching profession and the reformulation of new objectives
in future curriculum designs.

References
Bagozzi, R.P. and Yi, Y. (2012), “Specification, evaluation, and interpretation of structural equation
models”, Journal of the Academy of Marketing Science, Vol. 40 No. 1, pp. 8-34.
Bahia, S., Freire, I., Estrela, M.T., Amaral, A. and Espírito Santo, J.A. (2017), “The Bologna process and
the search for excellence: between rhetoric and reality, the emotional reactions of teachers”,
Teaching in Higher Education, Vol. 22 No. 4, pp. 467-482.
Bartram, D. and Roe, R. (2005), “Definition and assessment of competences in the context of the
European diploma in psychology”, European Psychologist, Vol. 10 No. 2, pp. 93-102.
Beneitone, P., González, J. and Wagenaar, R. (2014), Meta-perfiles y perfiles. Una nueva aproximación
para las titulaciones en América Latina, Universidad de Deusto, Bilbao.
Beneitone, P., Esquetini, C., González, J., Marty, M., Siufi, G. and Wagenaar, R. (2007), Reflexiones y
perspectivas de la Educación Superior en América Latina. Informe Final Proyecto
Tuning-America Latina 2004-2007, Universidad de Deusto, Bilbao.
JARHE Bergsmann, E., Schultes, M.T., Winter, P., Schober, B. and Spiel, C. (2015), “Evaluation of competence-
based teaching in higher education: from theory to practice”, Evaluation and Program Planning,
Vol. 52, October, pp. 1-9, available at: www.sciencedirect.com/science/article/abs/pii/S0
149718915000270?via%3Dihub
Berry, C. and Taylor, J. (2014), “Internationalisation in higher education in Latin America: policies and
practice in Colombia and Mexico”, Higher Education, Vol. 67 No. 5, pp. 585-601.
Blunch, N. (2008), Introduction to Structural Equation Modelling Using SPSS and AMOS, SAGE,
New Delhi.
Bologna Declaration (1999), “The Bologna Declaration of 19 June 1999: joint declaration of the
European ministers of education”, available at: www.eurashe.eu/library/bologna_1999_
bologna-declaration-pdf (accessed March 1, 2018).
Bos, M.S., Elías, A., Vegas, E. and Zoido, P. (2016), Latin America and the Caribbean in PISA 2015:
How Did the Regiom Perform?, Inter-American Development Bank, Washington, DC.
Chuo-Chun, H. and Huisman, J. (2017), “Higher education policy change in the European higher
education area: divergence of quality assurance systems in England and the Netherlands”,
Downloaded by 186.68.227.160 At 05:33 21 January 2019 (PT)

Research Papers in Education, Vol. 32 No. 1, pp. 71-83.


Clemente-Ricolfe, J.S. and Escribá-Pérez, C. (2013), “Análisis de la percepción de las competencias
genéricas adquiridas en la universidad”, Revista de Educación, Vol. 362, September-December,
pp. 535-561, available at: http://dx.doi.org/10.4438/1988-592X-RE-2013-362-241
Cohen, R.J. and Swerdlik, M. (2001), Pruebas y evaluación psicológicas: Introducción a las pruebas y a la
medición, 4th ed., McGraw Hill, Mexico City.
Colmenero, M.J., Pantoja, A. and Pegalajar, M.C. (2015), “Percepciones del alumnado en formación
inicial del profesorado de Educación Secundaria”, Revista Complutense de Educación, Vol. 26
No. 1, pp. 101-120.
Corbett, A. (2011), “Ping pong: competing leadership for reform in EU higher education 1998-2006”,
European Journal of Education, Vol. 46 No. 1, pp. 36-53.
Didou-Aupetit, S. (2013), “Trends in student and academic mobility in Latin America: from ‘Brain
Drain’ to ‘Brain Gain’ ”, in Balan, J. (Ed.), Latin America’s New Knowledge Economy: Higher
Education, Government and International Collaboration, Institute of International Education,
New York, NY, pp. 71-81.
Egron-Polak, E. and Hudson, R. (Eds) (2010), “Internationalization of higher education: global trends,
regional perspectives. IAU 3rd global survey report”, International Association of Universities, Paris.
Elken, M. (2017), “Standartization of (higher) education in Europe- policy coordination 2.0?”, Policy and
Society, Vol. 36 No. 1, pp. 127-142.
Eurydice (2014), “Compulsory Education in Europe 2014/15. Eurydice facts and Gures report”,
Education and Training, United Nations Publishing Office, Brussels, available at: http://eacea.ec.
europa.eu/education/eurydice/documents/key_data_series/166es.pdf
Fernández, N. and Coppola, N. (2013), “Desafíos para la construcción del Espacio Latinoamericano de
Educación Superior, en el marco de las Políticas Supranacionales”, Journal of Supranational
Policies of Education, Vol. 1 No. 1, pp. 67-82.
Fornell, C. and Larcker, D.F. (1981), “Evaluating structural equation models with unobservable
variables and measurement error”, Journal of Marketing Research, Vol. 18 No. 1, pp. 39-50.
Frasquet, M., Calderón, H. and Cervera, A. (2012), “University-industry collaboration from a
relationship marketing perspective: an empirical analysis in a Spanish university”, Higher
Education, Vol. 64 No. 1, pp. 85-98.
French, S. (2011), “Aggregating expert judgement”, Revista de la Real Academia de las Ciencias Exactas,
Físicas y Naturales, Vol. 105 No. 1, pp. 181-206.
Gable, R.K. and Wolf, M.B. (1993), Instrument Development in the Affective Domain: Measuring Attitudes
and Values in Corporate and School Settings, Kluwer Academic Publishers, Boston, MA.
García, L. and Fernań dez, S.J. (2008), “Procedimiento de aplicacioń del trabajo creativo en grupo de Generic
expertos”, Ingeniería Energética, Vol. 29 No. 2, pp. 46-50. competences in
García, M.P. (2014), “La evaluación de competencias en la Educación Superior mediante rúbricas: higher
un caso práctico”, Revista electrónica interuniversitaria de Formación del Profesorado, Vol. 17
No. 1, pp. 87-106. education
Gómez, M., Aranda, E. and Santos, J. (2017), “A competency model for higher education: an assessment
based on placements”, Studies in Higher Education, Vol. 42 No. 12, pp. 2195-2215.
González, I. and López, A.B. (2010), “Sentando las bases para la construcción de un modelo de
evaluación a las competencias docentes del profesorado universitario”, Revista de Investigación
Educativa, Vol. 28 No. 2, pp. 403-423.
González, J. (2012), “La evaluación de la docencia en Iberoamérica”, Revista Iberoamericana de
Evaluación Educativa, Vol. 5 No. 1, pp. 339-348.
Gonzaĺ ez, J. and Wagenaar, R. (2003), Tuning Educational Structures in Europe I. Informe Final,
Universidad de Deusto, Bilbao.
Gonzaĺ ez, J. and Wagenaar, R. (2006), Tuning Educational Structures in Europe II. La contribución de las
Downloaded by 186.68.227.160 At 05:33 21 January 2019 (PT)

universidades al Proceso de Bolonia, Universidad de Deusto, Bilbao.


Hair, J., Black, W., Babin, B. and Anderson, R. (2010), Multivariate Data Analysis, Pearson, New York, NY.
Hernández, M.J. and Carrasco, R. (2012), “Percepciones de los estudiantes del Máster de Formación del
Profesorado de Enseñanza Secundaria. Fortalezas y Debilidades del nuevo modelo formativo”,
Enseñanza and Teaching, Vol. 30 No. 2, pp. 127-152.
Instefjord, E. and Munthe, E. (2016), “Preparing pre-service teachers to integrate technology: an
analysis of the emphasis on digital competence in teacher education curricula”, European
Journal of Teacher Education, Vol. 39 No. 1, pp. 5-19.
Keeley, J., Christopher, A.N. and Buskist, W. (2012), Emerging Evidence for Excellent Teaching Across
Borders, SAGE Publications, Thousand Oaks, CA.
Korthagen, F., Loughran, J. and Russell, T. (2006), “Developing fundamental principles for teacher
education programs and practices”, Teaching and Teacher Education, Vol. 22 No. 8, pp. 1020-1041.
Lagoa-Varela, D., Alvarez-García, B. and Boedo, L. (2018), “Recent changes in the role of Spanish
lecturers in economics and business: an empirical analysis based on their own perspectives”,
Studies in Higher Education, Vol. 43 No. 8, pp. 1321-1333.
Landeta, J. (2002), El método Delphi: una técnica de previsión del futuro, Ariel, Barcelona.
León, B. and Latas, C. (2007), “La formación en técnicas de aprendizaje cooperativo del profesor
universitario en el contexto de la convergencia europea”, Revista Psicodidáctica, Vol. 12 No. 2,
pp. 269-277.
López, C., Benedito, V. and León, M. (2016), “El Enfoque en Competencias en la Formación Universitaria
y su impacto en la evalución”, Formación Universitaria, Vol. 9 No. 4, pp. 11-22.
Luft, J., Roehrig, G. and Patterson, N. (2003), “Contrasting landscapes: a comparison of the impact of
different induction programs on beginning secondary science teachers’ practices, beliefs and
experiences”, Journal of Research in Science Teaching, Vol. 40 No. 1, pp. 77-97.
Maldonado-Maldonado, A. (2012), “Latin American higher education hope in the struggle?”, in
Palfreyman, D. and Tapper, T. (Eds), Structuring Mass Higher Education: The Role of Elite
Institutions, Routledge, London, pp. 73-94.
Malo, S. (2005), “El Proceso de Bolonia y la educación superior en América Latina”, Foreign Affairs
Latinoamérica, Vol. 5 No. 2, pp. 21-33.
Marcelo, C., Yot, C., Mayor, C., Sánchez, M., Murillo, P., Rodríguez, J.M. and Pardo, A. (2014), “Las
actividades de aprendizaje en la enseñanza universitaria: ¿hacia un aprendizaje autónomo de los
alumnos?”, Revista de Educación, Vol. 363, enero-abril, pp. 334-359, available at: www.
revistaeducacion.educacion.es/doi/363_191.pdf
Martínez, M.R. (1996), Psicometría: teoría de los tests psicológicos y educativos, Síntesis, Madrid.
JARHE Méhaut, P. and Winch, C. (2012), “The European qualification framework: skills, competences or
knowledge”, European Educational Research Journal, Vol. 11 No. 3, pp. 369-381.
Montaño, A.M. (Ed.) (2013), Educación Superior en América Latina: reflexiones y perspectivas en
Educación, Universidad de Deusto, Bilbao.
Morales, V. (2012), Estadística aplicada a las Ciencias Sociales. Tamaño necesario de la muestra.
¿Cuántos sujetos necesitamos?, Universidad Pontificia de Comillas, Madrid.
Muntaner, A., Vidal, J., Sése, A. and Palau, P. (2017), “Teaching skills, students’ emotions, perceived
control and academic achievement in university students: a SEM approach”, Teaching and
Teacher Education, Vol. 67, October, pp. 1-8, available at: www.sciencedirect.com/science/article/
pii/S0742051X16304188
Núñez, J.A. (2016), “El modelo competencial y la competencia comunicativa en la educación superior en
América Latina”, Foro de Educación, Vol. 14 No. 20, pp. 467-488.
Oberst, U., Gallifa, J., Farriols, N. and Vilaregut, A. (2009), “Training emotional and social competences in
higher education: the seminar methodology”, Higher Education in Europe, Vol. 34 No. 4, pp. 523-553.
Organización de Estados Iberoamericanos (2016), Miradas sobre la Educación en Iberoamérica. Avances
Downloaded by 186.68.227.160 At 05:33 21 January 2019 (PT)

sobre las Metas Educativas 2021, Instituto de evaluación (IESME), Madrid, available at: www.
oei.es/Educacion/Noticia/miradas-sobre-la-educacion-en-iberoamerica-2016
Pekrun, R., Cusack, A., Murayama, K., Elliot, A.J. and Thomas, K. (2014), “The power of anticipated
feedback: effects on students’ achievement goals and achievement emotions”, Learning and
Instruction, Vol. 29, February, pp. 115-124, available at: www.sciencedirect.com/science/article/
pii/S0959475213000637
Pool, J., Reitsma, G. and Mentz, E. (2013), “An evaluation of technology teacher training in South Africa:
shortcomings and recommendations”, International Journal of Technology and Design
Education, Vol. 23 No. 2, pp. 455-472.
Qazi, W., Ali, S. and Tehseen, S. (2014), “Higher education and growth performance of Pakistan:
evidence from multivariate framework”, Quality and Quantity, Vol. 48 No. 3, pp. 1651-1665.
Ravinet, P. (2008), “From voluntary participation to monitored coordination: why European countries
feel increasingly bound by their commitment to the bologna process”, European Journal of
Education, Vol. 4 No. 3, pp. 353-367.
Roelofs, E. and Sanders, P. (2007), “Towards a framework for assessing teacher competence”, European
Journal of Vocational Training, Vol. 40 No. 1, pp. 123-139.
Romero, M.C., Gleason, M.A., Rubio, J.E. and Arriola, M.A. (2016), “Validación de un modelo de
competencias docentes en una universidad privada mexicana”, Revista digital de investigación en
docencia universitaria, Vol. 10 No. 1, pp. 1-15.
Sánchez, L. (2016), “Los marcos de competencias docentes. Contribución a su estudio desde la política
educativa europea”, Journal of Supranational Policies of Education, Vol. 5, pp. 44-67.
Santiesteban, C. (1990), Psicometría. Teoría y práctica en la construcción de tests, Ediciones Norma,
Madrid.
Schulz, M. and Starnov, C. (2010), “Informal workplace learning: an exploration of age differences in
learning competence”, Learning and Instruction, Vol. 20 No. 5, pp. 383-399.
Segovia, F. (2016), Aprendizaje por competencias: el reto actual y futuro del Ecuador, El Comercio, Quito.
Serrano, R., Amor, M.I., Guzman, A. and Guerrero, J. (2018), “Validation of an instrument to evaluate
the development of university teaching competences in Ecuador”, Journal of Hispanic Higher
Education, available at: https://doi.org/10.1177/1538192718765076.
Sijtsma, K. (2009), “Correcting fallacies in validity, reliability, and classification”, International Journal
of Testing, Vol. 9 No. 3, pp. 167-194.
Tang, S.Y., Wong, A.K. and Cheng, M.M. (2016), “Configuring the three-way relationship among
student teachers’ competence to work in schools, professional learning and teaching motivation
in initial teacher education”, Teaching and Teacher Education, Vol. 60, November, pp. 344-354,
available at: www.sciencedirect.com/science/article/pii/S0742051X16303250
Taylor, J. (2010), Globalisation and Internationalisation in Higher Education, Continuum, London. Generic
Tejada, J. and Ruiz, C. (2016), “Evaluación de competencias profesionales en Educación Superior: Retos competences in
e Implicaciones”, Educación XXI, Vol. 19 No. 1, pp. 17-38. higher
Thomas, J. and Nelson, J. (2007), Métodos de investigación en actividad física, Paidotribo, Barcelona. education
Tynjälä, P., Virtanen, A., Klemola, U., Kostiainen, E. and Raski-Puttonen, H. (2016), “Developing social
competence and other generic skills in teacher education: applying model integrative pedagogy”,
European Journal of Teacher Education, Vol. 39 No. 3, pp. 368-387.
UNESCO (2016), Informe de resultados TERCE. Tercer Estudio Regional Comparativo y Explicativo,
UNESCO, Santiago, available at: http://unesdoc.unesco.org/images/0024/002485/248526s.pdf
Villa, A. and Poblete, M. (2011), “Evaluación de competencias genéricas: principios, oportunidades y
limitaciones”, Revista Bordón, Vol. 63 No. 1, pp. 147-170.
Vukasovic, M., Jungblut, J. and Elken, E. (2015), “Still the main show in town? Assessing political
saliency of the bologna process across time and space”, Studies in Higher Education, Vol. 42
No. 8, pp. 1421-1436.
Zabala, A. and Arnau, L. (2008), 11 ideas clave. Cómo aprender y enseñar competencias, Graó, Barcelona.
Downloaded by 186.68.227.160 At 05:33 21 January 2019 (PT)

Further reading
Fernandez-Sainz, A., García-Merino, J.D. and Urionabarrenetxea, S. (2016), “Has the Bologna process
been worthwhile? An analysis of the learning society-adapted outcome index through quantile
regression”, Studies in Higher Education, Vol. 41 No. 9, pp. 1579-1594.

Corresponding author
Washington Macias can be contacted at: wamacias@espol.edu.ec

For instructions on how to order reprints of this article, please visit our website:
www.emeraldgrouppublishing.com/licensing/reprints.htm
Or contact us for further details: permissions@emeraldinsight.com

You might also like