You are on page 1of 14

sustainability

Article
Human-Centered Design as an Approach to Create
Open Educational Resources
Carles Garcia-Lopez 1, * , Enric Mor 1 and Susanna Tesconi 1
Faculty of Computer Science, Multimedia and Telecommunications, Universitat Oberta de Catalunya,
08018 Barcelona, Spain; emor@uoc.edu (E.M.); stesconi@uoc.edu (S.T.)
* Correspondence: carlesgl@uoc.edu

Received: 1 August 2020; Accepted: 3 September 2020; Published: 9 September 2020 

Abstract: Open educational resources (OER) play an important role in teaching and learning,
especially in lifelong learning. Educational resources should be created in a way that addresses
lifelong learners’ needs. Human-centered design (HCD) is a design perspective and an iterative
process that involves users in all phases of the process. Thus, an HCD approach can provide relevant
advantages when creating OER for lifelong learning. This work presents the Design Toolkit as a case
study of digital open educational contents for design education that has been created following an
HCD process. The orientation of the Design Toolkit is to provide users OER in a tool format rather
than in a traditional manner. The main goal of this research is to contribute to the understanding
of how HCD impacts OER creation. The research focuses on teachers, assessing the Design Toolkit
content organization and analyzing teacher adoption and usage of the resources. The HCD approach
fosters teachers’ satisfaction, promotes OER adoption and provides new design requirements for a
future iteration of the HCD process. The results show that designing OER involving users through
an HCD approach sets the focus adequately on their needs and limitations. Teachers feel satisfied
with the Design Toolkit, fostering the adoption of OER in different educational contexts. Finally,
users’ involvement in the whole HCD process points out design and educational requirements for
future work.

Keywords: open educational resources; teaching resources; lifelong-learning; design education;


human-centered design

1. Introduction
User-centered design (UCD) is both defined as a process and as a philosophy. As a design
process, it is an approach to plan projects and a set of methods to be used in each phase. On the
other hand, as a design philosophy, it aims to involve users in all phases of the design process [1].
UCD is especially useful where design solutions involve interactive technology [2]. Human-centered
design (HCD) provides a broader mindset to UCD, emphasizing the humanistic approach and the
way in which technology is designed [3]. This design approach has a direct link with the human pillar
of sustainability, which focuses on the importance of involving users in the making of products or
services [4].
Sustainable development is an oriented and heuristic process [5], where education in all design
disciplines plays an essential role in conveying human-centered values and skills to designers.
In recent years, design has experienced a meaningful evolution that takes into account the sustainable
development. This evolution has caused a redefinition of the discipline and the designer role.
This transformation has shifted design focus from products to ideas, people and experiences and new
design disciplines have arisen, such as design thinking, service design and open design [6].

Sustainability 2020, 12, 7397; doi:10.3390/su12187397 www.mdpi.com/journal/sustainability


Sustainability 2020, 12, 7397 2 of 14

In this context, design practitioners have become lifelong learners that need to be up-to-date with
new design disciplines, and its related processes, methods and principles. This new design context
highlights the need for easily accessible, up-to-date, educational design contents that address the new
challenges of the field. The main challenge is related to going beyond theory-based, one-dimensional
static learning resources and providing them in an action-oriented manner to both designers and
learners. In this scenario, new design contents with a “tool perspective” have been created mostly
by companies and practitioners [7]. These actionable contents are available through repositories that
follow a toolbox metaphor and are mainly aimed at practitioners. Therefore, there is an opportunity to
create open educational resources (OER) for both design practitioners and learners.
This work focuses on the Design Toolkit, an open educational repository that provides content
as tools. The platform allows learners to access and explore open educational design resources in
multiple ways (direct access, exploration) using a navigational system designed to empower teachers
and learners [8]. The Design Toolkit was designed and developed following an HCD approach through
an iterative process [9].
The general purpose of this research is to contribute to the understanding of how HCD impacts
OER design and development. Specifically, the work focuses on the Design Toolkit as a case study,
assessing content organization and analyzing teacher adoption and usage. The HCD approach fosters
teacher satisfaction in terms of user experience and promotes OER adoption.

2. Background

2.1. Lifelong Learning and OER


Lifelong learning is the process of learning that occurs during each person’s lifetime [10]. This way
of learning often implies that learners combine their educational and professional life. In this scenario,
distance learning programs enable students to balance and adapt the learning progress to their
availability and needs. Distance learning programs are usually delivered through technology, which
may enhance the learning process. A key attribute of lifelong learning is that learners take greater
responsibility for their own learning process [11,12]. It is commonly accepted that educational resources
play a key role in distance learning programs as components of knowledge assets [13–15]. Open
educational resources (OER) “are teaching, learning, and research resources that reside in the public
domain or have been released under an intellectual property license that permits their free use and
repurposing by others”. [16].
OER are made possible by information and communication technologies [17], and they are released
under open intellectual property licenses that allow free use or repurposing [18]. Walker [15] states
that open educational resources need to be convenient, effective, affordable, sustainable and available
to every learner and teacher.
Research about OER focuses on both learners and teachers. Armellini and Nie [19] state that
teachers or course designers could make use of these OER through four types of practices: (1) “as-is”
as a planned enhancement during curriculum design, (2) “as-is” as a “just-in-time” resource during
course delivery, (3) adapted OER during curriculum design, and (4) adapted OER during course
delivery. Depending on how and when OER are used, these four practices can be grouped into two
scenarios: curriculum design, and course delivery. Additionally, Cox & Trotter present a pyramid with
six essential factors of OER teacher adoption [20]. These factors are: access, permission, awareness,
capacity, availability and volition.
Although the benefits of OER in education have been proved [21,22], the use of OER seems to be
low in higher education [23–25]. Nevertheless, it does not mean that the reuse of these resources is not
happening. As Glennie, Harley and Butcher [26] explain it might be taking place “below the radar”,
and Wiley uses the term “dark reuse” to refer to unobserved behavior [27]. Thus, it is worth providing
evidence of the usage of these kinds of resources in higher education. From the designer’s perspective,
Sustainability 2020, 12, 7397 3 of 14

it is also crucial to indicate the process followed to design and develop those resources, seeing them as
a process rather than as a product [28].
Human-centered design processes and methods aim to create solutions by focusing on the needs,
wants and limitations of the users of the product [28]. Technology-enhanced learning (TEL) and
OER may take advantage of HCD when designing learner-centered educational environments and
contents [29,30].

2.2. OER in the Design Discipline


OER take a key role in learning processes in the design field. The design discipline is constantly
evolving, and new content is appearing, while previously existing content is changing. This fact
becomes even more notable in interaction design and human-computer interaction (HCI) disciplines,
where constant technological changes push design practitioners to keep knowledge updated. Although
some initiatives have flourished during the last few years to provide design-related content, most of
them are targeted at design practitioners. However, as models of design resources, it is worth doing an
in-depth analysis of them through a benchmark method. In Table 1 of [9], “Analysis of design toolkit’s
main features and contents”, we analyzed design content repositories focusing on the type of content
(actionable, theoretical, etc.) and the organization of content (labeling, navigation systems, explorability,
etc). In this work we enlarge the analysis introducing the open content perspective, analyzing the
licences used in each content repository. The results show that most of them are published under
restrictive licenses that do not allow users to reuse those resources. The results are summarized on
Table 1.

Table 1. Toolkits’ licenses analysis.

Toolkit Link Licence


Data visualization catalog https://datavizcatalogue.com/ Copyright
Design-led research toolkit http://dlrtoolkit.com/ Unspecified
DIY https://diytoolkit.org/ CC BY-SA 4.0
D.P.D http://www.edu-design-principles.org Unspecified
Dubberly Design Office http://www.dubberly.com/ Unspecified
Ideos’ DesignKit https://www.designkit.org/ Unspecified
Hi Toolbox https://toolbox.hyperisland.com/ Copyright
Medialab Amsterdam https://toolkits.dss.cloud/design/ Unspecified
Project of how https://projectofhow.com/ Unspecified
Service Design Toolkit https://www.servicedesigntoolkit.org/ CC BY-NC 4.0
Usability.gov https://www.usability.gov/ CC0 (Public domain)

3. Design Toolkit
UOC’s Design Toolkit (design-toolkit.recursos.uoc.edu) is an open educational resource about
design. It was developed and is being used on design-related courses at the Open University of
Catalonia (UOC, www.uoc.edu). UOC is a fully online university based in Barcelona (Spain) with a
community of more than 60,000 students and 3000 teachers. The UOC educational model is based on
breaking space and time barriers through asynchronous distance education based on TEL. Teaching
and learning mainly take place in a virtual learning environment that integrates learning contents,
asynchronous communication, academic services and interaction with teachers and peers. Currently,
the Design Toolkit educational content is being used on fifteen courses of five design-related programs.
The design and development of the Design Toolkit followed an iterative UCD process [24],
which involved users (learners and teachers), in which a set of educational and design requirements
were identified.
Educational requirements include (1) providing content as tools; (2) providing action-oriented
resources; (3) allowing users to explore educational resources and (4) distributing contents with an open
licence. On the other hand, design requirements are: (5) resources must be easy to access; (6) contents
Sustainability 2020, 12, x FOR PEER REVIEW 4 of 14
Sustainability 2020, 12, x FOR PEER REVIEW 4 of 14
Sustainability 2020, 12, 7397 4 of 14
reusable
reusable in in different
different learning
learning contexts;
contexts; (8) (8) design
design professionals
professionals (lifelong(lifelong learners)
learners) need need direct
direct and and
action-oriented content; (9) teachers and instructors need to
action-oriented content; (9) teachers and instructors need to be able to easily edit, organize and be able to easily edit, organize and
should
update be
update clear, organized and up to date; (7) contents must be shareable, manageable and reusable in
content.
content.
different
As learning
As a result,
a contexts;
result, thethe Design
Design (8)Toolkit
design professionals
Toolkit has
has four
four main
main (lifelong learners) (1)
characteristics:
characteristics: need
(1) itsdirect
its design
design and action-oriented
followed
followed aa user-
user-
content;
centered (9) teachers
design and
process; instructors
(2) it provides need to
design be able
relatedto easily
content edit,in organize
an
centered design process; (2) it provides design related content in an actionable way; (3) it is focused and
actionable update
way; (3)content.
it is focused
on As a result,
on learner
learner the Design
autonomy
autonomy and Toolkit has
and lifelong
lifelong four main
learning;
learning; andcharacteristics:
and (4)
(4) the
the educational
educational(1) its resources
design followed
resources are a user-centered
are offered
offered under
under an an
design
open process;
license.
open license. (2) it provides design related content in an actionable way; (3) it is focused on learner
autonomy
The and lifelong
The contents
contents of
of the
thelearning;
Design and
Design Toolkit
Toolkit(4) the
areeducational
are organized
organized into resources
into cards,are
cards, whichoffered
which have
have under an openstructure
aa common
common license.
structure
and The
each contents
one of
presents the Design
content inToolkit
an are
actionableorganized
way, into
providingcards,
and each one presents content in an actionable way, providing step by step guidance and examples. which
step by have
step a common
guidance structure
and examples. and
each
Someone
Some cardspresents
cards also content in
also provide
provide two
twoanlevels
actionable
levels of way, providing
of content
content depth
depth through step byaastep
through guide.guidance
guide. The
The guide and examples.
guide offers
offers in-depth Some
in-depth
cards
guidance,also provide
allowing two
the levels
adaptationof content
of depth
contents through
to different a guide.
educational
guidance, allowing the adaptation of contents to different educational contexts and learners. DigitalThe guide offers
contexts and in-depth
learners. guidance,
Digital
content
content navigation
allowing the adaptation
navigation is
is strongly
strongly linked
of contents
linked to to user
user satisfaction
different educational
satisfaction since itit allows
sincecontexts
allowsand users
users to
to understand
learners.
understandDigitalwherecontent
where to
to
find what
navigation they
is need.
strongly The
linkednavigation
to user system
satisfaction of the
since Design
it allows Toolkit
users
find what they need. The navigation system of the Design Toolkit comprises six main categories of comprises
to understand six main
where categories
to find of
what
which
they
which Design
need.
Design TheMethods
navigation
Methods is
is the most
system
the most ofimportant
the Designand
important the
the one
Toolkit
and that
that contains
comprises
one six main
contains the majority
thecategories
majority of contents.
of which
contents. The
Design
The
Design
Methods Methods
is the mostcategory
important provides
and the a filtering
one that system
contains that
the allows
majority
Design Methods category provides a filtering system that allows learners to filter design methods learners
of contents.to filter
The design
Design methods
Methods
based
based on
category the
the design
onprovidesdesign phase,
phase, the
a filtering type
type of
system
the that
of method
allows(quantitative
method learners to filter
(quantitative or
or qualitative),
design methods
qualitative), user
user involvement
based on the as
involvement well
design
as well
as the
phase, difficulty,
the type experience
of method and duration
(quantitative needed.
or This
qualitative), navigation
as the difficulty, experience and duration needed. This navigation and filtering system (Figures 1 and
user and
involvement filteringas system
well as (
the difficulty,
Figures 1 and
) empower
2experience andboth teachers
duration and
needed. learners,
This allowing
navigation them
and to
filteringexplore
system
2) empower both teachers and learners, allowing them to explore design methods in each learning design
(Figures methods
1 and in
2) each
empower learning
both
scenario.
teachers
scenario.and learners, allowing them to explore design methods in each learning scenario.

Figure
Figure 1.
1. Design
Design Toolkit’s
Toolkit’s filtering
filtering system.
filtering system.
system.

2. User journey
Figure 2.
Figure journey card.
Figure 2. User
User journey card.
card.
Sustainability2020,
Sustainability 2020,12,
12,7397
x FOR PEER REVIEW 55 of
of 14
14

This work mainly focuses on how the Design Toolkit open educational resource is used by
This work mainly
teachers under focuses on how
a human-centered the Design
design Toolkit
perspective. Theopen educational
relationship resource
between is usedorganization
content by teachers
under a human-centered design perspective. The relationship between content organization
and teachers’ needs is assessed, as well as diverse teacher adoption in educational settings. and
teachers’ needs is assessed, as well as diverse teacher adoption in educational settings.
4. Materials and Methods
4. Materials and Methods
In this research we follow a human-centered design iterative process and we aim to discover
insightsthis
In intoresearch
content we follow a human-centered
organization design and
and teachers’ adoption iterative
usageprocess and weToolkit
of the Design aim toproject.
discover
In
insights
terms ofinto content organization
the iterative process, this and teachers’
research adoption
constitutes anditeration
a new usage of of
thethe
Design
HCD Toolkit
processproject.
(FigureIn
3)
terms
basedof onthe
theiterative process,
principles of ISOthis research
9241-210 constitutes
[29]. Previousawork
new iteration of the HCD
on the Design process
Toolkit can be(Figure
found 3)
in
based
[9]. on the principles of ISO 9241-210 [29]. Previous work on the Design Toolkit can be found in [9].

Figure 3. ISO
Figure 3. ISO 9241-210:2010-Human-centred
9241-210:2010-Human-centred design
design for
for interactive
interactive systems.
systems.

From
Fromthe theeducational
educationalpoint
pointofof
view,
view,thethe
Design Toolkit
Design was was
Toolkit designed mainly
designed for two
mainly fortypes
two of users:
types of
learners and teachers. The first design iteration was primarily focused on understanding
users: learners and teachers. The first design iteration was primarily focused on understanding the the needs,
expectations and limitations
needs, expectations of learners.ofThus,
and limitations learners
learners. were
Thus, involved
learners through
were research
involved methods
through with
research
the goal to collect their user experience with the Design Toolkit. As shown in [9], the
methods with the goal to collect their user experience with the Design Toolkit. As shown in [9], the overall experience
was satisfactory,
overall experience andwas
learners preferred
satisfactory, andto use the Design
learners Toolkit
preferred rather
to use thethan the Toolkit
Design traditional educational
rather than the
content.
traditional educational content. That research provided insights based on learners’ needs, wantswere
That research provided insights based on learners’ needs, wants and limitations, which and
mostly related
limitations, to the
which information
were architecture
mostly related and content organization.
to the information architecture and content organization.
Due
Due toto the
the importance
importance of of the
the information
information architecture
architecture on
on aa learning
learning content
content platform,
platform, collecting
collecting
more information was necessary from the teachers’ point of view in order
more information was necessary from the teachers’ point of view in order to define to define design requirements.
design
Thus, three actions
requirements. Thus,were carried
three actions out:
were carried out:
•• First,
First, workshop
workshop with with information
information architecture
architecture experts
experts to
to assess
assess and
and improve
improve existing
existing content
content
organization and structure. Based on this, card sorting was defined and carried out
organization and structure. Based on this, card sorting was defined and carried out with teachers with teachers
(See
(SeeSection
Section4.1).
4.1).
•• Second,
Second, aa set
set of
of focus
focus groups
groups with
with teachers
teachers to to inquire
inquire about
about teachers’
teachers’ adoption
adoption of of the
the OER,
OER, the
the
information
informationarchitecture
architectureand andhow
howthey
theyuse
use the
the Design
Design Toolkit
Toolkitin in teaching
teaching and
and learning
learning processes
processes
(See
(SeeSection
Section4.2).
4.2).
•• Third,
Third,two
twovalidated
validated questionnaires
questionnaires were were
usedused to understand
to understand the userthe user experience
experience with the with the
learning
learning platform:
contents contents system
platform: systemscale
usability usability
(SUS),scale (SUS),
and net and net
promoter promoter
score score
(NPS) (See (NPS)4.3).
Section (See
Section 4.3).
4.1. Card Sorting
4.1. Card
CardSorting
sorting is a useful design method in which participants need to label and classify cards
in groups
Card sorting is atouseful
according their criteria [31,32]. This
design method user-centered
in which design
participants needmethod
to label generates an cards
and classify overall
in
classification of the information and provides insights into the user’s mental model, allowing
groups according to their criteria [31,32]. This user-centered design method generates an overall researchers
to know the way
classification of users group, sort and
the information and label the content
provides of ainto
insights website [32,33].mental model, allowing
the user’s
researchers to know the way users group, sort and label the content of a website [32,33].
Sustainability 2020, 12, 7397 6 of 14

In this research, we performed an individual card sorting with teachers that know and have used
the Design Toolkit for teaching. Regarding the number of participants, in the case of card sorting,
Nielsen states that a 0.90 correlation of issue identification would be achieved by testing 15 users [34,35].
In our case, 15 teachers participated in the focus groups (n = 15), 60% of them male and 40% female.
Their ages ranged from 32 to 63, with an average of 43.6. They had used the Toolkit from 2 to 6
semesters with an average of 3.
A closed card sorting was carried out in which users were provided with cards already classified
into specific categories [32]. This type of card sorting is more suitable for cases in which researchers
aim to validate a current classification [32]. This predefined classification (Table 2) was based on the
results of the work done by information architecture experts at the previously mentioned workshop.
However, not only could participants arrange cards into categories, but they also could change the
name of each category, and add or delete categories.

Table 2. Card sorting predefined categories.

Category Subcategory Number of Cards


Methods - 36
Principles Cognitives 7
Functionals 10
Perceptives 8
Models - 7
Interaction - 16
Ideas - 9
Tools Media 8
Visual strategies 15

4.2. Focus Groups


A focus group is an effective qualitative method where a broad range of opinions, feelings,
attitudes and experiences arise [36–38]. Through this method, participants discuss a topic exposed
by a moderator, who also drives the conversation to make important ideas flourish. We ran three
different focus groups, as it is thought that running two or more focus groups may increase the chances
of success [39]. Small groups of participants may result in a lack of experiences drawn [39], and
large groups may result in a difficult situation to express participants’ opinion. Thus, each group had
between 4 and 6 participants. To recruit participants, a survey was sent to 18 teachers from courses
where the toolkit had been used and a total of 15 participants were recruited (n = 15), 46.7% of them
male and 53.3% female. Their ages ranged from 30 to 55, with an average of 40.53. They had used
the Toolkit from 2 to 6 semesters with an average of 2.7. Once further information about those who
felt able to participate was collected through this survey, two groups of similar profiles were set up
to build a space for exchanging opinions and attitudes between similar participants [40,41], and a
convenient time was arranged to run three focus groups: one with course designers and two with
course instructors (Table 3).

Table 3. Focus groups’ participants.

FG Num. Participants Profiles Participants


FG1 5 Course designers and instructors P1.1, P1.2, P1.3, P1.4, P1.5
P2.1, P2.2, P2.3, P2.4,
FG2 6 Course instructors
P2.5, P2.6
FG3 4 Course instructors P 3.1, P3.2, P3.3, P3.4

Each focus group was divided into two parts. In the first one, participants were asked to perform
a task to show when they started to use the Design Toolkit in their teaching process. To do so, they had
Sustainability 2020, 12, 7397 7 of 14

Sustainability 2020, 12, x FOR PEER REVIEW 7 of 14


to point out the key timepoints in a predefined timeline (Figure 4). They were asked to explain on top
of the figure where in this process was the Design Toolkit used.

Figure 4. Predefined timeline used in Focus Groups. From “course preparation” to “start of the “course
delivery”.
Figure 4. Predefined timeline used in Focus Groups. From “course preparation” to “start of the
The second part was an open interview guided by a moderator and analyzed by one observer, who
“course delivery”.
took notes of the experience. This stage lasted between 60 and 90 min and addressed specific questions
and topics that needed
The second to be
part was anaddressed in eachguided
open interview focus group, which areand
by a moderator summarized
analyzed in
by(Table 4)
one observer,
who took notes of the experience. This stage lasted between 60 and 90 min and addressed specific
Table 4. Focus groups’ questions, topics and tasks.
questions and topics that needed to be addressed in each focus group, which are summarized in
(Table 4) Type Question or Topic
Task Timeline of the usage of the Design Toolkit
Table 4. Focus groups’ questions,
General topics
satisfaction andand tasks.
user experience with the Design
Questions
Toolkit
Type
Questions Question
Adoption of or
theTopic
Design Toolkit to the course design
Task
Questions Timeline of the usage
Adoption of the
of the Design
Design Toolkit
Toolkit to the course delivery
QuestionsGeneral satisfaction and
Questions Open feature
user and external
experience with use
the of the Design
Design Toolkit
Toolkit
Questions
Questions Design features and opportunities
Adoption of the Design Toolkit to the course design to improve
Questions Adoption of the Design Toolkit to the course delivery
4.3. User Experience
QuestionsEvaluation Methods
Open feature and external use of the Design Toolkit
Usability Questions
and the perceived Design features
usefulness of aand opportunities
product to improve
or a service have an effect on the user
experience and satisfaction [42], which has a prominent relevance in TEL in Higher Eduaction [43].
4.3. User
SUS Experience
is a popular andEvaluation
validated Methods
method to evaluate the usability of web applications [44–46] even with
a small
Usability and the perceived isusefulness
sample group [47], which also used of
to quantify
a productthe oruser’s satisfaction
a service have anineffect
usingone-learning
the user
platforms
experience[48]. In addition,[42],
and satisfaction howwhich
likelyhas
areausers to recommend
prominent relevanceainproduct might be
TEL in Higher an indicator
Eduaction [43].
about how satisfied they are with this product. NPS is also a validated questionnaire,
SUS is a popular and validated method to evaluate the usability of web applications [44–46] even introduced by
Fred
with Reichheld in 2003group
a small sample [49], that
[47],shows
whichhow likely
is also usedparticipants
to quantifyarethetouser’s
recommend the company
satisfaction in using or
e-
product to a friend or a colleague and categorize participants into three categories:
learning platforms [48]. In addition, how likely are users to recommend a product might be andetractors, passives
and promoters.
indicator about how satisfied they are with this product. NPS is also a validated questionnaire,
introduced by Fred Reichheld in 2003 [49], that shows how likely participants are to recommend the
5. Results and Discussion
company or product to a friend or a colleague and categorize participants into three categories:
detractors,
5.1. passivesand
User Experience andOER
promoters
Adoption
One ofand
5. Results the Discussion
goals of this research was to understand teachers’ satisfaction with the Design Toolkit
educational resources. This was carried out with SUS and NPS questionnaires.
The SUS
5.1. User questionnaire
Experience and OERallows participants to rate ten questions related to the user experience of a
Adoption
digital product using a 5-point Likert scale, from strongly disagree (1) to strongly agree (5) (Equation
One of the goals of this research was to understand teachers’ satisfaction with the Design Toolkit
(1)) [45]. Each participant’s responses were calculated (1) and SUS results were considered positive
educational resources. This was carried out with SUS and NPS questionnaires.
results when they were higher than 70. In the Design Toolkit’s case, the SUS scale had an average value
The SUS questionnaire allows participants to rate ten questions related to the user experience of
of 75.58. Cronbach’s alpha was performed to examine the internal consistency of the results and an α
a digital product using a 5-point Likert scale, from strongly disagree (1) to strongly agree (5)
= 0.86 was obtained. It should be noted that values greater than α = 0.7 are reliable [50], and in SUS
(Equation (1)) [45]. Each participant’s responses were calculated (1) and SUS results were considered
questionnaires (10 items) it is set at α = 0.707.
positive results when they were higher than 70. In the Design Toolkit’s case, the SUS scale had an
average value of 75.58. Cronbach’s alpha was 5 performed to examine the internal consistency of the

X
results and an α = 0.86 was obtained.
SUS = 2.5It ×
should (U
be2n−1 − 1)that
noted + (5values
− U2n )greater than α = 0.7 are reliable




 (1)
[50], and in SUS questionnaires (10 items) it is set at α = 0.707.
n = 1

NPS questionnaire participants were asked


5


to answer the question: “How likely are you to
SUS = 2.5 × (U 2 n −1 − 1) + (5 − U
recommend [company or product name] to a friend or colleague?” using2 n )  a 10-point Likert scale. The
(1)
 n =1 
NPS questionnaire participants were asked to answer the question: “How likely are you to
recommend [company or product name] to a friend or colleague?” using a 10-point Likert scale. The
Sustainability 2020, 12, 7397 8 of 14
Sustainability 2020, 12, x FOR PEER REVIEW 8 of 14

results were split


results were splitinto
into three
three categories,
categories, beingbeing “detractors”,
“detractors”, thoseanswered
those who who answered
from 0 tofrom 0 to 6,
6, “passives”,
“passives”, from 7 to 8, and “promoters”, from 9 to 10 [51]. Results on the Design
from 7 to 8, and “promoters”, from 9 to 10 [51]. Results on the Design Toolkit’s questionnaire shows Toolkit’s
questionnaire
that 30.77% ofshows that 30.77%
participants were of participants
“passives” andwere
69.70%“passives” and 69.70%with
were “promoters” werescores
“promoters” with9
higher than
scores higher than 9 ( Figure 5 ). Thus, the NPS result was calculated, and scored a 69.70, which
(Figure 5). Thus, the NPS result was calculated, and scored a 69.70, which is an excellent result, taking is an
excellent result, taking into account that results higher than 50
into account that results higher than 50 are considered excellent [52]. are considered excellent [52].

60.00

50.00
Percentage of responses

40.00

30.00

20.00

10.00

0.00
0 1 2 3 4 5 6 7 8 9 10
From Very Unlikely (0) to Very Likely (10)

Figure 5. Responses to “How likely are you to recommend the Design Toolkit to a friend or colleague?”
Figure 5. Responses to “How likely are you to recommend the Design Toolkit to a friend or
Although the validity of NPS is sometimes questioned [53], it is widely used to assess satisfaction
colleague?”
in education due to its simplicity [51]. Thus, it is worth taking the results into account to get an
overall satisfaction
Although indicator.
the validity of NPS However,
is sometimesit should be used
questioned [53],together
it is widelywithusedother quantitative
to assess and
satisfaction
qualitative
in educationmethods.
due to its simplicity [51]. Thus, it is worth taking the results into account to get an overall
Focus groups
satisfaction revealed
indicator. more it
However, detailed
shouldandbe qualitative information
used together with other about teacher satisfaction.
quantitative When
and qualitative
asked for an overall opinion on the Design Toolkit, the majority of the participants showed their high
methods.
levelFocus
of satisfaction with the more
groups revealed platform and how
detailed andthey adopt it information
qualitative in their teachingabout activity.
teacher satisfaction.
When asked for an overall opinion on the Design Toolkit, the majority of the participants showed
“It is level
their high an agile tool, which helps
of satisfaction withyouthe to structure
platform thehow
and course,
theyit is nice to
adopt use,
it in andteaching
their I really appreciate
activity.
that it is an action-oriented tool [ . . . ] I also like the educational perspective, with which I agree,
“It is an agile
constructivism, whichtool, which
fosters thehelps
idea ofyou to structure
creating the through
knowledge course, the
it isinteraction
nice to use,
withand
theIcontent
really
appreciate
... ” that it is an action-oriented tool [...] I also like the educational perspective, with which I
agree, constructivism, which fosters the idea of creating knowledge through the interaction with the
participant P1.5
content...” - participant P1.5
“I think it is aitvery
“I think is auseful tool, not
very useful onlynot
tool, because the explanations
only because that it includes
the explanations that it of each technique,
includes of each
model and so on . . . but also because it encompasses all these things in a unique website
technique, model and so on…but also because it encompasses all these things in a unique website…it. . . it has a lot
of potential
has ... ”
a lot of potential…” - participant P3.2
As participant 1.5′s response shows, some of the feedback emphasizes theparticipant P3.2
utility of the platform
to design and prepare 0the course. In this regard, the result of the timeline task (Figure 4), in which
As participant 1.5 s response shows, some of the feedback emphasizes the utility of the platform
participants had to indicate the key points in a timeline when they used the Design Toolkit before
to design and prepare the course. In this regard, the result of the timeline task (Figure 4), in which
starting teaching the course, shows that they use the platform at the beginning of this process (Figure
participants had to indicate the key points in a timeline when they used the Design Toolkit before
6). It should be noted that while course designers start using it later, course instructors use the Design
starting teaching the course, shows that they use the platform at the beginning of this process (Figure 6).
Toolkit as a tool to get an overview of the course (Figure 7).
It should be noted that while course designers start using it later, course instructors use the Design
Toolkit as a tool to get an overview of the course (Figure 7).
Sustainability 2020, 12, 7397 9 of 14
Sustainability
Sustainability2020,
2020,12,
12,xxFOR
FORPEER
PEERREVIEW
REVIEW 99of
of14
14

Figure
Figure 6.
6. Examples
Examples of
of some
some of
of the
the timelines
timelines drawn
drawn by
by teachers
teachers during
during the
the focus
focus groups.
groups.

Figure
Figure 7.
7. Examples
Examples of
of timelines
timelines drawn
drawn by
by teachers
teachers during
during the
the focus
focus group.
group.

This
This use
This use of
use of the
of the platform
the platform fits
platform fits in
fits in with
in with one
with one of
one of the
of the types
the types
types of of use
of use that
use that teachers
that teachers can
teachers can make
can make of
make of OER
of OER
OER
according
according to
according toArmellini
to Armellini
Armellini andandNie,Nie,
and in which
Nie, in teachers
in which
which use it “as
teachers
teachers usea it
use planner
“as aaenhancement
it “as planner during curriculum
planner enhancement
enhancement during
during
design”
curriculum
curriculum[19].design”
However,
design” focus
[19].
[19]. groups also
However,
However, focus
focus reveal
groupsanother
groups also type
also reveal
reveal of another
use explained
another type by
of Armellini
type of use and Nie
use explained
explained by
by
once the
Armellini teaching
Armellini andand Nie activity
Nie once
once thestarts [19].
the teaching It is
teaching activityseen as “a
activity starts resource
starts [19]. during
[19]. ItIt is
is seen course
seen asas “a delivery”
“a resource as participants
resource during
during course
course
P2.2 and P2.3
delivery”
delivery” as explained. P2.2
as participants
participants P2.2 and
and P2.3
P2.3 explained.
explained.
[Marking
[Marking
[Marking aa student’s
student’s
a student’s activity],activity],
activity], II highlight
I highlight highlight the
the error,the error,
error,
and and
and II provide
I provide provide
them with them
thema with
withto
link aa the
link to
to the
linkDesignthe
Design Toolkit
Toolkitit where
Design where
Toolkit where
is explained [ . . . ] if I[…]
itit isis explained
explained everififavoid
[…] II ever
ever avoid
avoid referring
referring referring to
to the
to the toolkit theintoolkit
toolkit in
in the
the activities’
the activities’ activities’
feedback,
feedback,
feedback, sometimes
sometimes sometimes
students ask students ask me for
me for examples, and examples,
examples,
I send them andthe
and II send
sendtothem
link them the link
the
the toolkit.” linktotothethetoolkit.”
toolkit.”-
participant
participant P2.3
P2.3 participant P2.3
“I
“I usually
usually use use itit [the
[the Design
Design Toolkit]
Toolkit] to to validate
validate the
the marking
marking of of the
the activities
activities […]
[…] itit isis the
the
“I usually use it [the Design Toolkit] to validate the marking of the activities [ . . . ] it is the resource
resource that I use to check if students have done what they were asked to
resource that I use to check if students have done what they were asked to do” - participant P2.2 do” participant P2.2
that I use to check if students have done what they were asked to do”
As
As presented
presented in in Section
Section 3, 3, both
both design
design and and educational
educational requirements
requirements aimed aimed to
to provide
participant P2.2 open
provide open
resources
resources to to thethe design
design community
community (practitioners
(practitioners and and learners).
learners). In In this
this regard,
regard, thethe majority
majority of of
As presented
participants
participants express
express in satisfaction
Section 3, both
satisfaction withdesign
with the and educational
the Design
Design Toolkit
Toolkit being requirements
being an
an open aimed
open resource
resource to to
provide
to the
the designopen
design
resources
community
community to(participant
the designP3.4
(participant community
P3.4 and
and P2.6). (practitioners and learners). In this regard, the majority of
P2.6).
participants express satisfaction with the Design Toolkit being an open resource to the design
“I
“Ilike
likethethefact
factthat
thatititisisananopen resource
open resourceandand
thatthat
there is no
there is private section.”
no private participant
section.” - participantP3.4
community (participant P3.4 and P2.6).
P3.4
“Publishing an open resource that has an academic background and a revision process, how this
“I like
does [the theDesign
fact that it is an open
Toolkit], resource and that there is no private section.”
“Publishing an open has a great
resource value
that has to
anthe community”
academic participant
background P2.6 process, how this
and a revision
does [the Design Toolkit], has a great value to the community” - participant P2.6 participant P3.4
In fact, due to the Design Toolkit being an open educational resource, several participants
expressed
In fact,thatdue
“Publishing they already
antoopen
theresource knew
Design the
hasDesign
Toolkit
that anbeing Toolkit
academic before
an background
open being
and a part
educational of UOC’s
resource,
revision faculty.
several
process, how this Not
doesonly
participants
did they
expressed use the
that platform
they already with kneweducational
the Design
[the Design Toolkit], has a great value to the community” purpose
Toolkitbut also
before in their
being professional
part of UOC’s jobs. Furthermore,
faculty. Not only
some of them
did they use the stated that the
platform Design
with Toolkit purpose
educational is also a beneficial
but also inresource for the institution
their professional because it
jobs. Furthermore,
participant P2.6
increases
some of themthe reputation
stated thatfor thereliability and excellence
Design Toolkit is also a in the design
beneficial community
resource for the(participant P2.1). it
institution because
increases the reputation for reliability and excellence in the design community (participant P2.1).
Sustainability 2020, 12, 7397 10 of 14

In fact, due to the Design Toolkit being an open educational resource, several participants
expressed that they already knew the Design Toolkit before being part of UOC’s faculty. Not only
did they use the platform with educational purpose but also in their professional jobs. Furthermore,
some of them stated that the Design Toolkit is also a beneficial resource for the institution because it
increases the reputation for reliability and excellence in the design community (participant P2.1).
“From 2020,
Sustainability my point of view,
12, x FOR PEERwhat you
REVIEW have (the Design Toolkit) is branded content; in addition, it is10 of 14
excellent branded content.”
“From my point of view, what you have (the Design Toolkit) is branded content; in addition,
participant P2.1 it
is excellent branded content.” participant P2.1
5.2. Information Architecture
5.2. Information Architecture
Another primary goal of this research was to get a deeper understanding of the findings obtained
Another
in previous primary
stages of thegoal
HCD of process
this research wasthe
[9] where to need
get afor
deeper understanding
an improvement of theof information
the findings
obtained in was
architecture previous stages
raised. of the HCD
Information process relates
architecture [9] where the areas:
to four need for an an improvement
organization of the
system, a
information
labeling architecture
system, was raised.
a navigational system,Information
and a searching architecture relates
system [33]. The to four
card areas:revealed
sorting an organization
valuable
system, a labeling
information system,
about some a navigational
of these areas: system, and a searching system [33]. The card sorting
revealed valuable information
The organization system about
relatessome of these
to how areas:is categorized in the platform. The card
content
sortingThe organization
results shown in system
Figurerelates
8 havetobeen
howcalculated
content is with
categorized in the platform.
the unweighted The cardmethod
pair grouping sorting
results
with shown in Figure
arithmetic-mean 8 have been
(UPGMA) calculated
technique, whichwith
is the
one unweighted
of the most pair grouping
commonly method
used with
clustering
arithmetic-mean (UPGMA) technique, which is one of the most commonly used clustering
algorithms [54]. The majority of participants sorted the cards into nine groups (red horizontal line algorithms
[54]. 1The
near majority
in the y-axisofinparticipants sortedwas
Figure 8), which the cards into nine
the proposal bygroups
experts(red horizontal
in the workshop lineexplained
near 1 in the
in
y-axis
the in Figure
Section 8), which
4. This means was thethere
that proposal
is noby experts of
evidence in the workshop
a problem withexplained in the
the content Section 4. This
organization in
means
the that there is no evidence of a problem with the content organization in the platform.
platform.

Figure8.8. Cluster
Figure Cluster tree
tree from
from card
cardsorting
sortingresult.
result.

Results were also analyzed from the labeling perspective. Since dendrograms in card sorting take
Results were also analyzed from the labeling perspective. Since dendrograms in card sorting
no notice of group labels, a qualitative analysis of card sorting was also done. This analysis reveals
take no notice of group labels, a qualitative analysis of card sorting was also done. This analysis
no important issues regarding the labelling system since only two participants proposed changing
reveals no important issues regarding the labelling system since only two participants proposed
some group labels. Notably, one of them suggested changing two group labels, whereas the other one
changing some group labels. Notably, one of them suggested changing two group labels, whereas
recommended only one.
the other one recommended only one.
Since the card sorting method does not reveal information about searching and navigation systems,
Since the card sorting method does not reveal information about searching and navigation
questions related to that were discussed in focus groups. Regarding the search system, it should be
systems, questions related to that were discussed in focus groups. Regarding the search system, it
noted that the current Design Toolkit version does not provide a searching system. When participants
should be noted that the current Design Toolkit version does not provide a searching system. When
were asked about what they think should be improved, some of them suggested incorporating a
participants were asked about what they think should be improved, some of them suggested
searching system in the platform.
incorporating a searching system in the platform.
“Regarding how to improve it [the Design Toolkit], I would add a searching system to find
specific resources directly.” participant P1.1
I would appreciate a search bar […] when I need to find something, I have to use the shortcut
Sustainability 2020, 12, 7397 11 of 14

“Regarding how to improve it [the Design Toolkit], I would add a searching system to find specific
resources directly.”
participant P1.1
I would appreciate a search bar [ . . . ] when I need to find something, I have to use the shortcut
command + F to find it.”
participant P3.1
Finally, and regarding the navigation system, the majority of focus group participants expressed
concern with the navigation system of the platform. Most of them referred to the difficulty of finding
content in the toolkit using the filtering system. During the focus group, the moderator showed a
screenshot of the current filtering system (Figure 1) to the participants and asked them for an opinion
on it.
Although there were no specific improvements proposals, there was a consensus in all of the
focus group about the need to improve the filtering system. Some of the participants were very critical
of the current filtering system (participant P3.4).
“The filters are not clear at all, and do not encourage users to explore the resources.”
participant P3.4
Since participants showed such strong concern about how to find the content quickly, the
moderator drove the conversation in this direction to obtain more detailed information. Even though
participants clarified that it was not a general issue in the platform, they also raised issues regarding
the access to the content. In this case, some of them showed the difficulty in finding the guides inside
cards (participant P2.1). Moreover, some of them proposed changes in the Design Toolkit interface to
make the guides’ buttons inside each card more visible.
“Related to the guides, this second level that some of the cards have, [ . . . ] it has sometimes been
difficult to find them, not least for me [ . . . ] it is sometimes easier to find them using a Google Search
than through the Design Toolkit navigation.”
participant P2.1
As a summary, the general satisfaction with the Design Toolkit in the teacher community is
positive and participants stated that they had a good user experience while exploring and using the
platform. Regarding the educational aim of the toolkit, all of the participants were satisfied with the
ease of use and the adaptability of the educational resources to their teaching activities. Additionally,
there was a consensus between all the teachers on the benefits of publishing educational resources
under open licenses. The Design Toolkit contributes to spreading design knowledge to the community,
and it also improves the reputation of the university.
The results of this research also contribute to the definition of new design requirements to improve
the Design Toolkit. These requirements are summarized in Table 5, and constitute the starting point for
a new iteration of the Design Toolkit HCD process.

Table 5. Design requirements.

Number Area Question or Topic Priority


The filtering system needs to be improved
1 Filtering system through a new visual design that facilitates 1
to filter the content.
A searching system needs to be
2 Searching system 1
implemented in the Design Toolkit
Guides need to be more visible inside each
3 Visibility of guides card. This needs to be addressed through a 1
redesign of the interface.
4 Labeling Consider adjusting some categories’ names 3
Sustainability 2020, 12, 7397 12 of 14

6. Conclusions
In this paper, we presented the advantages of designing and creating open educational resources
as tools rather than in a traditional format through an iterative human-centered design process.
This action-oriented content enabled teachers to adapt educational resources for their teaching process,
both in the course design and in the course delivery phases [19]. Results reveal a high level of
satisfaction with the Design Toolkit and its usability as SUS and NPS results shown. This satisfaction
is linked to the ease of use of the OER in their teaching activity in different contexts. This concept is
aligned with the OER adoption factor presented by Cox and Trotter [20].
Open licensing the Design Toolkit and making it publicly accessible contributed to the design
community. The design field is mainly a professionalized area, where providing educational content
with an educational perspective may contribute to design practitioners’ lifelong learning. Additionally,
UOC, the higher education institution that promotes the Design Toolkit OER, takes advantage of it as
the repository is currently a well known resource in the Spanish-speaking design community.
Additionally, we showed the advantages of creating OER through an iterative human-centered
design process where researchers, teachers and learners have worked collaboratively. This constitutes
an improvement on how OER can be created. Involving teachers in the design process facilitated the
discovery of areas to improve, especially those based on their experiences and use, that would be more
challenging to discover without their participation. Furthermore, the use of the card sorting method
was crucial to emphasize issues related to the information architecture. This brings to light that the
use of HCD methods in OER creation enables obtaining design insights and improving the design of
OER platforms.

Author Contributions: Conceptualization, C.G.-L., E.M. and S.T.; Data curation, C.G.-L., E.M. and S.T.; Formal
analysis, C.G.-L. and E.M.; Investigation, C.G.-L., E.M. and S.T.; Methodology, C.G.L., E.M. and S.T.; Supervision,
E.M.; Visualization, C.G.-L. and S.T.; Writing–original draft, C.G.-L., E.M. and S.T.; Writing–review & editing,
C.G.-L., E.M. and S.T. All authors have read and agreed to the published version of the manuscript.
Funding: This research received no external funding.
Conflicts of Interest: The authors declare no conflict of interest.

References
1. Norman, D.A.; Draper, S.W. User Centered System Design: New Perspectives on Human-Computer Interaction;
CRC Press: Boca Raton, FL, USA, 1986.
2. Lowdermilk, T. User-Centered Design: A Developer’s Guide to Building User-Friendly Applications; O’Reilly
Media, Inc.: Sevastopol, CA, USA, 2013.
3. Gasson, S. Human-centered vs. user-centered approaches to information system design. J. Inf. Technol.
Theory Appl. 2003, 5, 5.
4. Benn, S.; Edwards, M.; Williams, T. Organizational Change for Corporate Sustainability; Routledge: Abington,
UK, 2014.
5. Blewitt, J. The Ecology of Learning: Sustainability, Lifelong Learning and Everyday Life; Earthscan Publications
Ltd.: London, UK, 2006.
6. Sanders, E.B.-N.; Stappers, P.J. Co-creation and the new landscapes of design. CoDesign 2008, 4, 5–18.
[CrossRef]
7. IDEO. The Field Guide to Human-Centered Design: Design Kit; IDEO: Japalo Alto, CA, USA, 2015.
8. Kalantzis, M.; Cope, B. The Teacher as Designer: Pedagogy in the New Media Age. E Learn. Digit. Media
2010, 7, 200–222. [CrossRef]
9. Garcia-Lopez, C.; Tesconi, S.; Mor, E. Designing Design Resources: From Contents to Tools. In Human-
Computer Interaction. Perspectives on Design; Springer International Publishing: Orlando, FL, USA, 2019;
pp. 87–100.
10. Knapper, C.; Cropley, A.J. Lifelong Learning in Higher Education; Psychology Press: East Sussex, UK, 2000.
11. Pachler, N.; Daly, C.; Mor, Y.; Mellar, H. Formative e-assessment: Practitioner cases. Comput. Educ. 2010, 54,
715–721. [CrossRef]
Sustainability 2020, 12, 7397 13 of 14

12. Gikandi, J.W.; Morrow, D.; Davis, N.E. Online formative assessment in higher education: A review of the
literature. Comput. Educ. 2011, 57, 2333–2351. [CrossRef]
13. Carroll, J.M.; Rosson, M.B.; Dunlap, D.; Isenhour, P. Frameworks for Sharing Teaching Practices. J. Educ.
Technol. Soc. 2005, 8, 162–175.
14. Hsu, K.C.; Yang, F.-C.O. Toward an Open and Interoperable e-Learning Portal: OEPortal. J. Educ. Technol.
Soc. 2008, 11, 131–148.
15. Chen, I.Y.L.; Chen, N.-S.; Kinshuk. Examining the Factors Influencing Participants’ Knowledge Sharing
Behavior in Virtual Learning Communities. J. Educ. Technol. Soc. 2009, 12, 134–148.
16. Heath, W. Open Educational Resources: Breaking the Lockbox on Education; William and Flora Hewlett Fundation:
Menlo Park, CA, USA, 2013; Available online: https://hewlett.org/open-educational-resources-breaking-the-
lockbox-on-education/ (accessed on 10 July 2020).
17. United Nations Educational Scientific, and Cultural Organization (UNESCO). Forum on the Impact of Open
Courseware for Higher Education in Developing Countries; Final Report; UNESCO: Paris, France, 2002.
18. Atkinson, R.K.; Derry, S.J.; Renkl, A.; Wortham, D. Learning from Examples: Instructional Principles from
the Worked Examples Research. Rev. Educ. Res. 2000, 70, 181–214. [CrossRef]
19. Armellini, A.; Nie, M. Open educational practices for curriculum enhancement. Open Learn. 2013, 28, 7–20.
[CrossRef]
20. Cox, G.; Trotter, H. Factors Shaping Lecturers’ Adoption of OER at Three South African Universities; African
Minds, International Development Research Centre & Research on Open: Western Cape, South Africa, 2017.
21. Grewe, K.; Davis, W.P. The Impact of Enrollment in an OER Course on Student Learning Outcomes. Int. Rev.
Res. Open Distrib. Learn. 2017, 18. [CrossRef]
22. Delgado, H.; Delgado, M.; Hilton, J., III. On the efficacy of open educational resources: Parametric and
nonparametric analyses of a university calculus class. Int. Rev. Res. Open Distrib. Learn. 2019, 20, 200.
23. Allen, I.E.; Seaman, J. Opening the Curriculum: Open Educational Resources in U.S. Higher Education; Babson
Survey Research Group: Babston Park, MA, USA, 2014; p. 52.
24. Schuwer, R.; Janssen, B. Open Educational Resources en MOOC’s in Het Nederlandse Hoger Onderwijs: Een onderzoek
Naar de Stand van Zaken Rond Productie en Hergebruik; Fontys Hogeschool ICT: Eindhoven, The Netherlands,
2016.
25. Beaven, T. ‘Dark reuse’: An empirical study of teachers’ OER engagement. Open Prax. 2018, 10, 377–391.
[CrossRef]
26. Glennie, J.; Harley, K.; Butcher, N. Introduction: Discourses in the development of OER practice and policy.
In Open Educational Resources and Change in Higher Education: Reflections from Practice; Commonwealth of
Learning: Vancouver, CO, Canada, 2012; pp. 1–12.
27. Wiley, D. Dark Matter, Dark Reuse, and the Irrational Zeal of a Believer. Available online: http://opencontent.
org/blog/archives/905 (accessed on 10 June 2009).
28. Friesen, N. Open educational resources: New possibilities for change and sustainability. Int. Rev. Res. Open
Distance Learn. 2009, 10, 1–13. [CrossRef]
29. ISO. Ergonomics of Human-System Interaction: Part 210: Human-Centred Design for Interactive Systems; ISO:
Geneva, Switzerland, 2010.
30. Ferran, N.; Guerrero-Roldán, A.-E.; Mor, E.; Minguillón, J. User Centered Design of a Learning Object
Repository. In Human Centered Design; Springer: Berlin/Heidelberg, Germany, 2009; pp. 679–688.
31. Sherwin, K. Card Sorting: Uncover Users’ Mental Models for Better Information Architecture. Available
online: https://scholar.google.com/scholar?hl=en&as_sdt=0%2C5&q=Card+Sorting%3A+Uncover+Users%
E2%80%99+Mental+Models+for+Better+Information+Architecture&btnG= (accessed on 17 July 2020).
32. Spencer, D. Card Sorting: Designing Usable Categories; Rosenfeld Media: New York, NY, USA, 2009.
33. Morville, P.; Rosenfeld, L.; Arango, J. Information Architecture: For the Web and Beyond; O’Reilly Media:
Sebastopol, CA, USA, 2015.
34. Hepburn, P.; Lewis, K.M. What’s in a Name? Using Card Sorting to Evaluate Branding in an Academic
Library’s Web Site. Coll. Res. Libr. 2008, 69, 242–251. [CrossRef]
35. Nielsen, J. Card Sorting: How Many Users to Test. 2004. Available online: http//www.nngroup.com/articles/
card-sorting-how-many-users-to-test/ (accessed on 6 June 2015).
36. Lazar, J.; Feng, J.H.; Hochheiser, H. Research Methods in Human-Computer Interaction; Morgan Kaufmann:
Cambridge, MA, USA, 2017.
Sustainability 2020, 12, 7397 14 of 14

37. Hanington, B.; Martin, B. Universal Methods of Design: 100 Ways to Research Complex Problems, Develop
Innovative Ideas, and Design Effective Solutions; Rockport Publishers: Beverly, MA, USA, 2012.
38. Plummer, P. Focus group methodology. Part 1: Design considerations. Int. J. Ther. Rehabil. 2017, 24, 297–301.
[CrossRef]
39. Krueger, R.A. Focus Groups: A Practical Guide for Applied Research; Sage Publications: Thousand Oaks, CA,
USA, 2014.
40. Asbury, J.-E. Overview of Focus Group Research. Qual. Health Res. 1995, 5, 414–420. [CrossRef]
41. Oates, B.J. Researching Information Systems and Computing; Sage: Thousand Oaks, CA, USA, 2005.
42. Calisir, F.; Calisir, F. The relation of interface usability characteristics, perceived usefulness, and perceived
ease of use to end-user satisfaction with enterprise resource planning (ERP) systems. Comput. Human Behav.
2004, 20, 505–515. [CrossRef]
43. Gomes, A. User Needs and User Centered Design of Teaching Support Environments for E-learning.
In Proceedings of the Society for Information Technology & Teacher Education International Conference,
Albuquerque, NM, USA, March 2003; pp. 3562–3565. Available online: https://www.learntechlib.org/primary/
p/17906/ (accessed on 3 September 2020).
44. Lewis, J.R. Measuring perceived usability: The CSUQ, SUS, and UMUX. Int. J. Human–Comput. Interact.
2018, 34, 1148–1156. [CrossRef]
45. Harrati, N.; Bouchrika, I.; Tari, A.; Ladjailia, A. Exploring user satisfaction for e-learning systems via
usage-based metrics and system usability scale analysis. Comput. Human Behav. 2016, 61, 463–471. [CrossRef]
46. Lewis, J.R. The System Usability Scale: Past, Present, and Future. Int. J. Hum. Comput. Interact. 2016, 61,
463–471. [CrossRef]
47. Kaewsaiha, P. Usability of the Learning Management System and Choices of Alternative. In The International
Conference on Education, Psychology, and Social Sciences (ICEPS); Tokyo University of Science: Tokio, Japan,
2019; pp. 252–259. Available online: http://www.elic.ssru.ac.th/pongrapee_ka/pluginfile.php/18/mod_page/
content/13/Full%20Paper.pdf (accessed on 3 September 2020).
48. Brooke, J. SUS: A ‘quick and dirty’usability. Usability Eval. Ind. 1996, 189, 4–7.
49. Reichheld, F.F. The one number you need to grow. Harv. Bus. 2003, 81, 46–55.
50. Sekaran, U.; Bougie, R. Research Methods for Business: A Skill Building Approach; John Wiley & Sons: Chichester,
West Sussex, UK, 2016.
51. Palmer, K.; Devers, C. An Evaluation of MOOC Success: Net Promoter Scores. In Proceedings of EdMedia +
Innovate Learning 2018; Association for the Advancement of Computing in Education (AACE): Amsterdam,
The Netherlands, 2018; pp. 1648–1653.
52. Yan, J. Net Promoter Score (NPS): What is a Good Net Promoter Score? 2017. Available online: https:
//www.questionpro.com/blog/nps-considered-good-net-promoter-score/ (accessed on 20 July 2020).
53. Keiningham, T.L.; Cooil, B.; Andreassen, T.W.; Aksoy, L. A Longitudinal Examination of Net Promoter and
Firm Revenue Growth. J. Mark. 2007, 71, 39–51. [CrossRef]
54. Gronau, I.; Moran, S. Optimal implementations of UPGMA and other common clustering algorithms. Inf.
Process. Lett. 2007, 104, 205–210. [CrossRef]

© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access
article distributed under the terms and conditions of the Creative Commons Attribution
(CC BY) license (http://creativecommons.org/licenses/by/4.0/).

You might also like