Professional Documents
Culture Documents
A R T I C L E I N F O A B S T R A C T
Keywords: This paper proposes a discussion of the evaluation of an artefact developed under the Design Science paradigm
Design science using the Delphi method. It evaluates the Ecosystem framework of University Research Centres in Project
Design science research Studies, considering a set of criteria pre-established in the literature. The Delphi method is an evaluation
Framework
implemented in an electronic platform involving twenty-one participants, among whom were academics, prac
Project studies
Delphi method
titioners, and PhD candidates in the field of project management. It reached consensus and stability in two
Project management rounds: the results indicate a consensus among the participants in the applicability, novelty, simplicity,
completeness, fidelity to modelled phenomena, consistency and internal coherence, scalability, flexibility, in
terest, elegance, and reusability criteria. Usability was the only criterion that did not attain the predefined
percentage of consensus among the participants (70%). Given the framework’s characteristics, Delphi partici
pants indicated the need to produce complementary guidelines for its implementation.
* Corresponding author at: State University of Rio de Janeiro – UERJ, Rua São Francisco Xavier, 524, Maracanã, 20550–013 Rio de Janeiro, RJ, Brazil.
E-mail address: moutinho_pmp@yahoo.com.br (J.A. Moutinho).
https://doi.org/10.1016/j.evalprogplan.2023.102366
Received 5 October 2022; Received in revised form 10 July 2023; Accepted 28 August 2023
Available online 29 August 2023
0149-7189/© 2023 Elsevier Ltd. All rights reserved.
J.A. Moutinho et al. Evaluation and Program Planning 102 (2024) 102366
2
J.A. Moutinho et al. Evaluation and Program Planning 102 (2024) 102366
• Higher educa�on • Staff (academics, • Collabora�ve research • New theories and • Increase prac�cal skills • Enhance reputa�on
ins�tu�ons students, prac��oners, • Joint supervision of prac�ces knowledge & experience • Reinforce knowledge
• Research ins�tu�ons others) students • Technical and scien�fic • Prac�cal applica�on of transfer
• Companies • Financial (government, • Lifelong learning publica�ons research results • Develop R&D roadmaps
• Governments business, HEI, others) • Professional and • Intellectual property • New research • Increase knowledge
• Professional associa�ons • Facili�es (dedicated student mobility • Technological products opportuni�es breakthroughs
facili�es, access to and processes • Commercializa�on of • Increase employability
• Other outputs research • Other impacts
partners' facili�es)
• New ventures crea�on
• Knowledge assets • Network
• Curriculum update
• Other outcomes
Project Studies
• Project Design • Level of analysis
• Environmental factors
Context • Organiza�onal factors
• Individual factors
3
J.A. Moutinho et al. Evaluation and Program Planning 102 (2024) 102366
4
J.A. Moutinho et al. Evaluation and Program Planning 102 (2024) 102366
ATTRIBUTES
Applicability: The URC-PS framework applies to knowledge co-crea�on in a collabora�ve context
Novelty: The URC-PS framework is a novel proposal for Project Studies and knowledge crea�on
Simplicity: The framework comprises the necessary and sufficient number of elements and their rela�onships and is free of complex concep ts
Completeness: The URC-PS framework includes all the essen�al elements and the rela�onships between the elements
Elegance: The framework clearly represents the URC-PS
Usability: The URC-PS framework is easy to use
Fidelity to modelled phenomena: The URC-PS framework corresponds to the modelled reality
Consistency and internal coherence: The URC-PS framework is uniform, standardised, and free of contradic�ons between its elements
Scalability: The URC-PS framework can support growing numbers of elements without losing the characteris�cs that add value to it
Add field Flexibility: The URC-PS framework can be tailored or changed to meet the needs of specific contexts
Interest: The URC-PS framework is apt to be spread in the project studies field
Reusability: The URC-PS framework can lead to the crea�on of a new artefact
Strongly disagree
Disagree
Neither disagree nor agree
Agree
Strongly agree
Table 2 Table 3
Participant characterization. Relevant quality attributes of the URC-PS framework.
Roles Home Organisation Criterion Description Reference
Academics 52% Higher Education Institution 52% Applicability The URC-PS framework applies to Gill and Hevner
Practitioners 24% Company 38% knowledge co-creation in a (2013)
PhD Candidates 24% Government 10% collaborative context.
Years of Experience in Project Educational Level Novelty The URC-PS framework is a novel Gill and Hevner
Management proposal for Project Studies and (2013)
Over 15 52% PhD 71% knowledge creation.
5–15 43% Master’s degree 29% Simplicity The URC-PS framework comprises the ISO/IEC/IEEE
Under 5 5% necessary and sufficient number of (2010)
elements and their relationships and is
free of complex concepts.
homogeneous characteristics: indeed, a certain level of diversity in the Completeness The URC-PS framework includes all Prat et al. (2015)
the essential elements and the
demographic characteristics of the participants and in elements related
relationships between the elements.
to professional experience is beneficial (Förster and von der Gracht Elegance The framework clearly represents the March and Smith
(2014); Hussler et al., 2011; Powell, 2003). For participants who were URC-PS. (1995)
practitioners, however, there was a requirement - a previous relation Usability The URC-PS framework is easy to use. Davis (1989)
ship with the PPPM. Fidelity to modelled The URC-PS framework corresponds to Prat et al. (2015)
phenomena the modelled reality.
Although the URC-PS framework has not yet been applied in the real
Consistency and The URC-PS framework is uniform, ISO/IEC/IEEE
world (for example, the PPPM), its relevant quality attributes have been internal coherence standardised, and free of (2010)
evaluated by PPPM academics, PPPM’s students, alumni, and practi contradictions between its elements.
tioners. The evaluation took place in April 2022. Once the evaluation Scalability The URC-PS framework can support Bondi (2000)
had begun, the participants initially viewed the explanatory video of the growing numbers of elements without
losing the characteristics that add
URC-PS framework (9′30’’ duration), whereupon they evaluated the value to it.
framework according to the twelve criteria defined above. Flexibility The URC-PS framework can be tailored Gill and Hevner
The method was designed to be carried out in no more than three or changed to meet the needs of (2013)
rounds, or a consensus ≥ 70% for each criterion, or, indeed, stability specific contexts.
Interest The URC-PS framework is apt to be Gill and Hevner
(variation ≤ 20%) in comparison with two consecutive rounds and the
spread in the project studies field. (2013)
absence of new relevant contributions. In each round, participants had Reusability The URC-PS framework can lead to the Prat et al. (2015)
the opportunity to indicate their level of agreement with the statement creation of a new artefact.
made for each criterion (Table 3) using a five-point Likert scale:
"strongly disagree"; "disagree"; "neither disagree nor agree"; "agree" and
"strongly agree". By choosing one of the first three points on the scale, maintain or revise their answers.
each participant was required to indicate the reasons for their The reliability of responses among participants was verified using
disagreement and was also allowed to include comments when they Cronbach’s Alpha coefficient (Cronbach, 1951). The study was con
agreed with the statement. Once the first round was over, the partici cerned with the heterogeneous composition of the sample of participants
pants were presented with the statistical results and comments from the (including academics, students and practitioners) in relation to the
previous round - in the second round, they had the opportunity to concepts under evaluation, since any homogeneity could lead to a low
5
J.A. Moutinho et al. Evaluation and Program Planning 102 (2024) 102366
variance between the responses and, consequently, a low level of reli relevant quality attributes compared to students and practitioners. This
ability of the results (Creswell, 2010; Hayes, 1995). fact may occur due to the proximity of the academics to both the ele
To evaluate the similarity between the academics, practitioners and ments and macro-elements that make up the framework since they come
students, the answers given to the twelve criteria, using the Likert scale, across them in their daily working lives.
were organised in a contingency table to perform the χ2 test (Rana &
Singhal, 2015). The hypotheses were defined to test if the participants
come from the same population or not: 4.2. First round analysis
Table 4
Frequency distribution of degrees of agreement.
Degree of agreement (Likert scale) Academics Students Practitioners
Absolute Frequency Relative Frequency Absolute Frequency Relative Frequency Absolute Frequency Relative Frequency
6
J.A. Moutinho et al. Evaluation and Program Planning 102 (2024) 102366
detailing of the framework’s elements was also highlighted as a re reached during the first round, notably the percentages of consensus and
striction to its completeness, as well as the scope of its solution, given the participants’ comments. It also considered the definition of the
the heterogeneity of organisations involved. approval rules for the second round (degree of consensus ≥ 70% or
• Elegance: In the opinion of 90% of the participants, the elegance variation ≤ 20% compared to the previous round’s results). The second
criterion was achieved since the framework represents the URC-PS round focused on the two criteria that did not reach a consensus in the
Ecosystem clearly. In the opinion of the two participants who dis first round: simplicity and usability. The twenty-one participants
agreed with the statement, the latter was couched in such a way that received an email informing them of the start of the second round with a
the excess of words and the lack of clarity of the processes and their deadline of seven days for its completion. During the second round, each
flows clouded the issue. participant had the opportunity to analyse the comments of the others
• Fidelity to modelled phenomena: The criterion is present in the and thus adhere to their degree of agreement or even change it.
framework for 91% of respondents. For those who disagreed, func The simplicity criterion attained 77% of agreement. The four dis
tions such as changes in project scope, manager, and team were not agreeing participants argued that it is impossible to say that the
represented. framework is free of complex concepts since some are too exhaustive
• Internal consistency/coherence: 81% of the respondents agreed and abstract. In addition to this, they could not indicate any missing
that the framework has internal consistency/coherence. In the elements. Even among those who agreed with the framework’s
opinion of those who disagreed, some points are presented generi simplicity, there is an indication that there are items that would require
cally, which would need to be further explained, as would the greater detailing due to the meaning they may assume in the frame
interface between the macro-elements. work’s context. Some participants understood that the excess elements
• Scalability: This criterion obtained 72% of agreement. For the might be particularly visually complex. However, considering the
participants who disagreed, including new elements may make the various macro-elements and the scope of elements, it would be difficult
framework complex and compromised or even make its application to produce a more straightforward framework. They stated that the
unfeasible. Additionally, there is a group which neither agreed nor concepts in the framework are quite intelligible even for people from
disagreed, either by claiming lack of grounds for a conclusive eval other fields of expertise.
uation or because the framework has not been used in a real context. The usability criterion remained without consensus among the
• Flexibility: The criterion was accepted by 91% of the participants. participants. Opinions remained divided. While 52% of the participants
The two participants who disagreed with the affirmation indicated agreed that the framework is easy to use, 24% neither disagreed nor
the need for the incorporation of elements linked to change and agreed, and another 24% disagreed. Those who agreed understood that
configuration management. the framework represents an environment whose core process leads to
• Interest: 77% of the participants agreed that the framework could the impact of co-created knowledge. Even so, they highlighted the need
engender interest among both academics and practitioners. The for an explanation regarding the flow of the framework. Those who
participants who disagreed with the statement stated that there neither agreed nor disagreed point to the need to develop complemen
needs to be a specific context with a minimum structure for imple tary material such as a manual or even a guide for its implementation.
mentation in order to arouse interest. Similarly, the use of frame For this group of participants, the way the framework should be used is
works representing research centres is not common practice, which still not completely clear. Those who disagreed claimed that without
may hinder people’s interest. In any case, dissemination of the URC- having the opportunity to use the framework in practice, they could not
PS framework depends on the form of communication and the pre give their opinion with certainty. They also emphasised the need for a
disposition of URC to receive it. gradual implementation of the elements that make up the Governance
• Reusability: The criterion indicating the framework’s reusability and Management macro-elements.
potential was a consensus for 95% of the participants. Only one Because participants reached a consensus on the simplicity criterion
participant had expressed his doubt that the framework’s could (77%) in the second round and because of the stability of opinions in the
create new artefacts. usability criterion between the first and second round (52%), coupled
• Simplicity: 67% of the participants felt the framework was simple. with the lack of new relevant contributions, the researchers realised that
Those who neither agree nor disagree claimed that the macro- a new round would not produce different results. This decision agrees
element Project Studies should be explored further, with particular with Gallego et al. (2008), given that after stability is achieved between
reference to: specific terminology to address the research paradigm; subsequent rounds, new variations tend to be marginal and do not
the purpose for which the framework was created (representing a compensate for the mobilisation and efforts of participants.
collaborative environment) is naturally complex; at some point, the
framework becomes complex given the number of elements; the 5. Discussion and conclusions
number of relationships between elements and macro-elements is
very large, which makes the framework complex. The positive results of evaluating the URC-PS framework enhance
• Usability: This criterion, according to the participants, had the our understanding of the societal impact of university research (Dotti &
lowest agreement levels (52%), with 42% of agreement and 10% of Walczyk, 2022). Due to their specific objectives, research conducted by
total agreement. For 28% of the participants, the framework’s use is URCs often leads to even more societal impacts (Noe & Alrøe, 2023).
still unclear; the number of variables and macro-elements may The evaluation of the URC-PS framework is aligned with Tremblay
hinder its use; perhaps it may be partially possible to apply it; there is et al. (2010) in its manifestation of evidence that the artefact is endowed
a need for additional explanatory material to guide its with quality attributes that solve real problems. Moreover, as seen in the
implementation. literature, the Delphi method can be used at different moments of
evaluation in the context of the development of the artefact in research
Only two criteria, simplicity, and usability, did not reach the mini using the DSR method, both ex-ante (Henriques et al., (2021); Smits &
mum percentage (70%) of consensus during the first evaluation round Van Hillegersberg, 2014) and ex-post (Coetzee, 2019; Ebel et al., 2022),
among the twenty-one participants. as in this study.
In the present methodological paper, the Delphi method aimed at
4.3. Second round analysis consensus among participants, as it does in most studies using this
method (Diamond et al., 2014). However, some studies use variations of
As soon as the first round was completed, the welphi e-platform was the method, such as Policy Delphi, where dissensus prevails in searching
customised for the second round. It initially took into account the results for a wide range of opinions (Steinert, 2009). This study previously
7
J.A. Moutinho et al. Evaluation and Program Planning 102 (2024) 102366
restricted the maximum of three rounds, as Linstone and Turoff (2002) 5.1. Lessons learned
suggested, given that additional rounds tend to reveal little change; and
the number of participant dropouts could increase considerably. How The URC-PS framework was evaluated by a set of relevant quality
ever, this study gives additional empirical evidence that the recom attributes based on the requirements of a context for its future imple
mended number of rounds for a Delphi study is between two and three mentation. As a lesson learned, this study demonstrates that stakeholder
(Gallego et al., 2008). engagement during all phases of DSR - from problem identification to
The agreement objective in each of the evaluated criteria was evaluating the generated artefact - is critical. Its contribution to practice
defined, a priori, as 70%. This percentage is within the range indicated is proven in terms of its relevance to the Postgraduate Programme in
by the literature, which is between 50% and 97% (Diamond et al., Project Management when the latter acknowledges the artefact’s
2014). It is important to emphasise that the alleged arbitrariness importance as a tool for managing the collaborative environment
established in the percentage does not interfere with the research result. formed by academics and practitioners to co-create knowledge in Proj
Unlike other studies that may exclude a criterion for not reaching the ect Studies.
specified limit, this study only does not recognise the criterion as
consensual in that particular round, taking it on to a new round of 5.2. Limitations and suggestions for further research
evaluation. The convergence rate at the end of the first evaluation round
of the URC-PS framework can be attributed to the fact that the relevance The study presents limitations, such as the range of the Likert scale
cycle was effectively incorporated from the early stages of the study. The used, which may have influenced the change of opinion and how
generation of the URC-PS framework took into account feedback from consensus was reached (Makkonen et al., 2016). The research also does
several key stakeholders, specifically collected during the conduct of a not provide details as to the reasons that motivated participants to
focus group on problem identification and motivation and another change their opinions, although comments made during the second
during the definition of objectives for a solution, while, later in the round may give some indication (Turnbull et al., 2018).
artefact development step, the conduct of twenty-eight interviews with With this study, the artefact (URC-PS Ecosystem framework) is
academics, practitioners and Ph.D. candidates guaranteed appropriate considered evaluated (Venable et al., 2016), given that the application
levels of feedback. of the Delphi method was conducted as expected and achieved its
Even though there is no consensus on how to give feedback to par objective (Brady, 2015). As a result, it is pertinent to consider further
ticipants in the Delphi method, variations focus between argumentative research, such as, for example, conducting a naturalistic ex-post evalu
comments and statistical summaries (Rowe et al., 2005). In this study, ation, exploring the artefact in the environment of its practical use, and
both forms of feedback were used. While Rowe and Wright (1996) using other research strategies (leading to a possible methodological
identified that statistical summaries elicit fewer changes in opinions triangulation) which would further increase its external validity.
among participants compared to argumentative comments, in a The suggestion to develop an implementation guide also deserves
post-Delphi questionnaire study, Turnbull et al. (2018) identified that special attention since the framework’s implementation tends to involve
both argumentative comments and statistical summaries similarly in different actors depending on the context in which the URC is inserted.
fluence changes in opinions. Developing a user guide would make perfect sense since the framework
Stability between rounds also needs to be analysed (Dajani et al., is composed of many elements and there are multiple relationships be
1979). In this study, the variation between the first and second rounds tween them.
was considered acceptable since only 3 out of 21 participants (14%) After the framework’s implementation, a longitudinal study is sug
changed their opinions - a percentage below that of the 20% recom gested in order to analyse the results arising from the application of the
mended by Novakowski and Wellar (2008). If, on the one hand, the artefact, as well as its impact on individuals and the organisations
anonymity of the participants is a central feature of the method, thus involved.
reducing the effect of dominant individuals in the group, on the other In the same vein as the societal impact assessment of R&D collabo
hand, majority opinions also tend to influence changes of views (Mak rations between universities and industries is still an underexplored
konen et al., 2016; Meijering & Tobi, 2018), although such influence topic (Cohen et al. 2022), the evaluation and measurement of the soci
also depends on the composition of the group and the perceived etal impact of collaborative research conducted within the URC-PS
importance of the participants (Turnbull et al., 2018). context also warrant special attention.
Applying scientific methods in the evaluation step of artefacts Further investigations may also test the framework in URCs for fields
developed under the Design Science paradigm is imperative to legiti of knowledge other than Project Studies, as well as numerous other
mising the process of producing an artefact using DSR (Venable et al., contexts, which will legitimise its generalisation to the "collaborative
2016). However, even though the literature has explored this crucial academic environments" class of problems.
component, it still provides limited guidance on its implementation
(Peffers et al., 2012). Thus, this study aimed to evaluate the URC-PS CRediT authorship contribution statement
Ecosystem framework using the Delphi method in light of pre-defined
criteria. In addition to the specific results achieved, the study brings José da Assuncao Moutinho: Conceptualization, Methodology,
methodological contributions, lessons learned and provides opportu Formal analysis, Investigation, Writing – original draft, Visualization.
nities for further research. Gabriela Fernandes: Resources, Data curation, Writing – review &
There are two main methodological contributions: the first to DSR editing, Supervision, Funding acquisition. Roque Racbechini Junior:
and the second to the Delphi method. In the case of DSR, it supports Validation, Resources, Data curation, Writing – review & editing, Su
using an anonymous qualitative method (Delphi) to conduct artificial pervision, Project administration, Funding acquisition.
summative evaluation, in addition to already existing methods. The
methodological rigour adhered to in the evaluation demonstrated the Acknowledgments
quality of the knowledge produced by Design Science. For the Delphi
method, using an e-platform such as welphi broadens its application This research is sponsored by national funds through FCT – Fundação
possibilities when transposing barriers such as displacement restrictions para a Ciência e a Tecnologia, under the project UIDB/00285/2020 and
and incompatibility of participants’ schedules, thus increasing the LA/P/0112/2020 and was financed in part by the Coordenação de
method’s efficiency. Aperfeiçoamento de Pessoal de Nível Superior - Brasil (CAPES).
8
J.A. Moutinho et al. Evaluation and Program Planning 102 (2024) 102366
References Grisham, T. (2009). The Delphi technique: a method for testing complex and
multifaceted topics. International Journal of Managing Projects Business, 2(1),
112–130.
Akoka, J., Comyn-Wattiau, I., Prat, N., & Storey, V. C. (2023). Knowledge contributions
Hayes, B. E. (1995). Medindo a satisfação do cliente (p. 228). Rio de Janeiro: Editora
in design science research: Paths of knowledge types. Decision Support Systems, 166,
Qualitymark.
Article 113898.
Henriques, D., Pereira, R., Bianchi, I. S., Almeida, R., & Silva, M. M. da (2021). How IT
Albats, E., Fiegenbaum, I., & Cunningham, J. A. (2018). A micro level study of university
Governance can assist IoT project implementation. International Journal of
industry collaborative lifecycle key performance indicators. Journal of Technology
Information Systems and Project Management, 8(3), 25–45.
Transfer, 43(2), 389–431.
Hevner, A. R., March, S. T., & Park, J. (2004). Design research in information systems
Baskerville, R., Baiyere, A., Gregor, S., Hevner, A., & Rossi, M. (2018). Design science
research. MIS Quarterly, 28, 75–105.
research contributions: Finding a balance between artifact and theory. Journal of the
Hofmann, P., Jöhnk, J., Protschky, D., & Urbach, N. (2020). Developing purposeful ai use
Association for Information Systems, 19(5), 358–376.
cases - A structured method and its application in project management. Proceedings
Baskerville, R., Kaul, M., & Storey, V. C. (2015). Genres of inquiry in design-science
of the 15th International Conference on Business Information Systems 2020
research: Justification and evaluation of knowledge production. MIS Quarterly, 39
"Developments, Opportunities and Challenges of Digitization".
(3), 541–564.
Hsu, C.-C., & Sandford, B. A. (2007). The delphi technique: Making sense of consensus.
Berggren, C., & Söderlund, J. (2011). Management education for practicing managers:
Practical Assessment, Research, and Evaluation, 12. Article 10.
Combining academic rigor with personal change and organizational action. Journal
Hussler, C., Muller, P., & Rond, P. (2011). Is diversity in Delphi panelist groups useful?
of Management Education, 35(3), 377–405.
Evidence from a French forecasting exercise on the future of nuclear energy.
Bondi, A. B. (2000). Characteristics of scalability and their impact on performance.
Technological Forecasting and Social Change, 78(9), 1642–1653.
Proceedings of the Second International Workshop on Software and Performance (WOSP
ISO/IEC/IEEE, Systems and Software Engineering: Vocabulary, ISO/IEC/IEEE 24765:
2000) (pp. 195–203). Ottawa: ACM.
2010(E), December 2010, 1–418.
Bornmann, L. (2013). What is societal impact of research and how can it be assessed? a
Jones, M. (2018). Contemporary trends in professional doctorates. Studies in Higher
literature survey. Journal of the American Society for Information Science and
Education, 43(5), 814–825.
Technology, 64(2), 217–233.
Kayo, E. K., & Securato, J. R. (1997). Método Delphi: fundamentos, críticas e vieses. In
Brady, S. R. (2015). Utilizing and adapting the Delphi method for use in qualitative
Cadernos de Pesquisa em Administração, 1 pp. 51–61).
research. International Journal of Qualitative Methods, 1–6.
Kieser, A., Nicolai, A., & Seidl, D. (2015). The practical relevance of management
Brunet, M. (2021). On the relevance of theory and practice in project studies.
research: Turning the debate on relevance into a rigorous scientific research
International Journal of Project Management.
program. Academy of Management Annals, 9(1), 143–233.
Carton, G., & Mouricou, P. (2017). Is management research relevant? A systematic
Konstantinou, E. (2015). Professionalism in project management: Redefining the role of
analysis of the rigor-relevance debate in top-tier journals (1994–2013). Management,
the project practitioner. Project Management Journal, 46(2), 21–35.
20(2), 166–203.
Kuechler, B. & Vaishnavi, V. (2008). Theory development in design science research:
Clegg, S., Killen, C. P., Biesenthal, C., & Sankaran, S. (2018). Practices, projects and
Anatomy of a research project. In Proceedings of the Third International Conference on
portfolios: Current research trends and new directions. International Journal of Project
Design Science Research in Information Systems and Technology, Atlanta, Georgia.
Management, 36(5), 762–772.
Landeta, J. (2006). Current Validity of the Delphi Method in Social Sciences.
Coetzee, R. (2019). Towards designing an artefact evaluation strategy for human factors
Technological Forecasting & Social Change, 73(5), 467–482.
engineering: A lean implementation model case study. South African Journal of
Larsen, K. R., Lukyanenko, R., Muller, R., Storey, V. C., Vander Meer, D., Parsons, J., &
Industrial Engineering, 30(3), 289–303.
Hovorka, D. S. (2020). Validity in Design Science Research. International Conference
Cohen, M., Fernandes, G., & Godinho, P. (2022). Measuring the societal impacts of
on Design Science Research in Information Systems and Technology (DESRIST 2020),
university-industry R&D collaborations. Procedia Computer Science, 2022.
1–15.
Costa, C. A. B., Vieira, A. C. L., Nóbrega, M., Quintino, A., Oliveira, M. D., & Costa, J. B.
Linstone, H. A., & Turoff, M. (2002). The Delphi method: Techniques and applications.
(2019). Collaborative Value Modelling in corporate contexts with MACBETH.
Addison Wesley Newark, NJ: New Jersey Institute of Technology,.
Procedia Computer Science, 162, 786–794.
Makkonen, M., Hujala, T., & Uusivuori, J. (2016). Policy experts’ propensity to change
Creaton, J., & Anderson, V. (2021). The impact of the professional doctorate on
their opinion along Delphi rounds. Technological Forecasting and Social Change, 109,
managers’ professional practice. The International Journal of Management Education,
61–68.
19(1), Article 100461.
Manson, N. J. (2006). Is operations Research really Research? Orion, 22(2), 155–180.
Creswell, J. W. (2010). Educational research-planning, conducting, and evaluating
March, S. T., & Smith, G. F. (1995). Design and natural science research on information
quantitative and qualitative research (fourth ed.). New Jersey: Pearson Merril Prentice
technology. Decision Support Systems, 15(4), 251–266.
Hall.
Meijering, J. V., & Tobi, H. (2018). The effects of feeding back experts’ own initial ratings
Cronbach, L. J. (1951). Coefficient alpha and the internal structure of test. Psychometrika,
in Delphi studies: a randomized trial. International Journal of Forecasting, 34(2),
16(3), 297–334.
216–224.
Dajani, J. S., Sincoff, M. Z., & Talley, W. K. (1979). Stability and agreement criteria for
Moutinho, J. A. (2022). Ecosystem of a University Research Centre in Project Studies
the termination of Delphi studies. Technological Forecasting and Social Change, 13(1),
[Doctoral dissertation]. Universidade Nove de Julho.
83–90.
Moutinho, J. A., Fernandes, G., & Rabechini, R., Jr. (2023). Knowledge co-creation in
Dalkey, N., & Helmer, O. (1963). An experimental application of the Delphi method to
project studies: The research context. Project Leadership and Society, 4, 100090.
the use of experts. Management Science, 9(3), 458–467.
Moutinho, J. A., & Rabechini, R., Jr. (2021). Centro de Pesquisa Universitária:
Davis, F. D. (1989). Perceived usefulness, perceived ease of use, and user acceptance of
Caracterização do Ambiente de Pesquisa. Cadernos EBAPE.BR, 19(4), 887–900.
information technology. MIS Quarterly, 13(3), 319–340.
Moutinho, J. A., Rabechini, R., Jr., & Fernandes, G. (2023). Ecossistema de Centro de
Delbecq, A. L., Van de Ven, A. H., & Gustafson, D. H. (1975). Group techniques for program
Pesquisa Colaborativa em Project Studies: Um Framework Conceitual. Revista de
planning. Glenview, IL: Scott, Foresman, and Co.
Administração Mackenzie, 24(5), 1–31.
Denzin, N. K., & Lincoln, Y. S. (2018). The Sage Handbook of Qualitative Research (fi).
Nagle, T., Doyle, C., Alhassan, I. M., & Sammon, D. (2022). The research method we need
London: Sage,.
or deserve? A literature review of the design science research landscape.
Diamond, I. R., Grant, R. C., Feldman, B. M., Pencharz, P. B., Ling, S. C., Moore, A. M., &
Communications of the Association for Information Systems, 50.
Wales, P. W. (2014). Defining consensus: A systematic review recommends
Narazaki, R. S., Chaves, M. S., & Pedron, C. D. (2020). A project knowledge management
methodologic criteria for reporting of Delphi studies. Journal of Clinical Epidemiology,
framework grounded in design science Research. Knowledge and Process Management,
67(4), 401–409.
January, 1–14.
Dotti, N. F., & Walczyk, J. (2022). What is the societal impact of university research? A
Noe, E. B., & Alrøe, H. F. (2023). University Research Centres, scientific freedom, and the
policy-oriented review to map approaches, identify monitoring methods and success
Jester’s paradox. Syst Pract Action Res. https://doi.org/10.1007/s11213-023-09655-
factors. Evaluation and Program Planning, 95.
x
Ebel, M., Jaspert, D., & Poeppelbuss, J. (2022). Smart already at design time – Pattern-
Nonaka, I. (1991). The knowledge-creating company. Harvard Business Review, 69,
based smart service innovation in manufacturing. Computers in Industry, 138, Article
96–104.
103625.
Nonaka, I., & Toyama, R. (2007). Why do firms differ? The theory of the knowledge
Fini, R., Rasmussen, E., Siegel, D., & Wiklund, J. (2018). Rethinking the
creating firm. In K. Ichijo, & I. Nonaka (Eds.), Knowledge creation and management.
commercialization of public science: From entrepreneurial outcomes to societal
New challenges for managers (pp. 13–31). Oxford: Oxford University Press.
impacts. Academy of Management Perspectives, 32(1), 4–20.
Novakowski, N., & Wellar, B. (2008). Using the Delphi technique in normative planning
Förster, B., & von der Gracht, H. (2014). Assessing Delphi panel composition for strategic
research: methodological design considerations. Environment and Planning A, 40(6),
foresight - A comparison of panels based on company-internal and external
1485–1500.
participants. Technological Forecasting and Social Change, 84, 215–229.
Osborne, J., Collins, S., Ratcliffe, M., Millar, R., & Duschl, R. (2003). What "Ideas-about-
Galan-Muros, V., & Davey, T. (2019). The UBC ecosystem: putting together a
Science" should be taught in school science? A Delphi study of the expert
comprehensive framework for university-business cooperation. The Journal of
community. Journal of Research in Science Teaching, 40(7), 692–720.
Technology Transfer, 44(4), 1311–1346.
Pallant, J. (2001). SPSS survival manual - a step by step guide to data analysis using SPSS for
Gallego, M. D., Luna, P., & Bueno, S. (2008). Designing a forecasting analysis to
windows (version 10). Buckingham Open University Press,.
understand the diffusion of open-soURCe software in the year 2010. Technological
Peffers, K., Rothenberger, M., Tuunanen, T., & Vaezi, R. (2012). Design science research
Forecasting and Social Change, 75(5), 672–686.
evaluation. Lecture Notes in Computer Science (including Subseries Lecture Notes in
Gill, T. G., & Hevner, A. R. (2013). A fitness-utility model for design science research.
Artificial Intelligence and Lecture Notes in Bioinformatics), 398–410.
ACM Trans Management Information System, 4(2), 1–24.
9
J.A. Moutinho et al. Evaluation and Program Planning 102 (2024) 102366
Peffers, K., Tuunanen, T., Rothenberger, M. A., & Chatterjee, S. (2007). A design science Venable, J., Pries-Heje, J., & Baskerville, R. A. (2012). Comprehensive framework for
research methodology for information systems research. Journal of Management evaluation in design science research. In K. Peffers, M. Rothenberger, & B. Kuechler
Information Systems, 24(3), 45–77. (Eds.), Proceedings of the Seventh International Conference on Design Science Research in
Powell, C. (2003). The Delphi technique: myths and realities. Journal of Advanced Information Systems and Technology (DESRIST 2012) (pp. 423–438). Las Vegas:
Nursing, 41(4), 376–382. Springer Verlag.
Prat, N., Comyn-Wattiau, I., & Akoka, J. (2015). A taxonomy of evaluation methods for Venable, J., Pries-Heje, J., & Baskerville, R. A. (2016). FEDS: A Framework for evaluation
information systems artifacts. Journal of Management Information Systems, 32(3), in design science research. European Journal of Information Systems, 25(1), 77–89.
229–267. Vom Brocke, J., Winter, R., Hevner, A., & Maedche, A. (2020). Special issue editorial
Rana, R., & Singhal, R. (2015). Chi‑square test and its application in hypothesis testing. –accumulation and evolution of design knowledge in design science research: a
Journal of the Practice of Cardiovascular Sciences, 1(1), 69–71. Journey through time and space. Journal of the Association for Information Systems, 21
Reining, S., Ahlemann, F., Mueller, B., & Thakurta, R. (2022). Knowledge accumulation (3), 520–544.
in design science research: Ways to foster scientific progress. SIGMIS Database, 53(1), Von der Gracht, H. A. (2012). Consensus measurement in Delphi studies. Review and
10–24. implications for future quality assurance. Technological Forecasting and Social Change,
Romme, A. G. L. (2003). Making a difference: organization as design. Organization 79(8), 1525–1536.
Science, 14(5), 558–573. Vries, M. D., Gerber, A., & van der Merwe, A. (2013). A framework for the identification
Rowe, G., & Wright, G. (1996). The impact of task characteristics on the performance of of reusable processes. Enterprise Information Systems, 7(4), 424–469.
structured group forecasting techniques. Technological Forecasting and Social Change, Walker, D. H. T., Anbari, F. T., Bredillet, C., Söderlund, J., Cimil, S., & Thomas, J. (2008).
12, 73–89. Collaborative academic/practitioner research in project management: Examples and
Rowe, G., Wright, G., & McColl, A. (2005). Judgment change during Delphi-like applications. International Journal of Managing Project in Business, 1, 168–192.
procedures: The role of majority influence, expertise, and confidence. Technological Walker, D. H. T., & Lloyd-Walker, B. (2016). Rethinking project management.
Forecasting and Social Change, 72(4), 377–399. International Journal of Managing Projects in Business, 9(4), 716–743.
Saunders, M., Lewis, P., & Thornhill, A. (2019). Research Methods for Business Students Watermeyer, R., & Chubb, J. (2018). Evaluating ‘impact’ in the UK’s research excellence
(eighth ed.). Essex: Pearson. framework (REF): Liminality, looseness, and new modalities of scholarly distinction.
Secundo, G., Elia, G., Margherita, A., & Leitner, K.-H. (2022). Strategic decision making Studies in Higher Education, 44(9), 1554–1566.
in project management: a knowledge visualization framework. Management Decision, Wright, J. T. C., & Giovinazzo, R. (2006). O país no futuro: aspectos metodológicos e
60(4), 1159–1181. cenários. Estudos Avançados, 20(56), 13–28.
Siemieniako, D., Kubacki, K., & Mitręga, M. (2021). Inter-organisational relationships for Young, S. J., & Jamieson, L. M. (2001). Delivery methodology of the Delphi: A
social impact: A systematic literature review. Journal of Business Research, 132, comparison of two approaches. Journal of Park and Recreation Administration, 19(1),
453–469. 42–58.
Silvius, G., & Schipper, R. (2019). Planning project stakeholder engagement from a Yousuf, M. I. (2007). Using experts’ opinions through Delphi technique. Practical
sustainable development perspective. Administrative Sciences, 9(2), 46. Assessment, Research & Evaluation, 12(4), 1–9.
Simon, H. A. (1996). The science of the artificial (third ed.). Cambridge: MIT Press.
Smits, D., & Van Hillegersberg, J. (2014). The development of an IT governance maturity
José da Assunção Moutinho holds a PhD in project management from Nove de Julho
model for hard and soft governance. Proceedings of the 8th European Conference on
University (2022). He is a project manager at the State University of Rio de Janeiro and a
Information Management and Evaluation, 347–355.
researcher in the Graduate Program in Project Management at the Nove de Julho Uni
Söderlund, J., & Maylor, H. (2012). Project management scholarship: Relevance, impact
versity. His research interests include collaborative R&D research, implementation of
and five integrative challenges for business and management schools. International
project management practices in public environment and knowledge management in
Journal of Project Management, 30(6), 686–696.
projects. He has worked as a business consultant and project manager for over 20 years.
Sonnenberg, C., & vom Brocke, J. (2012). Evaluation patterns for design science research
artefacts. Communications in Computer and Information Science, 286, 171–177.
Steinert, M. (2009). A dissensus based online Delphi approach: An explorative research Gabriela Fernandes holds a PhD in management from the University of Southampton
tool. Technological Forecasting and Social Change, 76, 291–300. (2014). She is assistant professor at the University of Coimbra and researcher at CEMMPRE
Többen, J., & Opdenakker, R. (2022). Developing a Framework to integrate circularity and ALGORITMI Research Centres. Her research interests are in Organizational Project
into construction projects. Sustainability (Switzerland), 14(9), 5136. Management and Innovation Management, particularly in University-Industry R&D Col
Tremblay, M. C., Hevner, A. R., & Berndt, D. J. (2010). Focus group for artifact laborations context. She spent ten years coordinating and managing projects in different
refinement and evaluation in design research. Communications of the Association for industries. She developed and taught several project management training courses and, as
Information Systems, 26, 599–618. a consultant, coordinated the implementation of project management systems and Project
Trevelyan, E. G., & Robinson, N. (2015). Delphi methodology in health research: How to Management Office structures.
do it? European Journal of Integrative Medicine, 7(4), 423–428.
Turnbull, A. E., Dinglas, V. D., Friedman, L. A., Chessare, C. M., Sepúlveda, K. A.,
Roque Rabechini Junior holds a PhD in Production Engineering from the Polytechnic
Bingham, C. O., & Needham, D. M. (2018). A survey of Delphi panelists after core
School of the University of São Paulo (2003). He is a professor in the Graduate Program in
outcome set development revealed positive feedback and methods to facilitate panel
Project Management at the Nove de Julho University. He works as a researcher of tech
member participation. Journal of Clinical Epidemiology, 102(410), 99–106.
nology, project management and business strategy, as a consultant in projects of Project
Van Aken, J. E., & Romme, G. (2009). Reinventing the future: Adding design science to
Management Office implementation, structuring of new product launching processes and
the repertoire of organization and management studies. Organization Management
creation of project management methodology, among others.
Journal, 6(1), 5–12.
10