You are on page 1of 15

66

Journal of MultiDisciplinary Evaluation


Volume 15, Issue 33, 2019
Using Program Theory to
Evaluate a Graduate
Student Development ISSN 1556-8180
http://www.jmde.com

Program
Shannon Guss Chelsea E. Bullard
University of Denver Oklahoma State University

Ashley Keener Sarah Gordon


Oklahoma State University Center for Arkansas Tech University
Health Sciences

Background: The “3 Minute Presentation” is a graduate Intervention: 3 Minute Presentation competition.


student competition based off the more popular “3
Minute Thesis” competition. The program aims to help Research Design: Mixed-method program-theory
graduate students learn to inform others of their evaluation.
research in a quick and accessible manner. Programs
to engage graduate students more deeply in their Data Collection and Analysis: Direct observations and
education require evaluation to determine if they are closed-ended survey analyzed through qualitative
useful and effective at meeting their intended goals. coding, descriptive statistics, group comparisons, and
Evaluation literature in graduate educational programs correlation analysis.
is currently limited, but increasingly needed for both the
field and the students served. Findings: Overall, the program evaluation found, with
a possible lack of diversity in participants, that the
Purpose: Development and testing of a program-theory program components of recruitment, preparation, and
evaluation to understand participation, recruitment, skill development work as expected. Additionally,
preparation, training, skills, and confidence of engagement in preparation was associated with
graduate students engaging in a “3 Minute competition scores and the perceived helpfulness of
Presentation” competition at a state university. preparation was related to students’ confidence in their
presentation skills. This evaluation was deemed useful
Setting: Institution of higher education. for program improvement and capacity building in the
program’s continuation at the university.

Keywords: graduate student engagement; self-determination theory; 3 Minute Thesis; program-theory


evaluation.
Journal of MultiDisciplinary Evaluation 67

evaluation plan, provide visual diagrams of the


Introduction program theory, discuss evaluation results,
consider lessons learned, and provide
Efforts to engage college students more deeply recommendations for future efforts in higher
in their education generates programs focused education that may use this program
on student needs. These programs can benefit evaluation approach.
from program evaluation through testing of
assumptions, clarification of design, and
questioning of practices. Assessment and Literature Review
evaluation efforts for programs on college
campuses are important and timely, especially Program Description
as accountability and improvement drive
decision-making for funding, policy, research,
The Three Minute Presentation (3MP) is based
staffing, and student learning opportunities.
on the 3-Minuite Thesis (3MT) competition
In this study, a program theory-driven
originating at the University of Queensland in
approach was applied to gain clarity about the
2008. The 3MT allows graduate-level students
stakeholder perceptions of the operations of a
to present their completed or ongoing research
program and to tailor an evaluation to the
projects in three minutes with a single static
unique aspects of the specific university
PowerPoint slide. The 3MT competition
context.
“cultivates students’ academic, presentation,
The program evaluated in this study was
and research communication skills” by
the Three Minute Presentation (3MP), which is
increasing student capacity to effectively and
a program run by the Graduate College at our
briefly explain their research “in a language
institution (a large doctoral-granting land-
appropriate to a non-specialist audience”
grant university in the Midwest). The 3MP is
(University of Queensland, 2017, para. 1).
based on the 3-Minuite Thesis (3MT)
Benefits of participating in the 3MT program
competition that originated at the University of
include publicizing the research and the
Queensland in 2008. The 3MT competition
researcher, prize money, increased potential
“cultivates students’ academic, presentation,
employment opportunities, media attention,
and research communication skills” by
and opportunities to write about research for
increasing student capacity to effectively and
the media (3MT, 2015).
briefly explain their research “in a language
The 3MP was adapted from the 3MT model
appropriate to a non-specialist audience”
(changing the ‘T’—thesis— to ‘P’—
(University of Queensland, 2017, para. 1). We
presentation) to be more inclusive of students
approached the evaluation of the 3MP, an
who did not pursue thesis options for their
adaptation of the 3MT, with program theory-
graduate degree but still engaged in research
driven evaluation.
and/or creative components; it also allowed
The goals of this paper are to 1) provide an
for doctoral students who wrote dissertations
example of a theory-based evaluation in a
to participate. Like the 3MT program, the 3MP
student-focused higher education program,
program is also structured as a competition
using our Graduate College’s Three Minute
amongst graduate students. Students must
Presentation (3MP) program as a case study
create a three-minute presentation to discuss
and 2) disseminate results of an evaluation of
their research with a single static slide as their
a graduate student program to help build
only visual. Those eligible to participate in the
knowledge about the programs’ efficacy. While
3MP include currently enrolled students
the study sample size is small, this outlined
seeking a non-thesis master’s degree,
method proved useful for elucidating the
specialist, or graduate certificate program.
program theory to stakeholders and
Prior institutional winners of the 3MP and the
evaluation of the program in the early years of
related 3MT competition are not eligible to
its development. This method of evaluation
participate. The 3MP program is coordinated
could prove useful in future evaluations of the
by an associate dean of the Graduate College.
3MP or its more widely utilized counterpart,
The program is advertised to students,
the 3MT competition. Following a description
graduate coordinators, and advisors via email.
of the program, we discuss the program
68 Guss et al.

Students in certain degree programs are


required to participate in the 3MP program (as
Methods
determined by their academic program
faculty), while students from other degree The following steps outline the undertaken
programs may elect to participate. process of developing the program theory,
Additionally, students may elect to attend an developing and then testing grounded
optional training/preparation meeting offered evaluation questions.
by the coordinator of the 3MP.
3MP presentations are judged by non- Program Theory Identification
faculty community members and staff of the
university. The presentations are scored based The evaluators first met with the key
on the ability for a lay audience to comprehend stakeholder to better understand the program
the topic covered and the ability of the speaker from the stakeholder’s perspective
to create a compelling and engaging (Donaldson, 2007). The stakeholder explained
presentation. Various preliminary rounds the program to the evaluators and the
determine who will compete at the additional evaluators reflected back their understanding
levels, with cash prizes at each of the three of the program to ensure consensus and
levels: 1. Preliminary, 2. University Finals, and clarity. Importantly, the evaluators were able
3. President’s Fellows. The 3MP is a new to understand the stakeholder’s view of the
program at the University, beginning in 2015. program and how it met intended goals. The
The program stakeholders requested for the implicit assumptions underlying the program
new program to be evaluated for effectiveness (Weiss, 2000) were: 1) the program was
and potential improvements. created in part as a way to provide graduate
students an opportunity to be engaged (as
Evaluation Method supported by research on student
involvement/engagement); 2) the program
While evaluation has always been a part of would help them in their academic and
higher education practice at some level, it was research endeavors (as promoted by the 3MT
not until the 1980s and 1990s that it came to program the 3MP is modeled after); and 3) the
the forefront as an essential element of components of the program worked together to
practice for higher education professionals help students build their skills.
Specifically, the process of engaging and
(Schuh, 2009). Theory-based evaluation, or
developing students’ skills occurred through
theory-driven evaluation, can be especially
the process of choice of participation,
useful because it necessitates a discussion
preparation, and presentation in the
about the components of a program that are
critical to achieving the desired outcomes competition. Some students were recruited
(Donaldson, 2007; Fitz-Gibbon & Morris, through email, while others were in programs
1975). Program-theory evaluations seek not to which mandate participation. The program
know just if a program works, but how and offered an opportunity for students to engage
why a program works by developing theories in their area of study by preparing and
that identify the relationships between the presenting their research. While training was
problems a program aims to solve, the not mandatory for all participants, the training
was expected to assist students in preparing a
conditions program components/processes
concise and engaging presentation. Students
operate within, and how the program plans to
could also elect to prepare for the 3MP
solve those problems (Bickman 1987;
Donaldson 2007). Program-theory based independently. Regardless of how students
evaluation was chosen to investigate the 3MP chose to prepare, the preparation process was
program due to its ability to addresses system- seen as part of the student engagement
level assumptions and goals articulated by opportunity that the competition provided.
stakeholders and how program elements are Stakeholders believed that students gained
thought to impact outcomes. skills through preparation, both through
deepening engagement in their area of study
and pursuing a professional development
Journal of MultiDisciplinary Evaluation 69

activity that would be useful to students as theory proposes that meaningful student
they enter their chosen careers. involvement leads to cognitive complexity,
Following interviews with stakeholders, learning, and development (Renn & Reason,
ethnographic observation of training lectures, 2013). While there is extensive research on the
and examination of documents related to the effect of involvement and engagement among
program, evaluators then reviewed literature undergraduates (Hartnett, 1965; Kuh et al.,
related to student development, involvement, 1991; Pascarella & Terenzini, 2005; Terenzini
and engagement. These concepts served as the & Pascarella, 1991) less is known about
foundation for the stakeholder’s conception of graduate student involvement. While there
the primary program aims. Evaluators then were approximately 1.78 million students
incorporated literature from student enrolled in graduate programs in the United
involvement theory and self-determination States as of the Fall 2015, research on
theory. These literatures informed the tools graduate student involvement and its
that were chosen to measure program potential to support student outcomes is
components and the expected relationships sparse (Okahana, Feaster, & Allum, 2016).
among them. While some research points to similarities
between undergraduate and graduate student
Student Involvement Theory involvement and persistence as related to time
spent in clubs, organization, and other
campus-related activities (Gardner & Barnes,
One of the most widely supported and
2007; Thomas, Clewell, & Pearson, 1992;
accepted theories in college student
Tinto 1993), research on graduate student
development literature is Student Involvement
involvement was found to be particularly
Theory (Renn & Reason, 2013). Involvement is
important for intellectual development,
defined as “the amount of physical and
development of skills needed for
psychological energy that the student devotes
thesis/doctoral completion, (Tinto, 1993) and
to the academic experience” (Astin, 1984, p.
socialization into the professional academic
297). Also referred to as “engagement”, the

Figure 1. Depiction of program theory.


70 Guss et al.

community (Boyle & Boice, 1998; Gardner & As depicted in Figure 1 (and discussed
Barnes, 2007; Lovitts, 2001; Weidman, Twale, previously as a part of the program theory), the
& Stein, 2001). Ultimately, graduate student components of the program to be evaluated
development programs appear to receive less include 1) the recruitment or requirement of
attention and evaluation than programs students to participate in the program, 2) the
directed at development and retention of training or other preparation that students
undergraduate students. undergo before competition, and 3) the
competition itself. The student presentations,
Self-Determination Theory at any stage of the competition, show the skills
of participants—the identified outcome of
participating in the 3MP. Because the key
Self-determination theory comprises a set of
stakeholder identified student engagement as
theories explaining human motivation. The
a process through which students benefit from
most applicable sub-theory to this evaluation
the program, the links of perceived choice,
is basic psychological needs theory (Deci &
perceived helpfulness of preparing, and
Ryan, 2001). The theory posits that when
student engagement were also included in the
three basic psychological needs—relatedness,
program theory. Additionally, the program
competence, and autonomy—are met, people
required some students to participate while
become more motivated to accomplish difficult
others elected to be involved, which raised
tasks. The authors applied it specifically in
questions about students’ motivation and the
this study because autonomy—a feeling of
effect it might have on program outcomes—
control or belief that one’s choices can effect
this is also included in the model. Finally,
intended change—was thought to potentially
confidence was included as an additional
differ between students who chose to
check of the benefits of the program.
participate and those who were mandated to
Once a program theory was agreed upon
participate by their program. This study seeks
with the stakeholder, evaluators developed
to understand how program components of
tools to assess the program components.
participation, preparation, and presentation
Evaluators used both qualitative and
fit together to support the development of
quantitative methodological strategies to
graduate students.
understand the program. Merging the
qualitative and quantitative data resulted in a
Generating and Prioritizing thick, rich description of the program that lent
Evaluation Questions itself to use in a program-theory evaluation
(Geertz 1974; Chen 1997). The 3MP program
coordinator (the stakeholder) and the
A major advantage of theory-driven evaluation
evaluation team came to consensus on three
approaches is the ability to gain deeper insight
evaluation questions: 1) Who participates in
regarding how to evaluate a program by
the 3MP, and why? 2) How did students
working closely with stakeholders to
understand the entirety of the program prepare to confidently communicate about
(including its goals and values) and what the their discipline? and 3) What was students’
stakeholders want to learn from the evaluation confidence and skill level after presentation?
(Donaldson, 2007; Fitzpatrick, Sanders, Additional questions about program links
&Worthen, 2012). Theory-based evaluation – student perceptions of choice, perception of
uses “the construction of a plausible and the training, and student engagement – were
sensible model of how a program is supposed also explored to link the program components
to work” (Bickman, 1987, p. 5) to guide the together in the evaluation. The linking
evaluation. Thus, after an initial conversation questions include: 1) Do differences in
with the stakeholder and a review of program requirements relate to difference in
documents and literature, the evaluators perceived choice or student engagement? 2)
returned to the stakeholder with a model of What relationship, if any, exists between
program components and expected processes perceived choice, perceived helpfulness of the
(see Figure 1), as is typical of program theory- preparation process, and student
based evaluation (Donaldson, 2007). engagement? 3) Do differences in confidence
or competition scores exist between students
Journal of MultiDisciplinary Evaluation 71

who did and did not attend the provided evaluation use, is to explore decision
trainings? 4) What relationship, if any, exists situations—points at which more information
between perceived helpfulness, student is needed to determine next steps. Evaluators
engagement, confidence levels, and and stakeholders can link each question to a
competition scores? decision situation to consider: Does the
After identifying the theory and answer to this question lead me to make a
determining evaluation questions, evaluators better decision in the given situation? All
and stakeholders worked together to prioritize evaluation questions, linking questions, and
questions (Donaldson, 2007). One way to decision situation questions are listed in Table
prioritize questions, as well as facilitate 1.

Table 1
Evaluation Questions

Primary Evaluation Questions Linking Questions Questions that Facilitate a Decision
1. Who participates in 3MP, and 1. Do differences in program 1. Are changes needed to the
why? requirements relate to differences recruitment process?
in perceived choice or 2. Should students be motivated to
engagement? participate differently?
2. How did students prepare to 2. What relationship, if any, exists 3. Are changes needed to training
confidently communicate their among links and outcomes in the or other preparation resources?
discipline’s contribution to the model? 4. Do students need
public good? different/more engagement
opportunities?
3. What was students’ confidence 3. Do differences in confidence or 5. Is further evaluation needed to
level and competition score competition scores exist between understand how the program
after presenting? students who did and did not could continue to support
attend the provided trainings? student outcomes?

Data Collection including participant observation of training


and a semi-structured interview with open-
ended questions with the 3MP program
IRB approval was obtained prior to the
coordinator were used to understand the
recruitment of participants. Data for this
students’ training and, to a lesser extent, other
evaluation came from program judges and
preparation processes.
student participants. To answer all the
evaluation questions, quantitative and
qualitative data were collected concurrently. Instruments
Quantitative survey methods were used to
understand participation, perceived choice, Participation was assessed by the student
training components, engagement, and demographics questions and the Perceived
student skills. In order to collect this data, an Choice Scale. The Perceived Choice Scale was
email survey was sent to students inviting adapted from the Self-Determination Scale
them to participate in the evaluation. Email (Sheldon, Ryan, & Reis, 1996). All other
addresses were provided by the 3MP program instruments were created by the evaluation
coordinator. The surveys were accessible over team, based on program objectives described
a one-week period in the same semester in by the key stakeholder and a review of the
which a 3MP competition was being held. The previous competition’s judges’ rubric.
competition judges’ quantitative rubrics were Preparation was examined through a
also collected after the 3MP finals and were preparation helpfulness scale and a student
provided to the evaluation team by the 3MP engagement questionnaire. Additionally, a
program coordinator. Qualitative methods, semi-structured trainer interview was
72 Guss et al.

conducted as well as participant observation a description of each data collection


of the training seminar. To examine the instrument/measure and the associated
presentation piece, students were asked to evaluation questions/model component.
complete a confidence assessment, and judges
provided competition scores. Table 2 provides

Table 2
Evaluation Questions and Instruments

Measure Purpose Description of Items
Participation Questions:
Which students participate? How autonomous are students’ choices to participate?
Student Demographic Assessed the diversity of the student age, gender, ethnicity, disability status,
Questionnaire population participating in the 3MP and program of study, first language, and student
their reasons for participating status
Perceived Choice Scale Assessed the extent to which students’ 5-item Likert-scale [range: 1 (Only A feels true)
perceived their participation as an to 5 (Only B feels true)]
autonomous decision Example item:
A. I participate in the 3MP because I choose
to.
B. I participate in the 3MP because I must.
Training and Preparation Questions:
What is the perceived helpfulness of the preparation process? How engaged are participants in
preparation? What supports were provided for student development?
Student Engagement in Assessed the degree to which students List of 14 activities students may have engaged
Preparation prepared/engaged in scholastic in during the 3MP process [Yes/No responses]
Questionnaire activities during the 3MP process Example items:
What did you do to prepare for the 3MP?
(Prepared a script, studied training materials,
etc.)
Preparation Helpfulness Assessed the extent to which students 9-item Likert scale [range: 1 (Strongly disagree)
Scale perceived that the process of preparing to 7 (Strongly agree)]
for the 3MP helped them meet program Ex. Preparing for the 3MP… Helped me
objectives consider the needs of a non-specialist
audience.

Semi-Structured Gather information about the training to Observation of in-person training seminar and
Interview and participant inform the context of Evaluation face-to-face, semi-structured interview with
observation of training Question 2 training coordinator. Questions formatted in
open-ended manner to elicit coordinator
perspective on the training, its core messages,
and how well the training helps to prepare
students for competition objectives
Presentation Questions:
How do students perceive their presentation skills after participating in the 3MP? How do others perceive students’
presentation skills after participating in the 3MP?
Competition Scores Rated students on their 3MP Continuous scale from 20 to 80 based on a
performance, as noted by 3MP judges rubric of items.
Items within two components:
Journal of MultiDisciplinary Evaluation 73

Comprehension/content &
engagement/communication
Confidence Scale Assessed whether students felt 11-item Likert scale [range: 1 = not at all
participation in the 3MP helped them confident to 7 = exceedingly confident]
develop skills that align with program Ex. How confident are you that you can…
objectives Explain how your discipline is beneficial to the
public.

Analysis Participation Evaluation Questions:


Which students participate? How
Data analysis plans were generated relative to
the evaluation questions and utilized a Autonomous are Students’ Choices to
convergent mixed-method analysis procedure Participate?
(Creswell, 2014). Quantitative data analyses
consisted of descriptive statistics (e.g., Of the student respondents, 56% belonged to
percentages, frequency counts, means, and graduate programs that required students to
standard deviations) and group comparisons participate in the 3MP. Despite this, only 22%
(e.g., t-test, Mann-Whitney U). Correlations said the mandate was a primary reason they
were used to assess consistency among participated. All respondents were asked why
responses (e.g., judges and students) and the they participated in the program; 48%
relationship between links and the program indicated it was due to an interest in skill
outcome in the model. Multiple regression was
improvement, 22% indicated it was a program
not used due to sample size. Qualitative data
mandate, 11% indicated it was for the cash
were coded using a grounded theory
reward, 4% indicated it was encouraged by
methodology (Strauss & Corbin, 1994). Using their program, 4% indicated in was due to
the constant comparison method (Strauss & entertainment value, and 4% selected other
Corbin, 1990), important topics arising from (7% did not provide a reason for participating).
the data were pinpointed and grouped There were no significant differences in
together into relevant themes and categories. responses by gender, race, or ethnicity in their
Data was coded and organized until saturation reasons for participating.
was met and no new topics arose.

Preparation Evaluation Questions:


Results
What is the Perceived Helpfulness of
Surveys were initially emailed to 210 graduate the Preparation Process? How
students. Twenty-seven graduate students
who participated in the 3MP during the fall Engaged are Participants in
2016 semester provided data for the Preparation? What Supports were
evaluation resulting in a 13% response rate.
The 27 evaluation participants ranged in age Provided for Student Development?
from 21 to 47, with a mean age of 25 (SD =
3.06). The sample contained 48% males and A total of 13 students indicated that they
37% females, with 15% who chose not to attended training, whereas 10 indicated they
provide their gender identification. In regards did not attend training. On average, students
to race/ethnicity, 22% identified as White and agreed that the training process helped to
67% identified as Asian, and 11% chose not to increase their presentation skills (M = 6.16, SD
provide their race/ethnicity. A total of 18 =.75 on a 7-point scale). Among the items on
participants (66%) were international the Training Experience Survey, students
students, and 13 participants (48%) indicated indicated that the training was most helpful at
that English was not their first language. developing their ability to “effectively manage
their time” (M = 6.44) and that the training was
74 Guss et al.

the least helpful concerning “improving their Skills After Participating in the 3MP?
understanding of their topic” (M = 5.84).
Another highly rated item was “pushed me to How do Others Perceive Students’
think of stories or evidence that illustrated my Presentation Skills After Participating
points” (M = 6.36).
Approximately 80 students attended the in the 3MP?
training program and only one participant
from the two programs mandating attendance The dataset revealed a negatively skewed
did not attend the training session. distribution in relation to confidence in
Additionally, there were two additional non- presentation skills among 3MP participants.
mandatory training sessions for non-thesis Most students reported that they were “pretty
master’s, specialist and graduate certificate confident” to “exceedingly confident” that they
students who volunteered to compete in the had obtained the necessary skills to present
3MP competition. Twenty-four students their research to an audience. Means and
signed up to attend one of the two non- standard deviations associated with each skill
mandatory training sessions and less than 20 are presented in Table 3. Students felt
students actually attended the non-mandated particularly confident about communicating
training program. clearly and holding an audience’s attention as
evidenced by their high means (6.00 and 6.04
respectively) and small standard deviations
Evaluation Question 3: How do (.96 and .90).
Students Perceive their Presentation

Table 3
Means and Standard Deviations Associated with Level of Confidence in Presentation Skill

Presentation Skill M SD
Give a good explanation of topic to someone you just met. 6.04 0.76
Explain your topic thoroughly without boring someone. 5.78 0.93
Communicate clearly about your discipline. 6.04 0.90
Communicate to a non-specialist audience effectively. 5.74 1.06
Avoid "jargon" when communicating your ideas. 5.78 1.12
Illustrate your points with stories or evidence. 5.96 1.02
Effectively manage you time when presenting. 5.81 1.11
Explain how your discipline is beneficial to the public. 6.07 1.04
Prepare a presentation in a concise format. 6.00 0.96
Prepare a visually effective presentation. 6.00 1.28
Hold your audience's attention with your topic. 6.00 0.76
Total 5.90 0.78
Note. N = 27; responses are based on a 7-point scale, where 1 = Not at all confident, 2 = a little bit confident, 3 = somewhat
confident, 4 = moderately confident, 5 = confident, 6 = very confident, and 7 = exceedingly confident.

Linking Questions: Do Differences in the mandate of some programs to require


students to participate was examined. As
Program Requirements Relate to shown in Table 4, comparing students in
Differences in Perceived Choice, or programs that do mandate participation
versus students in programs that do not
Engagement in Training/Preparation? showed no significant difference by program
mandate in perceived choice, engagement in
Two constructs that potentially linked training, or hours spent preparing. There was
program components were identified. First, a trending difference in engagement tasks,
Journal of MultiDisciplinary Evaluation 75

with higher engagement among those not was not statistically significant, but could be
mandated to participate [t(23) = 1.85, p = .08]. considered practically significant given the
Finally, the correlation between perceived evaluation’s small sample size.
choice and engagement tasks (r = .34, p = .102)

Table 4
Comparisons Between Students Mandated to Participate and Those not Mandated to Participate in the
3MP on Perceived Choice, Engagement in Training, Hours Spent Preparing, and Engagement Tasks

t df p
Perceived Choice .94 24 .94
Engagement In Training - .63 8 .54
Hours Spent Preparing .28 20 .78
Engagement Tasks 1.85 23 .08

What Relationship Exists, if any, analyzed between engagement and skill


variables (see Table 5), and we found that the
Among Links and Outcomes in the number of reported hours of engagement
Model? significantly correlated to confidence in skills,
and the number of reported tasks completed
This linking question examined the connection in preparation (i.e,. engagement in
between engagement / preparation and preparation) had a significant, strong
presentation skills. Engagement was correlation to competition scores. While the
measured in preparation tasks, behavior other engagement scores did not correlate to
during training, and hours spent preparing. the judges’ scores, we believe that data might
Skills were measured by the Confidence in have been more informative if scores (which
Skills Questionnaire and the three rounds of were provided to the evaluation team by the
judging across the semester. Correlations were 3MP program coordinator) were disaggregated
by the judging content areas.

Table 5
Correlations Among Key Variables

Perceived Preparation Engagement Preparation Skills Competition


Choice Engagement Hours Helpfulness Confidence Scores
Perceived Choice —
Preparation Engagement .34 —
Engagement Hours .31 .41 —
Preparation Helpfulness .18 .20 .42* —
Skills Confidence .05 .22 .42 .63** —
Competition Score .07 .55* .15 .28 .29 —

Do Differences in Confidence or To answer this linking question, an


independent samples t-test was conducted to
Competition Scores Exist Between determine if those that attended training
Students Who Did and Did Not demonstrated more confidence in skills than
those that did not attend training, and the
Attend the Provided Trainings? results revealed there were no statistically
significant differences between the groups
[t(21) = .36, p = .73], suggesting that students
76 Guss et al.

Figure 2. Program evaluation: Correlations between the program components.


Note. *p < .05; **p < .01

who attended the training did not obtain more components together, so will therefore be
confidence in their skills than those who did discussed as part of the program theory
not attend the training. evaluation.
The program evaluation tested whether the
Program Evaluation decision to mandate participation in the 3MP
would affect students’ perceptions,
experiences, or engagement in the process.
Our final assessment of the program considers
Engagement in the process of preparation for
the program components first, then assesses
the competition was also expected to facilitate
how they work together (Donaldson, 2007), as
the path from program support for student
outlined in the program theory. The program
preparedness and the acquisition of skills. In
evaluation highlights that the sample of
the theory evaluation, no evidence supported
participants is not diverse, but their
the stakeholders’ concern that the program
participation in the program is perceived to be
mandate negatively affected students’
due to their own choice (rather than required).
perceptions, experiences, or engagement in
The preparation process is seen as helpful and
the 3MP. Correlations (shown in Figure 2)
the training is convenient for those who chose
between the program requirement and
to attend. There was a high level of
perception of choice and perception of
engagement in the training and a largely
helpfulness of the preparation process were
varying level of engagement in preparation
both very low. However, student ratings of the
tasks as a whole. Confidence in the
helpfulness of the preparation process was
presentation skills were high. Overall, with a
significantly correlated with students’
possible lack of diversity in the participants,
confidence in their skills. Further, engagement
the program components of recruitment,
in the preparation process correlated with
preparation, and skill development seem to be
competition scores.
working as expected. The components of
perceived choice and engagement are
particularly important to link program
Journal of MultiDisciplinary Evaluation 77

the smaller sample size reduced the variability


Strengths, Limitations, and of participants, and therefore limited the range
Conclusions of scores from which correlations could be
observed.
The strengths and limitations of the program This brings us to note that this evaluation
can be organized by its component parts— was limited by the scope and quantity of its
participation and recruitment, preparation sample, which could mean the evaluation
and training, and skills and confidence. captured the perspective of those most
Strengths of participation and recruitment interested in the 3MP. An additional limitation
exist in the perceptions of students--students is the aggregated judge’s rubric data. We may
who participate generally felt autonomous in have seen more correlations between
their decision to engage in the 3MP. Strengths engagement and skill if we could look at
of the training and preparation processes were components of skill individually, but this was
many, with weaknesses being harder to find. not possible with the existing data with which
Participants found the preparation processes we worked. We address these limitations by
very helpful, and their rating of how helpful acknowledging that, while these data
the process was related to the confidence they represent an important portion of
had in their skills. participants, additional participants and
None of the aspects of training or additional data may have provided a wider and
preparation were rated low by the students more accurate account.
and intended goals were met. Further, while Finally, with regard to student outcomes,
most participants said the training was strengths and limitations of the skills
convenient and the preparation process was developed for the competitions were the most
very helpful, open-ended comments collected difficult to evaluate, as self-reported ratings of
from student surveys suggested that online confidence did not correlate to the judges’
training (asynchronous or streamed) or scores at the competitions. We believe this
evening trainings be available or that the issue is more of a limitation of the evaluation
presence of online materials (e.g., slides and itself than of the program. However, there is
videos) were easier to access. A small number evidence that confidence was high amongst
had difficulty finding the videos or were not most students and that a relationship exists
aware of the online resources until the between how participants engaged in
training. These anecdotes could indicate the preparation tasks and how well participants
need for additional opportunities to access did in the competition. These two pieces of
training. evidence show that the program is working as
Once students decided to participate and expected – giving students an opportunity to
prepared, their skills were tested in the practice an important professional skill. The
various levels of competition. Within the theory evaluation, along with overall
competition, students’ report of their conclusions and strengths and limitations,
engagement in preparation tasks was provide stakeholders with the data and
correlated with their scores. However, this evidence they need to understand both how
relationship diminished after the first round. the program works and answer the decision-
One interpretation is that the later rounds situation questions posed in the evaluation
were more difficult for participants to compete planning phase.
in due to competing against a more diverse
group – particularly more students whose first Lessons Learned
language is English and fewer students and
judges who are accustomed to their For future evaluations of this and similar
interdepartmental jargon. Thus, we programs, we recommend targeting a more
recommend that stakeholders consider the diverse pool of participants by reaching out to
influence of language in the judging of the a wider variety of programs and student types.
competition, potentially providing additional Particularly in light of the program’s aim to
supports for bilingual students. An alternative highlight scholarship’s contribution to the
explanation for this finding, however, is that public good, a diverse participation in the
78 Guss et al.

program may facilitate the clarity and skill stakeholders conceptualize what they thought
with which participants take their scholarship their program should do (student outcomes)
into their various communities. However, and why they thought it should work. In
though the number of participants is relatively situations such as the one in this study, where
small, we feel it important to note that the evaluation was not considered as a part of the
program that was evaluated is one of the most program development and/or when program
visible programs provided by the Graduate outcomes are not clearly stated from the start,
College for graduate students, and the theory-based evaluation approaches can be
Graduate College used this evaluation data to: helpful. The central work of theory-based
1) determine that the program was worthwhile evaluation lies in developing a theory for why
and will continue at our institution and 2) a program should achieve its desired
make changes for upcoming iterations for the outcomes through engagement with key
program. In addition to this practical use of stakeholders, evaluator expertise, and/or
evaluation results, the findings of this social science research to create linkages
evaluation also contribute to the knowledge between program actions, goals, and
base about programs for graduate students, outcomes (Chen, 1990, Weiss, 1997, p. 78).
which is an important part of the focus and We found this approach to be absolutely
use of evaluation (Owen, 2006) that should not essential in facilitating the evaluation process,
be overlooked, no matter how seemingly small and we recommend this approach continue to
the sample. This is also important because be used in similar situations in higher
little is known about graduate student education.
involvement and outcomes (Okahana et al., Further, the use of the theory-based
2016). evaluation approach also helped facilitate
While the need for graduate college evaluation capacity building for the
student development programs is a finally department (i.e., the Graduate College) that
recognized vital component for graduate oversaw the program. Fitzpatrick et al. (2011)
student graduation and satisfaction (Keeling, described how evaluations can be used to
2004), program evaluations are not always empower by “(1) providing stakeholders with
conducted on these programs to determine tools for assessing the planning,
their success. To our knowledge, no program implementation, and self-evaluation of their
evaluations have focused on the 3MT or 3MP program and (2) mainstreaming evaluation as
programs (at our institution or elsewhere). part of the planning and management of the
This evaluation provided evidence that the program/organization” (p. 209). This seemed
3MP was a successful professional to be case in this evaluation, because this
development program for graduate students. process helped stakeholders think through all
The theory-driven approach adopted for the phases of the program, articulate its purpose,
present evaluation was beneficial given the use information to make changes, and begin
3MP program had not been evaluated before incorporating pathways for data collection and
and the evaluators were working with a key thus program evaluation into this and other
stakeholder that was not familiar with the programs they oversaw. Clearly, the
evaluation process. Therefore, it became perspective gained from this evaluation served
pertinent throughout the evaluation to foster as a guide for program improvement and
the key stakeholder’s understanding of the capacity building. Capacity building has been
evaluation process via encouraging inquiry, described as “a context-dependent, intentional
building rapport, and cultivating an action system of guided processes and
environment where formative evaluation can practices for bringing about and sustaining a
be an ongoing process. This evaluation could state of affairs in which quality program
serve as a framework for evaluation of similar evaluation and its appropriate uses are
and more broadly implemented 3MT ordinary and ongoing practices within and/or
competitions and graduate schools and between one or more
universities. organizations/programs/sites” (Stockdill,
Through the course of conducting this Baizerman, & Compton, 2002, p. 8).
evaluation, we found that using a program In this evaluation, to encourage the key
theory evaluation approach helped the stakeholder to see the importance in
Journal of MultiDisciplinary Evaluation 79

continuing ongoing evaluation, emphasis was Retrieved from


placed on presenting the evaluation findings in https://vimeo.com/138709200
an understandable and non-technical format. Astin, A. W. (1984). Student involvement: A
For example, path diagrams were presented to developmental theory for higher education.
articulate the program-theory evaluation in a Journal of College Student Personnel, 25(4),
concise, easy-to-follow layout, which 297-308. Retrieved from
maximizes the utility of the evaluation for the https://www.researchgate.net/profile/Ale
key stakeholder. Additionally, by identifying xander_Astin/publication/220017441_St
the primary audience (i.e., a key stakeholder udent_Involvement_A_Development_Theor
with limited knowledge of program evaluation), y_for_Higher_Education/links/00b7d52d
we were able to tailor the evaluation report to 094bf5957e000000/Student-
their information needs and, hence, increase Involvement-A-Development-Theory-for-
the probability of the findings being used. In Higher-Education.pdf
this case, we contend that both the evaluation Bickman, L. (1987). The functions of program
approach (theory-based evaluation) and the theory. New Directions for Program
transmission of information could be seen as Evaluation, 33, 5-18. doi:
intimately linked to capacity building, 10.1002/ev.1443
especially considering this was the first time Boyle, P., & Boice, B. (1998). Systematic
the stakeholders had been involved in mentoring for new faculty teachers and
program evaluation. We recommend future graduate teaching assistants. Innovative
evaluations on this program (and others Higher Education, 22(3), 157-179.
similar to it in higher education) follow the https://doi.org/10.1023/A:1025183225
same approach to help facilitate utility of 886
findings and to build evaluation capacity. Brousselle, A., & Champagne, F. (2011).
Finally, the theory-based evaluation Program theory: Logic analysis. Evaluation
approach used in this evaluation allowed for a and Program Planning, 34(1), 63-78.
discussion about the components of a http://doi.org/10.1016/j.evalprogplan.20
program that are critical to achieving the 10.04.001
desired outcomes (Donaldson, 2007; Fitz- Chen, H. T. (1990). Theory-driven evaluations.
Gibbon & Morris, 1975). It also contributed to Newbury Park, CA: Sage.
building evaluation capacity by cultivating an Chen, H. T. (1997). Applying mixed methods
environment where evaluation was under the framework of theory-driven
understood and valued. Thus, because this evaluations. New directions for
evaluation was frequently limited by its evaluation, 1997(74), 61-72.
sample size, we recommend that evaluations Creswell, J. W. (2014). A concise introduction
of this program and others like it continue, to mixed methods research. Sage
and future evaluations should consider Publications.
incorporating data collection as part of the Deci, E. L., & Ryan, R. M. (2000). The 'what'
normal “paperwork” to be completed by and 'why' of goal pursuits: Human needs
participating students. This would allow for a and the self-determination of behavior.
broader and more accurate understanding of Psychological Inquiry, 11, 227-268.
the program’s influence across participants Donaldson, S. I. (2007). Program theory-driven
and would allow for parts of the evaluation to evaluation science: Strategies and
be “embedded” into the program itself, thus applications. New York: Erlbaum.
continuing to ensure that information is Fitz-Gibbon, C.T., & Morris, L.L. (1975).
collected to allow stakeholders to make Theory-based evaluation. The Journal of
evidence-based decisions regarding program Educational
improvements. Evaluation, 5(1). Retrieved from
http://www.cem.org/attachments/public
ations/CEMWeb022%20Theory%20Based
References %20Education.pdf
Fitzpatrick, J. L., Sanders, J. R., & Worthen,
3MT (Producer). (2015, September 9). B. R. (2012). Program evaluation:
Promoting yourself with 3MT [video file]. Alternative approaches and practical
80 Guss et al.

guidelines (4th ed.). Upper Saddle River, literature. New Directions for Evaluation,
NJ: Pearson. 93, 7-26. doi: 10.1002/ev.39
Gardner, S. K., & Barnes, B. J. (2007). Strauss, A., & Corbin, J. (1990). Basics of
Graduate student involvement: qualitative research: Grounded theory
Socialization for the professional role. procedures and techniques. Newbury Park,
Journal of College Student Development, CA: Sage.
48(4), 369-387. doi: Strauss, A., & Corbin, J. (1994). Grounded
10.1353/csd.2007.0036 theory methodology. In N.K. Denzin & Y.S.
Hartnett, R. T. (1965). Involvement in extra- Lincoln (Eds.), Handbook of qualitative
curricular activities as a factor in academic research (pp. 217-285). Thousand Oaks,
performance. Journal of College Student CA: Sage.
Development, 6(5), 272-274. Retrieved Terenzini, P. T., & Pascarella, E. T. (1991).
from https://muse.jhu.edu/journal/238 Twenty years of research on college
Kuh, G. D., Schuh, J. H., Whitt, E., Andreas, students: Lessons for future research.
R. E., Lyons, J. W., Strange, C. C., ... & Research in Higher Education, 32(1), 83-
MacKay, K. A. (1991). Involving colleges. 92. doi:
San Francisco: Jossey-Bass. https://doi.org/10.1007/BF00992835
Lovitts, B. E. (2001). Leaving the ivory tower: Thomas, G., Clewell, B., & Pearson Jr, W. GRE
The causes and consequences of departure Board. (1992). The role and activities of
from doctoral study. Lanham, MD: American graduate schools in recruiting,
Rowman & Littlefield. enrolling, and retaining United States’
Okahana, H., Feaster, K., & Allum, J. (2016). Black and Hispanic students. (Report No.
Graduate enrollment and degrees: 2005 to 87-08) Princeton, NJ: Educational Testing
2015. Washington, DC: Council of Service, Graduate Record Examinations
Graduate Schools. Retrieved from Board
http://cgsnet.org/ckfinder/userfiles/files Tinto, V. (1993). Building community. Liberal
/Graduate%20Enrollment%20%20Degree Education, 79(4), 16-21. Retrieved from
s%20Fall%202015%20Final.pdf https://www.aacu.org/liberaleducation
Owen, J.M. (2006). Program evaluation: Forms University of Queensland. (2017) Three Minute
and approaches (3rd ed.). New York: The Thesis. Retrieved from
Guilford Press. https://threeminutethesis.uq.edu.au/abo
Pascarella, E. T., & Terenzini, P. T. (2005). ut
How college affects students (Vol. 2). K. A. Weidman, J. C., Twale, D. J., & Stein, E. L.
Feldman (Ed.). San Francisco, CA: Jossey- ERIC. (2001). Socialization of Graduate
Bass. and Professional Students in Higher
Renn, K. A., & Reason, R. D. (2013). College Education: A Perilous Passage? ASHE-ERIC
students in the United States: Higher Education Report, (Volume 28,
Characteristics, experiences, and Number 3)., San Francisco, CA: Jossey-
outcomes. San Francisco, CA: John Wiley Bass, Publishers.Weiss, C. H. (1997).
& Sons. Theory-based evaluation: Past, present,
Schuh, J.H. (2009). Assessment methods for and future. New Directions for Evaluation,
student affairs. San Francisco, CA: Jossey- 76, 41-55. doi: 10.1002/ev.1086
Bass. Weiss, C.H. (2000). Which links in which
Sheldon, K., Ryan, R., & Reis, H.T. (1996). theories shall we evaluate? New Directions
What makes for a good day? Competence in Evaluation, 87, 35-45. doi:
and autonomy in the day and in the 10.1002/ev.1180
person. Personality and Social Psychology
Bulletin, 22, 1270-1279.
https://doi.org/10.1177/014616729622
12007
Stockdill, S.H., Baizeman, M., & Compton,
D.W. (2002). Toward a definition of the
ECB process: A conversation with the ECB

You might also like