You are on page 1of 4

Thriving in

Academe
REFLECTIONS ON HELPING STUDENTS LEARN
Thriving in Academe is a joint project of NEA and the Professional and Organizational Development Network in Higher Education
(www.podnetwork.org). For more information, contact the editor, Douglas Robertson (drobert@fiu.edu) at
Florida International University or Mary Ellen Flannery (mflannery@nea.org) at NEA.

Making SLO Assessment


Productive and Fun
Faculty, Assessment, Productive, and Fun. These four words are not
usually said in the same breath. However they can be. We can prove
it.
BY BETH LASKY, Like most colleges and universities our campus is mandated to conduct program assess-
ANU THAKUR, NINA ments of our Student Learning Outcomes (SLOs). Often this mandate is implemented
GOLDEN, MINTESNOT by a top-down approach in which faculty are directed to use this tool to assess your
WOLDEAMANUEL, classes and or programs! If you heard the voice of Charlton Heston or Morgan Free-
ASHLEY SAMSON, AND man, you get the point. This directive is often met with rolled eyes, questions about aca-
GIGI HESSAMIAN demic freedom, or just plain refusal. Some faculty feel students grades on assignments
California State and overall course grades reflect how well their students have met the SLOs. Other faculty
University, Northridge
are not even aware there are SLOs they need to meet.

In this article we will describe the process we went through to assess the specific learning
objectives for one of our General Education (GE) sectionsComparative Cultural Studies
(CCS). Not only did we learn a lot about working together as a committee, including part-
time and full-time faculty, we all had such a good time that we kept meeting even after we
finished our task! We feel that sharing our process will allow other campuses to benefit
from our experiences. Mandated directives about assessing SLOs do not have to be a bur-
den. We found that bringing together faculty from various disciplines is not only an asset
but can produce tools that can be used across campus.

6 NEA HIGHER EDUCATION ADVOCATE


Meet the Authors

Beth Lasky is a professor of


special education, and former
director of general education.
beth.lasky@csun.edu
Anu Thakur is an associate profes-
sor of interior design, and coordi-
nator of academic assessment.
anubuhti.thakur@csun.edu
Nina Golden is a professor of
business law.
nina.golden@csun.edu
Mintesnot Woldeamanuel is
an associate professor of urban
studies and planning. mintesnot.
woldeamanuel@csun.edu
Ashley Samson is an associate
professor whose specialty is sport
and exercise psychology.
ashley.samson@csun.edu
GiGi Hessamian is a lecturer
Participation Can be cess, we agreed that simply telling faculty who teaches General Education
and rhetoric.
what tools to use for assessment would
Painless not work. We needed to create a plan that
gigi.k.hessamian@csun.edu

We have all been at the place, say a depart- placed decision-making in the hands of the
ment meeting perhaps, where we are told faculty. Our campus GE program has seven
not only what to assess but what tools to sections, including natural sciences, arts
use for the assessment process. How many and humanities, and more. Each section
times have we said, That wont work in my has its own goals and SLOs. We decided to Establishing the
course, or in my discipline? begin with the GE section of Comparative
Cultural Studies/ Gender, Race, Class,
Committee
As the director of General Education and Ethnicity Studies and Foreign Languages To begin, a questionnaire to gain informa-
the coordinator of Academic Assessment (CCS) to follow with our GE course recer- tion about who was teaching CCS courses
at our university, we were part of a group tification process. Our first task was to put was distributed, via department chairs, to
that helped establish GE assessment on our together a committee to design a process to faculty. After respondents emailed back,
campus. During the strategic planning pro- assess the SLOs for CCS. the GE program director went through the

I TALES FROM REAL LIFE > HOW RUBRICS CAN WORK

A
t our first required students to through discussions paper. But having a ru- grade. Creating and
CCS commit- reflect on topics such and norming sessions. bric allows me to quan- implementing the CCS
tee meeting, as cultural diversity in So long as home-baked tify that sense, makes rubric has made me a
I remember thinking written assignments. pumpkin bread and grading faster, and more effective instruc-
that we were a diverse Over the course of multi-colored post-its most importantly, tells tor and I know that
group with little in many meetings we were provided, we students what I expect my colleagues from
common. Once we developed a tool to were content. of them. For every the committee feel the
began discussing our assess how successful paper that I assign in same way.
Most of us with years
courses and pedagogy, students were in reach- every class, I post a ru-
of teaching experi- BY NINA GOLDEN
we found that in fact ing the set learning out- bric before the paper is
ence develop a sense
we had one thing comes. Along the way due, and return a com-
of what makes a good
in common: we all we fine-tuned the tool pleted rubric with their

NEA HIGHER EDUCATION ADVOCATE 7


responses and formed a committee, ensur- Sports, as Mintesnot remembers, The piv- form of the papers. All brought checklists
ing a diverse group. (See Best Practices.) otal moment in the process was when we or holistic rubrics. After sharing some
Selected faculty were invited and informed discovered that the varying sample assign- examples of analytical rubrics, a fruitful
they would receive $100 for every meeting. ments had one thing in common: all were discussion around grading scales occurred.
reflection essay assignments. The assign- We began to draft a rubric and then revised
ments asked students to reflect on certain it via email. Before the calibration meet-
Getting Together cultural issues, and how the course content ing to determine its effectiveness, each
At first bringing together instructors from changed their perspective. This made the member submitted a reflection paper from
five departments seemed like a crazy idea. a previous semester. At the calibration
How would five instructors who teach five meeting, we practiced using the rubric until
different courses ever come to a consensus I AM MORE EFFICIENT, everyone was able to score the same paper
with a similar rating. By the end of this long
on how to assess the five CCS SLOs? The STRUCTURED, AND meeting everyone felt they had a good un-
purpose of our first meeting was simply to
get to know each other and to discuss the
THOROUGH IN ALL MY derstanding of how to use the CCS rubric.
goals and expectations of the committee. COURSE REQUIREMENTS A full day with lots of highlighters and food
Everyone briefly shared the structure of the was set aside to use the finalized rubric. Be-
CCS class they taught, while committee co- fore the meeting the members were asked
ordinators discussed potential assessment to provide 15 student papers from previous
strategies to initiate the thought process. entire process focus on developing an as-
semesters, representing a good mix of high
The meeting concluded with a discussion sessment tool on reflection assignments.
to low quality. Eight papers were randomly
of our goals and expectationsto design To do that, we shared our individual experi-
selected from the 15 papers brought by
a way to assess the CCS SLOs across their ences on how we assess assignments.
each faculty, for a total of 40 samples. They
five courses. As we left, members promised Finally, the committee discussed tools were duplicated and numbered. Then, each
to look at their own course to determine if to assess reflection papers. One member paper was rated by two committee mem-
and how it met the SLOs, and to return with used a scale based on timeliness, quality of bers, with each member scoring 16 papers.
samples of assignments and thoughts about writing, and whether the student followed The coordinator for Assessment shared the
the assessment strategy and process. directions. Rubric samples were shared, results at a final meeting.
and types and merits were discussed. The
meeting ended with everyone agreeing to
Deciding on a Tool Sharing the Rubric
try writing benchmarks for the reflection
Our second committee meeting lasted a bit assignment that met the SLOs. At the beginning of the 2013-14 school year
longer. Each member provided information the committee reconvened (without the
about his or her course, assignments used $100 meeting stipend) and decided they
and how they addressed the CCS SLOs. Getting into Rubrics
wanted to share the rubric as a grading/as-
Although the courses included ranged Most of the benchmarks brought to the sessment tool with CCS faculty across the
from Cities of the Third World to Women in third meeting focused on the content and campus. If the rubric could be widely ad-

I BEST PRACTICES > GETTING AND KEEPING FACULTY INVOLVED

W
e found that our tor was from, the course In addition, at least one
success hinged on they taught, their rank, home baked yummy was
a few easy strate- etc. Including part-time served, and we gave out
gies. First, we remained faculty was not only ben- other treats like colored through our campus recer-
focused. We started with eficial to our committee, pens. Although participants tification process with ease.
one area, and treated it like but also to the instructors. received $100 per meeting Finally, letters of apprecia-
a pilot program that would This best practice ensured during the first year, they tion from the director of GE
provide insights on ways more buy-in across campus continued to attend after were placed in participants
to scale up. Second, we when we shared our rubric. the stipends ended. Paying professional files to use
prioritized variety among Third, make the meetings for members to present and for promotion and tenure.
our members, including short and fun! Except for attend conferences was a Ashley said it best, I looked
length of time teaching a the calibration and scor- wonderful perk. Because we forward to coming to these
GE course, the college/ ing meetings, none went tied the rubric to the SLOs, meetings. My colleagues
department the instruc- longer than 90 minutes. two different courses went couldnt believe it!

8 NEA HIGHER EDUCATION ADVOCATE


opted, we could likely collect large sam-
ples of data from various departments.
I ISSUES TO CONSIDER
The director of GE emailed all chairs and
asked for 5-10 minutes at upcoming meet-
ings. Twenty-one of the 28 departments HOW TO MAKE highlighter for the area of
empathy, green for
welcomed a discussion, provided by one A RUBRIC observations, pink for
or two committee members, about the How was the rubric application of knowl-
new rubric and how to use it. Additional designed? edge, and finally, a purple
presentations were also made at meetings First we identified key for writing and organiza-
SLO terms that should be tion. In case of discrepan- two readers the results
of associate deans, college/department as-
sessment liaisons and various curriculum included in the assessment. cies, we discussed our were tabulated and dis-
These included self-aware- reasons for scoring until a cussed.
committees. In addition, the committee ness, knowledge, and em- consensus was reached. If
presented their process, rubric, and re- the rubrics wording was How was the rubric
pathy. We also discussed
found too ambiguous, it disseminated?
sults at two assessment retreats. the criteria for grading.
was changed to everyones A request was sent to all
Drafts were shared elec-
departments offering CCS
tronically until everyone felt liking.
courses asking for 5-10
Surveying the Students we had a rubric they could How was the rubric
minutes at upcoming
use effectively. validated? department meetings.
With the CCS rubric being used in many
What did the calibration To ensure the rubric could Twenty-one of 28 depart-
courses, the committee decided in fall meeting look like? be used with a number of ments welcomed a discus-
2013 to add an indirect assessment com- Each member was given a assignments, we spent one sion about the new rubric
ponent. They designed a student survey copy of the CCS rubric, five day grading multiple papers. and how to use it. One or
to be distributed at the beginning and end different colored highlight- Before the meeting, each two committee members
ers, and randomly selected of the five members were presented at each meeting.
of courses. The purpose was to determine asked to provide 15 student Additional presentations
student reflection papers.
why students take their course and then To begin, they were told to reflection papers, submit- were made to associate
to find out their opinion of the course use the yellow highlighter ted in previous semesters, deans, college and depart-
after completion. where the author addressed of varying quality. After a ment assessment liaisons,
the component of reflec- random selection, we had and various curriculum
tion. Then we discussed 40 papers and each committees. In addition,
Final Opportunities and our reasoning. After agree- member received 16 to the committee made pre-
ment was reached around score. After all papers were sentations at two campus
Lessons Learned reflection, the same process scored every paper was assessment retreats and
was done using a blue scored, independently, by one regional conference.
Four of the five committee members
attended the Association of American
Colleges and Universities Conference
(AAC&U) on General Education and As-
they line up more congruently with the Furman, T. (2013). Assessment of General
sessment in Portland, Ore. In addition, Education. The Journal of General Education.
SLOs. The students have also benefited
they presented at the Western Association 62(2-3). 129-136.
from having a structured rubric in place
of Schools and Colleges Academic Re- Hersh, R. & Keeling, R. (2013). Changing In-
from the start, so that they know what
source Conference, in Los Angeles, on this stitutional Culture to Promote Assessment of
they will be evaluated on. None of these Higher Learning. National Institute for Learn-
process and what was learned.
members had used rubrics before, now ing Outcomes Assessment. Occasional Paper
At the last meeting, everyone shared what they all use them! #17 (Discussion on how to build a culture of
they learned from the process. You have assessment.) http://learningoutcomesassess-
ment.org/OccasionalPapers.htm
already heard from Nina. GiGi said this:
From discussions with my committee Hutchings, Pat. (2010). Opening Doors to
REFERENCES & RESOURCES Faculty Involvement in Assessment. National
and participation in two assessment con-
Allen, M. J. (2004). Assessing Academic Pro- Institute for Learning Outcomes Assessment.
ferences attended largely by full-time fac- grams in Higher Education. Anker Publishing Occasional Paper #4. (Using rubrics and on-
ulty, I learned about assessment issues at Company, Boston. (Practical and easy to use line assessment to assess student knowledge
national and state levels. But why was this information on using rubrics.) and skills.) http://learningoutcomesassess-
information mostly new to me? Because ment.org/OccasionalPapers.htm
Banta, T. W., Griffin, M., Flateby, T., & Kahn,
part-time faculty are by-and-large absent S. 2009. Three Promising Alternatives for McLawhon, R., & Phillips, L. H. (2013). Gen-
from policy-related, decision-making Assessing College Students Knowledge and eral Education Assessment Plan: A Four-Tiered
Skills. National Institute for Learning Out- Approach. The Journal of General Education,
processes, their involvement relegated to 62(2-3), 204-221.
comes Assessment. Occasional Paper #2.
times of enforcement which can create a (Examines the reality of involving faculty and Wood, S. (2013). All about Rubrics. Presenta-
disconnect or worse, breed resistance. students in higher education.) http://learning- tion at the retreat on assessment of learning.
Finally, Ashley: I am more efficient, struc- outcomesassessment.org/OccasionalPapers. February 14, 2013. Western Association of
tured, and thorough in all my course re- htm Schools and Colleges. San Jose, CA.
quirements and am able to make sure that CSUN Documents. Please contact Beth Lasky
at beth.lasky@csun.edu

NEA HIGHER EDUCATION ADVOCATE 9

You might also like