External Pressures for Assessment

‡ Perceived Federal Interest: Spellings Report (2006) ‡ Increased Attention to Higher Education in Ohio: University System of Ohio (Voluntary System of Accountability, emphasizing the Collegiate Learning Assessment test; subsidy tied to metrics-student retention) ‡ North Central Association¶s Higher Learning Commission: emphasizing student learning outcomes

³Dual Pilot´ study at the University of Cincinnati:
To consider the applicability of learning portfolios (and especially e-portfolios) alongside the CLA among a group of first-year Honors students http://tinyurl.com/uceportfolioproject

The Voluntary System of Accountability (VSA)
‡ Result of a partnership between the American Association of State Colleges and University (AASCU) and the National Association of State Universities and Land-Grant Colleges (NASULGC), 2007 Includes: ‡ Student and Family Information ‡ Student Experiences and Perceptions ‡ Student Learning Outcomes (page 4) ± Measure of Academic Proficiency and Progress (MAPP) ± Collegiate Assessment of Academic Proficiency (CAAP) ± Collegiate Learning Assessment (CLA) ± Institution Specific Measures ± http://www.voluntarysystem.org

The Collegiate Learning Assessment
‡ Developed by the Council for Aid to Education (CAE) in 2004 ± www.cae.org/cla ‡ A standardized and nationally normed test that holistically measures the institutional contributions (value added) to very specific learning gains made by students ‡ Direct assessment of student learning (claims ³face validity´)
± critical thinking ± analytical reasoning / problem solving ± written communication

‡ Most frequently used of the VSA tests

AAC&U VALUE Metarubrics
‡ AAC&U collaboration with faculty and student affairs professionals ‡ Developed rubrics for 15 essential learning outcomes ‡ Encouraged institutions to revise to accommodate individual needs ‡ Development is evolving among institutions ‡ UC employed the Critical Thinking and Written Communication metarubrics:

Methodology
‡ Study cohort of 111 first-year Honors students ± Average ACT is 31 ± Representing a range of UC colleges and programs ± Briefed on experiment ± Follow cohort over four-year time frame ‡ Administer the CLA as a course requirement ± Half complete ³make an argument´ and ³critique an argument,´ with the other half completing the problem-solving ³performance task´ ± Student post-test survey also administered at testing site ‡ During the same month, students completed assignments designed to measure critical thinking and written communication ± Graded by faculty using a variation of AAC&U meta-rubrics

,

Primary Research Questions
How do the CLA results compare to results produced via VALUE-rubric scoring? In terms of ± Measures of individual student learning ± Measures of institutional-level value added ± As a tool for guiding program improvement ± As a response to accountability

Some Findings from the Student Survey
± Usefulness of the test (waste of time or only mildly useful=50%,useful=40%, very useful=10%) ± Describe the test experience (unpleasant=37%, indifferent 48%, pleasant=15%) ± Effort exerted (little or none=8%, some=40%, a great deal 52%) ± Incentives to take the test seriously (would do it for free=8%, self-improvement=39%, financial incentive required=43%) ± Experiences that prepared students to do well on the test: prep courses for ACT and SAT

Side-By-Side Comparison
CLA Grading Inter-rater Reliability Test Method Test Incentive Relevance Human and computer Systematically achieved Timed, open-ended response Voluntary ³Real world´ application VALUE Metarubrics Human Achieved high IRR through norming Open ended, over time Homework Academically connected

Side-By-Side Comparison
CLA Scoring Test Time Turnaround time Motivation Holistically 90 minutes 8 months Add on VALUE Metarubrics Trait-by-trait 10 Weeks As needed Embedded in context with course

Institutional CLA Data
Mean CLA Score Total CLA Score Performance Task Analytic Writing Task Make-an-Argument Critique-an-Argument 1352 1338 1366 1348 1383 Unadjuste d % Rank 98 98 98 98 98 Deviation Score 1.4 1.7 0.8 0.3 1.4 Adjusted % Rank 95 96 79 63 92

Metarubric Assignments
N Avg Std Dev Avg team rater correlation 0.717 0.772 Overall Correlation 0.696 0.824

Written Communication Critical Thinking

109 55

22.7 18.98

3.47 3.91

Correlation

Interpretation and Explanation
‡ The lack of statistically significant correlation between the CLA task scores and the e-portfolio assignment scores indicates a difference in evaluating student work. ‡ Although both processes measure the application of intellectual and practical skills, each is evaluating student outcomes at differing levels of context. ‡ The level instruments provide very different levels of aggregation. While the VALUE rubrics provide trait-level analysis of students, the CLA holistic score considers student performance. ‡ Differences in the administration of the two instruments must be considered ( context, incentives, timing, the ability to draft).

Conclusions
‡ The CLA is useful for INSTITUTIONAL assessment
± National benchmarking & comparisons for CLA are likely to be more valid than the more nuanced results of VALUE. ± The CLA does not appear to provide data that is particularly useful for STUDENT diagnostic feedback. ± Program assessment will require significantly greater sampling than required by VSA.

Conclusions
‡ The Value Rubrics are useful for providing a consistent framework for the evaluation of student artifacts. They can help transform ³grading´ into the assessment of specific Student Learning Outcomes
± Well received by faculty, appropriate assessment for accreditation and program improvement ± Useful in providing some comparability across faculty but only of limited value for institutional-level reporting because the rubrics are designed to be modified to meet specific needs ± Provides more timely and detailed feedback

Bottom Line
‡ Institutions should employ a complex and varied assessment strategy to understand achievement of student outcomes ± Understand achievement in relation to student outcomes ± Demonstrate the value of the educational experience ± Assure that students are well-prepared for lifelong learning and intellectual inquiry ± Provide a framework for program feedback and improvement

Recent Analyses
‡ ‡ ‡ ‡ ‡ ‡ Trudy Banta, ³The Search for a Perfect Test Continues,´ in Assessment Update (November-December 2007) Joan Hawthorne, ³Accountability & Comparability: What¶s Wrong with the VSA Approach?´ in Liberal Education (Spring 2008) Paul Basken, ³Electronic Portfolios May Answer Calls for More Accountability,´ in The Chronicle of Higher Education (April 17, 2008) Trudy W. Banta, ³Assessment for Accountability: A Cautionary Tale,´ presented at The Ohio State University on June 19, 2008 Klein, Liu, Sconing, 2009. Test Validity Study (TVS) Report . http://www.voluntarysystem.org/docs/reports/TVSReport_Final.pdf Schulenberger, D., Keller, C. (2009) Interpretation of Findings from the TVS for the VSA. http://www.voluntarysystem.org/docs/reports/VSAabstract_TVS.pdf