You are on page 1of 2

Quick Guide to Interpreting your OMET Teaching Survey Report for In…rs and Administrators – University Center for

Teaching and Learning 12/09/23 12:05 PM

Student Opinion of Teaching Surveys


 Menu

Quick Guide to Interpreting your OMET Teaching Survey Report


for Instructors and Administrators

Report Content
A summary section of Likert scaled questions including response count, mean, and standard deviation.
Additional details are provided and include enrollment total, response ratio, median, and mode.
Student responses to open ended questions.
Results to additional questions added by the instructor.

Statistical Analysis
When using numerical values assigned to Likert categories such as Strongly disagree (1). Disagree (2), Neutral
(3), Agree (4), and Strongly agree (5), be aware that the numbers convey “greater than” or “less than”
relationships but the diKerences between values are not necessarily constant. The diKerence in value between
Strongly Agree and Agree and between Agree and Neutral, for example, are not clear nor is there a shared
understanding of these values among raters.

For schools using an “overall eKectiveness” question:

It may also be helpful to compare the overall average of the Qrst set of items to the score of the
instructor’s overall teaching eKectiveness question. The Qrst set of questions ask about speciQc
behaviors whereas this is a general question whereby students can take into consideration other factors
that they consider as impacting teaching eKectiveness.

Understanding Student Comments


Student comments can provide insights into what worked and what didn’t work well in the class. The challenge
in deciphering qualitative data is that they are presented randomly with no order or structure. (Lewis, 2002)
Comments often appear unconnected and often do not line up with the numerical data included in the report.
Here are some ways to make sense of the data and extract meaningful information:

Classify comments – use a matrix and assign each comment to a category. Classify comments into
strengths and challenges, for instance.
Or use our coding template organize student comments into “keep,” “stop,” and “suggestions”

https://teaching.pitt.edu/resources/quick-guide-interpreting-omet-teaching-survey-reports/ Página 1 de 2
Quick Guide to Interpreting your OMET Teaching Survey Report for In…rs and Administrators – University Center for Teaching and Learning 12/09/23 12:05 PM

categories.
Students may have a hard time verbalizing what they Qnd diZcult and provide feedback that’s vague or
confusing. See our strategies for decoding and responding to common student feedback support article.
Look for patterns – once you’ve classiQed the comments, examine whether patterns exist. Have all or
most of the comments fallen into one or two categories?
Don’t place emphasis on the outliers – Unfortunately, sometimes students can be harsh critics. Reading
negative or cruel comments is diZcult but don’t dwell on one or two comments that are disrespectful or
hurtful.

For administrators: Consider trends in results over time.


OMETs should be one of several methods of assessing teaching. Consider conducting additional
measures like peer observations, review of teaching materials, and faculty self-assessments. For more
information, visit the Teaching Center’s Assessment of Teaching site.
Examine trends in OMET results across time. Avoid relying on a single course or mean semester rating,
examining small variations in ratings too closely, or focusing on anomalies.
Small variations in ratings (<0.4 points) are common and should not be overinterpreted. A variety of
factors outside of a faculty member’s teaching, including chance, could lead to a slight dip in ratings.
In addition to mean ratings, consider distribution of ratings across the scale. Student ratings rarely have
a normal distribution, which makes the mean a less than ideal measure of central tendency without
examining the entire distribution of ratings.
Do not compare ratings of one faculty member to ratings of other faculty or to a unit average rating.

Comparing ratings between faculty can lead to overinterpretation of small variations in ratings and can place
minoritized instructors (who receive more ratings in`uenced by students’ biases) at a disadvantage.

For faculty: Develop a plan using your student opinion of teaching results.
Meet with a Teaching Consultant who can help you interpret your results and develop a course of action
if necessary. Email teaching@pitt.edu to set up a consultation.
Plan on collecting student feedback during the semester the next time you teach. OMET oKers a
midterm course survey option and there are additional ways to collect student feedback throughout the
term. For more information, go to the OMET Midterm Course Survey section of the Teaching Center
website
In the future, discuss, teach, and model giving meaningful feedback with your students. Give them
multiple opportunities to practice giving feedback. We have several resources that can help guide the
discussion and options for gathering student feedback throughout the term.

https://teaching.pitt.edu/resources/quick-guide-interpreting-omet-teaching-survey-reports/ Página 2 de 2

You might also like