top of page
Search

Evaluations. The good, the bad and the ugly.


Earlier this week, I received our midterm evaluation survey results. While discussing with a co-worker (we'll call her Lisa from now on), it was brought up how an instructor she works with was having a difficult time handling the news that the instructor's class had bad reviews. My initial response was, "What did the students report as not going well?" Lisa responded and said that students reported that the discussions were not helping them to learn. Often times in online courses, this is a typical response from students - so I tried to dig deeper. I asked Lisa about what the other students reported about the discussion prompts and she replied with that other students enjoyed the prompts, reported finding all aspects of the course helpful, enjoying the weekly discussions, and liked the variety in the class discussion. After some more back and forth about the topic, I was able to discover that the evaluations actually reported positive experiences and feelings from the students - minus one comment regarding the discussion prompt. Therefore, the focus of my blog post this week is on the topic of evaluations - and how I (as an instructional designer and online instructor) use the data.


Person scratching head trying to pick between sad face, happy face, and normal face.

Review Objectives Prior to Reviewing Evaluation Data- When reviewing the data for your class, make sure to familiarize yourself with the learning objectives you gave the students. You want (as much as possible) to look at the course from a student perspective to see where the students are coming from when reviewing the course. If your learning objectives do not align with the activities you are asking the students to complete, then the issues may run deeper than what a course evaluation will show you. I'd also make sure you have a course map readily available when reviewing the data (see The Importance of Course Mapping blog post - linked below).

Outliers - It never fails. You always seem to have one or two comments to the extreme. Therefore, when I review evaluation data one of the first things I do is mark the top two (good) and the bottom two (bad) reviews. Once these are identified, I see if there is a trend or connection between the two comments. Are both students reporting on the same issue? Are these posts coming from the right state of mind (i.e. are the complaints valid or over items that cannot be controlled such as the LMS)? If these reviews are not discussing the same topic - consider if the student is just overly positive or negative?

Themes- If possible, review evaluations over multiple course runnings. This will show specific themes or issues with a course that may stem deeper than the "bandaid" fixes that some instructors use to fix issues. For example, if students keep stating that discussions are not aiding in the learning of course material - it may be that the discussions are not asking the right questions, the content is difficult to discuss or students do not know enough yet to feel as if they are able to describe the information to other students. Some students will explain in detail why they feel a problem occurs - while others will just state they 'do not like it.'

Make Notes - While reviewing your evaluation data, highlight and make notes of areas of concern. For example, if a student states, "Offer online face to face video chats or peer to peer group meetings," brainstorm ideas how this need could be met. Are you holding office hours currently? Are there any group projects in the course? Do you have an introduction discussion where students are able to talk to each other and meet their classmates? You don't necessarily need to take the evaluation at face value - there could be another solution to the problem. In this example, it could be that an online sense of community was never developed - therefore, students do not feel connected to one another.

 

If you have your evaluation data and still feel you are not getting the feedback you need, it may be time to restart.


Restart Button

Restart you say? What do you mean? It could be that your survey questions are not asking or measuring the right things. For those who feel they fall into this category, I'd consider the following steps:

  1. Work with your 'teaching and learning' or 'evaluation' department or employees. These are typically individuals who have experience working with faculty or instructors to create and develop successful student-centered learning. They may have a set of evaluation questions to use or would even be willing to set up surveys on your behalf.

  2. Consider what you want to know - make sure your survey questions are not yes and no. Present the learners with a topic - but allow them to openly discussion concerns of the course.

  3. Use a question bank for specific question ideas. (Berkeley Question Bank, Macquarie University Question Bank)

  4. If you have a smaller course size - consider reaching out to the students individually and ask what they would do to improve the course. Keep an open mind and be open to the feedback you receive.

  5. If you do not have an area or person who can help you with evaluation, try to read up on tips and tricks from others in the field (eLearning Industry).

6 views0 comments

Recent Posts

See All
bottom of page