Recently we were approached by a department looking to gather feedback from students for a program review. As the feedback that they were looking to gather was in the form of multiple-choice responses, we decided to use student response systems in the classroom to gather the survey responses.

The TurningPoint Clicker system is a basic handheld wireless voting devices. Each kit contains 40 handheld devices and one receiver for gathering the responses. The receiver is plugged into a laptop which runs the software and can be used to display the questions. The handheld devices can be distributed to students allowing them to provide anonymous responses to questions. The results of polling are commonly then displayed back to students so that they can see how their vote compares to the rest of the class. This can be a useful way to assess students’ understanding of a topic of issue, provide an opportunity for intervention, or initiate further discussion.

respondus_classroom_response_system

As we were gathering feedback from the students for a survey, we did not display the results of each question during the survey. Questions and choices were set up in a PowerPoint slide deck and students were asked to respond by pressing the corresponding letter on their clicker while the question was displayed on screen.

We ran the multiple-choice feedback surveys with a population of nearly 150 students spread throughout 7 different classrooms. It took 5-10 minutes at the start of each class to get the system set up, clickers distributed and students oriented. We ran through the surveys, one containing 14 and the other 23 questions, in roughly 10 minutes.

The process went quite well, students had minimal problems with the clickers and data was captured immediately. The system displays how many students have responded in real time and we knew how many were in the room, so it was possible to get a sense of how many had responded and if further clarity on the question was required. We gave the students a few minutes for each question and they had an opportunity to ask for clarity. As the questions were not mandatory, we moved along even if not all students had responded. I let the students know that they had a last chance to respond before moving to the next question.

Following each survey I saved the TurningPoint Session file (tpzx) to preserve the data. I then reset the survey to be run in the next venue. The 7 session files were then combined into a spreadsheet for analysis. Within a few hours we were able to provide the results of the survey back to the department along with some visualizations of the data.

All in all, a successful use of classroom response systems to gather anonymous student feedback during an in class survey.  I would suggest this method for anyone seeking feedback to multiple choice surveys conducted in class.