For several years, The University of Alabama had been considering making a transition from paper course evaluations disseminated in the classroom to an on-line (anytime, anywhere) course evaluation system. A task force was formed to review products available from third party vendors. Between fall 2008 and fall 2009 pilot tests were conducted with eXplorance Inc.’s Blue/Evaluation. In spring 2010, all courses previously evaluated with the paper forms will be converted to fully online course evaluations.
UA wanted to apply the best technological solutions available to encourage student s to share their opinions of instruction. The paper forms created tremendous volumes of paper to be handled by the academic departments, all of which must be manually scanned (i.e. over 40,000 each semester in Arts & Sciences alone). Stray marks or punctured paper may render the submission invalid because the scanner cannot read it. Students must also complete the evaluation within a limited amount of time at the end of a class session which can lead to less thoughtful responses. Additionally, due to the manual nature of the paper forms, reports to faculty often take months to deliver, therefore faculty cannot act on feedback provided by students in a timely manner. The new online system eliminates paper and invalid submissions, provides students access to complete the evaluation anytime, anywhere enabling them to provide more thoughtful responses, and reports can be delivered to faculty within a matter of days instead of months.
The fall 2008 pilot project involved over 5,500 students being invited to complete on-line evaluations in all fully online courses taught in Continuing Studies and 20 sections of Management 395 in the College of Commerce and Business Administration. In the fall 2008 pilot, participation rates of Continuing Studies students more than doubled as compared to the response in the past using a different on-line mechanism. For the CBA students, the response rate was the same as with paper forms distributed in the classroom – about 66 percent completed the evaluations. Reports were available for faculty and academic administrators within two weeks after the beginning of the following semester.
The spring 2009 extended pilot involved over 16,000 students and over 550 instructors in 1200 sections. The overall response rate was 54%, which was positive in light of the rapid implementation and limited opportunity for communication. The spring 2009 extended pilot project dealt strictly with implementing the new on-line system, identifying any problems via the pilot groups and correcting them before the larger implementation for fall 2009. There are other aspects of student opinions of teaching that continue to be reviewed and discussed as well, but this project focused on improving the system of data collection.
For the spring pilot, students participating received an invitation via their Bama email to complete the evaluations, then reminder emails were sent to those who had not responded. Additionally, students could have accessed the online course evaluation system through a link in MyBama (after login, they clicked on the Academics tab>>Banner Self Service>>Student Opinions of Teaching/Course Evaluation). They were prompted to login to the system using their MyBama username and password.
Faculty and administrators received access to the final report(s) on June 19, 2009, later than anticipated due to the pilot. They received notification via their Bama email account and had also had the option to login through MyBama.
Follow up surveys were conducted with both student and faculty participants to obtain feedback on the new system.
During the fall 2009 survey period, several improvements were implemented based on lessons learned during the two pilot tests. Single sign on through myBama made the login and communication process more streamlined, especially for students. A theme was created to generate more awareness, “Your Opinion Matters!”, and was used on yard signs located in high traffic areas, on flyers and in myBama. Nine out of ten colleges fully participated in the online course evaluation process, eliminating the paper process altogether. The overall response rate was 52%.
Typically, when course evaluations transition from paper form to on-line form, participation rates can drop off as students are not provided forms directly in the classroom but are sent them electronically. Responses, however, tend to be more thoughtful as respondents may take more time filling them out on-line. E-mail reminders are critical to the response rate and were used effectively in both the fall 2008 and spring 2009 pilots at UA. In fall 2009, a more comprehensive communication plan was employed in an effort to raise awareness and response rates. The communication campaign will continue to be enhanced.
As with the paper-form system, confidentiality of responses is critical and is a key feature of the system. The data is compiled in such a way that no single response is tied to a particular student. The information is compiled and provided in summary to the faculty member and administrators.
A common set of University approved evaluative questions are provided for all courses. This consistency with the evaluation instrument allows for cross-comparisons within a college or campus wide as part of the reports generated. In addition to the campus wide common data collected, individual schools and colleges can add specific questions/information that would be helpful to their ongoing efforts to improve teaching and learning.
The Office of Institutional Research and Assessment (OIRA) has been designated as the institutional administrators of the system beginning with the spring 2009 extended pilot. Dr. Ivon Foster, assistant to the provost, has coordinated the project for the Office for Academic Affairs. OIRA contacts include Executive Director Lorne Kuffel and Jon Acker, Coordinator for Student Assessment. For more information, email SOI@ua.edu.