Improving the Quality of Student Learning
Rio Salado College
Based on available data about learning outcomes from the course pilot, what were the impacts of re-design on learning and student development?
For those students who completed the course and received a final grade (Rio Salado provides multiple start dates causing students to finish at varying times), the distribution of grades mirrored the traditional "bell curve" in both the spring 2000 pilot and the fall 2000 Phase II implementation. There was no substantial impact on learning – positive or negative. It is important to note that since the focus of the re-design project is structural (to increase substantially the numbers of students in an online course) rather than instructional, the fact that there was no impact is a good sign—i.e., student outcomes were not affected adversely.
With regard to impact on certain types of students, the data in the spring pilot pointed to trends in two areas. Students who experienced a higher frequency of communication or spent more time on the computer tended to receive lower grades. Initially this appears to contradict what we anticipated. However, when considering the anxiety students often express when dealing with math, as well as noting that the re-design project includes a developmental course, perhaps students who are more "at-risk" to begin with need more time on task as well as more guidance and reassurance and will likely not earn the highest scores.
Retention data in both semesters of the four redesigned courses were consistent with district-wide retention. The retention (re-design and district-wide) for Mathematical Concepts/Applications was significantly higher than the other three courses. However, this course is different from the other three because it serves as a pre-requisite for nursing students and is not applicable to general degree requirements.
Post-drop data (based on those students who did not drop before or during the first week of class) indicated an increased retention rate of 68% for the spring 2000 semester compared to 59% from the spring 1999 semester. Post-drop data indicated a minimally increased semester retention rate of 60% for the fall 2000 semester compared to a post-drop retention of 59% for fall 1999.
The outcome that was most enhanced as a result of the pilot re-design was the level of communication. Tracking data suggests that the level of communication improved by at least 50%. Comments from students indicated an apparent level of dissatisfaction with faculty-to-student communication, saying they experienced delayed and/or no response from the instructor. Yet tracking reports revealed "same day" and "next day" communication. This disparity between student perception and reality has caused the re-design team to consider ways to communicate the faculty role better and earlier to students.
Student satisfaction in both the spring and fall implementations averaged 8.4 on a scale of 1 to 10 with 10 being the highest.
In the spring pilot, students spent an average of 14 weeks to complete their courses. In terms of hours, they spent an average of 53 hours online (slightly longer than what is spent in a Carnegie-based classroom setting) – for an average of 3.5 hours per week.
In the fall implementation, students spent an average of 12.4 weeks to complete their course and an average of 42 hours online (less than the time spent online during the pilot and equivalent to a Carnegie-based classroom setting) – for an average of 3.5 hours per week (identical to the Spring pilot).
Program in Course Redesign Quick Links: