Improving the Quality of Student Learning
California State Polytechnic University, Pomona (Cal Poly Pomona)
Based on available data about learning outcomes from the course pilot, what were the impacts of re-design on learning and student development?
The pilot phase of the redesigned course during the spring 2001 quarter included 132 full-time on-campus students. A version of the redesigned asynchronous course had been developed over a number of years and consisted of 18 hours of video streaming and a web site. In its first newly redesigned year, the course included the original 18 streamed videos, a web site based on WebCT with a random test generator, a daily-monthly calendar, an alternate server-based web site, a class mailbox and an interactive animated Learning CD. Three new teaching modules covering new topics in psychology were also completed and will be available fall 2001.
The assessment data will come from pre/post general psychology questionnaires, final exams and grades, a Technology Attitudes Survey, and student focus groups. Since one of our main tasks this summer is to analyze our pilot data, our comments for this section are based on a limited examination of our data.
During the pilot, our most enhanced outcome was the retention of students (our drop-out and withdrawal rates were 6 percent) and the reduction of the numbers of students receiving grades of D and F. We believe this was due to a number of factors including the extra credit assignments, the CD, the WebCT structure, and on-line testing.
December 2001 Update: During summer 2001, we thoroughly analyzed the data collected in the pilot course. A pre/post psychology test was administered in both a traditional course and the redesigned course. The test was aimed at measuring gains in knowledge because of taking the course. The test covered eight conceptual areas: experimental design, learning, development, gender, social psychology (only in the traditional class), abnormal psychology, clinical psychology, and memory development. Problems in the instrument were discovered in spite of the fact that both instructors thought it had face validity. (1) It covered several items not covered by either professor. (2) A few items made reference to a text not being used, and each professor used a different text. An ideal measure would have been a test of material covered by both instructors. These issues will be addressed in future offerings.
An external consultant analyzing the data recommended that all items be used even though it was not a perfect instrument. One-tailed t-tests were used on subtests in order to gain statistical power. If a class scored lower on the post-test than on the pre-test, statistical tests of significance were not made. In addition, since our hypothesis was that the students in the redesigned course would score higher than the traditional, if the traditional scored higher than the experimental comparisons were not calculated.
The traditional class demonstrated increased knowledge in comparison-wise tests in experimental design, memory, development, gender, social, and abnormal. The students in the redesigned course showed improvement in all areas except social psychology—a concept not covered in their course. The traditional course reached statistical significance in experimental design and gender. The students in the redesigned course reached statistical significance in learning, memory, development and clinical. The learning CD created for the redesigned course contained two lessons on learning, classical and operant conditioning. It appears to have had a significant impact on their understanding of these topics.
Our intent is to use the same type of instrument during the implementation phase of the redesigned course. An item analysis of individual items by overall scores will be used for decisions of inclusion and exclusion of items. Sections not covered by both instructors will not be included in the instrument.
A technology survey was administered pre-and post-class. In the pre-test most students had access to computers in their home using them for academic purposes as well as for enjoyment. However, 26 percent described themselves as tech-phobic. This was the first online course for most students (49 percent). More than 20 percent had taken one other course and 11 percent had taken two. In the post-test 80 percent said they would recommend the course to their friends; 87 percent said they liked psychology and would recommend the course to their friends and felt they had learned a lot in the class; 92 percent felt that the professor was responsive to their needs; 77 percent believe they learned good study skills; 81 percent indicated they increased their ability to manage time; 90 percent enjoyed learning the technology associated with the course; and 91 percent believed they increased their informal education. The positive response of the students to the course is exceptional given the many threats present due to late completion of the CD, trouble accessing the videos from home computers, and the professor’s illness prior to finals.
The placement of materials on the web seemed to be useful to the students. The mean number of visits was 44 with a standard deviation of 32. Twenty-three out of 132 students had 70 or more visits to the site. Chi-square tests using a two-tailed test were significant for web visits by final grade )p=.007). Pearson product moment correlation (two-tailed)tests were also significant for: web visits by final exam score (p=.000); web visits by final grade (p=.006); items read by final exam score (p=.06); and items read by final grade (p=.02). The response by the students to the course materials on the web was also positive. More than 90 percent found them of use, and 97 percent of the students liked the use of the calendar in WebCT.
Students who were actively involved in the course and regularly checked the web site for calendar items and messages did better in the course. Students in subsequent quarters will be informed of the findings, so that they may practice the behaviors that have been found to lead to better grades.
The use of the WebCT test item generator ensured that each student could have a unique test. Ninety-five percent of the students said they liked the online testing. They were better able to focus and found the labs more conducive to a testing situation. However, the first test of the quarter had many errors due to publisher materials. We were able to make corrections by the time of subsequent testing. The students also were able to reduce the time spent taking the test in the redesigned course to 30 to 50 minutes. Students in the traditional courses took 40 to 90 minutes to complete the tests. Students also appreciated the immediate feedback of the test results and from being able to view the test items.
Program in Course Redesign Quick Links: