Improving the Quality of Student Learning

University of Southern Maine

Based on available data about learning outcomes from the course pilot, what were the impacts of re-design on learning and student development?

Most striking and important is that students in the redesigned course did better on an assessment of basic concepts in psychology than those taught in a more traditional mode.

Students in 7 sections of the course in the fall 2000 semester participated in this assessment. One section, (Group 1, computer-based) used computers for quizzes, exams, lecture notes, interaction with the instructor via e-mail and online discussions. Two sections, (Group 2, computer-enhanced) used computers for quizzes, exams and lecture notes online. E-mail and online discussions were not available to these students. Four sections (Group 3, traditional) received traditional classroom lecture and paper and pencil exams. The second and third groups were both taught by the same instructor.

A 35-item measure of basic psychological terminology and concepts was used to assess student understanding of material covered. These items, five from each of seven chapters covered by both faculty members involved in the study, were not text or course specific. They were designed to assess retention of information covered in the lectures given by both instructors that was also available in the texts that they had selected. The pretest, administered in the first week of the semester, included this 35-item measure and questions about previous computer use, assess to computers and level of comfort when it comes to using a computer. Demographic information, including questions about age, sex and prior experience with computer-based courses, was also obtained. The same 35-items were used on a post-test administered during the last week of the semester, along with questions related to students experience in the course were included on the post-test.

Students in the computer-enhanced retained more than students in traditionally taught sections. We found a significant difference between pre and post test scores (f(1,171) = 165.65, p <.001). Thus it appears that students learned something over the course of the semester. We found no main effect of teaching method (Computer Based, Computer Enhanced or Traditional) (f(2,171) = 8.08, p>.01). Importantly, there was a significant interaction between teaching method and time of testing (f(2,171) = 8.08, p<.001). Though there were differences between groups at the start of the semester, students in the Computer Enhanced and Computer-Based courses did better on our measure of important psychological concepts at the end of the semester then those in the traditionally taught sections.

Students in the Computer-Based section of the course enjoyed the course more than students in the Computer–Enhanced and Traditionally taught sections. In response to a question "what is your overall rating of the course" administered as part of a standard end of semester course evaluation, students in the technology-based section (mean = 1.69 where 1 is excellent and 5 is poor, SD = .79) rated the course more highly than those in the technology enhanced (mean = 2.29, SD = .81) and those in the non-technology sections (2.38, SD =.83) (t(78)= 2.7, p<.01 and (t(165), =3.02, p<.01 respectively).

Other findings included:

  • All students surveyed indicated that they liked receiving immediate feedback about their performance on tests and quizzes. This makes sense given the importance of immediate feedback to the learning process and the reduced duration of anxiety associated with grading.
  • 96% indicated that taking quizzes on the web before class was helpful. This may be an acknowledgement that they spent more time studying for the class then they might have otherwise.
  • 87% felt that the required use of a computer in the class enhanced their understanding of course content.
  • 75% of the students reported that felt more confident that they would reach their academic goals in this course with the use of computers.
  • 52% preferred computer-based testing to traditional paper and pencil testing. This may be in part due to the immediacy of feedback that computer-based testing makes possible.
  • 49% appreciated having had access to Web-based materials, (quizzes, lecture notes and other instructional materials) and reading materials for the class.

The difference in scores was so reliable as to create a problem for us: Should we continue to teach traditional sections of the class, knowing that we need a control group but that students in traditionally taught sections will not understand the concepts covered in the course as well as those in the computer-enhanced course. This ethical concern has prompted us not to continue to teach the class in the traditional mode. The result has been noted improvement in scores on our measure of basic psychological concepts and terminology.

From the beginning, our course redesign has worked very well for motivated students--those students who take education seriously--but probably not as well for those who are not excited by the process of education. Availability of immediate feedback and information about where individuals stand in the class works well for those who are interested in achieving academic success. Unfortunately, many students at the University of Southern Maine (USM) are not especially interested in academics, especially in the first year of their college career.

Some of this is our own fault. We have been hampered by a lack of fun, focused, activities that we had hoped to integrate into the class. Quizzes are quick and easier to create than more exciting activities and demonstrations. We therefore found it expedient to work on developing course-related quizzes, not creating demonstrations that involve the student in the outcome. This has limited the appeal of our student-centered activities to date. While many report that they appreciate the opportunity for repeated quizzing, the appeal of repeatedly finding out that you do not know something is very limited.

We began to incorporate engaging, participatory, student-centered exercises into in fall 2000. We are working with John Wiley Publishing on a software product that works within WebCT to mimic what we had hoped to do by now. We are also working with Prentice Hall to create something similar for their texts. Houghton-Mifflin has also expressed interest in creating text-based activities that engage the students more than simple reading of the text. Once these and other similar activities have been incorporated into the course, we expect that the redesign will then appeal to more students.

There has been a significant change in the grade distribution in classes taking advantage of technology. There has been a 10-20% reduction in the percentage of students earning less than a C in the class. This has been associated with a commensurate increase in the numbers of A's B's and C's. Apparently the immediate feedback and ability to assess grades in relation to the class as a whole has been helpful to many marginal students.

After the initial pilot, we were however concerned that one consequence of incorporating class notes into a Web-based format might be that many students believe that they need not attend class because they have the notes. Class attendance, often quite low after the first two weeks in the traditional class, had not improved as the result of the redesign; though students who attend class gave us rave reviews for our effort. Students were telling us that they were getting what they needed from the text, that there was little value in attending class.

In the second phase, class attendance has improved as the result of adding surprise quizzes. Designed to award class attendance, these were sufficient to keep students coming to class. Students who do the required quizzes also do better on the exams and in the course, though there clearly is a self-selection bias that may also account for these results. Those doing the required quizzes report that they are an effective teaching tool, and more than a dozen students have asked why this is not standard practice in other classes.

We now know that spacing quizzes through the entire semester improves overall retention of understanding of concepts and terminology better than enabling students to complete a set of quizzes just before an exam. This is not unanticipated – few people would be surprised that, if allowed to do so, students will put off their study until shortly before exams, and that such cramming does not lead to long-term retention of information. It is however nice to have data that documents the efficacy of what is termed spaced over massed practice.

There are several nice psychological explanations for the efficacy of our reinvented course. These include an increase in the time spent studying, increased familiarity with the process of testing, increased familiarity with tested material, the impact of immediate feedback, the ability to see the impact of effort on grades and increased comfort with testing generally, any and all of which would make good subjects for research.

Back

 

Program in Course Redesign Quick Links:

Program In Course Redesign Main Page...

Lessons Learned:
Round 1...
Round II...
Round III...

Savings:
Round I...
Round II...
Round III...

Project Descriptions:
Sorted by Discipline...
Sorted by Model...
Sorted by Success...
Sorted by Grant Rounds...