Improving the Quality of Student Learning

Brigham Young University

Based on available data about learning outcomes from the course pilot, what were the impacts of re-design on learning and student development?

During the winter 2002 semester, the pilot redesign of freshman composition was implemented in three online sections enrolling 57 students and taught by second semester instructors. Throughout the semester, students prepared four papers: (1) personal narrative, (2) textual analysis, (3) research paper, and (4) argumentative synthesis group paper. These papers were graded by the instructor and placed into a student portfolio. Student portfolios from the redesign sections were compared with those from three randomly selected traditional sections, also taught by second semester instructors and enrolling 58 students, during winter 2002. The portfolio analysis was based on a detailed rubric to provide raters with a common reference to judge the quality of student writing.

Based on our initial pilot we are encouraged that overall paper quality is higher in the online versus the traditional version of the course. We are also encouraged that the redesign received significant higher ratings on introductions, conclusions, focus and organization since these are global writing skills that were a focus in the online modules we developed.

Two generalizability studies were conducted. The first was a nested study. Each student portfolio was read by three raters. Sixteen readers participated in the nested study with each reader rating a total of 30 papers. The second study involved a fully crossed design. In this study, we had five readers read the same 16 portfolios and we compared their ratings. Sixteen students were randomly selected from each of the traditional and online sections. Eight students came from the traditional sections and eight students from the online sections. Unlike the nested study, all readers were required to read all portfolios in the sample. As a result, all five readers read a total of 48 papers.

Prior to receiving the student portfolios, all readers were required to attend a training session. During the session, the study designers explained the scoring rubric and how to assign points to each portion. The rubric was divided into five categories: (1) introductions and conclusions; (2) focus and organization; (3) development and detail; (4) sentence structure and word choice; and (5) adherence to convention. The instructions were that each section was to be rated on its own merits against the rubric. To retain the holistic evaluation present in English composition courses, the readers were given the flexibility of assigning points within a range. This rubric gave the readers the scaffolding needed to increase the likelihood of quantitative consistency between papers while also allowing the readers to follow their own instinct about the quality of the paper.

During the training session, all readers received the same paper and were given 30 minutes to grade the paper using the rubric. The grade sheets were returned and the results compiled. The readers were shown how their ratings compared to those of other readers. The readers were given the opportunity to evaluate their own rating consistency and determine what corrective action, if any, was needed to bring their ratings in line with other readers. The objectives of this exercise were to familiarize the readers with the rubric, and to give them an opportunity to see if they were easy or hard graders.

The readers then received their packets with either ten or sixteen student portfolios. The readers shuffled all papers in the packet, so that all papers for all students were randomly graded and two papers from the same student would not be graded back to back. The chance of a halo effect was reduced because the papers were not identified with a student and were shuffled.

The preliminary results showed that 3% of the variance is caused by the paper effect. The student effect accounted for almost 50% of the total variance. Readers accounted for approximately 15% of the variance. The interpretation of the data is that readers are not consistent with each other. Some readers may be easy graders, while others are hard graders. One concern is that since the readers were not consistent with each other, a student's relative ranking would change depending on the reader who read the paper. Each reader was fairly consistent with him or herself. In other words, easy raters tended to be easy with all students and difficult raters tended to be difficult for all students.

Overall, the papers in the online sections had a significantly higher average score than the traditional sections with a p value of .055. In other words, at a 94% confidence interval, the scores from the online course were higher than the traditional course. Scores from the traditional sections tended to vary the greatest; scores from the online sections were more consistent and uniform. One objective for redesigning the freshman composition course was to standardize instruction, which would hopefully result in less variability in student writing quality.

We also compared student writing according to the five categories in the rubric. In the fully crossed study, we found that the papers in the online course were rated significantly higher in the (1) quality of the introductions and conclusions (99% confidence level) and in the area of (2) focus and organization (94% confidence level). There were no significant differences in the other three categories. Additionally, in the nested design we did not find significant differences in any of the rubric categories. The nested study is less precise and is especially open to errors related to the variance introduced by inconsistency among the raters.

To further assess quality in writing, we will conduct a similar analysis during the fall 2002 implementation. During this analysis, we will provide more extensive training with our raters to increase the inter-rater reliability.

December 2002 Update: During the 2002 fall semester, the redesigned course was implemented in 30 sections of English Composition, comprising all of the sections taught by new instructors. Of these sections, eight participated in our in-depth study. Following the winter 2003 semester, we will compare the student papers from the fall 2002 and winter 2003 redesigned sections with those from the winter 2002 traditional sections. Because we will conduct the portfolio evaluation in May 2003, no data is yet available on student papers for this midterm report.

We will need to make adjustments to the portfolio analysis design as we have found significant variability in rater-reliability. Since we encountered some problems with the rubric we used in the winter 2002 portfolio analysis, we will also rescore all the essays from that analysis using the new rubric. This comprehensive portfolio analysis should give us a good idea of how the students in the traditional sections taught in winter 2002 compared with students in the online sections taught in winter 2002 and winter 2003. In addition, comparing the essays from the fall 2002 online classes taught with the essays from the winter 2003 online classes should give us an idea of what effect instructor experience may play in student performance.

Student Satisfaction

The winter 2002 pilot was successful in that students in both the traditional and redesigned sections were equally satisfied with the overall effectiveness of the instruction and the course quality. Among our findings:

  • Online students liked the flexibility and convenience of the course (73%), the technology (39%), the quality of the course material (34%), and felt it improved their writing skills (26%).
  • Online students pointed to technology problems (45%), lack of interaction (15%) and poor organization (12%) as weaknesses.
  • While the quality of the assignments and the clarity of the course objectives were rated lower in the online version, only the quality of the assignments was found to be statistically significantly different.
  • Traditional students' view of whether the course increased their interest in the subject matter was higher than in the redesign. Typically, the passion for writing has been communicated by instructors in their face-to-face interaction with their students. We need to carefully consider ways to help instructors communicate the importance of writing and help increase students interest in writing.
  • It may be that students self enrolled in the online course were less interested in writing and felt it was an easier way to receive the credit. For example, in a course survey 33% of online students reported that they were confident in their writing versus 51% of traditional students.
  • Those with extensive or minimal computer experience rated the course quality, assignment quality, and instructor quality higher than those with average experience.
  • When asked whether it was necessary to be in a classroom to learn, most of those with extensive computer experience said no (extensive 6.67, minimal 4.25).
  • When asked whether the course would have been better if offered in a traditional environment, most people with extensive experience said no, while many of those with minimal experience said yes (extensive 2.83, minimal 4.50).
  • Most students reported that they would take another online course based on their experience.

We learned that we need to help instructors clarify course and assignment objectives. We are concerned that students may not be using the course modules to complete their assignments and are considering ways to increase their use of these modules. We are also designing training for the instructors that will help them use their class time and student/teacher conferences in order to increase students' interest in writing.

December 2002 Update: During fall 2002, we measured student satisfaction in only the online course. Student pre-surveys and post-surveys were administered to all online English composition sections. In comparison with online students during winter 2002, students this semester rated the overall quality of the course nearly identically.

In addition to completing pre- and post-surveys, students from eight online sections participated in voluntary focus groups. When asked how they thought the online class differed from the traditional one, students noted the course provides more flexibility (only meeting once a week), that they have more interaction with instructors, and that the traditional course seemed to have more assignments. Although they only met once a week, most students felt that the time was used effectively. However, this feedback varied from section to section. Seven of the sections interviewed felt that the class time was effective and that everything was covered in the time they met together as a class. In one section, students felt that the classroom time was rushed, and they wished that they could have had two days of face-to-face instruction. They also felt that most of the time was spent trying to figure out problems with technology or answering student questions.

Most students were satisfied with the overall level of instructor/student interaction. A few students felt that they had even more interaction than in a traditional course. They felt that it was easy to set up meetings with their instructor if they had questions, and that their instructor was very flexible and easy to contact via email. Whether the students were satisfied with peer interaction in the course also varied by section. No student said that they wanted more peer interaction.

Analyzing the data from the pre- and post-surveys in terms of gender revealed that males and females rated the overall quality of the course and the instructor's effectiveness nearly identically. In addition, both groups felt that the course improved their writing skills. Both males and females were generally satisfied with the speed and quality of instructor feedback, but males were more likely to want more instructor interaction. Additionally, males feel much more strongly that the course would have been better if taught in a traditional format.

Back

 

Program in Course Redesign Quick Links:

Program In Course Redesign Main Page...

Lessons Learned:
Round 1...
Round II...
Round III...

Savings:
Round I...
Round II...
Round III...

Project Descriptions:
Sorted by Discipline...
Sorted by Model...
Sorted by Success...
Sorted by Grant Rounds...