Program in Course Redesign: Round III

Improving Learning and Reducing Costs:
Lessons Learned from Round III of the Pew Grant Program in Course Redesign

By Carol A. Twigg

Download PDF version

Since April 1999, the Center for Academic Transformation at Rensselaer Polytechnic Institute has conducted a Program in Course Redesign with support from the Pew Charitable Trusts. The purpose of this institutional grant program is to encourage colleges and universities to redesign their instructional approaches using technology to achieve quality enhancements as well as cost savings. Redesign projects focus on large-enrollment, introductory courses, which have the potential of impacting significant student numbers and generating substantial cost savings. The Center has awarded $6 million in grants to 30 projects in three rounds of ten projects each.

The third round of redesign projects began in July 2001 and concluded in July 2003. (Detailed descriptions of the ten redesigns and the outcomes each achieved can be found at http://www.thencat.org/PCR..htm.) The ten institutions and the courses they redesigned are:

What follows is an analysis of the results of the Round III projects, with a focus on the most important quality improvement and cost reduction techniques used in the redesigns, the implementation issues they encountered, and the projected sustainability of the course redesigns. The Center has produced similar analyses for Rounds I and II.

Quality Improvement Strategies and Successes

Eight of the ten Round III projects reported improved learning outcomes; two reported no significant difference. Among the findings were the following:

  • At Drexel, there was a statistically significant improvement in learning under the redesign, as measured by the students' final grades (p = 0.0130). Students in the combined pilot and redesigned versions of the course had considerably more A and fewer F grades when compared to the traditional course. In addition, because midterm scores for the redesigned course were significantly higher than those for the traditional version, instructors created a more difficult final examination to assess the students in subsequent offerings of the redesigned course.
  • FGCU redesign students succeeded at a much higher level than traditional students on module exam objective questions, which tested content knowledge (85 percent vs. 72 percent) and on module exam short essays, which assessed critical thinking skills where the percentage of Ds and Fs dropped from 21 percent to 7 percent.
  • Redesign students at Iowa State performed better than traditional students at statistically significant levels on comparable examinations.
  • OSU students had greater success on common exams than traditional daytime students and about the same scores as students in the evening class, which had smaller class sizes and older students and had previously outperformed the daytime class.
  • At PSU, the redesign of the first-year Spanish sequence focused on de-emphasizing rote grammar and improving oral proficiency. End-of-year oral exam scores showed improvement: redesign = 87.3 percent, traditional course = 85.8 percent.
  • TCC students in the redesigned course scored significantly higher (p=0.04) on final essays, with an average score of 8.34 compared to 7.33 for traditional students. Success rates of redesign students in the second-level English course increased (79.3 percent for redesign compared to 76.1 percent for traditional), indicating that the redesign students were better prepared.
  • The percentage of redesign students at UNM who received a grade of C or higher was 77 percent for fall 2002 and 74 percent for spring 2003 versus an average of 61 percent for the traditional course. In addition, there were more grades of A (fall 2002 = 34 percent; spring 2003 = 31 percent) than found in traditionally taught sections (18 percent.)
  • At USM, in the area of reading comprehension, the number of students scoring C or better climbed from 68 percent in the traditional course to 88 percent in the redesign. In the area of writing skills, the number of students scoring C or better increased from 61 percent in the traditional course to 77 percent in the redesign. The latter gain was particularly significant because of the emphasis placed on writing in the redesigned course, which accounted for 40 percent of the total grade.

Seven of the ten projects measured changes in course completion rates; five showed improvement; one reported no change; and one experienced problems with students dropping or withdrawing from the course. Among the findings were the following:

  • DWF (drop-failure-withdrawal) rates at Drexel were consistently reduced 10-12 percent in the redesigned course.
  • At OSU, withdrawals were reduced by 3 percent, failures by 4 percent and incompletes by 1 percent. As a result, 248 more students successfully completed the course compared to the traditional course.
  • At TCC, students in redesigned sections had a 68.4 percent success rate compared to 60.7 percent for traditional sections. Success rates were higher for all groups of students regardless of ethnicity, gender, disability, or original placement. The overall success rate for all composition students was 62 percent for the 2002-2003 year compared to 56 percent for the 1999-2000 year prior to redesign.
  • At UNM, 41 percent of traditional students received a C– or below, including drops, withdrawals and incompletes. This percentage was reduced in the redesigned course to 23 percent in fall 2002 and 26 percent in spring 2003.
  • In the traditional course at USM, faculty-taught sections typically retained about 75 percent of students while adjunct- and TA-taught sections retained 85 percent. In the redesign, the retention rate was 87 percent. The rate of D and F grades dropped from 37 percent in the traditional course to 27 percent in the redesigned course. DFW rates dropped from 26 percent in the traditional course to 22 percent in the redesign.

All ten projects have effected significant shifts in the teaching-learning enterprise, making it more active and learner-centered. The primary goal is to move students from a passive, note-taking role to an active, learning orientation. Lectures are replaced with a wide variety of learning resources, all of which involve more active forms of student learning or more individualized assistance. In moving from an entirely lecture-based to a student-engagement approach, learning is less dependent on words uttered by instructors and more dependent on reading, exploring, and problem-solving undertaken actively by students.

Among their most important quality improvement techniques, the Round III projects identify the same four cited by the Round I and Round II projects: continuous assessment and feedback, increased interaction among students, online tutorials, and undergraduate learning assistants (ULAs.) The Round III projects also cite two additional techniques identified by the Round II projects that contribute to improved student learning: individualized, on-demand support and structural supports that ensure engagement and progress. The following is a list of the most effective quality improvement techniques used by the Round III projects.

  • Continuous Assessment and Feedback. Shifting the traditional assessment approach in large introductory courses, which typically employs only midterm and final examinations, toward continuous assessment is an essential pedagogical strategy in these redesigns. Most of the ten projects included automated (computer-based) assessment and feedback in their redesigns in fields as diverse as psychology, mathematics, Spanish, English, statistics and fine arts. Automating assessment and feedback enable both repetition (student practice) and frequent feedback, pedagogical techniques that research has consistently proven to enhance learning.

The Round III projects used quizzes from commercial sources as well as those they created themselves. Students were regularly tested on assigned readings and homework; quizzes probed their preparedness and conceptual understanding. These low-stakes quizzes motivated students to keep on top of the course material, structure how they studied and encouraged them to spend more time on task. Online quizzing encouraged a "do it till you get it right" approach: Students were allowed to take quizzes as many times as they wanted to until they mastered the material. At PSU, Spanish grammar presentation, grammar drills, listening comprehension and reading comprehension exercises were delivered online, allowing class interaction to focus on student-student oral communication. The electronic activities provided consistent, automated grading across sections and instant feedback at the moment when students were concentrating on the task.

In mathematics, student learning is directly related to the amount of time students spend working problems. Although homework is assigned in most courses, usually instructors are not able to grade more than a small part of it, and students do not take it seriously. At Iowa State, frequent homework assignments replaced lectures and formed an important part of the students’ final grade. Computer grading of all exercises ensured that every assignment was counted and that students received immediate feedback.

Both FGCU and UNM discovered that requiring quizzes was essential to increased student performance. To determine whether quizzes that were mandatory (i.e., required for course credit) or voluntary (no course credit) would differentially affect exam and grade performance, UNM faculty conducted an experiment. Students in one section received course points for completion of weekly online mastery quizzes; students in the other section were encouraged to take the mastery quizzes, but received no course points for doing so. On in-class exams, students who were required to complete quizzes for credit always outperformed students for whom taking quizzes was voluntary. The former also received more As, Bs, and Cs and fewer C- or below grades. Students took more quizzes, scored higher, and spent longer on quizzes when course credit was at stake than students in the section where quizzes were not linked to credit. In contrast, when credit was not a consequence, relatively few students successfully completed quizzes, and some students chose not to take quizzes at all. FGCU had similar findings.

  • Increased Interaction among Students. Many of the projects restructured their courses explicitly to increase discussion among students. Students in large lecture classes tend to be passive recipients of information, and student-to-student interaction is often inhibited by class size. Through smaller discussion forums established online, students can participate actively. At FGCU, students completed Web Board discussions where they analyzed sample short essays in preparation for writing their own short essays. One of the essays was a strong essay and the other a weak essay. Working in peer learning teams of six students each, students had to determine which was strong and which was weak and explain why. The Web Board discussions increased interaction among students, created an atmosphere of active learning, and developed students’ critical thinking skills.

At PSU, different forms of computer-mediated communication (CMC) were used according to their capacities as revealed by research: synchronous CMC (chat) resembles interpersonal oral discussion and asynchronous CMC (message boards) resembles presentational, formal written discourse. Students were required to work in chat groups to learn about each other and to report this information on message boards. The amount and quality of information exchanged (communicative use of Spanish) exceeded that of most face-to-face discussions. The depth and extension of communication strengthened both student-student relations and student-teacher relations.

At Drexel, a dedicated computer laboratory was built to facilitate group work, allowing groups to project shared work and annotations onto white board "wallpaper." Groups were put together to mix students with greater and little previous programming/ computing experience, providing less experienced students with help over the initial obstacles in learning to program. The groups also kept the class more unified in its learning stages: small computing issues did not become major roadblocks and students could focus on important concepts. The more experienced students could quickly answer questions and demonstrate the use of the computer and/or software tools to the less experienced in their groups, preventing the latter from falling behind.

  • Online Tutorials. Nearly every one of the ten Round III projects relied heavily on instructional software, some of which was created at the institution and some of which was available from commercial sources. Like the three mathematics projects in Round II, NAU built its redesign around a commercial instructional software package called MyMathLab. Students found the software easy to use and achieved a comfort level in a short amount of time. Students especially liked the instant feedback they received when working problems and the "Guided Solutions" available when their answers were incorrect. At PSU, Spanish grammar presentation, grammar drills, listening comprehension and reading comprehension exercises were delivered online, allowing class interaction to focus on student-student oral communication.

At TCC, easy online access to materials and resources increased learner time on task in English composition. Grammar review sites and quizzes, including the support site for the New Century Handbook, CLAST online textbook, Cttc.comnet.edu/grammar, Academic.com and the Texas Information Literacy Tutorial (TILT), provided individualized remediation based on diagnostic information. Students also had access to textbook companion web site materials that assisted with writing principles, mechanics and reading comprehension. Students could access information 24x7 as often as they needed to do so. By conducting some instruction online instead of in class, faculty increased the amount of class time spent on the writing process. Drexel also found that having course materials online opened class time to other activities such as a broad overview of concepts to be undertaken that week, answering questions, and large group exercises such as "pair and share," creating a more active learning environment in lectures.

  • Undergraduate Learning Assistants (ULAs). The Round III projects did not make as heavy use of ULAs as prior rounds. Only UNM incorporated ULAs, who were recruited from students who received A’s in the previous semester, in their redesign. The role of the ULAs was to work with students who scored 75% or less on the first exam, administered at the end of the third week, in weekly 50-minute studio sessions for the remainder of the semester. During studios, students worked on multimedia course material, took quizzes, learned a memorization strategy, and discussed their course performance with the ULAs. The more studios students attended, the better their course performance.

  • Individualized, On-Demand Support. The emporium model used by NAU eliminated all class meetings and replaced them with a learning resource center featuring online materials and on-demand personalized assistance. All learning experiences were designed to move students from a passive to an active learning experience in which students controlled their learning experiences based on individualized needs. If a group of students was struggling with a particular concept, instructors called "time out" to give a short explanation of the material. Many instructors also arranged exam review sessions. The net effect was greater instructor-student interaction than in the traditional format. OSU established a Help Room that allowed students to work collaboratively on problems or concepts that presented difficulty. The Help Room was staffed with TAs, adjuncts and full-time faculty who held their office hours there, thus making help available to students throughout the day.

At Drexel, students were less inhibited about asking questions and stating opinions through online mechanisms such as email, chat and threaded discussion than in person. (In fact, some students sent email from down the hall even when faculty members were in their offices holding office hours.) Students found that the availability of "virtual office hours" allowed them to get questions answered from home while they were actively working on homework. This decreased students' frustration and helped them overcome minor but persistent programming errors. TCC English composition students were able to submit mid-stage drafts to tutors at SMARTHINKING, a commercial, online tutoring service, and/or to TCC e-responders. These 24x7 services provided students with prompt, constructive feedback on writing assignments. The fast feedback and online assistance allowed students to make appropriate changes in their drafts, improving the quality of student writing.

  • Structural Supports that Ensure Student Engagement and Progress. Each redesign model added greater flexibility in the times and places of student engagement with the course. Although some projects initially thought of their designs as self-paced, open-entry/open-exit, they quickly discovered that students need structure (especially first-year students and especially in disciplines that may be required rather than chosen) and that most students simply will not make it in a totally self-paced environment. Students need a concrete learning plan with specific mastery components and milestones of achievement, especially in more flexible learning environments.

Like the three Round II mathematics projects, NAU's original redesign plan envisaged a program where students were self-taught using the software, and the computer laboratory was staffed simply to get students past any roadblocks. During the proposal review process, Center staff talked with the NAU team about the need to "beware of self-pacing" and stressed the importance of providing sufficient structure for students within a well-articulated set of requirements. Despite these admonitions, NAU students were only required to attend class in the computer lab for the first three weeks of the semester. Afterwards, attendance was not required, and students were on their own. The result was increased drops and withdrawals. The team eventually decided to require student attendance throughout the semester for any student not making a grade of A. Doing so helped weaker or less-motivated students successfully complete the course. After some initial experiences, Iowa State also added mandatory attendance at computer lab sessions, which counted for a small part of the grade.

Another approach was to establish some form of early alert intervention system-- a kind of "class management by exception" process, whereby baseline performance standards were set and those who were falling too behind were contacted. At UNM, for example, students who scored 75% or less on the first exam at the end of the third week were required to attend a weekly 50-minute studio for the remainder of the semester as described above. The more studios students attended, the better their course performance. Iowa State also forwarded the names of failing students to their advisors in the College of Business for additional counseling.

OSU's innovative buffet model produced an additional pedagogical improvement technique. Informed by an assessment of their learning styles and preferences, all students were able to select from a variety of learning modes, thus meeting the course's common learning objectives by using different pathways. To support this individualized learning model, OSU established a taxonomy of 90 learning objectives linked to all course components. Every problem in the text, every quiz and exam question, every lab and problem in the lab manual was linked to a learning objective. Students knew succinctly what they were expected to know. Students perceived that they had more control over their learning experiences since their choices of activities personally suited them. This perception established a positive learning environment in the course and increased both student satisfaction and learning.

People who are knowledgeable about proven pedagogies that improve student learning will find nothing surprising in the above list. Among the well-accepted Seven Principles for Good Practice in Undergraduate Education developed by Arthur W. Chickering and Zelda F. Gamson in 1987 are such items as "encourage active learning," "give prompt feedback," "encourage cooperation among students," and "emphasize time on task." Good pedagogy in itself has nothing to do with technology. What is significant about the faculty involved in these redesigns is that they were able to incorporate good pedagogical practice into courses with very large numbers of students—a task that would have been impossible without technology.

Cost Reduction Strategies and Successes

There are a variety of ways to reduce costs. As a result, there are also a variety of strategies for pursuing instructional redesign, depending upon institutional circumstances. The approach most favored by the Round III projects was to maintain constant enrollments while reducing the total amount of resources devoted to the course. By using technology for those aspects of the course where it would be more effective and by engaging faculty only in tasks that require faculty expertise while transferring other tasks that are less academically challenging to those with a lower level of education, an institution can decrease costs per student even though the number of students enrolled in the course remains unchanged. Eight of the ten projects employed this approach, which makes sense when student demand for the course is relatively stable.

But if an institution is in a growth mode or has more demand than it can meet through existing course delivery, it may seek to increase enrollments while maintaining the same level of investment. Many institutions have escalating demand for particular subjects like Spanish or information technology that they cannot meet because they cannot hire enough faculty members. By using redesign techniques, they can increase the number of students they enroll in such courses and relieve these academic bottlenecks without changing associated costs.

Both PSU and FGCU planned to increase the number of students who could enroll in their courses. PSU was able to increase student enrollment in an introductory Spanish course by keeping section size relatively small and increasing the number of sections offered. Because of seat-time reduction, the number of sections could be doubled in the same physical space with only a small increase in personnel. In the full implementation of the redesign, the number of students served will increase from 690 students served in the traditional format to about 1270.

Anticipating continued enrollment growth, FGCU created a model that will scale, permitting costs to increase but at a much slower pace than would have been true of the traditional model. As course enrollment grows, the only additional cost will be to increase the number of preceptors. Important faculty oversight will be maintained via a course coordinator and on-going faculty curricular review, and their costs remain unchanged. Thus, the cost-per-student will continue to go down. The projected cost-per-student for 2400 students (the projected enrollment five years from now) is $50 compared to $132 for enrolling 800 students in the traditional course.

Another way to reduce costs is to decrease the number of course repetitions due to failure or withdrawal, so that the overall number of students enrolled each term is lowered and the required number of sections (and thus the faculty members required to teach them) are reduced. At many community colleges, for example, it takes students about two-and-a-half tries to pass introductory math courses. If an institution can move students through in a more expeditious fashion by enabling them to pass key courses in fewer attempts, this will generate considerable savings--both in terms of institutional resources and in terms of student time and tuition. Five of the ten projects showed a decrease in drop-failure-withdrawal (DFW) rates. While the cost impact of these reductions could be calculated, these calculations are not included in the cost-per-student savings that we report.

What are the most effective cost-reduction techniques used by the redesign projects? Since the major cost item in instruction is personnel, reducing the time that faculty members and other instructional personnel invest in the course, and transferring some of these tasks to technology-assisted activities are key strategies. Some of the more predominant cost-reduction techniques used by the Round III projects included:

  • Online Course Management Systems. Course management systems played an important role in all ten of the Round III redesigns. Drexel, FGCU, Iowa State, PSU, UNM and USM used WebCT; BYU and TCC used Blackboard; Ohio State used a homegrown system created specifically for the redesigned course; and, NAU used instructional software that includes an integrated management system. Sophisticated course-management software packages enable faculty members to monitor student progress and performance, track their time on task, and intervene on an individualized basis when necessary. Course management systems can automatically generate many different kinds of tailored messages that provide needed information to students. They can also communicate automatically with students to suggest additional activities based on homework and quiz performance, or to encourage greater participation in online discussions.

Using course-management systems radically reduces the amount of time that faculty members typically spend in nonacademic tasks like calculating and recording grades, photocopying course materials, posting changes in schedules and course syllabi, sending out special announcements to students—as well as documenting course materials like syllabi, assignments, and examinations so that they can be used in multiple terms. The Drexel team created and continues to refine a number of software tools to facilitate online course management by interfacing WebCT and other third party applications such as plagiarism detection software. These tools provide a more versatile interface and eliminate some of the more time-consuming aspects of using WebCT such as multiple mouse clicks per case to retrieve or return student information.

OSU's buffet redesign model required flexible tracking of the multiple approaches available to students to be sure that all students were meeting all learning outcomes. To enable this tracking, the team created a customized MySQL database that linked all course components to 90 course-learning objectives. The learning objectives provided linkages among all aspects of the course, from individualized assessment of student learning styles to appropriate kinds of learning materials and activities to assessments of outcomes. Every problem in the text, every quiz and exam question, every lab and problem in the lab manual was linked to a learning objective. Depending on learning preferences and the choices students made, the system displayed the appropriate course components.

  • Online Automated Assessment of Exercises, Quizzes, and Tests. As noted above, most of the ten projects used automated grading of exercises, quizzes or tests for subjects that can be assessed through standardized formats, not only increasing the level of student feedback but also offloading these rote activities from faculty members and other instructional personnel. Some used the quizzing features built into commercial software products like MyMathLab; others used the quizzing features of WebCT; and OSU used a homegrown system created specifically for the course.

Online quizzing sharply reduces the amount of time faculty members or GTAs need to spend on the laborious process of preparing quizzes, grading them, and recording and posting the results. Automated testing systems that contain large numbers of questions in a database format enable individualized tests to be easily generated, then quickly graded and returned. At PSU, automation relieved graduate teaching assistants from menial, repetitive and non-satisfying labor, while increasing the number of students they could facilitate and monitor. At Iowa State, the teaching assistants no longer had to grade exams so they could be assigned more hours to interact with students in the computer lab or in office hours.

Perhaps the most innovative assessment application was FGCU’s use of a software program called the Intelligent Essay Assessor (IEA) to grade student essays. The Intelligent Essay Assessor, once programmed, assesses short (between 100 and 500 words), well-structured student essays based on their content, grammar and mechanics. This software required careful preparation for use, but once fine-tuned, it reliably scored short paragraphs, producing a final IEA-human inter-rater reliability of 81 percent, and saved faculty a lot of grading time.

  • Online Tutorials. The use of instructional software allows much of the time previously spent on instruction to be transferred to the technology and eliminates lecture time previously used to introduce content and review homework. Access to web-based resources reduced labor costs at TCC by decreasing the amount of time faculty spent in diagnostics, preparation of lectures, grammar instruction, monitoring progress, grading and making class announcements. Overall, faculty logs kept during the spring 2003 semester indicated a 33 percent decrease in time spent on course activities associated with the preceding tasks. At Iowa State, salary savings in the redesigned course were directly attributable to online delivery and online testing. Since instructors did not have to meet students in the classroom and did not need to design several exams per term, each instructor could handle between 500 and 600 students in comparison to 150 in the traditional format.

  • Shared Resources. When the whole course is redesigned, substantial amounts of time that faculty members spend developing and revising course materials and preparing for classes can be considerably reduced by eliminating duplication of effort. All ten of the Round III projects benefited from using shared resources, leading to a significant reduction in preparation time. Since responsibility for improving and updating the materials was shared among instructors, each faculty member's workload was reduced. At USM, for example, four full-time faculty shared the teaching (each was responsible for offering one-quarter of the presentations), allowing those faculty to devote greater time to other teaching, research, and service duties when they were not teaching.

In addition, online lecture notes, review questions, activities associated with the text, CD, and Web sites are often used by students as learning resources instead of relying on faculty office hours. Both TCC and PSU found that using computer-based resources allows more learning to take place within the classroom, thereby reducing the amount of time faculty need to spend in office hours and extra student appointments.

Another benefit of creating shared course resources is the opportunity for continuous improvement of those resources. During each phase of implementation, redesign teams are able to modify, update and revise learning activities based on what works well and what does not. Student feedback on the clarity and number of assignments, as well as their expressed need for greater explanations and models, provides multiple indicators for areas needing change. The online environment permits flexibility in design and expansion where needed, and timely changes can be made. In addition, many teams have found that once the course resources have been developed, only a minimum amount of additional labor has been necessary to improve the course content and keep it current. The shared course materials not only save the original instructors involved in the redesign course preparation and maintenance time, but also enable their use by new faculty members who would otherwise have had to prepare the course during the first semester of teaching it.

  • Staffing Substitutions. By constructing a support system that comprises various kinds of instructional personnel, institutions can apply the right level of human intervention to particular kinds of student problems. Highly trained (and expensive) faculty members are not needed to support all of the many tasks associated with delivering a course. At Drexel, for example, the increased use of online materials greatly stabilized the course, providing much more uniformity from term to term. Because of the greater structure of the course redesign, using undergraduate and graduate part-time (hourly) undergraduate and graduate students in the labs became more effective, as did the use of auxiliary rather than tenured or tenure-track faculty in running the course. Since these instructional staff cost less per hour than full-time teaching assistants and tenured/tenure-track faculty, Drexel experienced increased cost savings.

FGCU redesigned course was taught 100% by full-time faculty supported by a new position called the preceptor. Preceptors, most of whom had a B.A. in English, were responsible for interacting with students via email, monitoring student progress, leading Web Board discussions and grading critical analysis essays. Each preceptor worked with 10 peer learning teams or a total of 60 students. Replacing adjuncts independently teaching small sections ($2,200 per 30-student section) with preceptors assigned a small set of specific responsibilities ($1,800 per 60-student cohort) in the context of a consistent, faculty-designed course structure will allow FCGU to accommodate ongoing enrollment growth while steadily reducing its cost-per-student.

TCC reduced the number of full-time faculty involved in teaching the course from 32 to 8 and substituted less expensive adjunct faculty without sacrificing quality and consistency. In the traditional course, full-time faculty taught 70% of the course, and adjuncts taught 30%. In the redesigned course, full-time faculty teach 33% of the course, and adjuncts teach 67%. Full-time faculty were freed to teach second-level courses where finding adjuncts is much more difficult. By making these changes, TCC reduced the cost-per-student by 43% and produced an annual dollar savings of $321,000, the highest dollar savings in Round III.

The preceding five cost reduction techniques were also cited by the Round I and Round II projects. An additional cost reduction technique identified by the Round II projects was also cited by the Round III projects:

  • Consolidation of Sections and Courses. Unlike participants in Round I, the Round II and III institutions were required to redesign the whole course. As a result, many were able to realize cost savings by consolidating the number of sections or the number of courses offered. Iowa State consolidated 12 large lecture sections of ~150 students each (8 in fall and 4 in spring) to one section each term (1200 students in fall, 600 students in spring). Previously 12 full-time faculty each taught one section of the course supported by 15 TAs who taught recitation sections (~35 students) once per week. In the redesign, 3 full-time faculty members manage the online course supported by 12 TAs who teach optional recitations, hold office hours, and conduct online help sessions.

USM reduced the number of sections from 30 of 65 students each to 2 sections of 1000 students each. These changes enabled the university to reduce the number of faculty teaching the course from 16 (8 full-time faculty and 8 adjuncts) to the equivalent of 2 full-time faculty and 4 GTAs. Prior to the redesign, 50% of the course was taught by full-time faculty, and 50% was taught by adjuncts. USM eliminated adjuncts completely. The redesigned course was taught 100% by full-time faculty supported by GTAs for writing assignment grading. By making these changes, six full-time faculty were freed to teach other courses, and the funds previously used to hire adjuncts were made available for a variety of academic enhancements in the department. Consolidation was possible only through technology since there was not an auditorium on campus large enough to hold all the students enrolled in the course. USM reduced the cost-per-student by 56%, the highest percentage reduction in Round III.

FGCU reduced the number of sections from 31 to 2 and increased the number of students served in the first year of the redesign from 800 to 950. Full-time faculty taught 20% of the traditional course, and adjuncts taught 80%. FGCU eliminated adjuncts completely; the course is now taught 100% by full-time faculty supported by a new position called the preceptor described above.

Finally, one additional cost savings technique shared by the Round III projects was a reduction of space requirements. Delivering portions of the PSU Spanish course via the Web as a substitute for face-to-face classroom instruction brought significant space savings to this urban university with rapidly increasing enrollments. Online chat allowed communicative use and practice of Spanish to extend beyond the limits of the classroom while maintaining student-student contact and instructor supervision. FGCU's redesign helped the university deal with a space crisis caused by enrollment which is growing at a faster pace than its buildings. Because the course was entirely online, the redesigned course no longer needed to use any classroom space. Of the 30 redesign projects, UCF in Round I was the only one that detailed the cost savings resulting from reduced space costs, but any of the projects that reduced contact hours could calculate those space savings as well.

With regard to cost savings, the redesign methodology was an unqualified success. All ten of the Round III projects reduced their costs. Some saved more than they had planned; others saved less. The Round III projects planned to reduce costs by about 41 percent on average, with a range of 28 to 56 percent. They actually reduced costs by 39 percent on average, with a range of 15 to 56 percent. Final results from Round III show a collective savings of $999,214 for ten courses, compared with the original projection of $1,195,028. (For a detailed comparison of planned versus actual savings, please see www.thencat.org/PCR/R3Savings.html.)

Why is there such a large range in cost savings across the projects? With the exception of BYU (which planned to reduce costs by 40 percent but actually reduced them by 15 percent simply because the team failed to carry out their plan), differences are directly attributable to the different design decisions made by the project teams, especially with respect to how to allocate expensive faculty members. Redesigns with lower savings tend to re-direct, not reallocate, saved faculty time. They keep the total amount of faculty time devoted to the course constant, but they change the way faculty members actually spend their time (for example, lecturing versus interacting with students.) Others substantially reduce the amount of time devoted to the course by non-faculty personnel like GTAs, but keep the amount of regular faculty time constant. Decisions like these reduce total cost savings.

Higher education has traditionally assumed that high quality means low student-faculty ratios and that large lecture-presentation techniques are the only low-cost alternatives. By using technology-based approaches and learner-centered principles in redesigning their courses, these ten institutions like the twenty institutions involved in Rounds I and II are showing us a way out of higher education's historical trade-off between cost and quality. Each project carefully considered how best to use all available resources—including faculty time and technology—to achieve the desired learning objectives. Moving away from the current credit-for-contact paradigm of instruction and thinking systematically about how to produce more effective and efficient learning are fundamental conditions for success.

Implementation Issues

As part of the grant application process, the Center required institutions to assess and demonstrate their readiness to engage in large-scale redesign by responding to a set of institutional-readiness criteria and to a set of course-readiness criteria, both developed by Center staff. (For a full description of the program's readiness criteria, please see http://www.thencat.org/PlanRes/Readiness.htm.) Our experience in the program has taught us that some institutions, because of their prior investments and experiences, better understand what is required to create these new learning environments and are more ready to engage in redesign efforts. In addition, just as some institutions are more ready than others to engage in large-scale redesign, some faculty members and some courses are more ready than others to be the focus of that redesign effort. Prior experiences with technology-mediated teaching and learning and numerous attitudinal factors give them a head start on the process.

The ten institutions involved in Round III exhibited a high degree of readiness, and all successfully completed their redesigns. The experiences of the Round III projects, like those in Rounds I and II, corroborate the importance of readiness in completing a successful redesign project. In the case of Round III, it is clear that the teams were better prepared to undertake the redesign projects than previous rounds as a result of the accumulated lessons learned from which they could benefit. There were fewer implementation problems in the third round. When project teams encountered implementation problems, however, in almost every instance the problem was directly related to a lack of readiness. The description of implementation problems that follows is organized in relation to the program's readiness criteria; the italicized portions are taken from commentary about each criterion included in the grant program guidelines.

  • Course Readiness Criteria #3: Decisions about curriculum in the department, program, or school must be made collectively.

Decisions to engage in large-scale course redesign cannot be left to an individual faculty member. An institution's best chance of long-term success involves not a single individual but rather a group of people who, working together, are committed to the objectives of the project. Indicators that the faculty in a particular unit are ready to collaborate include the following: they may have talked among themselves about the need for change; they may have decided to establish common learning objectives and processes for the course in question; and they may have instituted pieces of a common approach, such as a shared final examination.

The biggest implementation issue for several of the Round I projects was achieving consensus among all faculty teaching the course about a variety of issues. In contrast, five of the ten Round II projects cite collective decision-making and departmental buy-in as key factors that led to the success of their redesigns, thus reinforcing the importance of this readiness criterion. In Round III, only TCC encountered some difficulty in achieving faculty consensus. While the English faculty initially agreed to the redesign, there was some opposition from several faculty members once it was accomplished. In retrospect, the team believes it needed to do a better job of communication and inclusion in order to actively involve the other 16 full-time faculty in improving redesign components and course evolution.

  • Institutional Readiness Criteria #3: The institution's goal must be to integrate computing throughout the campus culture.

Unlike institutions that have established "initiatives" without specific milestones, computing-intensive campuses know the numbers. They know the level of availability of network access and the level of personal computer ownership (or availability) for students and faculty on their campuses because their goal is saturation, and the numbers tell them how close they are to achieving that goal. Ubiquitous networked computing is a prerequisite to achieving a return on institutional investment. Until all members of the campus community have full access to IT resources, it is difficult to implement significant redesign projects.

Like the projects in Rounds I and II, Round III projects encountered a number of problems related to the technology, although the problems were fewer and more easily solved. Two of the projects, Iowa State and TCC, encountered problems in providing adequate laboratory classroom space and equipment to offer the course in the redesigned format as the course moved from pilot to full implementation. These problems were eventually resolved.

Both TCC and PSU commented on the issue of student access to technology in their final reports. At TCC, learners had sufficient home access to online materials, and most had previous computer experience. In addition, all students continue to have extensive lab access on campus. There was little or no evidence to support the fears of socio-economic exclusion. TCC will continue to monitor student access as all college composition courses move to an online environment. In contrast, PSU encountered problems when students insisted on performing all online activities from their home computers, where the university does not provide technical assistance. Although all students were strongly encouraged to use university computer labs, about 90% completed their activities from home, with about 10% of them experiencing chronic frustration.

  • Institutional Readiness Criteria #7: The institution must have established ways to assess and provide for learners' readiness to engage in IT-based courses.

Learner readiness involves more than access to computers and to the network. It also involves access to technical support for using navigation tools and course-management systems. Students also need to be aware of what is required to be successful in technology-intensive courses. Making the change from face-to-face instruction to online learning involves far more than learning to use a computer. Many students are set in their ways after a lifetime (albeit brief) of passive instruction. They need preparation in making the transition to more active learning environments.

Preparing students (and their parents) for changes in the way a course is offered turned out to be an important factor at USM. Initial stories in the campus and local press emphasized the technology of the course, especially its online dimensions, and pitched making life easier as students could 'come to class without leaving home.' The stories frightened many students, angered faculty, and confused administrators as parents phoned them to ask for details about an "instructorless" course that was still in the design stage.

In hindsight, a better approach would have been to emphasize that 1) students could still attend live presentations and participate in discussions; 2) WebCT was already being used in hundreds of other campus courses; and 3) there would be more in-person help and office hours available than ever before with a nine-person team collectively offering the redesigned course rather than a single instructor. It would have been better to insist that the press stress educational ends rather than technological means from the outset. Although improved reading and writing skills will always seem less newsworthy than stories about streaming video, it's nevertheless crucial to keep a clear focus on why the technology has been called into play in the first place.

Additional Implementation Problems

  • Software Problems: Like the Round I projects, several of the Round III projects experienced problems and delays due to factors beyond their control having to do with the current, relatively immature state of the commercial software marketplace. Course management software, for example, is being continuously changed and updated. An ongoing issue for faculty at Drexel was adapting to new versions of software, particularly since such a large portion of the course delivery depended on sophisticated software systems. Project software needed to be constantly upgraded to adapt to changes in interfacing software such as plagiarism detection software. The learning curve was never fully mastered as various routine activities were done slightly differently with each new release of the course management software.

Two projects encountered problems with commercial test banks. The FGCU team expected to be able to utilize the test banks that came with the course textbook. Unfortunately, there were many errors in the test bank, so they ended up writing their own. At Iowa State, the creation of question banks for homework and exams took considerably more time than expected. All the assignments were administered using MapleTA (formerly called EDU), a program specifically created for administering mathematical questions with write-out solutions. The syntax for creating MapleTA's algorithmic questions was a bit peculiar, error messages were often meaningless or misleading, and documentation was sparse. The final result was well worth it since the team now has question banks for homework assignments and exams that can be reused term after term.

  • Instructor, GTA, and undergraduate tutor training. Like the Round II projects, several of the Round III projects experienced problems because they underestimated the degree of instructor, GTA, and undergraduate tutor training that was required in order to implement their redesigns successfully. The desire to go back to old ways of doing things had to be overcome. As new faculty and teaching assistants at Drexel were brought into the course over time, it was important to help them go through the steps of accepting a different learning model. Laboratory assistants needed to be coached in how to facilitate and engage students in problem-solving rather than resorting to lectures or providing answers to students. Thus a formal training system with follow-up monitoring was needed for new faculty, teaching assistants, and laboratory assistants so they could fully adapt to the course redesign.

The BYU team also underestimated the need for rigorous instructor training. Virtually all of their instructors were English graduate students, who had traditionally been afforded a great deal of autonomy and latitude in their teaching. Even though instructor pre-service and in-service training focused intensively on the redesign, changing this instructor culture proved more difficult than anticipated. Because the department has only an MA program, there was continual turnover in the instructor pool, which required constant training. Achieving instructor "buy in" was a recurring battle. The team is confident that they have adequately addressed these issues, but before they were resolved, they made full implementation of the redesigned course difficult.

  • Changes in the original course redesign. Six of the ten Round III projects made changes in their original course redesign, which were relatively minor and had little effect on the overall project outcome. FGCU originally planned to build its redesign on a buffet model. During the course pilot, the FGCU team made a variety of learning activities available for students: lectures, videotapes, labs, online tests, etc. Students did not attend any of the lectures or labs and did not view any of the videotapes, yet they were still very successful. The team concluded that these activities did not contribute to student success, so they were eliminated. In addition, FGCU initially planned to link an assessment of student learning styles to course activities. Again, because students were so successful in the course, the team concluded that this aspect of the redesign was not needed.

UNM originally planned to reduce the number of lectures per week to one. During the pilot implementation, there was considerable and sustained student protest to the announcement that there would be only one lecture per week. In subsequent implementations, two lectures were offered. Portland State originally planned to increase section size in the redesign in order to increase the number of students served. Based on experience during the 2002-03 academic year, the team decided to maintain section size at a lower level and to increase the number of sections as well as the number of students. Because of seat-time reduction, the number of sections could be doubled in the same physical space with a small increase in personnel.

Two of the projects experienced backsliding from their original project goals, bringing to mind the importance of Institutional Readiness Criteria #1: The institution must want to reduce costs and increase academic productivity. It is questionable whether some of the ten institutions involved in Round III really wanted to increase academic productivity. In one case, the original academic problem was that the traditional course suffered from inconsistency and inefficiency. The instructors, who were almost all Masters-degree students, tried to achieve course objectives in a multitude of ways, and their inexperience led them to spend a significant amount of time preparing for classes, duplicating the efforts of others. The redesign plan was to develop a series of interactive multimedia lessons to standardize the curriculum across all sections, provide students with a more consistent experience, and reduce the time graduate instructors spend preparing and presenting in the classroom.

After the initial implementation, the project leadership team discovered that instructors were not using the course materials they had developed. For the next implementation, the team built new instructor training around the course modules and provided additional training throughout the term. Despite this extra emphasis on training, the team again discovered at semester's end that instructors were picking and choosing among the online materials that had been developed to bring greater consistency to the course. These outcomes suggest a lack of departmental and institutional commitment to increasing student success. In order for the successes achieved in a redesign to have a sustained impact, administrative leadership needs to play an active and continuing role.

In the second case, the department was split into independent departments in different colleges in the middle of the project. The importance of having strong support from departmental (and university) leadership became increasingly clear after the department was split. Team members ended up in both departments, which created conflicting priorities that affected the pace of redesign. Unlike the joint department head, the new computer science department head was not a member of the redesign team, which resulted in a change in project scope because of a decision about how the target courses would be used. The fragility of creating and sustaining major pedagogic change under changes in leadership, which may bring changed priorities, was evident. Existing redesign features at the time of the split have been sustained and more fully developed, but aspects of the redesign that were not yet in place have been problematic to initiate due to changing interests and changing personnel. The project team is still working to achieve all of the redesign goals; however, the pace of implementation has been slowed.

Sustainability

One way to judge the success of a grant-funded project is to assess its potential to be sustained once the grant funding runs out. All ten Round II projects are firmly committed to sustaining their redesigns. (Even the two that have backslid are committed to moving forward to complete their redesigns.) Comments include "the redesigned course reflects a permanent change in the way introductory psychology is taught," "the university is committed to maintaining the course in online form," and "the redesigned course will continue to be offered because it saves resources and is demonstrably a success in the area of student learning." As the FGCU team puts it, "The redesign will be sustained for two reasons. First, the redesigned course has been an incredible success at improving quality: with the dramatic increase in student learning and with the success of some of the design elements such as the use of the Intelligent Essay Assessor and the alternative staffing model, faculty who were at first skeptical but willing participants have become true supporters. Second, the redesigned course has been an incredible success at reducing costs. Without the cost savings associated with the course, the university may have been unable to continue the course because they could not have afforded to pay the adjuncts or find the rooms for all the sections needed."

A second way to evaluate the success of a grant-funded project is to consider its impact on other courses within the department and within the institution. Again, most of the projects report that the original redesign is having an impact on other courses. The Drexel math faculty are actively pursuing ways to implement course redesign in both calculus and statistics. Both are large enrollment courses with highly diverse students and substantial DWF rates that would benefit from improved instruction and cost savings from using technology. At UNM, two NSF grants have been submitted based in part on results obtained from the redesign. One grant proposes the redesign of introductory math and science courses (biology, chemistry, and physics); the other examines ways to show enhanced concept formation abilities in introductory physics and psychology. Iowa State plans to apply its experiences with discrete mathematics to two other web-based courses, algebra and trigonometry.

PSU has already committed monetary and personnel resources to other redesign projects focused on reduced seat time, infusion of technology into instruction and enhanced student learning. Second-year Spanish, introductory computer science, and introductory statistics are among the academic areas involved. Budget permitting, additional proposals will be entertained in following years. The redesign at TCC serves as a model for other courses. The team is now planning for the redesign of second-level English courses, and many of the redesign features are already being piloted in those courses. Additionally, dual enrollment courses taught in the local high schools are teaching the redesigned version of college composition. At OSU, the buffet strategies are being used in other statistics classes.

To what do we attribute the high level of success achieved by the Round III projects? The Program in Course Redesign provided leadership in choosing the right participants, teaching them the planning methodology, actively supporting them as they developed their design plans, closely monitoring the implementation process, and insisting on ongoing and final progress reports that include measurable outcomes. The program followed a unique three-stage proposal process that required applicants to assess their readiness to participate in the program, develop a plan for improved learning outcomes, and analyze the cost of traditional methods of instruction as well as new methods of instruction utilizing technology. (Please click here for a description of the Center's Course Planning Tool, which facilitates this analysis.)

Perhaps the most significant aspect of this process has been the need for the Center to teach the redesign methodology, especially in regard to cost savings, since neither faculty nor administrators traditionally employ this approach to restructuring courses using IT. Prospective grant recipients were supported throughout by a series of invitational workshops that taught these assessment and planning methodologies and by individual consultations with Center staff. Both faculty and administrators have repeatedly indicated that learning the methodology is key to the effectiveness of the process. Once learned, however, the methodology is easily transferable to other courses and disciplines.

 

Improving Learning and Reducing Costs: Lessons Learned from Round II of the Pew Grant Program in Course Redesign

By Carol A. Twigg

© 2004 Center for Academic Transformation
Sponsored by a grant from the Pew Charitable Trusts.
Center for Academic Transformation
Rensselaer Polytechnic Institute
4010 Walker Lab
110 8th Street, Troy, NY 12180
518-276-6519 (voice) / 518-695-5633 (fax)
http://www.center.rpi.edu

Return to Top

 

Program in Course Redesign Quick Links:

Program In Course Redesign Main Page...

Lessons Learned:
Round I...
Round II...

Savings:
Round I...
Round II...
Round III...

Project Descriptions:
Sorted by Discipline...
Sorted by Model...
Sorted by Success...
Sorted by Grant Rounds... .