Increasing Success for Underserved Students:
Redesigning Introductory Courses

A report examining the impact of the redesign techniques developed by the Program in Course Redesign on the success of adult students, students of color and low-income students.

By Carol A. Twigg



Download PDF version

Preface
Introduction
The Program In Course Redesign
Assessing the Impact of Course Redesign on Underserved Students
Results: Improved Learning and Increased Retention
Quality Improvement Techniques
The Achievement Gap
Cost Reduction Techniques
Technology Access and Use among Underserved Students
Conclusion
Appendix A: Identification of Target Institutions
Appendix B: PCR Project Leader Interview Protocol
Appendix C: Site Visit Protocol
Case Study: Florida Gulf Coast University
Case Study: Indiana University– Purdue University Indianapolis
Case Study: The Ohio State University
Case Study: Portland State University
Case Study: Rio Salado College
Case Study: Riverside Community College
Case Study: Tallahassee Community College
Case Study: The University of Alabama
Case Study: University at Buffalo–SUNY
Case Study: University of Central Florida
Case Study: University of Idaho
Case Study: The University of New Mexico
Case Study: University of Southern Maine
Case Study: The University of Southern Mississippi
Case Study: University of Tennessee, Knoxville
Notes

Preface
By Peter Ewell, Vice President, National Center for Higher Education Management Systems

Raising educational attainment levels in the United States among the country's most underserved citizens—those of low income and those of color—requires that higher education do three things simultaneously. First, more such individuals need to progress, especially through the critical, first year of college. Achieving that objective is hard because failure rates among underserved students in so-called gatekeeper courses are alarming. Second, underserved students must not only complete such courses but also effectively master the skills and knowledge that the courses encompass, because most of them are prerequisites for the rest of the undergraduate curriculum. Third, both of these things must occur on a very large scale and at a price that society can afford. Accomplishing these tasks demands radically new ways of thinking about undergraduate education.

This monograph describes one of them. The National Center for Academic Transformation's (NCAT's) project on course redesign is the most extensive demonstration to date of the effectiveness of fusing instructional technology and reconceptualized instructional practices. Its success in raising achievement levels while lowering costs has been documented in many places already. This monograph addresses whether those positive findings can be extended to underserved students.

The course redesign project was not originally configured to answer such questions, but enough participating institutions (15 of 30) enrolled large enough numbers of low-income students and students of color to allow for a second look at the impacts of course redesign on those specific populations. The NCAT research team reexamined statistical results and when possible, tried to disaggregate them. The team also conducted interviews and made site visits to participating campuses to learn more about these effects. A retrospective look of this kind, by its very nature, cannot establish cause and effect at the individual student level. If the institutional team did not originally collect data on retention and learning outcomes disaggregated by income or race/ethnicity, such data could not be re-created. But the size of the general student impacts attributable to the redesigns, combined with the large numbers of underserved attending classes at these institutions, provides reasonable evidence that these methods benefit low-income students and students of color in at least equal measure.

The research team also found that certain redesign features were especially effective for particular kinds of students, because not all underserved students are alike. One large component of the at-risk population consists of recent high school graduates who are frequently challenged by skills deficiencies in reading, writing, or mathematics during their first year of college. Often, these students are first-generation college goers not familiar with the mechanics or culture of higher education. The redesign elements that seem to especially benefit such students include high expectations, a requirement that students participate in specific experiences or exercises, and on-demand support services. Another large underserved population consists of working adults returning to higher education or entering it for the first time. In addition to cultural barriers, such students typically commute to class and are unusually challenged by the competing time commitments of work and family. Redesign elements that allow flexibility, around-the-clock access to academic and support services, and the ability to work on one's own appear especially suited to, and valued by, these populations.

Results from all 30 course redesign project institutions prove decisively that these changes can generate significant cost savings over time. A major drawback of typical approaches to addressing the low success rates among underserved students is that the approaches are add-ons to existing programs. Special-purpose academic programming is put in place alongside the regular curriculum, and additional services are tailored to address the problems of these populations. Evidence about the effectiveness of such approaches is mixed, and unquestionably the approaches cost more money. Redesigning introductory courses along the lines of this monograph suggests an alternate route that costs less over time and generates savings that may be reinvested elsewhere to support student success.

Higher rates of course completion—with better or equivalent learning—and taking approaches that are affordable are key success factors. Previous work by NCAT demonstrates that these redesigns work for general student populations. The evidence presented here supports a workable conclusion that they work just as well for underserved students. That evidence also suggests that some redesign elements, if implemented carefully and intentionally, may begin to redress historical gaps in achievement as well. 1

INTRODUCTION

Many students who begin postsecondary education drop out before completing a degree. An estimated 60 percent of students at public institutions fail to complete degrees within five years, and half of these students leave during the freshman year. As shown by research by the Policy Center on the First Year of College at Brevard College in North Carolina and others, the first year of college is the most critical to a college student's success and to degree completion. According to the National Center for Education Statistics (NCES), almost half of first-time students who leave their initial institutions by the end of the first year do not return to higher education.

Graduation rates among African-American, Hispanic, Native American, and low-income students are lower than the overall numbers. NCES data indicates that one-quarter of freshmen are from low-income backgrounds, almost one-third are nonwhite, and 40 percent are the first in their families to attend college. Such students—often not as academically or socially prepared as others for higher education—are more prone to drop out. Indeed, 45 percent of African-American students and 39 percent of Hispanic students, on average, leave four-year institutions within six years without earning degrees, compared with 33 percent of white students and 26 percent of Asian-American students. Similar gaps exist by income: students from lower-income backgrounds are significantly less likely than students from higher-income backgrounds to go on to earn bachelor's degrees. Although many students who do not complete degrees may have met other personal goals, both educators and policy makers consider these rates to be too low.2

Vincent Tinto's student integration model focuses on what can be done to reverse these trends. Tinto's model is the dominant theoretical perspective on retention and completion and influences most of the current thinking in the higher education community about persistence and graduation. In his 1993 book Leaving College: Rethinking the Causes and Cures of Student Attrition, Tinto emphasizes the importance of the full integration of the student in the social and intellectual life of the institution.3 He differentiates between social integration, measured by such factors as interaction with faculty and participation in extracurricular activities, and academic integration, measured by grades or other indicators of academic achievement, and recommends that institutions foster both types of integration among college students. Tinto's model is designed to help colleges understand why students leave so that institutions can design activities to better serve students' needs and thereby increase retention and graduation rates.

The student integration or engagement model was developed based primarily on four-year-college models, with particular emphasis on full-time, traditional-aged, residential students. In their Lumina-funded report Pathways to Persistence: An Analysis of Research on Program Effectiveness at Community Colleges, Tom Bailey and Mariana Alfonso suggest that the one place where the engagement model may be most relevant for nontraditional students is in the classroom. This, after all, is where even commuter students interact with faculty and potentially with other students. Designing the classroom experience to promote more-meaningful interaction among students and teachers is a promising strategy.

Successful completion of introductory courses is critical for first-year students, but typical failure rates in these courses contribute heavily to overall institutional dropout rates between the first and second year. Although success rates vary by institutional type and by subject matter, Research I universities commonly cite a 15 percent DFW (drops, failures, and withdrawals) rate in introductory courses. Comprehensive universities have DFW rates ranging from 22 percent to 45 percent in these courses. Community colleges frequently experience DFW rates of 40 percent to 50 percent or more.

Undergraduate enrollments in the United States are concentrated heavily in large-enrollment introductory courses. In fact, just 25 courses generate about half of all student enrollments in community colleges and about a third of enrollments in four-year institutions. The topics of these courses are not surprising and include introductory studies in such disciplines as English, mathematics, psychology, sociology, economics, accounting, biology, and chemistry. In addition to suffering from a high rate of academic failure, these courses affect literally every student who goes to college.

Clearly, making significant improvements in first-year courses can have a major impact on student success and retention. As Bailey and Alfonso have commented: "No program, however well designed, can work in isolation. An excellent developmental or counseling program in a college with generally ineffective teaching may ultimately have no effect on student completion rates."4

THE PROGRAM IN COURSE REDESIGN

Supported by an $8.8-million grant from the Pew Charitable Trusts, the Program in Course Redesign (PCR) was created in April 1999 by the National Center for Academic Transformation (NCAT), formerly housed at Rensselaer Polytechnic Institute, to address these issues. Its purpose was to demonstrate how colleges and universities can redesign their instructional approaches by using technology to achieve quality enhancements as well as cost savings. Selected from hundreds of program applicants in a national competition, 30 institutions each redesigned one of their top 25 large-enrollment, introductory courses. The 30 institutions include research universities, comprehensive universities, private colleges, and community colleges in all regions of the United States. 5

The PCR followed a unique three-stage proposal process that required applicants to assess their readiness to participate in the program, develop a plan for improved learning outcomes, and analyze the cost of traditional methods of instruction as well as new methods of instruction utilizing technology. Prospective grant recipients were supported in this process through a series of invitational workshops that taught institutional teams these assessment and planning methodologies and through individual consultations with NCAT staff.

NCAT required each institution to conduct a rigorous evaluation focused on learning outcomes as measured by student performance and achievement. National experts provided consultation and oversight regarding the assessment of learning outcomes to ensure that the results were reliable and valid. The results were astounding. Twenty-five institutions showed significant increases in student learning, with the remaining five showing learning equivalent to traditional formats. Of the 24 that measured retention, 18 showed noticeable increases. Other qualitative outcomes included better student attitudes toward the subject matter and increased student satisfaction with the mode of instruction.

The basic assessment question associated with the PCR was the degree to which improved learning was achieved at lowered cost. Answering this question required comparisons between the learning outcomes of a given course delivered in its traditional and in its redesigned format. This comparison was accomplished by running parallel sections in traditional and redesigned formats or by comparing baseline information from a traditional course with a later offering of the course in redesigned format, looking at whether there were any differences in costs and outcomes.

The degree to which students actually mastered course content was the bottom line. Techniques used to assess student learning included comparing the results of common final examinations; comparing the results of embedded common questions or items in examinations or assignments; collecting samples of student work (papers, lab assignments, problems) and comparing their outcomes according to agreed-upon common faculty standards for scoring or grading; and tracking student records after students completed redesigned courses, looking at (a) proportions satisfactorily completing a downstream course, (b) proportions going on to a second course in the discipline, and (c) grade performances in postrequisite courses.

Before-and-after course costs were analyzed and documented using activity-based costing. NCAT developed a spreadsheet-based course planning tool (CPT) that supported institutions in this process, which involved the following steps: (1) determine all personnel (faculty, adjuncts, teaching assistants, peer tutors, professional staff) costs expressed as an hourly rate; (2) identify the tasks associated with preparing and offering the course in a traditional format and the personnel involved; (3) determine how much time each person involved in preparing and offering the course in a traditional format spends on each of the tasks; (4) repeat steps 1 through 3 for the redesigned format; and (5) enter the data in the CPT. The CPT then automatically calculates the cost of both formats and converts the data to a comparable cost-per-student measure. At the beginning of each project, baseline cost data (traditional course costs and projected redesigned course costs) were collected, and actual redesigned course costs were collected at the end.

All 30 institutions reduced costs by 37 percent on average, ranging from 15 percent to 77 percent. Collectively, the 30 redesigned courses affect more than 50,000 students nationwide and produce a savings of $3.1 million in operating expenses each year.

A Variety of Models

The PCR has created many different models of how to restructure large-enrollment, introductory courses so as to improve learning as well as to effect cost savings. In contrast to the contention that only certain kinds of institutions can accomplish these goals—and in only one way—the program has demonstrated that many approaches can achieve positive results. And to counter the belief that only courses in a restricted subset of disciplines—science or math, for instance—can be redesigned effectively, the program has produced successful examples in many disciplines. Here is a breakdown of the 30 participating institutions by curricular area:

HUMANITIES (6)

  • English composition: Brigham Young University, Tallahassee Community College
  • Spanish: Portland State University, The University of Tennessee, Knoxville
  • Fine arts: Florida Gulf Coast University
  • World literature: The University of Southern Mississippi


QUANTITATIVE (13)

  • Mathematics: Iowa State University, Northern Arizona University, Rio Salado College, Riverside Community College, The University of Alabama, University of Idaho, Virginia Polytechnic Institute and State University
  • Statistics: Carnegie Mellon University, The Ohio State University, Pennsylvania State University, University of Illinois at Urbana–Champaig
  • Computer programming: Drexel University, University at Buffalo–SUNY

SOCIAL SCIENCE (6)

  • Psychology: California State Polytechnic University, Pomona; University of Dayton; The University of New Mexico; University of Southern Maine
  • Sociology: Indiana University–Purdue University Indianapolis
  • American government: University of Central Florida

SCIENCE (5)

  • Biology: Fairfield University, University of Massachusetts Amherst
  • Chemistry: University of Iowa, University of Wisconsin–Madison
  • Astronomy: University of Colorado at Boulder

What did these projects have in common? To one degree or another, all 30 projects shared the following six characteristics:

1. Whole-course redesign. In each case, the whole course—rather than a single class or section—was the target of redesign. Faculty began the design process by analyzing the amount of time that each person involved in the course spends on each kind of activity—a process that often revealed duplication of effort among faculty. By sharing responsibility for both course development and course delivery, faculty members saved substantial amounts of time while achieving greater course consistency.

2. Active learning. All of the redesign projects made the teaching-learning enterprise significantly more active and more learner centered. Lectures were replaced with a variety of learning resources that move students from a passive, note-taking role to an active, learning orientation. As one math professor put it, "Students learn math by doing math, not by listening to someone talk about doing math."

3. Computer-based learning resources. Instructional software and other Web-based learning resources assumed important roles in engaging students with course content. Resources included tutorials, exercises, and low-stakes quizzes that provided frequent practice, feedback, and reinforcement of course concepts.

4. Mastery learning. The redesign projects added greater flexibility for when students could engage with a course, but the redesigned courses were not self-paced. Rather than depending on class meetings, student pacing and progress were organized by the need to master specific learning objectives, which were frequently in modular format, according to scheduled milestones for completion.

5. On-demand help. An expanded support system enabled students to receive assistance from a variety of different people. Helping students feel that they are part of a learning community is critical to persistence, learning, and satisfaction. Many projects replaced lecture time with individual and small-group activities that took place either in computer labs—staffed by faculty, graduate teaching assistants (GTAs), and/or peer tutors—or online, enabling students to have more one-on-one assistance.

6. Alternative staffing. By constructing support systems consisting of various kinds of instructional personnel, the projects applied the right level of human intervention to particular student problems. Not all tasks associated with a course require highly trained, expert faculty. By replacing expensive labor (faculty and graduate students) with relatively inexpensive labor (undergraduate peer mentors and course assistants) where appropriate, the projects increased the person-hours devoted to the course and freed faculty to concentrate on academic rather than logistical tasks.

ASSESSING THE IMPACT OF COURSE REDESIGN ON UNDERSERVED STUDENTS

Although the PCR was directed at a broad first-year student population at all types of institutions, NCAT knows that its redesign techniques have been particularly effective with underserved students: low-income students, students of color, and adults. For example,

  • The University of New Mexico (UNM) leads U.S. research universities in student diversity with an undergraduate minority student population of approximately 47 percent (33 percent Hispanic, 5 percent Native American, and 9 percent other). UNM students are primarily commuters who also work 30 or more hours per week. By redesigning its introductory psychology course, UNM reduced its DWF rate from 42 percent to 18 percent. The number of students who received a C or higher rose from 60 percent to 76.5 percent, and there were more A (34 percent) and B (31 percent) grades than were recorded in previous semesters.
  • Rio Salado College, one of the 10 community colleges in the Maricopa County Community College District, has been delivering distance education for the past 20 years with a focus on adult learners. By redesigning four of its online introductory math courses, Rio Salado increased course completion rates from 59 percent to 65 percent.
  • The University of Idaho increased overall student success rates in mathematics. Success rates in intermediate algebra for Hispanic students who are part of the College Assistance Migrant Program increased from 70 percent to 80 percent and surpassed the success rate for the entire algebra population as a whole.
  • Two urban universities that serve a high percentage of adult learners—Florida Gulf Coast University and Indiana University–Purdue University Indianapolis (IUPUI)—respectively reduced their DFW rates from 45 percent to 11 percent in fine arts and from 39 to 25 percent in introductory sociology.

Supported by a grant from Lumina Foundation for Education, NCAT conducted an in-depth study to determine what specific techniques among those used by the PCR projects had led to increased success rates among underserved students. This report builds on our initial research and documents the impact of course redesign on the target student groups. Since the redesign projects varied considerably in their approaches, we have identified those techniques that can be adopted by other institutions.

Among the questions we sought to answer were: Are these techniques similar or different from those used with traditional-age, better-prepared students? What works best with the target population? How can other institutions with a focus on at-risk students use what we have learned? How can what we have learned contribute to the public discourse on higher education access and success among underserved populations?

The initial targets of the study were 25 of the original 30 PCR institutions. We eliminated the five institutions that showed no significant difference in improving student learning and in increasing retention: Brigham Young, Northern Arizona, UC Boulder, UIUC, and UW–Madison.

We then examined enrollment patterns among the remaining 25 institutions with regard to numbers of low-income students, African-American and Hispanic students, and adult students. That process eliminated six additional institutions from consideration because of insufficient representation of the target student population compared with the national average: Iowa State, Penn State, Dayton, Iowa, Massachusetts Amherst and Virginia Tech. Appendix A contains a breakdown of the 19 remaining institutions and their respective student populations. After initial data mining, four other institutions were eliminated: Cal Poly Pomona, Carnegie Mellon, Drexel, and Fairfield.

The remaining 15 institutions were the subjects of in-depth follow-up and focused interviews. Each course redesign project enrolled a high percentage of one or more of the target student groups; each of them increased student learning and/or student success and reduced instructional costs. The institutions and the courses they redesigned are:

  • Florida Gulf Coast University (FGCU): Fine Arts
  • Indiana University–Purdue University Indianapolis (IUPUI): Introduction to Sociology
  • The Ohio State University (OSU): Statistics
  • Portland State University (PSU): Introductory Spanish
  • Rio Salado College: Introductory Algebra
  • Riverside Community College (RCC): Elementary Algebra
  • Tallahassee Community College (TCC): College Composition
  • The University of Alabama (UA): Intermediate Algebra
  • University at Buffalo–SUNY (UB): Computer Literacy
  • University of Central Florida (UCF): American National Government
  • University of Idaho (UI): Pre-Calculus Mathematics
  • The University of New Mexico (UNM): General Psychology
  • University of Southern Maine (USM): Introductory Psychology
  • The University of Southern Mississippi: World Literature
  • The University of Tennessee, Knoxville (UTK): Intermediate Spanish Transition

Case studies of each redesign project are included at the end of this report.

We then established an institutional profile for each of the 15 institutions that described and analyzed patterns of individual approaches and results against the backdrop of projectwide trends. The profiles included a set of categories that illustrated both similarities and differences among the different redesign techniques used by the 15 institutions. Through mining of the PCR's existing data, we were able to document the techniques that contributed to increased success among underserved students and to identify areas needing further investigation.

After the institutional profiles were created, we established an interview protocol and conducted telephone interviews with project staff at the 15 institutions. The interview protocol is included in Appendix B. The faculty and administrators who were interviewed provided additional data related to the success of the underserved groups of students. In some cases, institutions provided additional quantitative data; in other cases, the faculty and staff provided qualitative observations about the techniques that were particularly supportive and successful with the target group of students. During these interviews, NCAT staff also reviewed the data collected from the national PCR in comparison with those at that particular institution and discussed possible discrepancies to ascertain whether or not particular techniques not originally cited were relevant.

Based on the information gathered via the institutional profiles and the telephone interviews, a subset of six institutions was selected for site visits by NCAT staff. The primary criterion for selecting an institution for a site visit was that it had demonstrated particularly successful learning outcomes—as a result of the redesign—with a high percentage of at least one of the three target groups of underserved students. The goal of each site visit was twofold: (1) to meet with students who had participated in the redesign in order to gain insight regarding the students' perception of their experience and (2) to meet with a variety of faculty, professional staff, and administrators to follow up on what had been identified during initial data mining and telephone interviews.

Site visits were made to Florida Gulf Coast University, Indiana University–Purdue University Indianapolis, Rio Salado College, Riverside Community College, Tallahassee Community College, and the University of Southern Maine. A copy of the site visit interview protocol is included in Appendix C. At each institution, NCAT staff met with faculty members (both full-time and part-time), technology and student service personnel if appropriate, and students from the underserved groups. During the interviews and visits, the institution's profile provided the basis for discussion with faculty and administrators and for gaining further insight into the effectiveness of their particular approaches and the relation of those approaches to student learning outcomes.

RESULTS: IMPROVED LEARNING AND INCREASED RETENTION

Fourteen of the 15 projects reported improved learning outcomes as measured by pre- and postassessments that examined key course concepts. Data analysis and interviews with institutional representatives confirmed that the overall institutional breakdown of underserved students was reflected in the redesigned course—with a few minor variations. As several project directors commented, these large-enrollment, introductory courses are generally required and basically mirror the enrollment of the institution. Among the findings were the following:

FGCU redesign students in fine arts succeeded at a much higher level than traditional students on module exam objective questions, which tested content knowledge (85 percent versus 72 percent) and on module exam short essays, which assessed critical thinking skills where the percentage of Ds and Fs dropped from 21 percent to 7 percent. Thirty-seven percent of FGCU students are older than the age of 25, and 33 percent are part-time.

  • At IUPUI, students in redesigned sociology sections had significantly higher (0.10-level) grades. IUPUI students roughly reflect the national average among African-American students and exceed the national average among adult students: 37 percent versus 20 percent for public four-year institutions.
  • At OSU, students in the redesigned statistics course had greater success on common exams than traditional daytime students and about the same scores as students in the evening class, which had smaller class sizes and older students and had previously outperformed the daytime class. Ten percent of OSU freshmen are African-American, which reflects the national average for four-year public institutions.
  • At PSU, the redesign of the first-year Spanish sequence focused on de-emphasizing rote grammar and improving oral proficiency. End-of-year oral exam scores showed improvement: redesign = 87.3 percent, traditional course = 85.8 percent. PSU has a high percentage of students older than the age of 25 (40 percent) and of low-income students (30 percent).
  • RCC redesign students in math had significantly higher scores than traditional students in four of six content areas on a common final exam. At RCC, 32 percent of the freshmen are Hispanic and 13 percent are African-American.
  • TCC students in the redesigned composition course scored significantly higher (P = 0.04) on final essays, with an average score of 8.34 compared with 7.33 for traditional students. Success rates of redesign students in the second-level English course increased (79.3 percent for redesign compared with 76.1 percent for traditional), indicating that the redesign students were better prepared. At TCC, 41 percent of the students are low income and 34 percent of the freshmen are African-American.
  • At UA, where 14 percent of the undergraduates are African-American, the sum of A and B grades based on comparable examinations and assignments was significantly higher for the redesigned math course than for the traditional course. In subsequent math courses, redesign students outperformed traditional students.
  • At UB, where 35 percent of the students are low income, the redesign of computer literacy resulted in an increase in the percentage of students earning a grade of A– or higher, moving from 37 percent to 56 percent. The mean grade earned in the course increased by one-third of a letter grade, from a C+ to a B–.
  • At UCF, American government students in the traditional format posted a 1.6-point improvement on a content examination, whereas at 2.9, the mean change for students in the redesigned course was almost double that amount. African-American, Hispanic, and adult students at UCF roughly reflect national averages for each group.
  • The percentage of students at UI earning A and B grades in math based on comparable examinations and assignments was higher in the redesigned course; the percentage of Ds and Fs was lower. Thirty-eight percent of the students at UI are low income.
  • The percentage of psychology redesign students at UNM who received a grade of C or higher was 77 percent for fall 2002 and 74 percent for spring 2003 versus an average of
    61 percent for the traditional course. In addition, there were more grades of A (fall 2002 = 34 percent, spring 2003 = 31 percent) than were found in traditionally taught sections (18 percent.) Thirty-four percent of the freshmen at UNM are Hispanic; nationally, only 8 percent of freshmen at public four-year institutions are Hispanic.
  • At USM, where 37 percent of the students are older than the age of 25 and 50 percent of the students are low income, the psychology redesign resulted in significant improvements in overall understanding of course content as measured by pre- and postcourse assessment of important concepts.
  • At Southern Miss, in the area of reading comprehension, the number of students scoring C or better climbed from 68 percent in the traditional literature course to 88 percent in the redesign. In the area of writing skills, the number of students scoring C or better increased from 61 percent in the traditional course to 77 percent in the redesign. The latter gain was particularly significant because of the emphasis placed on writing in the redesigned course, which accounted for 40 percent of the total grade.
  • At UTK, where 41 percent of the students are low income, oral skills among students in the redesigned Spanish course were significantly better than among traditional students.

Eleven of the 15 projects showed improvement in course completion/retention rates. Among the findings were the following:

  • IUPUI reduced the rate of DFWs from 38.9 percent to 24.8 percent.
  • At OSU, withdrawals were reduced by 3 percent, failures by 4 percent, and incompletes by 1 percent. As a result, 248 more students successfully completed the redesigned course compared with the traditional course.
  • Rio Salado increased retention rates from 59 percent to 64.8 percent. Rio Salado's mission focuses on serving working adult students; 94 percent of students are part-time, and 46 percent are older than the age of 25 years.
  • At TCC, students in redesigned sections had a 68.4 percent success rate compared with 60.7 percent for traditional sections. Success rates were higher for all groups of students regardless of ethnicity, gender, disability, or original placement. The overall success rate for all composition students was 62 percent for the 2002/03 year compared with 56 percent for the 1999/2000 year prior to redesign.
  • At UA, the average success rate (grades of C– or better) for the redesigned course during fall semesters went from about 44 percent prior to the redesign to 80 percent in 2003. Females were consistently more successful than males in the redesigned course, as were African-Americans when compared with white students.
  • At UB, the number of students receiving a C or better increased from 74 percent to 78 percent.
  • UCF increased its course completion rate by 7 percent.
  • At UI, the percentage of students earning a D or failing was cut by more than half. Hispanic students, who have historically been unsuccessful in math courses, had an 80 percent pass rate in algebra.
  • At UNM, 41 percent of traditional students received a C– or below, including drops, withdrawals, and incompletes. This percentage was reduced in the redesigned course to 23 percent in fall 2002 and to 26 percent in spring 200.
  • At USM, a smaller percentage of students received failing grades, moving from 28 percent in traditional sections to 19 percent in the redesigned course.
  • In the traditional course at Southern Miss, faculty-taught sections typically retained about 75 percent of students while adjunct- and teaching-assistant-taught sections retained 85 percent. In the redesign, the retention rate was 87 percent. The rate of D and F grades dropped from 37 percent in the traditional course to 27 percent in the redesigned course. DFW rates dropped from 26 percent in the traditional course to 22 percent in the redesign.

QUALITY IMPROVEMENT TECHNIQUES

What pedagogical techniques were most effective in improving learning and in increasing success for all students and for underserved groups in particular? Did a particular strategy work better with African-American students, for example, than with the class in general? Data analysis and interviews with institutional representatives were unanimous: good pedagogy worked equally well with all student groups. As one project leader commented, "all boats rose." The most-prominent techniques for the 15 institutions—indeed, for all 30 in the PCR—were the following:

Online tutorials. In the redesigned courses, instructional software and other Web-based resources that support greater student engagement with the material replace standard presentation formats. Such resources may include interactive tutorials and exercises that give students needed practice; computerized or digitally recorded presentations and demonstrations; reading materials developed by instructors or in assigned textbooks; examples and exercises in the student's field of interest; links to other relevant online materials; and individual and group laboratory assignments.

At PSU and UTK, Spanish listening comprehension and reading comprehension exercises and grammar drills were delivered online, allowing class interaction to focus on student-student oral communication. At TCC, easy online access to materials and resources increased learner time on task in English composition. Grammar review sites and quizzes provided individualized remediation based on diagnostic information. Students had access to textbook companion Web site materials that assisted with writing principles, mechanics, and reading comprehension. They could access information 24-7 as often as they needed. By conducting some instruction online instead of in class, faculty increased the amount of time spent in class on the writing process.

RCC, UA, and UI based their redesigned mathematics courses on MyMathLab, a commercial software package. The availability of this software allowed each institution to avoid spending funds on software development and to direct all resources toward supporting student learning. Using instructional software allows much of the time previously spent on instruction about math concepts to be transferred to the technology and eliminates lecture time previously used for reviewing homework. The software supports verbal, visual, and discovery-based learning styles and can be accessed anytime at home or in a lab. MyMathLab allows instructors to see the work that students are actually doing and to easily monitor their progress. Students found the software easy to use and achieved a comfort level in a short amount of time. Students especially liked the instant feedback they received when working problems and the Guided Solutions available when their answers were incorrect.

Continuous assessment and feedback. Shifting the traditional assessment approach in large-enrollment, introductory courses, which typically utilize only midterm and final examinations, toward continuous assessment is an essential pedagogical strategy. Most of the projects included automated (computer-based) assessment and feedback in their redesigns in fields as diverse as psychology, mathematics, Spanish, English, statistics, and fine arts. Automating assessment and feedback facilitates repetition (student practice) and increases both the frequency and specificity of feedback to students—pedagogical techniques that research has consistently proved enhance learning.

Students were tested regularly on assigned readings and homework via quizzes that probed preparedness and conceptual understanding. The projects used quizzes from commercial sources as well as those they created themselves. These low-stakes quizzes motivated students to keep on top of course material and to structure how they studied. Online quizzing encouraged them to spend more time on task with a do-it-till-you-get-it-right approach: Students were allowed to take quizzes as many times as they wanted until they mastered the material. In mathematics, student learning is related directly to the amount of time students spend working problems. Although homework is assigned in most courses, usually instructors are not able to grade more than a small part of it, and students do not take it seriously. At UA and UI, frequent homework assignments replaced lectures and formed an important part of students' final grade. Computer grading of all exercises ensured that every assignment had been counted and that students received immediate feedback.

Both FGCU and UNM discovered that requiring quizzes was essential to increased student performance. To determine whether quizzes that were mandatory—that is, required for course credit—or voluntary—no course credit—would differentially affect exam and grade performance, UNM faculty conducted an experiment. Students in one psychology section received course points for completion of weekly online mastery quizzes; students in the other section were encouraged to take the mastery quizzes but received no course points for doing so. On in-class exams, students who were required to complete quizzes for credit always outperformed students for whom taking quizzes was voluntary. Students took more quizzes, scored higher, and spent longer on quizzes when course credit was at stake than did students in the section whose quizzes were not linked to credit. In contrast, when credit was not a consequence, relatively few students successfully completed quizzes, and some students chose not to take quizzes at all. FGCU had similar findings in its fine arts redesign.

Quizzes also provide powerful formative feedback for faculty members. Faculty can quickly detect areas where students are not grasping key concepts, thereby enabling timely corrective intervention. Since students are required to complete quizzes before class, they are better prepared for higher-level activities once they get there. Consequently, the role of the instructor shifts from one of introducing basic material to reviewing and expanding what students have already been doing.

Increased interaction among students. Many redesign projects took advantage of the Internet's ability to support useful and convenient opportunities for discussion among students. Students in large lecture classes tend to be passive recipients of information, and student-to-student interaction is inhibited by class size. Through smaller discussion forums established online, students can participate actively. UCF and IUPUI created small online discussion groups whereby students could easily contact one another. Students benefited from participating in the informal learning communities that were created in this manner. Software allowed instructors to monitor the frequency and quality of student contributions to these discussions more readily and carefully than would be the case in a crowded classroom.

At FGCU, fine arts students completed online discussions in which they analyzed sample short essays in preparation for writing their own short essays. Working in peer learning teams of six students each, students had to determine which essays were strong and which were weak and explain why. The online discussions increased interaction among students and developed students' critical thinking skills. At PSU, different forms of computer-mediated communication (CMC) were used according to their capacities as revealed by research: synchronous CMC (chat) resembles interpersonal oral discussion, and asynchronous CMC (message boards) resembles presentational, written discourse. Students were required to work in chat groups to learn about each other and to report this information on message boards. The amount and quality of information exchanged (communicative use of Spanish) exceeded those of most face-to-face discussions. The depth and extension of communication strengthened both student-student relations and student-teacher relations.

Individualized, on-demand support. A support system, available around the clock, enables students to receive help from a variety of sources. Helping students feel they are part of a learning community is critical to persistence, learning, and satisfaction. Active mentorship of this kind can come from a variety of sources, allowing students to interact with the person who can provide the best help for their specific problem.

TCC English composition students were able to submit midstage drafts to tutors at SMARTHINKING—a commercial, online tutoring service—and/or to TCC e-responders. These 24-7 services provided students with prompt, constructive feedback on writing assignments. The fast feedback and online assistance allowed students to make appropriate changes in their drafts, thereby improving the quality of student writing. OSU established a help room that allowed students in statistics to work collaboratively on problems or concepts that presented difficulty. The help room was staffed with faculty, GTAs, and adjuncts who held their office hours there, thus making help available to students throughout the day.

Rather than supplementing class time with help, many of the redesign projects replaced lecture time with individual and small-group activities that took place in computer labs staffed by faculty, GTAs, and/or peer tutors. In several instances, increasing lab hours enabled students to get access to more one-on-one assistance. UA and UI have moved away from the three-contact-hours-per-week norm and significantly expanded the amount of instructional assistance available to students: UA's Math Technology Learning Center (MTLC) is open 71 hours per week, and UI's Polya Math Center is open 86 hours per week.

Most students commented that they had taken advantage of the assistance provided by the college or university and that they liked the ability to get help when they needed it. RCC students said they welcomed the individualized assistance that was available and recognized that they would have had much more difficulty learning math without the combination of personal and software assistance. Even the few students who found having to come to the lab to work objectionable readily indicated that they liked being able to get help when they needed it.

Undergraduate learning assistants (ULAs). Several of the universities employed ULAs in lieu of GTAs. They found that ULAs turned out to be better than GTAs at assisting their peers because of their understanding of the course content, their superior communication skills, and their awareness of the many misconceptions that undergraduate students often hold. UA and UI found that ULAs did an excellent job of assisting their peers. UA's initial plan was to staff the MTLC primarily with instructors and to use graduate students and upper-level, undergraduate students for tutorial support. It soon became apparent that the undergraduate students were as effective as the graduate students in providing tutorial support, thus eliminating the need for graduate students. At UI, during a weekly, one-hour mandatory tutor training session, undergraduate tutors were given an overview of the upcoming week's material and the homework exercises that typically give students problems. At that session, tutors relayed to the course coordinator important information about student difficulties so that the information could be properly relayed to leaders of student focus groups. These training sessions helped maintain consistency in instruction, and the undergraduate tutors played an important role.

UNM incorporated ULAs recruited from students who received As in the previous semester in their redesign. The role of the ULAs was to work with students who scored 75 percent or less on the first exam—administered at the end of the third week—in weekly 50-minute studio sessions for the remainder of the semester. During studios, students worked on multimedia course material, took quizzes, learned memorization strategies, and discussed their course performance with the ULAs. The more studios students attended, the better their course performance.

Structural supports that ensure student engagement and progress. Each redesign model added greater flexibility in the times and places of student engagement with the course. This did not mean, however, that the redesign projects were self-paced. Rather than depending on class meetings, the redesigns ensured student pacing and progress by requiring students to master specific learning objectives, frequently in modular format, according to scheduled milestones. Projects quickly discovered that students need structure—especially first-year students and especially in disciplines that may be required rather than chosen—and that most students simply will not make it in a totally self-paced format. Students need a concrete learning plan with specific mastery components and milestones of achievement, especially in more-flexible learning environments.

RCC, UA, and UI required students to spend a minimum amount of time in learning labs and to attend group meetings to ensure that students spent sufficient time on task. In spite of such attendance requirements, some students did not spend enough time in the lab to meet learning objectives and needed further intervention. At UA, student hours in the lab were tabulated weekly to ensure that students invested adequate time in the course. An automated e-mail system was used to reward students who were meeting requirements and to encourage those who were falling behind. In response to student requests for more structure, the UI team created a weekly task list—a breakdown of the week's assignment that showed precisely where to find the information that pertained to each specific problem. Instructors were able to use the task list to help each student devise a detailed study plan for the upcoming week. The task lists were Web-based and had links to all of the necessary online lessons and to hints and other supplemental material providing more instruction.

Another approach was to establish some form of early alert intervention system—a kind of class-management-by-exception process—whereby baseline performance standards were set and those who were falling too far behind were contacted. At UNM, for example, students who scored 75 percent or less on the first exam were required to attend a weekly studio for the remainder of the semester as described earlier.

People who are knowledgeable about proven pedagogies that improve student learning will find nothing surprising in the aforementioned list. Among the well-accepted Seven Principles for Good Practice in Undergraduate Education developed by Arthur W. Chickering and Zelda F. Gamson in 1987 are such items as "encourage active learning," "give prompt feedback," "encourage cooperation among students," and "emphasize time on task." Good pedagogy in itself has nothing to do with techno-logy, and we've known about good pedagogy for years. What is significant about the faculty involved in these redesigns is that they were able to incorporate good pedagogical practice into courses with very large numbers of students—a task that would have been impossible without technology.

To illustrate, in the traditional general chemistry course at the University of Iowa, one of the 15 PCR institutions not included in this study, four GTAs used to be responsible for grading more than 16,000 homework assignments each term. Because of the large number of assignments, GTAs could only spot-grade and return a composite score to students. By automating the homework process through redesign, every problem is graded and students receive specific feedback on their performance. This in turn leads to more time on task and higher levels of learning and releases the GTAs to perform other duties. Applying technology is not beneficial without good pedagogy. But technology is essential to move good pedagogical practice to scale, where it can affect large numbers of students.


THE ACHIEVEMENT GAP

For those concerned about increasing the success levels of underserved students, the good news is that these 15 institutions that have large numbers of the target student populations increased learning and success. The bad news is that while "all boats rose," the achievement gap among some groups of students remained. At OSU, for example, the grades of African-American students improved about the same as the other students under the redesigned model. If the grades of African-American students were lower than those of white students before the redesign, that gap (which is about 5 percent) continued after the redesign. At Southern Miss, African-American and white students alike generally earned grades in the redesigned course that were about one letter grade lower than their grade point averages (GPAs). African-American students had an average GPA of B and a World Literature average of C; white students had an average GPA of B+ and a World Literature average of C+. The DFW rate was about 13 percent—virtually identical in the two populations. This phenomenon generally occurred across all projects, with the exception of The University of Alabama.

It is important to remember that these redesigns were aimed at students in the course in general rather than at underserved students in particular. What have we learned about closing the achievement gap? We know that student behavior in the course not only matters but also can eliminate differences among groups. At IUPUI, for example, nonwhite students had lower grades than white students on biweekly quizzes and papers. However, when participation in online forums, as measured by the number of log-ins and forum grades, was considered, there was no difference. Thus, participation in forums was especially important in eliminating minority-status disparities. Both the number of log-ins into the online system and the forum grade were positively associated with having better grades overall.

Clearly, a key to increasing student success is to increase the amount of time students spend studying for the course. Faculty and students alike involved in the redesign projects recognize the importance of time on task and acknowledge that students are spending more time on task in the redesigned courses when compared with traditional formats. At USM, for example, students in redesigned sections reported spending more time studying for Introductory Psychology than they did for other introductory classes and for traditionally taught sections (typically three to five hours per week in contrast to one to three hours.) This difference was highly significant (.001 level).

One USM student commented that while she found the ability to take mastery quizzes multiple times a very useful study technique, some of the other students did not like the quizzes because they found them time-consuming. She believes the technique really works, but she understands that it is necessary to put in the time. Rio Salado students also said that they generally found the redesigned environment more demanding than the traditional face-to-face format. They believe they worked harder than they would have if they had been taking the course in a classroom, but they also believe they were more engaged with the subject matter and, consequently, learning more. These observations were echoed by TCC students who said, while it does seem to take more effort to learn in the redesigned format, there is no reason to fail if one tries and does the work.

What lessons can be drawn from the redesign projects for those who wish to target students who may be first-generation college students and/or less prepared to engage in college study? Since we know that spending adequate time on task closes the achievement gap, the key is to find ways of ensuring that students are engaged in study. At UA, making sure that students spent sufficient time on task was a high priority. The combination of required participation in the MTLC, where students received help on demand; required weekly class meetings; and an early intervention system that identified students who were having difficulty led to increased levels of student success. In fact, the success rate (grades of C– or better) for African-American freshmen was substantially higher than for white freshmen. In fall 2000, 71.4 percent of African-American freshmen were successful versus 51.8 percent of white freshmen; in fall 2001, it was 70 percent versus 65.3 percent. At the same time, African-American freshmen were less prepared when they entered the course. In fall 2001, for example, the average ACT score in math was 20 for white freshmen and 18.7 for African-Americans. The mean score on a math placement exam was 230 for white freshmen and 208 for African-Americans; 20 percent of white freshmen scored less than 200 versus 41 percent of African-Americans.

In addition to UA, many of the projects required participation or attendance as described earlier in the discussion of the structural support technique. While effective with many students, the problem with requiring participation is that some students simply ignore the requirement. Another effective way to engage students in spending time on task is to create student learning teams within the larger course structure. FCGU, for example, placed students into cohort groups of 60 and, within these groups, into peer learning teams of six students each. Learning teams engaged in Web Board discussions that required students to analyze two short essays in preparation for producing their own short essays. The Web Board discussions increased interaction among students, created an atmosphere of active learning and developed students' critical thinking skills. Students reported that they felt like they were "in a class of six."

At IUPUI, students reported that the online learning teams "made the course seem smaller." Instructors noticed that when one or two students wrote more or raised a controversial issue in a posting, it often had a synergistic effect on the team. It prompted students to invest more time in exploring the issues in their forums and resulted in an overall better performance by the team on the in-class activities. Anecdotal evidence suggested that the learning teams and associated online homework assignments contributed significantly to higher levels of student engagement in the class and in the course material. The faculty believe that online communication among small groups of students within large classes fostered more-rapid development of social cohesiveness and more-frequent substantive course-related interactions than in the traditional large lectures or even in in-class collaborative learning activities.

At Fairfield University, one of the 15 PCR institutions not included in this study, students worked in teams six times over the course of the term. The use of computer-based exercises during class meetings forced students to work in teams of two or three. According to the Fairfield faculty, previously Hispanic students had not integrated well with others in the class. Because the redesign forced students to work together, Hispanic students seemed to be opening up—meeting more students and widening their study partners. They seemed to be developing more connections, more friends, and more contacts in their major.

At PSU, interaction among the students online was perceived as extremely valuable. Students learned quickly whom they could depend on and whom they could not. They self-selected into groups and kept these through the final project. TCC English composition students liked the opportunity to work online and to work with others in the class. They indicated that sending their writing via e-mail in a small group did not seem as public as talking in a larger class. Being able to discuss their writing with others helped increase their confidence as well as their actual ability to write well.

Although there is plenty of literature showing that collaborative learning can be very effective, it does not follow that students will engage in the practice automatically. A few will, but many students need prodding to overcome their ingrained habits to study alone—perhaps because they fear to display their lack of understanding to their classmates. One of the 15 PCR institutions not included in this study—the University of Colorado at Boulder—divided its large, 220-student introductory astronomy class into small learning teams of 10 to 15 students. To ensure that members of the learning teams actually worked together, 40 percent of a student's score in the course was attributed not to the student's individual performance but to the team's performance. (The remaining 60 percent was based on the student's performance on quizzes and examinations.) Thus, every student on a team had an incentive to help every other student prepare good written and oral answers to discussion questions and to complete collaborative homework projects.

Members of the learning teams were permitted to divide the cumulative team score among themselves as they saw fit. Each team member rated his or her teammates online—not on ability but on performance. Each student could see his or her average performance rating by the rest of the team (but not the ratings by individuals) and could compare that rating with the average rating of all members of the team. Team scores were divided among the members according to a simple algorithm based on these ratings. The system worked remarkably well. Before posting the team ratings, the instructor asked team coaches whether the students had rated each other fairly, and 90 percent of the time the coaches said that the students' mutual ratings conformed almost exactly to their own perceptions of the students' performance. The students perceived the system as fair. Since the students within a learning team knew each other personally, they could and did exert powerful peer pressure to perform.


COST REDUCTION TECHNIQUES

In order to be adopted by large numbers of institutions, good ideas must be affordable. Far too many good ideas that can increase student success and retention are viewed by the higher education community as simply not possible to implement given budget constraints. The PCR has shown how to make significant gains in student success while reducing the cost of doing so—something sorely needed by all institutions.

There are a variety of ways to reduce instructional costs. As a result, there are also a variety of strategies for redesign, depending upon institutional circumstances. For instance, an institution may want to maintain constant enrollments while reducing the total amount of resources devoted to the course. By using technology for those aspects of the course where it would be more effective, by engaging faculty only in tasks that require faculty expertise and by transferring other tasks that are less academically challenging to those with lower levels of education, an institution can decrease costs per student even though the number of students enrolled in the course remains unchanged. This approach makes sense when student demand for the course is relatively stable. Twelve of the 15 projects followed this approach to cost reduction and were able to reallocate to other institutional needs the resources saved.

An institution that is in a growth mode or that has more demand than it can meet through existing course delivery may seek to increase enrollments while maintaining the same level of investment. Many institutions experience escalating demand for particular subjects like Spanish or information technology that they cannot meet because they cannot hire enough faculty members. By using redesign techniques, they can increase the number of students they enroll in such courses and relieve these academic bottlenecks without changing associated costs. FGCU, PSU, and UTK followed this approach to reducing the cost-per-student. UTK, for example, has been able to increase by one-third the number of students served by the same instructional staff in introductory Spanish.

Another way to reduce costs is to decrease the number of course repetitions due to failure or withdrawal, so that the overall number of students enrolled each term is lowered and the required number of sections and number of faculty members to teach them are reduced. At many community colleges, it takes students about two and a half tries to pass introductory math courses. If an institution can move students through in a more-expeditious fashion by enabling them to pass key courses in fewer attempts, this will generate considerable savings both in terms of institutional resources and in terms of student time and tuition.

As noted earlier, 11 of the 15 projects reported a noticeable increase in retention rates. Two institutions—UCF, included in this study, and the University of Iowa, one of the other 15 PCR institutions—calculated the savings that resulted from improved retention. UCF increased retention in its American government course by 7 percent, which resulted in one section fewer needing to be offered. This amounts to a $28,064 cost savings each time the course is offered. Iowa's reduction in its DFW rate from 24.6 percent to 13.1 percent means that 90 students each semester do not need to repeat the course. These students make up three discussion sections and four laboratory sections. The personnel needed to cover these sections equate to 1.5 GTAs—no longer necessary and representing a cost savings of $7,022. Not surprisingly, most of the redesign projects tried to reduce course repetitions and produce savings by using one of the other approaches.

What were the most effective cost-reduction techniques used by the redesign projects? Since the major cost item in instruction is personnel, reducing the time that faculty and other instructional personnel invest in the course and transferring some of these tasks to technology-assisted activities is the key strategy. Some of the more-predominant cost-reduction techniques for these 15 institutions—indeed, for all 30 in the PCR—were the following:

Online tutorials. Modular tutorials lead a student through a particular topic presented through interactive Web- or CD-ROM-based materials. Once students have completed the tutorial, they are presented with questions that test whether they have mastered the content of the module. Interactive tutorials can replace part—and in some cases, all—of the "teaching" portions of the course. UA's use of online course delivery techniques enabled reductions in teaching staff. Individual faculty members no longer were required to present the same content through duplicative efforts, nor did they need to replicate exercises and quizzes for each section. Similarly, at RCC lecture time was reduced from four to two hours per week. Class meetings were reorganized, and they targeted topics that students find particularly difficult. Faculty members spent more time interacting with students about questions and problems rather than repetitively presenting math concept information.

Access to Web-based resources reduced labor costs at TCC by decreasing the amount of time faculty spent in diagnostics, preparation of lectures, grammar instruction, monitoring progress, grading and making class announcements. Faculty logs kept during the spring 2003 semester indicate a 33 percent decrease in time spent on course activities associated with the aforementioned tasks.

Automated assessment of exercises, quizzes, and tests. Automated grading of homework exercises and problems, of low-stakes quizzes, and of examinations for subjects that can be assessed through standardized formats not only increases the level of student feedback but also offloads these rote activities from faculty members and other instructional personnel. Some of the projects used the quizzing features of commercial products like WebCT. Others used specially developed grading systems like Mallard at the University of Illinois. Still others took advantage of the online test banks that are available from textbook publishers.

Online quizzing sharply reduces the amount of time instructors need to spend on the laborious process of preparing quizzes, grading them, and recording the results. Automated testing systems that contain large numbers of questions in a database format enable individualized tests to be easily generated, then quickly graded and returned.

Staffing substitutions. By constructing a support system that comprises various kinds of instructional personnel, institutions can apply the right level of human intervention to particular kinds of student problems. Employing ULAs in lieu of GTAs, for example, not only improves the quality of assistance available to students, as noted earlier, but also serves as a key cost-saving device. By replacing expensive faculty members and graduate students with relatively inexpensive labor, an institution can increase the person-hours devoted to the course and, at the same time, cut costs.

At UA, as noted earlier, the plan to use graduate students and upper-level, undergraduate students for tutorial support was changed after the first semester of implementation when it became apparent that the lower-cost undergraduate students were as effective as the graduate students in providing tutorial support. In addition, data on student use of instructional staff were collected during the first semester of operation and refined on a semester-by-semester basis. Based on that usage data, it was possible to reduce the number of instructors and undergraduate tutors assigned to the MTLC by matching staffing levels to trends in student use.

Another solution, implemented by Rio Salado College, was to employ a course assistant to address the many nonacademic questions that arise as any course is delivered—questions that can characterize up to 90 percent of staff interactions with students. This freed the instructor to teach more students and to concentrate on academic interactions rather than logistics.

FGCU's redesigned course was taught exclusively by full-time faculty supported by a new position called the preceptor. Preceptors were responsible for interacting with students via e-mail, monitoring student progress, leading Web Board discussions, and grading critical analysis essays. Each preceptor worked with 10 peer learning teams, or a total of 60 students. Replacing adjuncts independently teaching small sections ($2,200 per 30-student section) with preceptors assigned a small set of specific responsibilities ($1,800 per 60-student cohort) in the context of a consistent, faculty-designed course structure allowed FCGU to accommodate ongoing enrollment growth at a reduced cost-per-student.

Shared resources.
When an entire course (or more than one section) is redesigned, faculty members begin by analyzing the amount of time that each person involved in the course spends doing each activity. This highly specific task analysis often uncovers instances of duplicated effort and can lead to more-efficient approaches to course development. The often substantial amounts of time that individual faculty members spend developing and revising course materials and preparing for classes can be reduced considerably by eliminating such duplications.

For example, most projects constructed easy-to-navigate Web sites that contained not only material on managing the course but also a large number of student aids and resources such as solutions to problems, study guides, supplemental reading materials for topics not treated in the text, and student self-assessment activities. Putting assignments, quizzes, exams, and other course materials on a community Web site for the course can save a considerable amount of instructional time, since responsibility for improving and updating the materials is shared among instructors, thus reducing each faculty member's workload.

Another benefit of creating shared course resources is that doing so creates an opportunity for continuous improvement of those resources. During each phase of implementation, redesign teams were able to modify, update, and revise learning activities based on what worked well and what did not. Student feedback on the clarity and number of assignments, as well as students' expressed need for greater explanations and more models, provided multiple indicators for areas needing change. The online environment permits flexibility in design and expansion, enabling timely changes to be made. In addition, many teams found that once the course resources had been developed, only a minimum amount of additional labor was necessary to improve the course content and keep it current. The shared course materials not only saved time for the original instructors involved in the redesign preparation and maintenance, but also enabled their use by new faculty members who otherwise would have had to develop the course from scratch.

Course management systems.
Course management systems—software packages that are designed to help faculty members transfer course content to an online environment and assist them in administering various aspects of course delivery—played a central role in most of the redesigns. All of the projects used a course management system. Some used commercial products like WebCT and Blackboard; others used homegrown systems created centrally for campuswide use or specifically for the redesigned course. And still others used instructional software that includes an integrated course management system. Sophisticated course management software packages enabled faculty members to monitor student progress and performance, track students' time on task, and intervene on an individualized basis when necessary.

Course management systems can automatically generate many different kinds of tailored messages that provide needed information for students. They can also communicate automatically with students to suggest additional activities based on homework and quiz performance or to encourage greater participation in online discussions. Using course management systems radically reduces the amount of time that faculty members spend on nonacademic tasks like calculating and recording grades, photocopying materials, posting changes in schedules and course syllabi, and sending out special announcements to students as well as documenting course materials like syllabi, assignments, and examinations so that they can be used in multiple terms.

Reduced space requirements. Using the Web to deliver particular parts of a course as a substitute for face-to-face classroom instruction enables institutions to use classroom space more efficiently. Because one of the goals of its redesign was to reduce the amount of rented space needed, UCF delivered portions of its American government course via the Web. Two or three course sections could be scheduled in the same classroom where only one could be scheduled before.

Delivering portions of the PSU Spanish course via the Web as a substitute for face-to-face classroom instruction brought significant space savings to this urban university with rapidly increasing enrollments. Online chat allowed communicative use and practice of Spanish to extend beyond the limits of the classroom while maintaining student-student contact and instructor supervision. FGCU's redesign helped the university deal with a space crisis caused by rapidly growing enrollment. Because the course was entirely online, the redesigned course no longer needed to use any classroom space.

Consolidation of sections and courses. By redesign of the whole course rather than a single class, it is possible to realize cost savings by consolidating the number of sections offered or the number of courses offered. UTK increased the number of students served from 1,500 to 2,000. In the traditional format, 16 adjunct instructors and 6 GTAs each taught 57 sections (about 27 students each). In the redesigned format, GTAs were paired with experienced instructors as support partners, thereby reducing the number of sections from 57 to 38 and doubling the number of students in each section from 27 to 54 students. UTK reduced the cost-per-student by 74 percent.

In the emporium model used at UA and UI, multiple sections of a course were combined into one large course structure, replacing duplicative lectures, homework, and tests with collaboratively developed online materials. UA combined 44 intermediate algebra sections of about 35 students each into one 1,500-student section offered in its math emporium; UI moved two precalculus courses, previously organized in 60 sections of about 40 students each, into its Polya Math Center, treating each course as a coherent entity. By teaching multiple math courses in its computer lab facility, each university can share instructional person-power among courses, thereby significantly reducing the cost of teaching additional courses.

By using technology-based approaches and learner-centered principles to redesign their courses, these institutions are showing us a way out of higher education's historical trade-off between cost and quality. Some of them relied on asynchronous, self-paced learning modes, while others used traditional, synchronous classroom settings but with reduced student/faculty contact hours. Both approaches started with a careful look at how best to deploy all available instructional resources to achieve the desired learning objectives. Questioning the current credit-for-contact paradigm of instruction as well as thinking systematically about how to produce more-effective and more-efficient learning are fundamental conditions for success.


TECHNOLOGY ACCESS AND USE AMONG UNDERSERVED STUDENTS

The use of information technology is a cornerstone of these redesigns. The technology makes it possible to incorporate good pedagogical practice into courses with very large numbers of students, which in turn leads to greater learning. Within the higher education community, there are a number of assumptions about underserved students and technology use, which can be summed up as, the two do not mix. These assumptions relate to both access—the have and have-not issue—and use: that underserved students do not like to use technology or that use of technology is an obstacle to student success. Clearly, the redesigns could not have achieved the level of success that we report if these assumptions are correct.

When many of the projects launched their redesigns, they were concerned about underserved students' access to technology. IUPUI, for example, reported some initial concerns that low-income students would have difficulty because of access; UNM reported the same concerns about Hispanic and Native Americans students. Others echoed those concerns. All of the projects reported that in practice, these concerns were resolved over time.

An ever-increasing proportion of underserved students have personal access to the technology, and for those who do not, the easy availability of campus labs can address the problem. When the issue is handled properly, faculty have received no complaints. The key is to make sure that campus labs are open a sufficient number of hours to meet students' needs. UTK reported that early in the redesign some low-income students complained about having to do parts of the course online, but those objections diminished over time. UTK's language lab is now open from 8 a.m. to 8 p.m., and this wide span of hours has helped reduce complaints. These hours provide greater access than was previously available. As more and more students at a given campus own personal computers, lab space and time become increasingly freed for those who do not. This is not to say that access should not be a concern—it should be—but the solution is to provide on-campus lab space for those who need it.

Campus labs should not be sterile spaces but ones in which help is available. Most of the RCC students who were interviewed liked using the software in the lab environment to learn math. Some immediately verbalized their concerns about taking math and the fact that using the software helped them overcome some of their fears. They welcomed the individualized assistance that was available in the lab and readily offered that they would have much more difficulty learning math without the combination of software and personal assistance that supported them in their studies.

A second area of concern among the projects was the need for adequate bandwidth among students who accessed the course from home or work. Rio Salado reported that the need for adequate high-speed access to use the course software seems to have resolved itself over time. Adequate bandwidth used to be a problem, but it no longer seems to be an issue. More than 60 percent of Rio Salado students have high-speed Internet access. At IUPUI, there were a few reports of problems with access—especially with greater reliance on OnCourse, the campus course management system. Students were expected to post one response the night before each class and could dial in. The quizzing feature of the course, however, was more functional with high-speed access and became problematical if students had dial-in access only.

Awareness of bandwidth issues and careful planning of all elements of course delivery can overcome most problems. UTK, for example, experienced initial difficulties regarding the variety of modem connection speeds and/or computer configurations from which students were accessing course materials. Based on feedback received from student questionnaires, the project team reviewed all of the more than 400 graphic, audio, and video files utilized in the course and optimized them for efficient download speed. A tutorial was developed to provide students with clear instructions on how to download the players needed to access the course audiovisual files and how to configure those players for their connection speeds.

A third area of concern involved adequate training and support to make sure that students were able to access and use the technology easily. This is an issue for all students—not just those who are underserved—when institutions offer online courses or courses with online components. Technology support personnel at FGCU, for example, reported there was no consistent pattern of people who needed help based on age; questions usually related to mechanical issues of logging on or dealing with pop-ups. At Southern Miss, there was some suspicion that low-income students had initial difficulties with the technology, but the university added training, which helped resolve the problem.

Generally, there did not seem to be a difference in student reaction to the technology aspects of the redesigned courses based on students' underserved status, as reported by the project leaders. Both Southern Miss and FGCU conducted follow-up surveys that confirm these anecdotal impressions. At Southern Miss, there was no significant difference between student responders who received financial aid and those who did not in terms of their reaction to the course on such variables as perception of course difficulty, value of online materials, quantity of work, and use of online materials in other courses. Of the adult students responding to an online survey of FGCU students at the end of the fall 2004 term, 85 percent said they experienced no significant technological problems while taking the course, and 85 percent agreed that the online learning materials helped them work on the course whenever they wanted.

Benefits of Technology-Enhanced Instruction

In addition to the ability of the technology to support good pedagogy, faculty and students identified other benefits of using technology that are particular to the underserved students who are the subjects of this study.

For adult and working students, what stood out as the most predominant benefits are the convenience and flexibility that technology-enhanced approaches provide. In response to an online survey at USM, where a large percentage of the students are both low income and adult, 97 percent of the students indicated that the online materials helped them work on the course whenever they wanted; 91 percent said they found these materials helpful; 85 percent disagreed or strongly disagreed with the statement, "I missed the chance to attend lecture on a regular basis"; and 94 percent indicated they would like to see the online features incorporated into other courses at USM. Students liked the ability to organize their study hours around their other obligations.

Adult students at FGCU, most of whom are part-time, have a hard time scheduling work and classes. The redesigned fine arts course allowed them to work from home. Students frequently commented that they appreciated the flexibility and the convenience of being able to do so. UNM adult students echoed this view: they liked being able to do much of their work at home. Given that the lectures are optional in UNM's redesigned course, students could adjust their study schedules if needed. The project leader at UTK reported that 53 percent of students in the redesigned course had a job of more than 20 hours per week. UTK believes that greater accessibility to learning resources benefits those who work because they can have access at any time.

If TCC students had to miss a class because of work or family obligations, they knew what was covered by what was posted on the course Web site and they did not fall behind on their assignments. At IUPUI, the forums and discussion groups were particularly important for adults and part-time students, since IUPUI is a commuter campus. The adult students in the Rio Salado focus group all worked from home, and the convenience of the course was paramount to them since they had jobs and families. They commented on the need for greater time management, but they appreciated the ability to arrange their studying to fit in with their other scheduled activities. Most Rio Salado students seemed to have a designated study time: for some of them, late at night; for others, early in the morning; and for one of them, during nap times, since she provides in-home day care for several children.

These reports are consistent with the literature on distance and adult learning, yet only three of the 15 projects (and only 5 of the 30 PCR projects) are fully online. The majority of the redesign projects blended online elements with face-to-face experiences on campus. Nevertheless, students consistently cited the convenience and flexibility provided by the technology as the most beneficial aspects of their course experience. The lesson for other institutions is that even if they do not want to offer a fully online course, they can still add convenience and flexibility—so appreciated by students—to on-campus courses by taking advantage of the capabilities of information technology.

It is difficult to separate the benefits that technology-enhanced approaches offer for adults, low-income students, and students of color since these categories of students tend to overlap: students of color tend to be low income; adults tend to be working students, as do low-income students; and so on. There is no indication that students of color had anything but positive attitudes toward the use of technology in the redesigned courses. In a few instances, they appear to have had more-positive attitudes than white students did.

At Southern Miss, African-American students ranked the redesigned course higher than white students did in terms of student satisfaction. On the 8 to 10 questions on student surveys that ask about discrete elements of the course (presentation, instructors' ability to explain, attitude toward students, and so on), African-American students routinely gave the course higher marks than white students did. The overall rating by African-American students was 2.71 and was 2.35 among white students. Explaining the satisfaction difference is difficult. African-American students were just as likely as white students to attend the live presentations, to take mastery quizzes multiple times, to use the tutors to get help with writing assignments, or to have a part-time job.

TCC may have an explanation for the higher satisfaction ratings among African-Americans. The TCC faculty believe that the use of technology in the redesigned course provided a more-open, more-democratic environment and greater inclusion of all students. Previously, students of color would not speak out in class, but in the redesigned course they were more than willing to "speak up" while online. Both adults and students of color used the online resources for self-remediation—probably, the faculty surmise, because no one knew they were doing so. Rather than feeling stigmatized when seeking help, students could find what they needed on their own time and without anyone's knowing. The learning environment at UA, where students received individualized assistance in the MTLC, was much friendlier to students seeking help than the traditional classroom was, and it led to higher performance among African-American freshmen. In addition, the MyMathLab software allowed students to self-remediate.

Faculty members at Fairfield University commented that the use of visual aids and online demonstrations of biological concepts increased options for students for whom English is not the first language, since they needed to rely less on verbal explanations. While this change helped all students, Hispanic students at Fairfield have commented on how helpful they found these computer-based learning resources.

Rather than being an obstacle to student success, information technology has been an enabler of student success in these course redesign projects. In each instance, redesign teams have given careful consideration to how technology can best be used to support student learning. What the PCR institutions have in common is a commitment to ensuring learner readiness to engage in technology-based courses. Learner readiness involves more than access to computers and to the network. It also involves access to technical support as well as other forms of student support—such as help in using navigation tools and course management systems—and to processes that enable students to gain literacy if they do not already possess it. Thoughtful applications of technology that take into account the specific needs and interests of students can indeed produce positive outcomes.


CONCLUSION

Our experience in the Program in Course Redesign has promising implications for institutions seeking to increase student success. Three important lessons can be drawn from the results of our in-depth study of the impact of NCAT's method of course redesign on underserved students.

First, most of the weaknesses of introductory courses are generic in nature and have as their source the limitations of the predominant form of collegiate instruction: the didactic lecture. An overwhelming body of research shows that students do not learn effectively from lectures. The lecture method treats all students as if they were the same, as if they bring to the course the same academic preparation, the same learning style, the same motivation to learn, the same interest in the subject, and the same ability to learn. The lecture format simply cannot accommodate the broad range of differences among students. Lecture-based courses are notoriously ineffective in engaging students: they neither encourage active participation, nor offer students an opportunity to learn collaboratively from one another, nor provide adequate tutoring assistance. Smaller classes in theory allow greater interaction with students than large lecture halls do, but in practice, most small classes are dominated by the same presentation techniques as used in larger courses. As the PCR redesigns demonstrate, moving away from the lecture method is the key to increasing student success.

Second, information technology can be a solution rather than an obstacle to increasing success for underserved students. As this report and the case studies of the 15 projects that are the foci of this study emphasize, this means using information technology to support good pedagogical practice rather than using technology for technology's sake. It also means making sure that learners have access to the necessary technology and know how to use it comfortably. It suggests that institutions and faculty members must be conscientious in their planning to integrate technology in courses in order to make sure that students can use the technology appropriately.

Third, good ideas must be affordable in order for them to be implemented on a large scale. The predominant view of how to improve retention says that colleges and universities must provide additional services and support and that it will be impossible to improve retention if institutions do not have the necessary financial support from state and federal governments. As Watson Scott Swail, president of the Educational Policy Institute, says in a January 23, 2004, editorial in The Chronicle of Higher Education: "Regardless of the success of any of their other efforts, colleges without the necessary resources could not even come close to those that could invest substantially in retaining students. . . . Unless we recognize the different roles that various institutions play, and provide them with the resources needed to meet the challenge of college dropouts, the problem will only worsen."6 In contrast, NCAT's method of course redesign offers a concrete way for institutions to improve student success and retention without investing additional resources. Indeed, our redesigns generate additional resources that can be used for other institutional purposes such as developing new programs, serving more students, or responding to areas of pressing need.

Course redesign offers an important complement to ongoing attempts to integrate underserved students in the social and intellectual life of the institution. Most efforts to increase student success and retention heretofore have focused on institutional factors rather than on what happens in specific courses, yet success in first-year courses is critical to overall student success. NCAT's focus on what goes on within courses dovetails nicely with cross-course or extracurricular approaches to student engagement, and it advances the nation's understanding of what works effectively to increase student academic success among underserved students.

APPENDICES

APPENDIX A: IDENTIFICATION OF TARGET INSTITUTIONS

The following tables show the percentages of underserved students at PCR institutions, who were the initial target of this study, in relation to national averages in general and by sector.

Low-Income Students 
National average 34%
Public four-year institutions 28%
Private four-year institutions 30%
24 PCR institutions 26%
Public Two-year institutions 37%
PCR two-year institutions 31%
   
University of Southern Maine 50%
The University of Tennessee, Knoxville 41%
Tallahassee Community College 40%
University of Idaho 38%
California State Polytechnic University, Pomona 37%
The University of Southern Mississippi 37%
University at Buffalo–SUNY 35%
Portland State University 30%
Rio Salado College 30%
Indiana University–Purdue University Indianapolis 27%

Data source: NCES/IPEDS 2001-2002 Student Financial Aid File: percent receiving federal student aid (full-time, first-time, degree/certificate-seeking freshmen), the best available data to determine income status.

African-American Students
  All
Undergraduates
Freshmen
National average 12% 13%
Public four-year institutions 11% 11%
Private four-year institutions 11% 10%
24 PCR institutions 7% 8%
Public two-year institutions 12% 14%
PCR two-year institutions 15% 16%
  All
Undergraduates
Freshmen
Tallahassee Community College 30% 34%
The University of Southern Mississippi 25% 37%
The University of Alabama 14% 10%
Riverside Community College 12% 13%
Indiana University–Purdue University Indianapolis 10% 9%
Drexel University 9% 8%
The Ohio State University 8% 10%
University at Buffalo–SUNY 8% 7%
University of Central Florida 8% 9%
The University of Tennessee, Knoxville 7% 9%

Data source: NCES/IPEDS fall 2002 enrollments.

Hispanic Students
  All
Undergraduates
Freshmen
National average 11% 11%
Public four-year institutions 8% 8%
Private four-year institutions 9% 9%
24 PCR institutions 7% 7%
Public two-year institutions 14% 13%
PCR two-year institutions 13% 16%
  All
Undergraduates
Freshmen
The University of New Mexico 33% 34%
Riverside Community College 31% 32%
California Polytechnic State University, Pomona 23% 24%
University of Central Florida 11% 12%
Rio Salado College 9% 12%
Florida Gulf Coast University 8% 9%
Carnegie Mellon University 5% 5%
Tallahassee Community College 5% 5%
Fairfield University 4% 5%
University at Buffalo–SUNY 47% 4%

Data source: NCES/IPEDS fall 2002 enrollments.

Adult Students
  25 Years and Older Part-Time
National average 32% 39%
Public four-year institutions 20% 21%
Private four-year institutions 21% 18%
24 PCR institutions 20% 25%
Public two-year institutions 44% 63%
PCR two-year institutions 37% 74%
  25 Years and Older Part-Time
Rio Salado College 46% 94%
Portland State University 40% 38%
Riverside Community College 39% 75%
University of Southern Maine 37% 49%
Indiana University–Purdue University Indianapolis 37%* 39%
Florida Gulf Coast University 31% 33%
Tallahassee Community College 25% 53%
The University of Southern Mississippi 25% 14%
The University of New Mexico 24%* 23%
University of Central Florida 20% 26%

Data source: NCES/IPEDS fall 2002 enrollments.
*Data supplied by institution to correct blank cells in IPEDS.

APPENDIX B: PCR PROJECT LEADER INTERVIEW PROTOCOL

1. Here are the data for your institution's underserved students.

Do you think that your course, especially during the term of your reported data, reflects the institutional percentage of these students? Do you have a breakdown of students by category—especially during the term(s) of your reported data—for the traditional course and for the redesigned course?

2. Do you think—or know whether—there is any difference in the impact of the redesign on the general student population and the target population? Why or why not?

3. Here are the pedagogical techniques that improved student learning in your course as you reported to us.

Which of these had the most impact on the target population? Why do you think so? Do you have any data to support this conclusion?

4. Here are the most-effective pedagogical techniques that improved student learning in the PCR as a whole.

• Continuous assessment and feedback
• Increased interaction among students
• Online tutorials
• Undergraduate learning assistants
• Individualized, on-demand support
• Structural supports that ensure engagement and progress

Did any of these have an impact on all students? On the target population? Why do you think so? Do you have any data to support this conclusion?

5. Are there any techniques that you tried and that didn't work? If so, what kinds of changes did you make when the techniques were not successful, such as voluntary attendance and then began to require attendance?

6. Are you still offering the course as you reported in your final report? If not, why not?

7. If you have made changes, what impacts have these changes had on the target population? Do you have any data to support these conclusions?

APPENDIX C: SITE VISIT PROTOCOL

People to Interview

  • Focus group(s) of underserved students who have taken the redesigned course
  • Faculty who worked on the project beyond the project leader
  • Information technology professionals if they provide direct assistance for students, such as orientation or help desk services
  • Higher Education Opportunity Program professionals or other student service professionals who work with underserved students

Questions to Ask Student Focus Groups

Introduction: Briefly explain the goals of the Lumina grant and the Program in Course Redesign and the specifics regarding the course at each institution.


1. Here are the most-effective pedagogical techniques that led to improved student learning in the course at your institution. Which of these had the most impact on your ability to learn? Why do you think so?

2. Were any of these techniques not useful to you? Why?

3. Did you learn strategies for success in this course? Have you had any opportunity to use these strategies in other courses?

4. Any other comments?

Questions to Ask Other Faculty
Introduction: Briefly explain the goals of the Lumina grant and why the people are being interviewed.

1. Here are the data for your institution's underserved students.

Do you think that your course—especially during the term of your reported data—reflects the institutional percentage of these students?

2. Do you think—or know whether—there is any difference in the impact of the redesign on the general student population and the target population? Why or why not?

3. Here are the pedagogical techniques that improved student learning in your course as you reported to us.

Which of these had the most impact on the target population? Why do you think so? Do you have any data to support this conclusion?

4. Here are the most-effective pedagogical techniques that improved student learning in the PCR as a whole. There are some listed here that you did not report.

  • Continuous assessment and feedback
  • Increased interaction among students
  • Online tutorials
  • Undergraduate learning assistants
  • Individualized, on-demand support
  • Structural supports that ensure engagement and progress

Did any of these have an impact on all students? On the target population? Why do you think so? Do you have any data to support this conclusion?

5. Are there any techniques that you tried and that didn't work? If so, what kinds of changes did you make when the techniques were not successful, such as voluntary attendance and then began to require attendance?

6. If you have made changes, what impacts have these changes had on the target population? Do you have any supporting data?

Questions to Ask Professionals
Introduction: Briefly explain the goals of the Lumina grant and why the people are being interviewed.

1. Have you had conversations with [type of underserved students] who took [name of course] after it was redesigned? Did you observe differences in the students' learning strategies? In their ability to use these strategies in other courses? In their attitudes toward the course or toward higher education?

2. What observations have you made, if any, about the success of these students based on their experiences in the redesigned course?

3. Have you recommended any of the pedagogical techniques used in the redesigned course as you work with other faculty who work with underserved students? Which ones and why?

Case Studies

Florida Gulf Coast University
FGCU redesigned Understanding the Visual and Performing Arts to accommodate enrollment growth and achieve greater coherence and consistency. All students were moved into a single, fully online section using a common syllabus, textbook, set of assignments, and course Web site. Students were placed into cohort groups of 60 and within these groups, Peer Learning Teams of six students each. The redesign allowed FGCU to maintain active engagement with ideas and a collaborative and experiential learning experience, while eliminating seat time completely.

Students demonstrated a markedly enhanced level of content learning in the redesigned course. The average score on standardized exams in the traditional course was 72 percent and in the redesigned course, 85 percent. The percentage of As and Bs on standardized exams went from 37 percent in the traditional course to 77 percent in the redesigned course, and the percentage of Ds and Fs went from 21 percent in the traditional course to 7 percent in the redesigned course.

Data from 2002 to 2004 showed that adults had a greater percentage of A and B grades and a lower percentage of drops, failures, and withdrawals (DFWs) than the total class. The overall percentage of A and B grades was 56 percent; the percentage for adults was 63 percent. The overall DFW rate was 29 percent; the rate for adults was 27 percent.

Two pedagogical techniques were particularly important in improving student learning at FGCU: (1) Low-stakes quizzes with automated feedback helped students master concepts; quizzes could be taken as often as desired so that students could practice as many of the questions in the test bank as possible. The highest score achieved on the practice test was the score recorded. (2) Web board discussions of sample essays in small peer learning teams increased interaction among students, created an atmosphere of active learning, and developed students' critical thinking skills.

The cost-per-student went from $132 in the traditional course to $81 in the redesign enrolling 950 students. When 1,200 students took the course in the second year of full implementation, the cost-per-student decreased to $70.

http://www.theNCAT.org/PCR/R3/FGCU/FGCU_Overview.htm

Indiana University–Purdue University Indianapolis
IUPUI redesigned Introduction to Sociology to encourage greater collaboration among students, increase student learning, and improve student success rates. In the traditional format, 39 percent of students received a D or F or withdrew from the course. The course redesign substituted online learning modules, threaded discussions, interactive computer-based testing, and an interactive research module.

In the fall 2000 pilot, the percentage of students receiving a D or F or withdrawing dropped from 39 percent to 33 percent; in spring 2001, it was 30 percent; in fall 2001, it dropped to 25 percent. In fall 2000, students in redesigned sections had higher (.10 level) grades. In spring 2001, redesign students had significantly higher (.05 level) grades than those in the traditional format. In fall 2000, tests showed that students in redesigned sections scored significantly higher (.05 level) on common questions measuring understanding of key sociological concepts.

Three pedagogical techniques were particularly important: (1) The redesign introduced collaborative computer work on a research module common to all sections. An online common discussion space allowed all students (resident and commuter, traditional and nontraditional) to work collaboratively without location and time restrictions. (2) Interactive testing allowed students to take exams outside of class, which freed in-class time for additional student-faculty interaction. (3) A course management system allowed faculty to monitor students' progress and participation, thereby enabling faculty to intervene early in problem situations.

Participation in the online discussion forums was particularly effective for students of color as they prepared for biweekly quizzes. After analysis, faculty found that the number of log-ins to the online system and the forum grade were positively associated with better quiz grades. They also found that age was a significant predictor, suggesting that older students, regardless of performance in the forums, were more likely to do better than traditional students on the cumulative final exam.

Offering three large sections per semester instead of two will decrease the cost-per-student from $83 to $66, a decrease of 20 percent. The reduction in the DFW rate translates to an additional savings of $19,541, bringing the total cost reduction produced by the redesign to $53,541.

http://www.theNCAT.org/PCR/R1/IUPUI/IUPUI_Overview.htm

The Ohio State University
OSU redesigned Introductory Statistics—a five-credit course enrolling about 2,800 students per year—to increase student success levels, provide greater individualization of the student learning experience, and reduce the course repetition rate. In the traditional format, students met for three hourly lectures and two hourly labs. In the redesign, OSU implemented a buffet model, offering students a choice of interchangeable paths to learn each course objective. The buffet included lectures, discovery laboratories, live and remote reviews, small-group study sessions, videos, training modules, oral and written presentations, active large-group problem solving, teaching-assistant-graded or self-graded homework assignments, and individual and group projects.

Compared with the last four quarters before the buffet model was implemented, retention improved significantly. The percentage of students who withdrew from the course dropped from 11 percent to 8 percent. The percentage failing the course or receiving a grade that did not satisfy a requirement of their major declined from 7 percent to 3 percent. The percentage receiving an incomplete dropped form 2 percent to 1 percent. The number of African-American students receiving a C or better rose from 63 percent to 79 percent compared with an increase for white students from 69 percent to 83 percent.

The grades of African-American students improved about the same as those of other students; grades of African-American students were about 5 percent lower than those of other students before and after the redesign. The African-American students tended to make different choices from white students among the buffet options and sought increased interaction with other students. They were more likely to choose group-activity lectures rather than reflective lectures. And they were less likely to choose out-of-class problem-solving and learning-by-discovery (intuitive) labs.

Three pedagogical techniques were particularly important in improving student learning at OSU: (1) Informed by an assessment of their learning styles and preferences, all students were able to select from a variety of learning modes, thus meeting the course's common learning objectives by using different pathways. (2) The team established a help room—staffed with teaching assistants, adjuncts, and full-time faculty—to provide on-demand assistance for students throughout the day. (3) A taxonomy of learning objectives linking all course components anchored the class and formed a framework to provide consistency.

OSU reduced the cost-per-student from $190 to $142, a reduction of 25 percent.

http://www.theNCAT.org/PCR/R3/OSU/OSU_Overview.htm

Portland State University
PSU redesigned its First-Year Spanish sequence, a yearlong, multiple-section course. Because of funding and space limitations, enrollment had been capped at about 690 students annually. In some academic years, current offerings could meet only 50 percent of the demand, forcing students to enroll at other institutions. The DFW rate was about 25 percent from fall to spring because of a wide variation in Spanish proficiency among students entering the course and because of the problem of false beginners—students with some basic language skills who can demonstrate proficiency early in the course but not later, leading to drops and withdrawals.

PSU reduced class meeting times from three per week to two while increasing the time students spent in the crucial area of interactive speaking. The redesign moved drilling activities usually performed in class to the online environment and devoted in-class time to oral communication. Online activities included testing, writing, grammar instruction, and small-group activities focused on oral communication. In-class time was further reduced for those students performing above standards, while low-achieving students were directed to small-group sessions for additional oral practice. Online chat exchanges prepared students for weekly discussion board activities, in which students summarized and presented the information they learned in the chat session.

The redesign de-emphasized rote grammar and focused on oral proficiency. End-of-year oral exam scores in the redesigned course averaged 87.3 percent compared with those in the traditional course: 85.8 percent. Students consistently reported greater satisfaction and a richer learning experience in redesigned sections than in traditional ones, including receiving individualized attention and more-timely feedback from the instructor; spending more time studying and reviewing; interacting with fellow students on course-related work; being able to communicate a complaint or suggestion to the instructor, to learn and master course material, and to keep up with the required work; and feeling more connected with the instructor and with other students.

The reduced in-class time and the flexibility of the online materials combined for a particularly useful learning environment for part-time, adult students. In 2003/04, more than half of the students received financial aid, and the grades of this group were comparable to those of students who did not receive financial aid, demonstrating the effectiveness of the redesign for this population.

Enrollment in the course increased substantially from 690 to 1,276 students annually, yielding a reduction in the cost-per-student from $127 to $88.

http://www.theNCAT.org/PCR/R3/PoSU/PoSU_Overview.htm

Rio Salado College
Rio Salado redesigned four precalculus mathematics courses. Before the redesign, the college had used mathematics software developed by Academic Systems (now Plato Learning) to deliver courses via the Internet. Although the Internet classes showed a modest retention increase of about 2 percent over the print/mixed-media format of distance delivery, the overall retention rate (the number of students who complete the course with a grade of A, B, C, D, or F) was only 59 percent. Rio Salado wanted to increase retention and to maintain or increase the number of students who completed the course with a grade of C or better.

Because the Academic Systems software presented course content so well, instructors did not need to spend time delivering content. Prior to the redesign, the majority of instructors' time had been spent dealing with logistical rather than academic interactions with students. The redesign added a nonacademic course assistant to address non-math-related questions (which constituted 90 percent of all interactions with students!) and to monitor students' progress. As a result, one instructor was able to teach 100 students concurrently enrolled in any of four math courses. The redesign yielded an increase in retention rates from 59 percent to 65 percent while tripling the number of students taught by one instructor.

Rio Salado took advantage of the Academic Systems software's large bank of problems and answers and automated grading to increase the amount and frequency of feedback to students. Students knew what they had not mastered and were able to take appropriate corrective actions. Students could take end-of-module quizzes as soon as they were ready, moving either quickly or slowly through the material. The software also provided a built-in tracking system that allowed the instructor and the course assistant to know every student's time on task and progress through the modules in each of the four courses.

Using the Academic Systems software ensured that all students who completed the course had the same kinds of learning experiences. This meant that they were more consistently prepared when they moved to the next course in the sequence or to other courses requiring a mathematical background. The greater consistency combined with allowing students to individualize their study patterns provided a flexible but structured course design well suited to part-time, working adult students.

The redesign reduced the cost-per-student from $49 to $31, a 37 percent decrease.

http://www.theNCAT.org/PCR/R1/RSC/RSC_Overview.htm

Riverside Community College
RCC redesigned Elementary Algebra, a four-credit course enrolling 3,600 students annually in 72 sections of 50 students each. Elementary Algebra is RCC's lowest-level math course that meets associate degree requirements and its highest-enrolled math course. For the decade preceding the redesign, the success rate (a grade of C or better) was about 50 percent with a repeat rate of 30 percent. Many students simply gave up and dropped out. RCC attributed these problems to the course's lecture format, which severely limited student interaction with materials, instructors and other students.

The goal of the redesign was twofold: (1) to encourage students to take an active role in their own learning according to their preferred learning styles, building on timely assessment and faculty guidance, and (2) to move from a seat-time model to one based on subject matter mastery. The redesign converted four hours of weekly lectures into two hours of participation in a math lab and two hours in class. Students used MyMathLab, an interactive instructional software program, and received individualized assistance from faculty, tutors, and other students.

Three learning areas were assessed: (1) elementary algebra performance by comparing common final exam results, (2) enrollment and performance in subsequent mathematics courses, and (3) gains in knowledge and skills by administering pre- and posttests. Six objectives were mapped to specific pretest and posttest questions. Students' learning gain in the redesigned courses as measured by pre- and posttesting (mean = 7.66) was significantly higher than learning gain in the traditional course (mean = 6.38, t = –3.77, d.f. = 618, P <.001). Overall, students in the redesigned courses learned more on four of the six learning objectives.

The most-important pedagogical changes made at RCC were (1) to use MyMathLab software, which supported verbal, visual, and discovery-based learning options and was accessible in the lab or from home, and (2) to require student participation in the math lab to ensure that they spent the needed time on task while receiving help when they needed it. Students enjoyed using the software in the lab environment and welcomed the individualized assistance that was available. They acknowledged that they would have had much more difficulty learning math without the combination of software and personal assistance that supported them in their studies.

RCC decreased the cost-per-student from $206 to $121, a reduction of 41 percent.

http://www.theNCAT.org/PCR/R2/RCC/RCC_Overview.htm

Tallahassee Community College
TCC redesigned College Composition, a required course serving approximately 3,000 students annually. The traditional format, which combined lecture and writing activities in sections of 30 students each, made it difficult to address individual needs. Considerable class time was spent reviewing and reteaching basic skills, thus reducing the amount of time students had for engaging in the writing process. Success rates were poor (less than 60 percent annually). And many students had to repeat the course, which placed a financial burden on the English Department and led to heavy dependence on adjunct instructors.

The redesign had two major components. The first involved using appropriate technologies to provide diagnostic assessments resulting in individualized learning plans; interactive tutorials in grammar, mechanics, reading comprehension, and basic research skills; online tutorials for feedback on written assignments; follow-up assessments; and discussion boards to facilitate the development of learning communities. Students submitted midstage drafts to online tutors at TCC or to SMARTHINKING, thereby reducing the amount of time faculty spent grading papers. These activities took place outside the classroom and were accessible to students at any time.

The second component involved restructuring the classroom to include a wide range of learner-centered writing activities that fostered collaboration, proficiency, and higher levels of thinking. By shifting many of the basic instructional activities to technology, faculty could focus the classroom portion of the course on the writing process. Students worked in small groups or on individual writing efforts depending on their identified needs.

During the 2002/03 academic year, students in redesigned sections had a 68.4 percent success rate compared with 60.7 percent in traditional sections. The overall success rate for all composition students was 62 percent for the 2002/03 year compared with 56 percent for the 1999/2000 year, representing a 13.6 percent decrease in the DFW rate. Faculty observed that redesign students were more actively engaged in the learning process, were taking greater responsibility for their learning, were more independent and self-sufficient as learners, and were more adept at collaborative processes.

The pedagogical techniques that contributed most to improving student learning were greater course consistency via a menu of common assignments; increased interaction among students; online resources that included links to grammar review sites, quizzes with immediate feedback, textbook companion resources, and library orientation; and individualized, on-demand assistance that provided prompt, constructive feedback on writing assignments.

The redesign reduced the cost-per-student from $252 to $145, a 42 percent savings.

http://www.theNCAT.org/PCR/R3/TCC/TCC_Overview.htm

The University of Alabama
UA redesigned Intermediate Algebra—which enrolled about 1,500 students annually—in order to address poor student performance. Nearly 60 percent of students in the traditional course earned a D, F, or withdrawal grade, and students often needed to take the course two or three times before passing. A student's initial math course plays a key role in establishing either a successful or a problematic transition from high school to the university. Thirty percent of students who received a grade of D or F in Intermediate Algebra graduated in six years compared with the university average of 55 percent.

Modeled in part on the Math Emporium at Virginia Tech, UA's redesign allowed the individual student to focus precisely on his or her questions and difficulties. Students spent 3.5 hours in the Math and Technology Learning Center (MTLC), where they worked with instructional software and received individualized assistance from full-time faculty, part-time faculty, graduate students, and undergraduate students as well as 30 minutes in group work sessions each week. Additional hours in the MTLC were optional depending on individual student needs.

The average success rate (grade of C– or better) for the redesigned format was 49.1 percent (fall 2000–spring 2002) compared with an average of 46.4 percent for the traditional format (fall 1998–spring 2000.) The success rate for African-American freshmen was substantially higher than for white freshmen. In fall 2000, 71.4 percent of African-American freshmen were successful versus 51.8% of white freshmen; in fall 2001, it was 70.0 percent versus 65.3 percent.*

Three pedagogical techniques were particularly important in improving student learning: (1) The MyMathLab software supported verbal, visual, and discovery-based learning styles and provided quick feedback for students and a steady flow of information for instructors. (2) A flexible attendance policy allowed students to do math at times most convenient for them. While students were required to spend a minimum of 3.5 hours per week in the MTLC, they could use this time to move on to future topics or review material they found difficult. (3) All students were required to attend a weekly 30-minute class session, which focused on student problems and allowed instructors to follow up in areas of student weakness. It also helped build community among students and instructors.
UA reduced the cost-per-student from $122 to $82, a decrease of 33 percent.

http://www.theNCAT.org/PCR/R2/UA/UA_Overview.htm

*Annual success rates were generally lower than semester-by-semester success rates because spring semesters enrolled weaker students—those who failed during the fall—than fall semesters did.

University at Buffalo–SUNY
UB redesigned its Computer Literacy course, which enrolled about 1,000 students each year. In the traditional format, there were three lecture sections of 200 students each with 19 lab sections of approximately 26 students each. Students attended three 1-hour lectures, one 2-hour formal lab, and one 2-hour open lab each five-week term. The lecture format did not promote active and collaborative learning and did not provide enough student support at the beginning of the semester, when most students need proportionately more individualized, face-to-face contact. Many of the graduate teaching assistants who provided direct lab contact were not native speakers of English and often found it difficult to communicate with beginning-level students.

The redesigned course reduced the number of lectures from three to two per week and added Web-based tutorials, diagnostic quizzes, short minilectures, and Web- and lab-based group activities designed to support collaborative learning. Lab hours were restructured so that more formal lab hours occurred at the start of the semester and more open lab hours occurred at the end to match student need. Undergraduate learning assistants increased the amount of individualized assistance available in labs.

With the spring 2000 term as the baseline, analysis of the final grades indicated that there was an increase in the percentage of students earning a grade of A– or higher—from 27 percent to 56 percent in spring 2001. The mean grade earned in the course increased by a third of a letter grade, from a C+ to a B–.

The pedagogical techniques that contributed most to improving the quality of student learning at UB were (1) the use of undergraduate learning assistants, who replaced graduate teaching assistants to provide help for students and who demonstrated superior communication skills and greater understanding of students' common misconceptions about computers; (2) increased lab hours, which enabled students to have more one-on-one assistance; and (3) self-paced learning materials provided by the textbook publisher to enhance learning and allow for greater individualization of the student's experience.

UB reduced the cost-per-student from $248 to $143, a decrease of 42 percent.

http://www.theNCAT.org/PCR/R1/UB/UB_Overview.htm

University of Central Florida
UCF redesigned its American National Government course—enrolling 2,200 students annually—to improve student performance (only about 78 percent of the students earned a grade of C or better) and retention (approximately 100 students needed to retake the course each year). In addition, the course required too much in-class lecture time in a campus environment with scarce space for large lectures. UCF's dynamic growth had created a shortage of classroom space, and the university was paying $1.8 million annually for rented classroom space.

The course redesign substituted Web-based, asynchronous, modular learning for two-thirds of the in-class time, thereby reducing the number of lectures per week from three to one, and creating small, collaborative learning groups within this online structure. Examples of course activities include self-paced, autograded quizzes and games with instant feedback; interactive, Web-based election simulations; and test banks to review and prepare for exams. Communications software, bulletin boards, and chat rooms provided useful and convenient opportunities to increase discussion among students.

Using a content examination on knowledge of American government to measure learning, UCF found that students in the redesigned sections showed significantly better pretest and posttest improvements in content knowledge as well as significantly better absolute posttest performances. In addition, UCF discovered that the students in the redesigned course had less academic experience, less previous exposure to Web-based courses, and lower levels of motivation to learn about American politics. Students in the redesigned course expressed greater willingness to take another political science course employing the same format.

Two pedagogical techniques were most important in improving student learning: (1) An abundance of Web resources were available to students. Assignments were created around subject matter sites, and students analyzed material and summarized their findings in short papers. On interactive Web sites, students provided information and received immediate feedback—for example, in simulations of elections or public opinion quizzes. (2) Online discussion groups of 10 students each increased interaction among students, and students benefited from the informal learning communities that were created.

The total cost of the traditional American National Government course was $264,400. By implementing the redesign in all 24 sections, UCF can reduce the total cost of the course to $178,200, thereby producing an annual savings of $68,200.

http://www.theNCAT.org/PCR/R1/UCF/UCF_Overview.htm

University of Idaho
UI redesigned three courses—Intermediate Algebra, Algebra, and Pre-Calculus that review information offered in high school math—based on the Math Emporium model first developed at Virginia Tech. Enrolling a total of 2,428 students, the traditional courses were taught in a lecture format and suffered from high DFW and repeat rates.

In the redesign, class meetings were eliminated; learning activities were moved to the Polya Math Center. The courses used commercially available mathematics instructional software that generated problems and offered immediate feedback. Short topical lectures were available via on-demand streaming video. Most of the course material was also Web accessible. Faculty, teaching assistants, and peer tutors worked with students individually and in groups. Students also met weekly in focus groups of 40 to 50 students to coordinate activities and discuss experiences. Aside from the weekly focus group meeting, students managed their learning time, types of learning activities, and rate of progress.

Overall student performance as measured by grades based on comparable examinations and assignments improved. In Algebra and Intermediate Algebra, the percentage of As and Bs was higher, and the percentage of Cs, Ds, and Fs was lower. In Pre-Calculus, the percentage of A and B grades also tended to be higher for redesign students, though the proportion of failures was not reduced dramatically.

The redesigned courses were particularly successful with Hispanic students in the College Assistance Migrant Program (CAMP). CAMP students met together in the learning center for two of the three required hours, working with a tutor.

During the fall 2002 semester, CAMP students achieved an 80 percent pass rate in Intermediate Algebra compared with the previous 70 percent pass rate in the traditional format. CAMP students also surpassed the success rate for the entire algebra population as a whole. In fall 2004, 73.6 percent of CAMP students earned an A, B, or C; only 68 percent of other students passed. In addition, no CAMP students withdrew from the course.

The pedagogical techniques that contributed most to improving student learning were: (1) the Polya Math Center described earlier, which moved students from a passive to
an active learning experience; (2) student focus groups; (3) abundant online resources available to students; (4) weekly task lists that provided a step-by-step breakdown of assignments; and (5) weekly, one-hour mandatory tutor training sessions.

UI reduced the cost of offering all three courses from approximately $338,000 to about $235,000, a reduction of 31 percent.

http://www.theNCAT.org/PCR/R2/UId/UId_Overview.htm

The University of New Mexico
UNM redesigned General Psychology, its largest and most popular undergraduate killer course, which enrolled 2,250 students annually. UNM's primary redesign goal was to improve the course's extraordinarily high 42 percent DFW rate, 30 percent of which were failures and a disproportionate number of which were minority students. UNM has one of the lowest student retention rates among public research universities. High failure rates in core curriculum courses such as General Psychology are known to have a strong negative impact on UNM's low overall retention and graduation rates.

The course redesign reduced the number of lectures each week from three to two and incorporated a weekly 50-minute studio session led by undergraduate teaching assistants, strong students from previous sections of General Psychology, or upper-division honors students. In-class activities were supplemented by interactive Web- or CD-ROM-based activities and quizzes, offered on a 24-7 schedule. Students were able to interact online with other students and review concepts based on individual need. Online components utilized commercially available software that contained interactive activities, simulations, and movies. Students took repeatable quizzes each week requiring a C level of mastery. An active intervention strategy ensured that students were making progress. Graduate teaching assistants monitored quiz performance, counseling students with weak performance as to how to improve.

UNM's goal of reducing drop and failure rates was achieved. The failure rate was reduced from previous levels of 30 percent to 12 percent, and the DFW rate fell from 42 percent to 18 percent. The number of students who received a C or higher rose from 60 percent to 76.5 percent, and there were more A and B grades than recorded in previous semesters. At the same time, the course was arguably more difficult, requiring students to cover completely a high-level introductory text.

Three pedagogical techniques were particularly important in improving student learning at UNM: (1) Online mastery quizzes, which tested both factual and conceptual knowledge, structured students' learning and kept them on task. (2) Students who scored 75 percent or less on the first exam were required to attend a weekly 50-minute studio for the remainder of the semester for additional tutoring from undergraduate teaching assistants. (3) All sections used the same materials and required the same amount of work, which led to a more consistent learning experience for all students than in previous semesters.

The redesign of General Psychology reduced the cost-per-student from $72 to $37, a 49 percent reduction.

http://www.theNCAT.org/PCR/R3/UNM/UNM_Overview.htm

University of Southern Maine
USM redesigned Introductory Psychology to increase student understanding and retention of material by increasing active learning in the course. The traditional course enrolled about 875 students annually; faculty lectured three hours per week to 13 sections of about 75 students each. About 30 percent of the students did not pass, and many failed to retain course material in downstream courses. The large lecture sections and the absence of recitation sections did not support individualized instruction. Students received only the total score on tests and no feedback about which material was incorrect or where to learn the correct information. Students had no resources to support learning other than the large lecture, since there were no teaching assistants and only a few student tutors.

The course redesign involved reducing lecture time by half, replacing that time with interactive Web-based learning activities, and increasing individualized attention to students by instructors. Students answered questions within each module, got immediate feedback, and had the chance to redo modules until they fully comprehended the concepts. Testing was also moved online.

Using pre- and postcourse assessment of important concepts in parallel sections running simultaneously in fall 2000, USM found a statistically significant (.001 level) 10 percent improvement in scores. Students in the redesigned course did significantly better (average score of 76.7 on the posttest) than those in the traditional format (average score of 67.3). Exam grades showed a significant increase; average grades on each of three tests were about 10 percent higher than before the redesign.

Three pedagogical techniques were most important in improving student learning at USM: (1) Online quizzing forced students to prepare for class and changed the role of the instructor from introducing material to reviewing and expanding what students had already studied. (2) A mastery approach to quizzing allowed students to take quizzes several times and to receive immediate feedback. (3) Links from quiz items to text review material enabled students to easily identify the content they had not yet mastered.

USM reduced the cost-per-student from $113 to $88, a decrease of 22 percent.

http://www.theNCAT.org/PCR/R1/USMe/USMe_Overview.htm

The University of Southern Mississippi
Southern Miss redesigned World Literature, enrolling more than 1,000 students each term, in order to eliminate course drift and inconsistent student learning experiences. The traditional course was offered in 16 sections of about 65 students each: eight sections taught by full-time faculty and eight by adjuncts. The redesign placed all students in a coherent single online section and replaced the passive lecture environment with media-enriched presentations that required active student engagement. A course coordinator directed the team teaching of four faculty members, each of whom taught his or her area of expertise of four weeks, and four graduate assistant graders. The faculty team offered course content through a combination of live lectures with optional attendance and required Web-delivered, media- and resource-enhanced presentations.

Using baseline data collected in fall 2001, grades on weekly quizzes with common content in spring 2003 showed an increase in grades of C or better from 68 percent in the traditional course to 88 percent in the redesigned course.

In comparison with fall 2001, writing scores of C or better increased from 61 percent to 77 percent in spring 2003. In a comparison of parallel sections running simultaneously, essay scores increased in spring 2002 from 7.11 in the traditional mode to 8.10 in the redesigned format. The impact of the redesign was positive for both African-American and white students, and all improved. African-Americans, however, averaged three to four points lower than white students. African-Americans consistently ranked the course experience higher than white students did.

In the traditional version of the course, faculty-taught sections typically retained about 75 percent of their students, while adjunct- and teaching-assistant-taught sections retained 85 percent. In the fall 2003 semester of full implementation of the redesign, retention was 87 percent, with all students being taught solely by faculty. At the same time, the rate of D and F grades dropped from 37 percent to 27 percent in the spring 2003 redesigned course.

The pedagogical techniques that contributed most to improving student learning were (1) low-stakes mastery quizzes that provided immediate feedback for students; (2) individualized, on-demand help that provided one-on-one assistance from graduate students to improve students' writing skills; and (3) accommodation of different learning styles by offering students an array of learning options.

As a result of the redesign, USM reduced the cost-per-student from $70 to $31, a 56 percent savings.

http://www.theNCAT.org/PCR/R3/USMs/USMs_Overview.htm

The University of Tennessee, Knoxville
UTK redesigned Intermediate Spanish Transition, an introductory course enrolling more then 60 percent of entering students as a result of language placement scores. During the 1999/2000 academic year, the course enrolled 1,539 students in 57 sections with 27 students per class. The traditional course was unable to provide enough sections to satisfy enrollment demands.

The redesigned course substituted online diagnostic homework exercises (grammar, vocabulary, and graded workbook assignments) for one in-class period per week. Immediate feedback on all graded assignments was given via online assessments, which eliminated the time-consuming grading of homework exercises, quizzes and examinations. Because they no longer dealt with skill-based practice in class, instructors had more time to emphasize active speaking skills and cultural awareness.

Students in both the traditional and redesigned courses were assessed by using the university's Spanish placement examination as well as midterm and final exams. Students also engaged in a simulated oral proficiency interview—a more complex measure of proficiency. Students in both formats were first given a pretest on these measures in which no differences were found. In the simulated oral proficiency interview, redesign students performed significantly better than traditional students on six of eight dimensions of language proficiency. There were no significant differences between the two groups on the remaining two dimensions.

Three pedagogical techniques contributed most to improving the quality of student learning at UTK: (1) Vocabulary and grammar practices were moved from the classroom to the online environment so that in-class time emphasized speaking, writing, and negotiating meaning and communication.

(2) Online resources were designed to incorporate a rich array of learning resources and activities. More than 400 graphic, audio, and video files were keyed to course concepts. The textbook and workbook exercises were moved online along with directions for use and model answers. Students received immediate (automated) feedback and detailed grammatical explanations about their work. (3) The flexibility of the online environment created opportunities for continuous course improvement, thereby allowing faculty to modify, update, revise, and expand activities as needed.

By offering one-third more sections with lower personnel costs, UTK was able to reduce the cost-per-student from $109 to $28, a 74 percent decrease, while serving more than 500 additional students annually.

http://www.theNCAT.org/PCR/R2/UTK/UTK_Overview.htm

NOTES
1. Peter Ewell, vice president of the National Center for Higher Education Management Systems, served as the external evaluator of this study.

2. Watson Scott Swail, "Legislation to Improve Graduation Rates Could Have the Opposite Effect," Chronicle of Higher Education, January 23, 2004, p. B18.

3. Vincent Tinto, Leaving College: Rethinking the Causes and Cures of Student Attrition, (Chicago, IL: University of Chicago Press, 1993).

4. Thomas R. Bailey and Mariana Alfonso, Pathways to Persistence: An Analysis of Research on Program Effectiveness at Community Colleges, (Indianapolis, IN: Lumina Foundation for Education, 2005). The discussion of Tinto's model draws heavily from Bailey and Alonso's summary.

5. For a full description of the Program in Course Redesign, including specific descriptions of each project and analyses of the outcomes achieved, see http://www.center.rpi.edu/PewGrant/OutAnaly.html (accessed July 2005).

6. Swail, p. B18.

Back to Top of Monograph

Copyright 2005 The National Center for Academic Transformation