| ACADEMIC PRODUCTIVITY: DECISIONS, DECISIONS, DECISIONS
by Carol A. Twigg
"There is a feeling that American higher education costs too much and does not deliver."
"Can nothing stop this relentless inflation in college costs?"
Sound familiar? Not a day passes without some kind of public statement decrying the state of American higher education and the consequences for the nation and its students. But these two statements were made more than 22 years ago: the first in 1990 and the second in 1991. I quoted both in my first published article on this subject, “Improving Productivity in Higher Education: The Need for a Paradigm Shift” in 1992.
What’s different today is that the voices saying that there is a problem have grown louder and more numerous. But the solutions have not. Higher education’s leadership has more or less decided that colleges and universities cannot cut their costs and become more productive. Higher tuitions and increases in financial aid to counteract budget cuts seem to be the preferred solutions just as they were in the early 1990’s. The only solution is more money.
We at NCAT know that productivity in higher education can be improved substantially with no diminution in quality—and we have proved it. Many say we are the only ones in higher education showing how it can be done. We say that’s because we are the only ones in higher education who want to do it and, unfortunately, the only ones who seem to know how to do it.
The purpose of this article is to share with the higher education community some of what we have learned over the past 13 years about reducing instructional costs and to de-bunk some popular misconceptions about the relationship between cost and quality.
A Word about Our Methodology
NCAT has developed what we call the Course Planning Tool (CPT), a spreadsheet-based decision-making tool that enables institutions to compare the “before” activities and operating costs (the traditional course) and the “after” activities and operating costs (the redesigned course when it is fully implemented). It does not include the up-front developmental costs of either the traditional or the redesigned course. The CPT consists of four worksheets which capture the following data: 1) all personnel costs (using average salaries by personnel type—e.g., tenured faculty, non-tenured instructors, adjuncts, graduate teaching assistants) associated with preparing and delivering the course expressed as an hourly rate; 2) the activities involved in and the costs of preparing and delivering the course in its traditional format; 3) the activities involved in preparing and delivering the course in its redesigned format when it is fully operational; and, 4) a comparison of the costs of the traditional course and the redesigned course. The outcome is the cost-per-student (total cost/number of students enrolled in the course) for both the traditional and the redesigned course. All of the data presented in this article have been captured by each institutional project using NCAT’s CPT. (More information about the CPT can be found at http://www.theNCAT.org/PlanRes/CPTdesc.htm.)
In both our national and statewide programs, we have not required institutions to achieve a minimum cost-reduction percentage for two reasons. First, our intention was to teach institutions how to reduce costs, so we believed that their gaining experience in doing so was more important than how large or small the reductions were. Second, if a course costs a lot to offer, a small percentage can produce a large amount of savings. Conversely, a large percentage of a relatively inexpensive course produces a relatively small amount of savings.
For example, when the University of Wisconsin Madison redesigned its introductory chemistry course in 1999, the cost of offering the course in its traditional format was $1,053,700. A 28% savings produced a dollar savings of $295,200. When the University of Central Florida redesigned its American government course in 1999, the cost of offering the course in its traditional format was $246,400. A 28% savings produced a dollar savings of $68,200. When the University of Southern Maine redesigned its introductory psychology course in 1999, the cost of offering the course in its traditional format was $98,875. A 49% savings produced a dollar savings of $48,125.
Why Do Cost Savings among Projects Vary So Widely?
The discussion that follows is based on data from about 150 completed redesign projects, a significant minority of which involved more than one course. Overall, the reduction in the cost-per-student ranged from 4% to 81%, with an average of 36%. Why did some projects save more than others?
Is It the Discipline?
Some may think that discipline would affect the amount of savings.
What can we learn from these data? There is no relationship between the discipline of the traditional course and the percentages of savings that are produced. First, the large range among cost savings within the disciplinary categories is similar to the overall range. Second, the average cost reduction among projects within the disciplines is similar, with the sole exception of two-year college developmental math redesigns.
Is it the Low vs. High Cost of the Original Course?
Some may think that the low vs. high cost of the traditional course offering would affect the amount of savings. The argument goes that if a course is already low cost, the savings cannot be great; if the course is high cost, the potential for substantial savings is higher. We have been told repeatedly that “our course is so cheap, so bare bones, that we cannot possibly produce costs savings.”
What can we learn from these data? There is no relationship between the cost of the traditional course and the percentages of savings that are produced. Low-cost courses produced high savings; high-cost courses produced low savings; and vice versa.
Why do the costs of teaching the same courses vary so dramatically? Why did it cost the University of North Carolina at Chapel Hill $416 per student to teach introductory Spanish in the traditional format versus $79 at the University of North Carolina Charlotte, $127 at Portland State University, $152 at the University of Southern Mississippi, $256 at SUNY Fredonia and $326 at Texas Tech? Why did it cost Mid-State Technical College $467 per student to teach developmental math in the traditional format versus $118 at Iowa Western Community College, $148 at Guilford Technical Community College, $199 at Volunteer State Community College, $246 at Robeson Community College and $323 at Northern Virginia Community College? It’s not because faculty salaries are higher in the high cost courses. As we shall see, it’s because each institution made different decisions about how to structure the course.
Is It the Type of Institution?
Some may think that the type of institution would affect the amount of savings.
What can we learn from these data? First, the large range among cost savings within the institutional categories is similar to the overall range. Some community colleges produced high savings; some four-year institutions produced low savings; and vice versa. Second, the average cost reduction for two-year institutions is significantly less than for four-year institutions.
But what is fascinating is the difference between the cost-per-student for the traditional offerings of the courses. For community colleges, the average is $224 with a range of $49 (Rio Salado College: Developmental Math) to $467 (Mid-State Technical College: Developmental Math). For four-year institutions, the average is $200 with a range of $32 (Truman State University: Health and Fitness) to $509 (Indiana University of Pennsylvania: Introductory Biology) It’s surprising that, at least at the introductory level, instruction at community colleges is more expensive than at four-year colleges and universities.
The primary reason why the average community college cost-per-student is high is because their classes are small. In some cases, classes are kept small for ideological reasons; in others, the classrooms themselves are predominantly small. A major factor leading to high costs not often mentioned is that community colleges often do not “fill” the classes (e.g., the enrollment “cap” and the classroom size are for 25 students, but the actual enrollment is 17 students). These unfilled sections are a result of offering too many sections for the overall course enrollment due to either the desire to offer students flexible scheduling or general ineptitude on the part of administration or a combination of both. And as we shall see below, the notion that smaller class size leads to better learning outcomes is simply incorrect.
Cost Savings Are A Result of Decisions Made by Redesign Teams
Let’s look at a few examples from three different disciplines.
Introductory Psychology: 9% vs. 66% Cost Reduction
Introductory psychology courses are typically low-cost courses. The average cost-per-student is $89, with a range of $60 to $113. Yet one of our introductory psychology redesigns saved 66% (Frostburg State University) whereas another saved 9% (Missouri State University).
Frostburg’s traditional cost-per-student was $89 for 900 students; Missouri State’s was $73 for 2,700 students. Frostburg used nine full-time faculty members @ $7,006 each (total = $63,035) and nine adjuncts @ $1,941 each (total = $17,472) to teach 18 sections (~50 students) of the course in the traditional format. Missouri State used 12 full-time tenured faculty @ $161,599; one non-tenured full-time faculty @ $8,121 and five adjuncts @ $13,500 to teach 18 sections (~150 students) of the course. Frostburg kept costs down by using a higher percentage of adjuncts; Missouri State kept costs down by using a large section size.
How were their redesigns similar and how did they differ?
Both institutions reduced the number of in-class meetings by half, replacing them with online activities, and increased section size. Frostburg tripled section size from 50 to 150 and reduced the number of sections from 18 to six. Missouri State doubled section size from 153 to 300 students and reduced the number of sections from 18 to nine. Both changed the nature of the in-class meeting to promote active learning. Both created a variety of online activities that included quizzing and small discussion groups. Both created a standard course with standard learning activities and assessments.
Both projects produced significant improvements in student learning outcomes. At Frostburg, the team compared performance on 43 common questions on a final examination. Students from the fully redesigned course performed significantly better (mean = 77%) than students from traditional sections (mean = 65%). An optional extra-credit essay that asked students to write about prejudice was also administered. A grading rubric provided points for each correctly used psychological concept in order to separate "general public" answers from answers by knowledgeable psychology students. Results indicated students in the redesign sections (mean = 2.85) performed significantly better than students in the traditional sections (mean = 1.09).
At Missouri State, the team compared performance on two pre/post comprehensive exams: a 30-item exam that has been used historically and a 50-item exam created specifically for this project by the course redesign team. Results indicated that, on the 30-item exam, students in the redesigned sections performed significantly better from pre to post (8.91 point gain) compared with the traditional comparison group (6.32 point gain). Similarly, students in the redesigned course demonstrated significantly more improvement from pre to post on the 50-item exam (12.63 point gain) compared with the traditional sections (7.73 point gain).
So what made the cost savings so different?
What happened here? Frostburg’s cost-per-student dropped to $31; Missouri State’s was $66, yet both achieved similar learning improvements. Missouri State retained full-time tenured faculty members and, in essence, reinvested the savings accrued from larger section size by adding expensive support in the form of adjuncts and graduate students. Hence, their cost reduction was only 9%. Frostburg, on the other hand, designed a much more cost-effective personnel structure and produced savings of 66% which could be used for a variety of purposes by the university. The design decisions made by each team were the cause of the significant differences in cost reduction.
Developmental Math: 4% vs. 54% Cost Reduction
Unlike introductory psychology, the cost of developmental mathematics instruction varies widely. Among NCAT’s Changing the Equation projects, for example, the cost-per-student in the traditional format of the course varied from $105 to $467. In their redesigns of their developmental math sequences (all sections of all developmental courses offered), each institution was required to use the Emporium Model. Interactive computer software combined with personalized, on-demand assistance and mandatory student participation are the model’s key elements. Each participating institution modularized its curriculum, allowing students to progress through the developmental course sequence at a faster pace if possible or at a slower pace if necessary. All projects were required to reduce costs, using similar proven strategies, yet the reduction in the cost-per-student varied from 4% to 54%.
How did the learning results compare among these four institutions?
Introductory Biology: 22% vs. 41% Cost Reduction
Like developmental math, the cost of offering introductory biology varies widely. The average cost-per-student is $271, with a range of $127 to $509. Yet one of our introductory biology redesigns saved 41% (Salisbury University) whereas another saved 22% (Mississippi State University).
Salisbury’s traditional cost-per-student was $327 for 840 students; Mississippi State’s was $128 for 377 students. Salisbury used four full-time faculty members @ $22,208 each and eight lecturers @ $23,477 each to teach 12 lecture sections (~72 students) and 40 lab sections (~24 students) of the course in the traditional format for a total annual course cost of $274,584. Mississippi State used one instructor (half-time) @ $21,316 and two graduate teaching assistants (GTAs) @ $26,832 total to teach four lecture sections (~94 students) and 16 lab sections (~24 students) each year for a total annual course cost of $48,148. Salisbury’s costs were relatively high due to the percentage of full-time tenure track faculty teaching the course versus Mississippi State’s use of instructors and GTAs as well as somewhat larger section sizes.
How were their redesigns similar and how did they differ?
Both institutions reduced the number of in-class meetings per week. Mississippi State went from two to one; Salisbury went from three to one, replacing them with online learning activities. Both changed the nature of the in-class meetings to promote active learning. Both created a variety of online activities that included quizzing and small discussion groups. Both created a standard course with standard learning activities and assessments.
Both increased course enrollment: Salisbury increased course enrollment from 840 to 960; Mississippi State doubled enrollment from 377 to 754. Salisbury increased section size from 72 to 120 and reduced the number of sections from 12 to eight. At Mississippi State, section size stayed the same, but the instructor taught two sections rather than one for the same workload credit.
Salisbury showed significant improvement in student learning, whereas Mississippi State showed no significant difference. At Salisbury, the team compared performance on common questions on a final examination. The average percentage correct for the traditional students was 74% whereas for the redesigned students, the average was 82%. At Mississippi State, pre- and post-tests were used to assess learning gains. Students in the traditional course showed a gain of 2.7 points; redesign students showed a gain of 2.6 points.
So what made the cost savings so different?
Since Mississippi State doubled instructor workload, one would think that their costs would have been reduced by 50%, yet they were only reduced by 22%. The reason is that Mississippi increased the number of GTAs from two to four to staff virtual labs at an additional cost of $26,832. Thus, they reinvested a large part of the savings back into the course.
Salisbury kept a two-hour lab each week, which included time devoted to discussion in addition to lab activity. In the redesigned format, one full-time faculty member and three lecturers each taught one lecture section and five lab sections per term vs. two full-time faculty members and four lecturers in the traditional format.
What happened here? The cost-per-student at Salisbury dropped from $327 to $192, a 41% decrease, and at Mississippi State from $128 to $99, a 22% decrease, yet Salisbury increased student learning whereas Mississippi State remained the same. Mississippi State decided to reinvest part of the savings back into the course; Salisbury decided to invest the savings in other departmental needs. The design decisions made by each team were the cause of the significant differences in cost reduction.
The point is not to beat up on Mississippi State–the team achieved costs savings and produced equivalent learning outcomes. They also achieved their goal of eliminating an enrollment bottleneck to meet the demand for the course: twice the number of students can be enrolled in each redesigned course each semester. But Salisbury achieved greater cost savings with significantly better learning outcomes.
While we have featured a handful of redesigns to help you understand what lies behind the conclusions that we have drawn, these conclusions are based on the results of about 300 course redesigns conducted over the past 13 years. We have proved that:
As we said in the beginning of this article, not much has changed in higher education in the last two decades with regard to increasing academic productivity. The reason to us is clear: higher education’s leadership seems unable or unwilling to make the kinds of decisions that are needed. Despite the wide recognition that NCAT’s work has received and the indisputable facts that back up the conclusions listed above, college and university presidents appear to be willfully ignorant of the possibilities that redesign offers and continue to trot out the same old clichés.
A 2008 study, The Iron Triangle: College Presidents Talk about Costs, Access, and Quality, conducted by Public Agenda and the National Center for Public Policy and Higher Education, documents widening gaps between the perceptions of civic, governmental and business leaders, higher education leaders, and the general public about the most fundamental issues confronting American higher education. The study is based on a series of interviews with more than two dozen college and university presidents. Three concepts dominated their concerns: the increasing cost of higher education; the challenge of providing access to new generations of students; and the need to maintain and improve educational quality (along with the need to be accountable for that quality).
As the authors note, “Any of these goals would be challenging enough, but most of the presidents see these three missions as being in tension—a change in one impacts the others. For example, while many of the presidents believe that greater efficiencies are possible, most also believe that, for the most part, efforts to enhance access or improve quality will ultimately drive up costs. By contrast, they believe reduced financial support from the states—something being talked about nationwide—will eventually either harm quality and or force tuition increases that will reduce access.” Here are some sample quotes from the interviewees:
It seems clear that most higher education leaders hold a very different definition of the problem than what typically exists among the general public or other leadership groups.
Over the years, Public Agenda has found one factor that is essential for resolving large-scale public issues: the various stakeholders must agree on the definition of the problem. Once this is established, there is a much greater likelihood of productive debate and resolution. Without it, the parties simply talk past each other. Solutions are unlikely when the public, leadership groups, and higher education leaders each accept only the solutions that match their particular definition of the problem. Until these groups can coalesce around a shared understanding, they are destined to talk past each other, with the two sides drawing farther apart through rising frustration, rather than coming together for a consensus or compromise.
It is our hope that NCAT’s work can contribute to achieving that shared understanding.