View Site Map

The Learning MarketSpace, October 2003

A quarterly electronic newsletter of the Center for Academic Transformation highlighting ongoing examples of redesigned learning environments using technology and examining issues related to their development and implementation.

TABLE OF CONTENTS

1. THE CAT VIEWPOINT

  • The KISS Approach to Costing

2. UPDATES FROM THE PROGRAM IN COURSE REDESIGN

  • Ohio Commission Learns about the Program in Course Redesign
  • South Dakota's Governor Rounds Funds A Second Statewide Initiative in Course Redesign
  • Round III Final Report Highlights

3. CUTTING ACROSS

  • Deeper Learning at Reduced Cost

4. COMMON GROUND

  • Learning Options at Glendale Community College
  • A New Newsletter from The National Resource Center for the First-Year Experience
  • Frank Newman Goes to Congress (and Takes the Program in Course Redesign Along)

5. SUBSCRIPTIONS, SUBMISSIONS, ARCHIVES, REPOSTING

1. THE CAT VIEWPOINT

Perspectives on issues and developments at the nexus of higher education and information technology.

The KISS Approach to Costing

When we developed the costing methodology used in the Program in Course Redesign, we decided to keep it as simple as possible. Our goal was to make the cost comparisons easily comprehensible to academic decision-makers, both faculty members and administrators, and to create a comparative cost unit (the cost-per-student) that would illustrate the impact of different design decisions made by the individual projects.

We decided to be as accurate as possible in comparing the time spent by academic personnel in the traditional and redesigned course formats since time is the largest cost factor in instruction. We also decided to be clear and consistent in explaining what we are comparing—that is, the "before" and "after" course operating costs—and what we are not—the development or transition costs involved in the redesign.

The actual dollars cited in the cost-per-student data are, of course, somewhat arbitrary, reflecting the salaries of particular personnel, which vary by field, geographic location and level of faculty involved in the course. (A high cost-per-student may be the result of heavy involvement of senior faculty in an institution in the North East whereas a low cost-per-student may be due to the predominance of junior faculty in a course in the South.) Both the comparative cost-per-student and the total dollars saved can provide useful information for a particular institution, but the key comparative data are the percentage decreases, which reflect the impact of the design decisions. One can then examine, for example, why Virginia Tech saved 77% whereas IUPUI saved 20%. (Tables summarizing this comparative data can be found on our Web site at Outcomes Analysis.)

Having said that, a question that persistently arises about our costing methodology is, "Aren't you overstating the savings that the redesign projects realized because you don't include development costs in the comparative cost-per-student calculations?"

The reality is that we understate the costs savings achieved by the projects because of our desire to keep the cost comparisons as simple as possible. Here are three examples of savings that are not included in the comparative cost-per-student data:

  • 22 of the 30 projects increased student retention. With one exception, we do not include savings accrued through increased retention in the comparative cost-per-student numbers.

The exception is the University of Central Florida (UCF). Included in its 28% reduction in the cost-per-student is the savings that result from a 7% increase in the retention rate. Applying this increased retention rate to all 25 sections of UCF's redesigned course results in a one-course-section reduction, amounting to $8,239 in cost savings in teaching personnel and classroom space rental for one traditional section.

The other 21 projects that increased student retention do not include the cost implications in their cost-per-student comparisons. Using IUPUI as an example, here's how those retention savings can be calculated.

IUPUI reduced costs in Introduction to Sociology by increasing the number of large sections offered from two to three, producing an annual savings of approximately $34,000. In addition, IUPUI decreased their DFW rate from 38.9% to 24.8%, which means that fewer sections will be needed. Prior to redesign, 778 students needed to repeat the course; after redesign, that number dropped to 496. Since 282 students no longer need to repeat the course, IUPUI could offer one fewer large section (of 200 students) at $6,199 and two fewer small sections (of 40 students each) at $6,671 each. The reduction in the DFW rate thus translates to additional savings of $19,541 achieved by the redesign.

If these calculations were applied to each of the 22 projects that increased retention, further savings would result.

  • 24 of the 30 projects have substantial space savings because of reduced seat-time. With one exception, we do not include savings accrued through reduced space needs in the comparative cost-per-student numbers.

The exception is again UCF. UCF's plan to reduce costs focused on saving funds spent on renting classroom space. At $22 per square foot, the cost to rent a 100-seat classroom is $44,000 annually. Based on UCF's scheduling policies and procedures, 37 traditional class sections can be placed in a single classroom annually, which translates to $1,189 per section. By reducing live-class meeting time BY two-thirds in its redesigned course, UCF reduced the rental costs to $396 for each section. Due to the more efficient use of 100-seat classrooms in the redesigned course configuration, five fewer sections were needed to accommodate the same number of students each year, amounting to savings of $46,309, including faculty, GTAs, and classroom rental costs.

If these calculations were applied to each of the 24 projects that reduced or eliminated in-class time, further savings would result.

  • While we report what we call "additional savings"--by which we mean savings that were not part of the plan but occurred as a result of the redesign--in each project's Final Report: Impact of Cost Savings on our Web site, we do not include those savings in the comparative cost-per-student numbers.

Here are two examples of such additional savings. The University of Iowa redesigned its introductory chemistry sequence. Historically, the chemistry department offered a separate introductory chemistry course for chemical sciences majors. Iowa's College of Engineering was so pleased with the redesign that it moved all of its 70 students into the redesigned course. As a result, the chemistry department stopped offering the chemical sciences sequence, producing an additional cost savings of $25,959.

In response to the redesign success at Fairfield University, the biology department plans to change the entire introductory sequence for biology majors. Previously the department required four semesters of introductory course work in the freshman and sophomore years. The department plans to expand the freshman sequence to three semesters and incorporate the two second-year courses within it. With the updated method of instruction, students will cover the same amount of material that used to require four semesters. Associated with this new approach are some important cost savings. One less large lecture class will be necessary and at least four laboratory sections will be removed, reducing the department's reliance on adjuncts and the number of faculty committed to the introductory sequence. Other additional savings at Fairfield included a 73% reduction in laboratory costs by replacing dissection labs with computer-based activities.

We did not ask the project institutions to calculate these three kinds of costs savings because of our desire to keep the process relatively simple. After all, the issue that we are trying to address is not how to count. The issue is how to get academics to behave differently—that is, how to convince them that by investing in IT-based course redesign, they can see a return on that investment in improved student learning and reduced instructional costs.

Returning to the issue of development costs, we know that when people raise this issue, they often have assumptions about the significance of development costs in the overall cost equation. Some assume that substantial investments must be made in instructional software development. Others assume that substantial investments must be made in the campus infrastructure, including hardware, software and staff support.

Regarding the common but deeply mistaken assumption that the time to create new learning software is often the largest single element of the cost of developing technology-enhanced courses, there is absolutely no evidence that this is true. While there have been instances of online courses that rely heavily on creating expensive instructional software (recall the PLATO system in the '80's), the vast majority of online courses do not. Most are faculty-led, discussion-based courses that are structured via course management systems like Blackboard or WebCT.

In the Program in Course Redesign, we explicitly discouraged the projects from developing software. In the selection process, we favored proposals in disciplines with large existing bodies of technology-based curriculum materials and/or assessment instruments. We also favored proposals from institutions that were willing to use existing materials in order to focus on redesign issues rather than materials creation. As a result, only five of the 30 projects developed software. Most used commercial software packages that students paid for, publisher-supplied software that accompanied textbooks, and existing Web sites from higher education, government or other public sources. In each case, there were no software development costs. Other projects were simply not materials-intensive, and still others used materials developed for the traditional format in different ways.

The second assumption--that we omit infrastructure investments from our cost calculations--is partially accurate. We distinguish between costs that are constant (existing both before and after the redesign) and those that are not. We do not believe that a modern college or university can operate in the US in the 21st century without a robust campus technological infrastructure, including the staff needed to support it. Indeed, just about every one of our 3,600 institutions of higher education has the needed infrastructure in place to support large-scale course redesign, and, in many cases, that infrastructure is under-utilized. Thus, the costs of that infrastructure are constant.

Even if one were to calculate the portion of the infrastructure used by a single redesign, the number would be so small that the effort required would far exceed the value in doing so. The typical campus infrastructure is used for instruction, research, email, Web support, administration, student services, library services, content-based teaching (e.g., computer science courses), downloading music files, and so on. A single course consumes a tiny fraction of the infrastructure. Furthermore, the costs of that infrastructure would be constant whether or not a redesign occurred.

Some redesigns, however, require the addition of servers, specialized software, technical staff, and so on, and in those cases, the costs of those additions are included in the comparative cost data. Virginia Tech, for example, added a data base manager and a programmer to support its redesign of linear algebra, and Fairfield University adds $8,100 annually to purchase specialized software and supplies as part of its redesign of general biology.

There were, to be sure, other development costs associated with the redesign projects. In general, the grants that we awarded were spent on faculty development (not materials development) and on release time to enable the faculty to engage in the many tasks associated with course redesign. We know that, in general, the development cost for each project was $200,000. Some of the projects spent more; others spent less.

Does this tell us that the development cost of redesigning a large-scale course is $200,000? Unfortunately, no. If we had offered grants of $100,000, we are certain that we would report that, in general, the development cost for each project was $100,000. If we had offered grants of $1 million, we would find that, in general, the development cost for each project was $1 million. On the other hand, a Kansas State University faculty member who attended our workshops and learned our methodology redesigned her human development course without a grant.

This phenomenon is the course-development equivalent of economist Howard Bowen's "revenue theory of costs." In his classic 1980 study of higher education costs, Bowen found great variation in costs among seemingly similar institutions with seemingly similar outcomes. More recently, data reported by the National Center for Education Statistics shows per-student spending for instruction of $7573 at universities, $4788 at other four-year institutions, and $2727 at two-year colleges. Bowen argued that revenues determine costs: colleges and universities raise all the money they can and spend all that they raise. Bowen's concept emphasizes the fact that no absolute, objective standard exists by which we can say how much college should cost.

Similarly, no absolute, objective standard exists by which we can say what the development cost of a large-scale redesign should be. The reality is that one can only know what the development costs of a project are either by restricting development costs to a set figure (a development budget) or by assessing costs after the development is complete.

Even if one wants to calculate development costs, what do you attribute to a particular course redesign? Carnegie Mellon University (CMU), for example, has been building intelligent tutoring systems for years. These systems have wide applicability across disciplines and are used in K-12 as well as in higher education. Some have been commercialized and are generating a return on CMU's initial investment. What proportion of the cost of development of these systems should we attribute to CMU's redesign of introductory statistics?

To attribute development costs properly as part of the overall cost calculation, one must first resolve the proportional problem illustrated by CMU and then amortize the development costs over the lifespan of the course. Since most introductory courses have a relatively long shelf life--somewhere between five and ten years on average--this would mean aggregating the operating cost savings over the same five- or ten-year period and subtracting the development costs from that total. If we were to use the lifespan of the course as the unit of analysis, subtracting the development costs, the cost savings we now cite would easily increase by fivefold. The question remains, however, is the time and effort required to do all of those calculations worth it?

The effect of keeping the costing process used in the Program in Course Redesign relatively simple is that while we may be understating the cost savings achieved by the redesign projects, we are successfully teaching institutions how to behave differently. We're willing to live with that.

--Carol A. Twigg

2. UPDATES FROM THE PROGRAM IN COURSE REDESIGN

Featuring progress reports and outcomes achieved by the Program in Course Redesign.

Ohio Commission Learns about the Program in Course Redesign

On October 10, 2003, the results of the Program in Course Redesign were shared with the Governor's Commission on Higher Education and the Economy meeting in Columbus, Ohio. Carolyn Jarmon was invited to testify at a hearing of a commission sub-committee on Delivering Results. The Commission is grappling with the combined issues of access, quality and cost, and is seeking active solutions to recommend to Governor Taft. Although several of Ohio's college and university presidents are members of the commission, most are not from the higher education community. Carolyn's presentation featured the redesign projects at Ohio State University and the University of Tennessee to illustrate how technology can be used constructively to address the issues of concern to the commission.

South Dakota's Governor Rounds Funds A Second Statewide Initiative in Course Redesign

To encourage the six public higher education institutions in South Dakota to improve quality and reduce costs using technology, Governor Michael Rounds has funded a second year of the Rounds Grants Program in Course Redesign modeled on the Program in Course Redesign. Interested institutions have submitted a letter of intent to redesign a large enrollment course; they will participate in a November workshop conducted by the Center for Academic Transformation. The Center will also consult with inter-university project teams as they develop their redesign proposals. To learn more about how your institution, system or state can replicate the successes achieved in the Program in Course Redesign and the consulting options available from the Center for Academic Transformation, click here.

Round III

Round III final reports are due October 31, 2003. Some teams have already submitted their reports. Here are some of the highlights:

As reported in the July newsletter, the redesign of fine arts at Florida Gulf Coast University has resulted in significant increases in student learning and reduced cost in a growth environment. Some of the most important pedagogical techniques leading to FGCU's success included the use of low-stakes quizzing that provided immediate feedback to students and online discussions of model short essays before students started to write their own essays. Cost reductions resulted from the use of the Intelligent Essay Assessor, a software program used to automate the grading of short, focused essays; an alternative staffing model that will allow continued growth at a minimal cost; and reduced space needs because the entire course is now online. These last two factors are extremely important to this rapidly growing institution.

Portland State University's redesign of First-Year Spanish included three courses. At the end of first full year of redesign, the cost-per-student decreased from $178 to $136 with no significant difference in learning as measured by external proficiency exams. Successful pedagogical techniques cited were 1) use of online materials to teach listening and reading comprehension, and grammar skills, and 2) proficiency training for graduate teaching assistants (GTAs). Effective cost reduction techniques included the ability to increase the number of students each graduate student worked with because of better online materials and automated grading of grammar exercises and other quizzes. The cost of assigning an additional section to a GTA was about 25% of the first section. Reduced seat-time and space savings also contributed to overall cost reduction.

Success in Tallahassee Community College's redesign of English Composition was the result of a number of pedagogical improvement techniques. Among these were a menu of common assignments requiring critical reading and the integration of reading and writing tasks, computer-based resources providing grammar skills practice, and the use of SMARTHINKING, a company that provides online tutoring when the students need it. Each of these provided faster feedback for students as well greater consistency among the large number of sections required to teach 3,000 students each year. Accompanying the successful pedagogical changes were several effective cost reduction techniques: increased student success due to the effective online resources; increased use of adjuncts made possible by greater consistency of course materials and practices; and the use of SMARTHINKING, which reduced the faculty time needed to review essay drafts and led to better quality papers. These factors combined to increase the rate of C grades or better from 60.7% in the traditional format to 68.7% in the redesign.

The University of New Mexico's redesign of general psychology was extremely successful. Student learning increased significantly. In fall 2002, 77% of redesign students received a grade of C or better versus 59% in the traditional course. There were also more A's (35% in the redesign versus 18% in the traditional format.) The cost of the course decreased slightly more than anticipated. The original plan called for five TAs to be part of the redesign; however, when the redesign was implemented, only four TAs were needed. The most important factor contributing to the 42% cost reduction was consolidating two sections into one. Two additional success factors were using publisher materials to reduce preparation and ongoing development costs and sharing the resources developed for the redesigned course with five smaller evening and weekend sections taught by others.

Using low-stakes quizzing with immediate feedback and accommodating different learning styles are two of the most important pedagogical techniques in the redesign of World Literature at the University of Southern Mississippi. Frequent self-assessments increased students' mastery of content and allowed faculty to adjust their teaching based on the concepts students missed on the quizzes. The redesign has increased course consistency by combining all students in one section taught by a single team with shared responsibilities using a single syllabus. At the same time, the wide array of learning options has provided greater flexibility for students. Students can attend live presentations or watch them online or both, take quizzes until they demonstrate mastery, view an array of complementary media linked to course content, explore recommended Web sites to deepen their understanding of literature, or consult with graduate assistants regarding their essays. Important cost reduction techniques included consolidating sections and leveraging the technology to provide a wide array of learning opportunities.

Final Round III reports will be posted on the Center's Web site in late fall. To learn more about these and other Round III projects, see Project Descriptions Sorted by Grant Rounds.

3. CUTTING ACROSS

Highlighting themes and activities that cut across redesign projects.

Deeper Learning at Reduced Cost

As we have reported previously, 23 of the 30 institutions in the Program in Course Redesign have thus far demonstrated greater student learning when compared to that achieved in traditional course formats, with the other seven showing learning equal to traditional formats. These findings have been documented as part of careful assessments using such methods as comparisons of exam questions used in both traditional and redesigned sections, comparisons of grades received using common rubrics, and so on. In these cases, the comparisons involve judging whether students have mastered similar content.

Three of these 23 projects—Carnegie Mellon University (CMU), Fairfield University and the University of Massachusetts (UMass)--have demonstrated student learning of much greater depth than had been achieved in traditional formats, learning that went well beyond what faculty had expected would be possible.

Carnegie Mellon University redesigned the laboratory portion of its introductory statistics course while leaving the lecture portion intact, targeting labs that teach statistical concepts known to be difficult for introductory students. The redesign uses SmartLab, an automated, intelligent tutoring system that monitors students' work as they go through lab exercises. SmartLab provides them with feedback when they pursue an unproductive path and closely tracks and assesses individual students' acquisition of skills in statistical inference—in effect, providing a personal tutor for each student.

CMU faculty found that students in the redesigned course showed a notable improvement in the percentage of exam questions that were correct. Some of the questions students mastered asked them to choose an appropriate statistical test when the correct answer was either chi-square or t-test. Prior to the redesign, these exam questions were never posed to students because they were deemed too difficult. CMU students not only mastered the content covered in the traditional course more easily and effectively but also mastered content that had never been included in the course. Thus, faculty found that that statistical literacy of the students who completed the course increased significantly.

Fairfield University's goal in redesigning its introductory biology sequence was to change the focus of activity from memorization to a student-centered, inquiry-based pedagogy. The redesigned course emphasizes the application of scientific methodology and critical thinking skills and incorporates modern instrumentation, software modules and online databases in order to increase student comprehension and retention of material. The redesign condensed all sections into a single large-classroom format in which students work in teams of two to three around individual laptop computers, utilizing software modules that focus on inquiry-based instruction and independent investigations. Labs were also redesigned to use online laboratory modules. Students also use iBooks in labs to conduct independent investigations using modern software packages and Web-based exercises, thus creating a dynamic, inquiry-based environment.

Students in the redesigned course performed significantly better than traditional students on common benchmark exam questions. Other questions on exams in the redesigned course were crafted to test higher-order thinking, allowing students to synthesize material from the basic concepts. Students succeeded exceptionally well on these questions, questions that had not been asked in the traditional format. In addition, specific exam questions incorporated in the second-year genetics course (required of all biology majors) were used to measure the retention of key concepts and to compare the performance of traditional students and redesign students. Students from the redesigned course performed significantly better on this set of questions than did students from the traditional course (88% correct vs. 79% correct, respectively). Thus, students in the redesigned model not only acquired higher order thinking skills in the introductory biology course but also retained them in subsequent courses.

The goal of the University of Massachusetts' redesign of introductory biology was to create an active learning environment within a large lecture hall setting supplemented by a variety of out-of-class activities that ensured that students were prepared when they come to class. Before class, UMass students review learning objectives, key concepts, and supplemental materials posted on the class Web site. To assess their preparation for class, students then complete online quizzes, which provide immediate feedback to students and data for instructors to assess students' knowledge levels. Instructors are now able to reduce the amount of class time spent on topics that the students clearly understand, increase time spent on problem areas, and target individual students for remedial help.

Using an in-depth analysis of the exam questions used in the traditional and the redesigned course, the faculty at UMass found a profound shift in the focus and difficulty of the questions. In the traditional course, the vast majority of questions were designed to test recall of factual material or definitions of terms, and only a minority of questions (21%) required reasoning or problem solving skills. In the redesigned course, 67% of the questions required problem-solving skills. These results are consistent with the nature and intent of the redesign. Online resources were created to help students learn basic content and class time was modified to allow students to practice problem-solving and receive real-time feedback on specific challenges they experienced. Students not only learned considerable detailed content but also learned to apply that content to solve rich problems addressing a wide range of biological questions.

In each of these three redesign projects, the faculty measured both the knowledge expected of students in the traditional course as well as new knowledge that had not been required of earlier students. Students not only mastered the concepts that had been identified as important but also demonstrated higher-level learning that prepared them more effectively for future courses. These results support the conclusion that information technology, used thoughtfully in combination with proven pedagogy, can make a profound difference in student learning.

Details about the findings from each of these projects can be found by following the links at Program in Course Redesign.

4. COMMON GROUND

Reporting on initiatives that share the Center's goals and objectives.

Learning Options at Glendale Community College

At Glendale Community College in Phoenix, AZ, René Díaz-Lefebvre is working with his colleagues to create as many learning options as possible to accommodate individual differences among students. The Multiple Intelligences/Learning for Understanding (MI/LfU) initiative began at Glendale in 1994 as an experimental pilot study in the psychology department. Since then, this initiative has evolved into an effective, interdisciplinary approach to learning, teaching, and creative assessment, involving departments in math, art, biology, Spanish, psychology, chemistry, nursing, music, child and family studies, English and anthropology. Based on Howard Gardner’s Theory of Multiple Intelligences, the Glendale College effort takes human differences seriously, helping students demonstrate their understanding of academic material through a performance of understanding. More than 2,400 students have completed courses offered where MI/LfU Learning Options are available. Among the outcomes the Glendale team has observed are a significant increase in student demand to study in classes that accommodate multiple intelligences and increased numbers of faculty who understand and appreciate teaching in a student-flexible environment. To learn more about the Glendale program, contact Rene Diaz-Lefebvre at r.diaz@gcmail.maricopa.edu.

A New Newsletter from The National Resource Center for the First-Year Experience

Increasing student success during the freshman year has been the goal of The National Resource Center for The First-Year Experience and Students in Transition program since its inception. For more than 20 years, this program has provided resources on assessment and innovative curricular reform initiatives, including supplemental instruction, service learning and learning communities. Recently, the National Resource Center has launched a new newsletter designed to address these issues in a bi-monthly, online format. E-SOURCE for College Transitions will include short articles and news briefs on such topics as teaching strategies in the first college year; strategies for addressing sophomore, senior, and transfer transitions; and proven institutional initiatives that assess student learning, and programs addressing the needs of special student populations. For more information see http://www.sc.edu/fye or contact Tracy Skipper at tlskipper@sc.edu.

Frank Newman Goes to Congress (and Takes the Program in Course Redesign Along)

On May 13, 2003, Dr. Frank Newman, Director of the Futures Project at Brown University, testified before the House Committee on Education and the Workforce. The hearing was entitled, "The State of American Higher Education: What Are Parents, Students, and Taxpayers Getting for Their Money." Dr. Newman's focus throughout his remarks was on the key role that higher education plays in preparing students for a life of workforce and civic participation. He effectively describes higher education's successes, but he cautions that society's concerns about uneven learning achievement and the need for greater efficiency require careful and innovative solutions. He offers examples of institutions that focus on assessing student learning, and he highlights the successes achieved by the Program in Course Redesign in reducing costs while improving learning. To read Dr. Newman's entire testimony, go to http://edworkforce.house.gov/hearings/108th/fc/hea51303/newman.htm.

5.SUBSCRIPTIONS, SUBMISSIONS, ARCHIVES, REPOSTING

The Center for Academic Transformation serves as a source of expertise and support for those in higher education who wish to take advantage of the capabilities of information technology to transform their academic practices.

  • To subscribe to The Learning MarketSpace, click here.
  • To submit items for inclusion in this newsletter, please contact Carolyn G. Jarmon.
  • This newsletter is a merger of The Learning MarketSpace and The Pew Learning and Technology Program Newsletter.
  • Archives of The Learning MarketSpace, written by Bob Heterick and Carol Twigg and published from July 1999 – February 2003, are available here.
  • Archives of The Pew Learning and Technology Program Newsletter, published from 1999 – 2002, are available here.
  • You are welcome to re-post The Learning MarketSpace without charge. Material contained in The Learning MarketSpace may be reprinted with attribution for non-commercial purposes.

Copyright 2003, The Center for Academic Transformation