View Site Map

The Learning MarketSpace, April 2004

A quarterly electronic newsletter of the Center for Academic Transformation highlighting ongoing examples of redesigned learning environments using technology and examining issues related to their development and implementation.

TABLE OF CONTENTS

1. THE CAT VIEWPOINT

  • How Essential is Assessment?

2. THE ROADMAP TO REDESIGN (R2R)

  • R2R Selects Forty Prospective New Associates
  • New R2R Planning Resources Available
  • Blackboard and WebCT Participate in R2R

3. UPDATES FROM THE PROGRAM IN COURSE REDESIGN

  • Projects Categorized by Models, Disciplines, and Degree of Success
  • Systemwide Redesign Initiatives Underway
  • Campuses Focus on Redesign
  • Round II Analysis Available on Web Site
  • Round III Final Reports Available on Web Site

4. CUTTING ACROSS

  • Automated Grading: Timely Feedback for Students at Reduced Cost

5. COMMON GROUND

  • Policy Issues for Secondary and Post Secondary Sectors
  • Lumina Foundation Announces New Grants
  • Publishers To Offer New Textbook Options

6. CALENDAR OF EVENTS

7. SUBSCRIPTIONS, SUBMISSIONS, ARCHIVES, REPOSTING

1. THE CAT VIEWPOINT

Perspectives on issues and developments at the nexus of higher education and information technology.

How Essential Is Assessment?

I must confess. I have never been much of an assessment advocate. It's not that I was against it. Like most in higher education, I thought that assessment fell under the category of "nice to do."

Years ago I remember an occasion when a faculty colleague of mine was asked by an accreditation team to define college-level learning. "College-level learning is what college faculty say is college-level learning," he responded. At the time, that seemed to me like the right response. I, like most in higher education, believed that individual faculty members should pretty much determine whether students did or did not learn what they were trying to teach. I also thought that writing down those expectations in the form of learning outcomes was, more or less, an exercise—and a pretty time-consuming exercise at that.

When we started the Program in Course Redesign, however, we knew that we would have to build in a strong assessment component so that our claims for improving student learning while reducing instructional costs would hold up to intense scrutiny. We knew that most faculty and administrators in higher education are primarily interested in quality improvements and they would want proof that quality did, in fact, improve as a result of redesign.

Consequently, a required part of the grant application process was to create a detailed assessment plan, and a required part of the grant implementation process was to carry out that plan and collect comparative learning outcome data that would be posted on the Center's web site. We were fortunate to enlist the help of Peter Ewell, who is widely regarded as one of the nation's leading experts on assessment, throughout the process. Peter had a lot of interaction with the grant applicants, the project leaders and the Center's staff. In that process, I, like all the others involved in the program, learned a lot about assessment.

I learned about the importance of reaching agreement in advance among all faculty teaching the course about what students were expected to learn and about how that learning would be measured. I learned about the importance of applying the agreed-upon measurement methods consistently.

I also learned how foreign the practice of assessment is to most faculty members and administrators. We struggled constantly with their tendency to confuse assessing student and faculty satisfaction with assessing student learning. Most of the project teams loved to survey students and faculty about how much they did or did not like the newly redesigned course formats (and we have reams of data on that subject!) But they were less eager to measure student learning thoroughly and consistently.

But the most important lesson that I learned is that a commitment to assessing student learning is fundamental to creating change in teaching and learning. Without that commitment, I do not think it is possible to make significant improvements. Here are the reasons why.

All college faculty think they do a good job. Moreover, most view themselves like the children in Lake Woebegone—all are above average in performance. Yet we all know that the quality of teaching varies widely and that large differences in student learning are the result. How can we overcome these differences when everyone thinks he or she is doing a good job? We need to measure what is going on. Without establishing baseline data about student learning, we really don't know how good a job anyone is doing. Without establishing a process of comparing the learning outcomes that result from different teaching methods, we really don't know which ones are better.

I am frequently asked, how should departments and institutions deal with faculty who resist trying new ways of teaching and learning or who oppose what may appear to be "unorthodox" teaching styles? Before the Program in Course Redesign, I used to talk about encouragement, carrots and sticks, cajoling, and so on. Now I answer, measure the results. I am convinced that the only way to deal with disputes about the "right" way to teach is to insist on measuring student learning outcomes and comparing the results. If the old ways are better, the results will show it. If the new ways are better, a framework for change has been created.

Our goal should be to establish an institutional assessment culture if we want to make real improvements in teaching and learning. Where should institutions begin if they want to use assessment of student learning as a way to create change? One of the roadblocks to establishing an institutional assessment culture, I believe, is that many academics think that assessment has to be complicated. By demonstrating that assessment does not have to be complicated and that a number of relatively simple techniques can be embedded in everyday academic practice, perhaps we can move closer to that goal.

In the Roadmap to Redesign, one of our objectives is to provide new institutions with a number of planning resources that will support a streamlined redesign process based on the experience gained in the Program in Course Redesign. In that program, we asked each institution to invent both its own redesign and its own assessment plan. As a result of that work, we learned a lot about what works well and what does not. We can now summarize the most effective and efficient ways to assess whether or not improved student learning has been achieved as a result of course redesign, and we can offer a menu of assessment options for selection as part of a two-step process rather than asking each institution to re-invent the wheel.

The first step is to establish the method of comparison. During a pilot phase, this comparison can be accomplished in one of two ways: 1) run parallel sections of the course in traditional and redesigned formats (ideally randomly assigning students to each format) and look at whether there are any differences in outcomes—a classic "quasi-experiment," or 2) establish baseline information about student learning outcomes from an offering of the traditional format "before" the redesign begins and compare the outcomes achieved in a subsequent ("after") offering of the course in its redesigned format.

During a full implementation phase, there will not be an opportunity to run parallel sections. The choices for comparison are to use baseline data from an offering of the traditional format "before" the redesign began or from the parallel sections of the course offered in the traditional format during the pilot phase. The key to validity in all cases is to use the same measures and procedures to collect data in both kinds of sections and to ensure as fully as possible that any differences in the student populations taking each section are minimized (or at least documented so that they can be taken into account.)

The second step is to choose the measurement method. The degree to which students have actually mastered course content appropriately is, of course, the bottom line. Therefore, some kind of credible assessment of student learning is critical to the redesign project. Five measures that may be used are described below.

1) Comparisons of Common Final Exams

Common final examinations can be used to compare student learning outcomes across traditional and redesigned sections. This approach may include sub-scores or similar indicators of performance in particular content areas as well as simply an overall final score or grade. (Note: If a grade is used, there must be assurance that the basis on which it was awarded is the same under both conditions—e.g., not "curved" or otherwise adjusted.) These common examinations may be internal (designed by faculty) or external (available from outside sources such as the ACS Blended Exam in Chemistry, a well-accepted instrument in chemistry that also allows inter-institutional comparisons according to common standards.)

2) Comparisons of Common Content Items Selected from Exams

If a common exam cannot be given—or is deemed to be inappropriate—an equally good approach is to embed some common questions or items in the examinations or assignments administered in the redesigned and traditional delivery formats. This design allows common baselines to be established, but still leaves room for individual faculty members to structure the balance in their own ways where appropriate. For multiple-choice examinations, a minimum of twenty such questions should be included. For other kinds of questions, at least one common essay, or two or three problems should be included.

3) Comparisons of Pre- and Post-tests

A third approach is to administer pre- and post-tests to assess student learning gains within the course in both the traditional and redesigned sections and to compare the results. By using this method, both post-test results and "value-added" can be compared across sections. Pre- and post-tests can be constructed to cover a range of behaviors from recall of knowledge to higher-order thinking skills.

4) Comparisons of Student Work Using Common Rubrics

Naturally occurring samples of student work (e.g. essays, papers, lab assignments, problems, etc.) can be collected and their outcomes compared—a valid and useful approach if the assignments producing the work to be examined really are quite similar. Faculty must have agreed in advance on how student performance is to be judged and on the standards for scoring or grading (a clear set of criteria or rubrics to grade assignments.) Faculty members should practice applying these criteria in advance of the actual scoring process to familiarize themselves with it and to align their standards. Ideally, some form of assessment of inter-rater agreement should be undertaken.

5) Comparisons of Course Grades Using Common Criteria

Course grades may be used as the measure of learning if—and only if—grades are assigned on the basis of comparable performances on common instruments using common grading standards. Faculty must have agreed in advance on standards for scoring or grading. Specific descriptions within each area of learning can be developed to distinguish between grades of A, B, C, D, and F, and faculty members can be trained in the interpretation of the criteria.

In addition to choosing one of these five measures, institutions may want to conduct other comparisons between the traditional and redesigned formats such as performance in follow-on courses, attitude toward subject matter, deep vs. superficial learning, increases in number of majors in the discipline, student interest in pursuing further coursework in the discipline, differences in performance among student sub-populations, and student satisfaction measures.

An institution may choose to compare learning outcomes among traditional sections to identify teaching and learning problems that need attention or to compare learning outcomes between traditional and redesigned formats to make better choices about how courses are structured. We need to create an assessment culture if we want to foster institutional change and improve student learning. How essential is assessment? It is indispensable.

For an expanded discussion of the assessment models that will be used in R2R, see Project Descriptions Sorted by Model.

--Carol A. Twigg

2. THE ROADMAP TO REDESIGN (R2R)

Featuring progress reports and outcomes achieved by the Roadmap to Redesign.

R2R Selects Forty Prospective New Associates

Forty applicants to participate in the Roadmap to Redesign as new practice members have been selected. These new applicants will participate in a workshop on June 2, 2004 in Baltimore, Maryland. At the workshop, three-person redesign teams from each institution will learn how to use the program's streamlined tools and techniques in preparation for developing their final redesign proposals. The teams will also meet with the faculty from the core practices in each academic area: precalculus mathematics, psychology, Spanish and statistics to discuss how and why they used specific interactive course materials and to gain advice about how to deal with a variety of planning issues as they implement their redesigns.

The new participating institutions are Buffalo State College, Calhoun Community College, Carlos Albizu University, Central Michigan University, Chattanooga State Technical Community College, College of DuPage, Concordia University, East Carolina University, Eastern Washington University, Georgia State University, Louisiana State University, Macomb Community College, Mohave Community College, Montclair State University, Ocean County College, Seton Hall University, SUNY College at Cortland, Texas Tech, Towson University, The University of Alabama, University of Arkansas – Fort Smith, University of California, Los Angeles, University of Missouri – St. Louis, The University of North Carolina at Chapel Hill, The University of North Carolina at Greensboro, University of South Alabama, The University of Southern Mississippi, University of Wisconsin Colleges, and Wayne State University.

Several of the applicants are proposing to redesign courses in more than one academic discipline. Not surprisingly, larger numbers of proposals were received for redesigns of precalculus mathematics and psychology than for Spanish and statistics since the former enroll much larger student numbers nationally than the latter. The next step in the application process will be to submit a full redesign proposal to the Center, by August 1, 2004. To learn more about the R2R program, visit The Roadmap to Redesign.

New R2R Planning Resources Available

As part of the R2R program, the Center for Academic Transformation has developed a series of online resources to streamline the process of course redesign and reduce the amount of time and effort needed in planning. These resources include "Five Principles of Successful Course Redesign" (a summary of the redesign techniques that are essential to improving student learning while reducing instructional costs); "Five Models for Course Redesign" (a summary of the characteristics of the five course redesign models that emerged from the Program in Course Redesign); "Five Critical Implementation Issues" (a summary of the most common implementation issues encountered by the projects in the Program in Course Redesign); "Cost Reduction Strategies" (a summary of the most effective strategies that can reduce instructional costs); and "Five Models for Assessing Student Learning" (a summary of the most effective and efficient ways to assess student learning.) These resources will be used by the institutions participating in the R2R program as they develop their redesign plans, and they also can be used by others in the higher education community as they engage in course redesign. To explore these resources, see Planning Resources.

Blackboard and WebCT Participate in R2R

A key component of the R2R program is the establishment of virtual repositories of research-based interactive learning materials. Proven to increase student learning or contribute to cost reduction, these materials can be used by other colleges and universities to redesign courses. Some of these materials have been developed at the core practice institutions to be used in Blackboard; others to be used in WebCT, still others to be used in homegrown course management systems. Both Blackboard and WebCT are helping the Center convert all locally developed materials so that they can be used in either of these two predominant course management systems. The contributions of Blackboard and WebCT will be of great benefit to the new practice institutions since the availability of proven student learning resources are key components of a successful, cost-effective learning experience for students.

3. UPDATES FROM THE PROGRAM IN COURSE REDESIGN

Featuring progress reports and outcomes achieved by the Program in Course Redesign.

Projects Categorized by Models, Disciplines, and Degree of Success

The Program in Course Redesign has produced a lot of material about the thirty participating institutions. In order to make it easier to find projects that are of particular interest, the Center has established a series of new pages on our web site. If faculty members would like to explore redesign projects in a particular academic discipline, they can easily identify them. If institutions are interested in gaining a better understanding of the five redesign models that emerged, they can take a look at all of the projects that used a particular model. Perhaps of most interest is our sort of the projects by level of success. While all projects achieved some level of success, some were more successful than others in their efforts to achieve learning and cost savings than others. Our new sort makes it possible to easily identify those projects that achieved more success toward these two goals. Each sort includes links to one-page project abstracts, full academic plans, full cost savings plans, ongoing progress reports, final outcome reports and contact information for each of the 30 institutions. To explore these new pages, visit Program in Course Redesign.

Systemwide Redesign Initiatives Underway

A systemwide initiative to redesign large enrollment courses for greater student learning and lower cost is underway in Oklahoma sponsored by The Oklahoma State Regents for Higher Education. Two academic areas have been selected as a focus: mathematics and ecology. A two-day kickoff event occurred on February 26 and 27, 2004 for more than 100 faculty and administrative participants from institutions across the state. The Center for Academic Transformation played a major role in the program. Joining Carolyn Jarmon were Malcolm Hill, project leader at Fairfield University in biology, and Kirk Trigsted, project leader at the University of Idaho in precalculus mathematics, so that participants gained both an overview of the redesign methodology and an understanding of two specific implementations.

The University of Hawaii System has joined a growing number of state systems that are moving forward to establish a redesign initiative to improve the quality of student learning in the context of limited resources. The new initiative will be launched in May 2004 with a visit from Carol Twigg. Carol will meet with faculty members and administrators from the ten institutions that comprise the system as a kickoff to the planning phase. Future steps include a systemwide Request for Proposals, a series of redesign workshops, and follow-up assistance from the Center to selected redesign teams. The Center's goal is to teach system leaders how to conduct redesign projects so that they can manage future projects independently. For more information about the State-based Redesign Initiative, contact Carol Twigg.

Campuses Focus on Redesign

The University of Kentucky is launching a campuswide redesign project in May 2004 with a one-day workshop conducted by the Center for Academic Transformation to introduce faculty, administrators and staff to the redesign process. The workshop is being organized by the Teaching and Academic Support Center, and the University has committed to a series of sessions throughout the next academic year to build on this initial workshop. Faculty from other institutions across Kentucky have been invited to participate as well, and more than 40 faculty members are expected at this kickoff event. This workshop immediately follows a statewide faculty development conference, sponsored by Kentucky's Council on Postsecondary Education (CPE), that has three themes: "Course and Program Redesign and Collaboration," "Improving Teaching and Learning with Technology," and "Improving Student Retention"--all of which resonate strongly with the Center's goals. For more information about the Center's redesign consulting services, see NCAT Staff and Board of Directors.

Round II

An overview of the Round II Program in Course Redesign projects and an analysis of what was learned from the perspective of the program staff is now available at Program in Course Redesign: Round 2. The analysis compares the pedagogical and cost reduction techniques used by the Round II projects with those used by the Round I projects as well as the implementation issues encountered by both groups. Final reports for each of the ten Round II projects can be found by following the links at Project Descriptions Sorted by Grant Rounds.

Round III

Final reports for nine of the ten Round III projects have been posted on the Center's Web site and can be found by following the links at Project Descriptions Sorted by Grant Rounds. Highlights include:

Eight of the ten institutions demonstrated increased learning; two showed learning equivalent to the traditional format.

All ten institutions reduced costs. Round III projects planned to reduce costs by about 44 percent, with a range of 20 percent to 84 percent, with a total annual savings of $1,195,028. They actually reduced costs by 39 percent on average, with a range of 15 percent to 56 percent, with a total annual savings of $999,214. A detailed summary that compares the projected savings to the actual savings achieved by the projects is available at Projected Savings.

All five models of redesign are represented among the ten institutions. No two model implementations are identical since institutions customized their selected model to meet the needs of their particular students and the capabilities of their faculty while working within common redesign principles. Some of the models chosen may seem counter-intuitive. For example, humanities courses at the University of Southern Mississippi and Florida Gulf Coast University were offered totally online to on-campus students.

Tallahassee Community College (TCC) deserves special mention. Because of the predominance of four-year institutions participating in the Program in Course Redesign (27 of 30), some have wondered whether the redesign methodology applies to community colleges. Although it is difficult to identify the "best" project among the 30, it is worth noting that TCC produced the highest annual dollar savings among all 30 institutions, exceeding $320,000 annually, while making significant improvements in student learning and increasing course completion rates from 54 percent to 75 percent. Way to go, TCC!

An overview of the Round III findings and an analysis of what was learned from the perspective of the program staff will be posted on the Center's web site in early summer. To view the final reports for each project, begin at Project Descriptions Sorted by Grant Rounds and follow the individual links.

4. CUTTING ACROSS

Highlighting themes and activities that cut across redesign projects.

Automated Grading: Timely Feedback for Students at Reduced Cost

One of the cross-cutting principles of effective course redesign is to shift the traditional assessment approach used in large introductory courses (which typically employ only midterm and final examinations) toward frequent assessment and prompt feedback to students about how well they are (or are not) mastering course content. Rather than relying on individual faculty members in small sections to provide feedback--a technique known to increase faculty workload significantly--successful redesign models utilize computer-based feedback strategies. In some cases, a large bank of problems for each course topic is built into instructional software, and assignments are graded on the spot. In other cases, publisher-provided test banks accompany textbooks, enabling faculty to create low-stakes mastery quizzes. By automating the feedback process, every problem or question is graded, and students receive specific information about their performance. This, in turn, leads to more efficient and focused time on task and higher levels of learning since students can use their time effectively to focus on areas where their understanding is weakest. Building in ongoing assessment and automated feedback also lets faculty know how well students are (or are not) doing and to take timely corrective action.

At the same time, using automated assessment and feedback also allow the benefits to scale to large numbers of students. When instructors and/or teaching assistants are responsible for grading, typically they must make compromises such as spot-grading or returning composite scores to students. By replacing hand grading with automated grading, it is possible to reduce the cost of providing feedback because not as many human graders are needed.

At The University of New Mexico, students in general psychology earned credit for completing mastery quizzes. Students were encouraged to take quizzes as many times as needed until they attained a perfect score; only the highest scores counted. The more times students spent taking quizzes and the higher their scores, the better they performed on in-class exams. A similar approach was used in the redesign of psychology at the University of Southern Maine. There students who missed quiz items were directed to online resources for review of the concepts in question before taking the quiz again.

At the University of Iowa, an analysis of the introductory chemistry course structure showed that more than 16,000 homework problems were assigned every term. Four TAs spent their entire assignment grading these problems. In the redesign, Iowa replaced hand grading with Chem SkillBuilder, an online homework software package that provides immediate, automated evaluation of all assigned items. Now the four TAs are available for other assignments, and students receive feedback on every problem they complete.

The design team at Carnegie Mellon University incorporated a "smart" feedback system called StatTutor, an automated, intelligent tutoring system that monitors students' work as they progress through lab exercises. StatTutor provides feedback when students pursue an unproductive path and closely tracks and assesses each student's acquisition of skills in statistical inference—in effect, providing a personal tutor for each student. After using StatTutor, students were able to achieve a level of statistical literacy not deemed possible in the course before its redesign.

Florida Gulf Coast University's redesign used a software program called the Intelligent Essay Assessor (IEA) to grade short, well-structured student essays between 100 and 500 words. The IEA, once programmed, assessed student essays based on content, grammar, mechanics, etc. This software requires careful preparation for use, but once fine-tuned, it reliably scores short paragraphs and saves faculty a lot of grading time.

Providing students with timely, automated feedback increases their motivation, provides focused information about what they already understand and what they need to spend more time and effort to master. In addition, by assessing and aggregating what students do and do not understand, both individually and collectively, faculty are able to spend class time on what students do not know rather than wasting time on what they already understand, a great improvement over the one-size-fits-all lecture method.

Details about the findings from each of these projects can be found by following the links at Program in Course Redesign.

5. COMMON GROUND

Reporting on initiatives that share the Center's goals and objectives.

Policy Issues for Secondary and Post Secondary Sectors

In the February 13, 2004 edition of the Chronicle of Higher Education, Kristen Conklin of the National Governors Association and Travis Reindl of the American Association of State Colleges and Universities make the case that in an environment of greater demand for higher education and shrinking state appropriations, higher education must do business differently. Their cogent analysis of the history and causes of our current situation is followed by concrete ideas about what to do about it, including the need for more strategic state funding of higher education with a clear focus on state and regional priorities. At the same time, institutions must establish priorities for programs and focus on cost-effectiveness and student success. Because leadership in higher education will change, the authors cite the need for a forum where continuing examination of the issues by representatives of states, institutions and businesses can occur. Subscribers can read the full article at http://chronicle.com/prm/weekly/v50/i23/23b02001.htm.

Lumina Foundation Announces New Grants

With the goal of increasing college access and student success, the Lumina Foundation for Education recently announced eleven grants totaling $1.25 million. Among these, seven grants focus on the problem of access including strategies to overcome financial and non-financial barriers, two address the stumbling blocks that inhibit students from reaching their educational goals once they are in college and inhibit their success, and one expands a National Governors Association program to encourage state policies that expand community college access and attainment for low-income adults. To learn more about these Lumina grants, see http://www.luminafoundation.org/newsroom/news_releases/042104.html.

Publishers To Offer New Textbook Options

Recognizing the value of online resources and the rising cost of textbooks, Thomson Higher Education and Pearson Education recently announced new product lines that will be available to students starting in the summer of 2004. The Advantage Series at Thomson Higher Education will coordinate print-based books with online resources and cost about 25 percent less. Thomson is also offering Digital Discounts, a limited-time offer on digital versions of selected texts. In a joint endeavor with O'Reilly Media, Inc., Pearson Education's SafariX Textbooks Online, a comprehensive line of digital texts, will cost 50 percent less than the hardcover texts for the same courses. To learn more about the new textbook options, see the Thomson Press Room and Pearson Education.

6. CALENDAR OF EVENTS

JUNE

  • R2R Workshop for three-person institutional teams in Baltimore, MD

JULY

  • Publication of The Learning MarketSpace

AUGUST

  • Full R2R proposal deadline for R2R finalists
    August 1
  • 20 new institutions selected to participate in R2R
    August 15

SEPTEMBER

  • 20 new R2R redesign projects begin

OCTOBER

  • Publication of The Learning MarketSpace

7. SUBSCRIPTIONS, SUBMISSIONS, ARCHIVES, REPOSTING

The Center for Academic Transformation serves as a source of expertise and support for those in higher education who wish to take advantage of the capabilities of information technology to transform their academic practices.

  • To subscribe to The Learning MarketSpace, click here.
  • To submit items for inclusion in this newsletter, please contact Carolyn G. Jarmon.
  • Archives of The Learning MarketSpace, written by Bob Heterick and Carol Twigg and published from July 1999 – February 2003, are available here.
  • This newsletter is a merger of The Learning MarketSpace and The Pew Learning and Technology Program Newsletter.
  • Archives of The Pew Learning and Technology Program Newsletter, published from 1999 – 2002, are available here.
  • You are welcome to re-post The Learning MarketSpace without charge. Material contained in The Learning MarketSpace may be reprinted with attribution for non-commercial purposes.

Copyright 2004, The Center for Academic Transformation