View Site Map

The Learning MarketSpace, July 2004

A quarterly electronic newsletter of the Center for Academic Transformation highlighting ongoing examples of redesigned learning environments using technology and examining issues related to their development and implementation.

TABLE OF CONTENTS

1. THE CAT VIEWPOINT

  • A Little Knowledge is A Dangerous Thing

2. WHAT'S NEW

  • The University of Hawaii Launches Strategic Initiative
  • DOE Digital Summit: Translating Research Into Practice
  • The National Center for Academic Transformation Formed
  • Jack Wilson Appointed President of the University of Massachusetts System
  • Twigg Recognized by NUTN

3. THE ROADMAP TO REDESIGN (R2R)

  • R2R Practice Applicants Participate in Workshop

4. UPDATES FROM THE PROGRAM IN COURSE REDESIGN

  • Two New Articles Focus on Urban and At-Risk Students
  • External Evaluation of Pew Grant Process Completed

5. CUTTING ACROSS

  • Virtual Emporium: Meeting Individual Student Needs

6. COMMON GROUND

  • Florida State Models New Initiative on Program in Course Redesign
  • Online Chemistry Preparation at East Carolina University

7. CALENDAR OF EVENTS

8. SUBSCRIPTIONS, SUBMISSIONS, ARCHIVES, REPOSTING

1. THE CAT VIEWPOINT

Perspectives on issues and developments at the nexus of higher education and information technology.

A Little Knowledge is a Dangerous Thing

Bill Massy and Bob Zemsky have recently published what they call "a major new study" entitled, Thwarted Innovation: What Happened to e-learning and Why. It is based primarily on a series of surveys and interviews with "the people responsible for e-learning" at a "wide sample" of colleges and universities (specifically six): Foothill College, Hamilton College, Michigan State University, Northwest Missouri State University, the University of Pennsylvania, and the University of Texas at Austin. The authors claim to answer the question, "Why did the boom in e-learning go bust?" by supplying a "strategic story that explains what happened to e-learning and why."

I must admit this phraseology caught my attention. I'm not used to hearing someone talk about "e-learning" in the past tense. This rhetorical device is repeated throughout the document: "Did e-learning simply flame out upon takeoff?" "Perhaps the most productive way to decipher what happened to e-learning . . . is to examine the three basic assumptions that defined its promise." It's almost as if Bill and Bob are historians writing about how the promise of the American West, after several hundred years, turned out to consist largely of LA freeways and Las Vegas casinos.

Nevertheless, I thought, this must be a real study. Although no one in higher education would consider Bob and Bill to be experts on "e-learning," many (including me) consider them to be both distinguished researchers and leading thinkers on higher education in general. So, I assumed, their "strategic story" would be built on research.

The first statement I encountered is one that I whole-heartedly agree with: "If you want to know where e-learning is heading, watching the leading edge proves to be a useful strategy." This is good, I thought. Bill and Bob emphasize this point throughout the report, noting that "we are reporting the experiences and opinions of e-learning's early adopters?that is, the innovation's leading edge rather than its lagging center."

Now, whenever anyone asks me which institutions are on the "leading edge," the first name that pops into my head is Northwest Missouri State, immediately followed by Hamilton College. Give me a break!

Despite their assertions that their sample reflects a desire "to have as broad a mix of institutions as possible" and is "representative of higher education," it's curious as to why four of the six serve the "Name Brand/Medallion segment of the market." Two-thirds of American colleges and universities are not "medallion" institutions. In addition, doctoral research universities comprise one-half of their sample; in reality, doctoral research universities comprise only 6.6 percent of U.S. higher education. I'll leave it to those more expert in research methods and statistics to determine whether one can generalize to all of higher education the views of 77 faculty and 78 staff at six institutions.

What about their methodology for assessing "e-learning usage"? "Given the absence of standard institutional data reflecting e-learning usage," they assert that their "weatherstations" methodology is superior to anything in the field. Specifically they claim that the dominant e-learning measurement strategy is the "one-time survey asking university administrators and the heads of corporate training departments about their current use of e-learning," citing the Sloan-sponsored study, Sizing the Opportunity, as the exemplar. They criticize such surveys because they are "snapshots that report frequencies at a single point in time." Their solution is to stretch the study period to 15 months(!)

Is it possible that they have never heard of Casey Greene's Campus Computing Project, which has been surveying more than 600 campuses a year for the last 15 years about the role of information technology in teaching, learning, and scholarship? Somehow data from 600 institutions seems more representative of higher education's usage than data from six; somehow tracking trends over a fifteen-year period seems more substantive than over fifteen months. I'm pretty sure that Casey hasn't concluded that the boom in e-learning has gone bust.

As an example of the differences in data analysis, here is Thwarting Innovation's finding about academe's financial state: "The tracking data suggest first and foremost that the chill of budget reductions was settling over e-learning. There was a growing perception that e-learning's priority within institutional budgets was declining." Here is Casey's on the same topic: "The 2003 survey confirms that budget cuts continue to cast a shadow over campus IT activities and investments. Fully two-fifths (41.3 percent) of the survey participants report budget cuts affecting academic computing this year, up from 32.6 percent in 2002 and just 18 percent in 2001."

But then I thought, even though six institutions have a hard time competing with 600 and a fifteen-month window comes up a little short when compared to 15 years, maybe what Bob and Bill are asking is so unique, so compelling, as to make their study worthwhile.

Bob and Bill begin by asking faculty specific questions about usage: do you use multi-media presentations, a course management tool, off-the-shelf software packages? If I were a respondent and answered any of those questions affirmatively, I guess I'd conclude that I was "doing e-learning." Then they ask questions about workload reductions and technical support for faculty "engaged in e-learning." Here I begin to get confused. Let's say I use SPSS or Excel in my course, but I don't get a workload reduction for doing so. How do I answer the workload question? Or suppose I use a textbook that offers my students and me access to ancillary web sites, but I don't get technical support to do so. Maybe I'm not "doing e-learning" after all.

The survey questions about direct usage are confusing enough because e-learning is never clearly defined, but things really get rolling when faculty are asked to predict e-learning's future. What are we talking about here: e-learning, e-learning products, e-learning software, e-learning activities, e-learning courses, e-learning initiatives? All of these phrases and more are used. Look at this question: "What is the capacity of e-learning to serve new markets?" What the heck does this mean? I think they meant to ask whether online courses and programs would attract adult students to the institution (or to higher education?) who would or could not enroll on campus. Or perhaps they meant to ask whether using information technology on campus would attract traditional age students to UT-Austin who might otherwise have gone elsewhere. Then again, maybe they meant to ask the faculty's opinion about the University of Phoenix's use of online programs to serve new markets. Frankly, I don't know what they were trying to ask.

But where it really gets interesting is when we arrive at the conclusion they draw from this confusion: that "e-learning" has gone "bust." Once again, the problem proves to be definitional. At any point where Bob and Bill move into their "theorizing" or "editorializing" mode, their use of the phrase "e-learning" becomes yet more strangled. Examples are myriad and include:

  • Phrases like "a complex process such as e-learning," or "e-learning's troubling assumptions." How does e-learning have assumptions? Some advocates of technology-mediated instruction have one set of assumptions about how IT might best be used, others have yet another set, and so on.
  • Questions like "to what extent does e-learning's adoption of the market model . . . help to explain what happened?" How does e-learning adopt a market model? Folks involved in using technology in teaching and learning have widely differing models of what they're doing in mind, ranging from the University of Phoenix's highly sophisticated—and successful—market model, to Penn State's outreach and service model, to individual faculty members' models of using technology in their courses, and so on.
  • Sentences like "It's high time for e-learning to get real." (my personal favorite)

When I heard Bob give a talk about this study at an ECAR conference last November, my response was the same as when I read the full report: What is "it?" ("The hard fact is that e-learning took off before people really knew how to use it.") Is "it" a faculty member at Foothill College using PowerPoint? a company like UNext trying to attract venture capital? a distance learning program serving off-campus students? an Ivy League university setting up ivy.com? Well, the answer is clear: it's all of the above and more.

Most of the "study" is based on their views of what "happened to e-learning" rather than on their research. (Curiously, there are no references or footnotes throughout the report.) If one were asked to name "leading edge" thinkers on higher education's use of information technology in academic programs, neither Massy nor Zemsky would readily spring to mind. These are the folks who crafted the sentence, "Illustrated lectures do not constitute electronically mediated learning any more than courses that use Blackboard or WebCT to distribute learning materials without introducing learning objects." I personally dismissed the "editorializing" parts of the report: their four-step theory of "e-learning adoption" (it's amazing how all of our PCR redesigns attained step 4 while skipping step 3), their "framing" questions that are interesting but tellingly never answered, their three "troubling assumptions" that led to e-learning's "bust." These assertions have no basis in their research.

Much of Thwarted Innovation reads like we're in a time warp, which is a by-product of a lack of knowledge of the field. In describing the need for interchangeable learning objects using the railroad track analogy, Bob and Bill seem to be unaware of the work of the Instructional Management System (IMS) Global Learning Consortium that was launched by Educom almost 10 years ago as well as other related standards efforts like The Advanced Distributed Learning (ADL) Initiative (SCORM). In citing RPI's studio physics course redesign, which was created more than 10 years ago, as "the exception, not the rule" in changing the way faculty teach, they ignore the 30 Pew-funded course redesign projects created in the past several years and the hundreds, perhaps thousands, of faculty who have changed they way they teach when they move from the classroom to an online environment. They quote my view of MERLOT at length, but they miss the main point of the article that was its source, which is a critique of the learning objects strategy in genera, and they miss the title's irony by saying that I "wistfully" asked, "Build It, But Will They Come?"

No one knowledgeable about the use of information technology in teaching and learning would conclude that "e-learning has gone bust." As ThinkEquity, an education industry research and analysis firm, summed up their assessment of the study: "The report suffers from an academic myopia that ignores the hundreds of thousands of students that have been able to return to school or stay in school while supporting themselves thanks to online education alternatives that, in most cases, represent a measurable improvement of large lecture-style courses."

Bob and Bill may not have heard of Arthur C. Clarke's oft-repeated observation that "when it comes to technology, most people overestimate it in the short term and underestimate it in the long term." They seem shocked that a number of people who have seen the potential of information technology for transforming American higher education may have overestimated the speed at which change would happen. I must admit that, while I have met many folks in this field who are disappointed at the pace of change in higher education, I have never met anyone who would conclude that "the experience with e-learning has been disappointing." I also doubt that the faculty and staff who were surveyed as part of their study would reach that conclusion.

I believe that the vast majority of those in higher education who are interested in using IT in collegiate teaching and learning are doing so because they are trying to make things better—to make higher education more accessible, to improve the student learning experience, and—in some cases—to reduce instructional costs. That doesn't mean that everyone does everything flawlessly. But what characterizes the efforts of faculty and staff in this field is excitement, dedication, care and sincerity, which stands in bold relief to the cynicism of, "Why did the boom in e-learning go bust?"

Because of the way it's written, this report will be used to attack online learning in a variety of settings. The Miami Herald's recent headline says it all: "Study Debunks Value of Online Learning." It will be used as a weapon by those in higher education who refuse to acknowledge, in the report's own words, "that there is a need to substantially improve educational quality, especially for undergraduates" simply because Bill Massy and Bob Zemsky wrote it. Ordinarily I am a great admirer of their work, but in this case, a little knowledge is a dangerous thing.

--Carol A. Twigg

2. WHAT'S NEW

Featuring updates and announcements from the Center.

The University of Hawaii Launches Strategic Initiative

The University of Hawaii System (UH) has joined with the The National Center for Academic Transformation to develop a program to redesign large enrollment courses using technology among the 10 UH institutions. The UH System will provide institutional grants to assist with implementation of the redesign plans. While UH has had a long-standing faculty development effort aimed at increasing the use of technology in teaching, this new initiative is more strategic and focuses on the need to contain instructional costs while increasing student learning. The University will kick off the initiative in September with a systemwide Call to Participate and an orientation featuring Carol Twigg. Next steps will include systemwide workshops conducted by the Center and development of campus proposals. For more information about the UH program, contact Hae Okimoto. For additional information about Center collaborations, contact Carol Twigg.

DOE Digital Summit: Translating Research Into Practice

The U.S. Secretary of Education held an invitational Digital Summit on July 12, 2004 in Orlando, FL, "Secretary's No Child Left Behind Leadership Summit—Increasing Options Through e-Learning," targeted at state policy makers. Carol Twigg was invited to participate in a three-person panel discussing virtual education as a powerful innovation that can expand opportunities for learning any time, any place in support of NCLB. This event was part of a national series of summit meetings sponsored by DOE with the goal of translating research on improving student achievement into practice in the classroom.

The National Center for Academic Transformation Is Formed

In order to pursue a wider range of activities, the principals of the Center for Academic Transformation have formed a new 501(c)(3) organization called The National Center for Academic Transformation (NCAT.) The establishment of NCAT is our first step in moving from a university center affiliated with Rensselaer to a self-sustaining, independent entity. The purposes of NCAT are to (1) serve as a source of expertise and support for those in higher education who wish to take advantage of the capabilities of information technology to transform their academic practices; (2) support and conduct non-partisan research, education, and informational activities to increase public awareness of the benefits of using information technology in higher education; (3) collaborate with other associations, organizations or agencies interested in similar and related activities; and (4) engage in other related activities.

Jack Wilson Appointed President of the University of Massachusetts System

Jack Wilson was recently appointed President of the University of Massachusetts system. He formerly served as the Vice President for Academic Affairs of the UMass system and CEO of UMassOnline. Prior to moving to Massachusetts, Jack was the J. Erik Jonsson '22 Distinguished Professor of Physics, Engineering Science, Information Technology, and Management at Rensselaer Polytechnic Institute where he also served as a Dean and Interim Provost. Jack has been a close friend of the Center for Academic Transformation as a partner in developing the Program in Course Redesign, as an active member of the RPI Center's Advisory Board and as Chairman of the Board of Directors of the newly created National Center for Academic Transformation. Congratulations, Jack!

Twigg Recognized by NUTN

In June, The National University Telecommunications Network (NUTN) awarded their 2004 Distinguished Service Award to Carol Twigg. The mission of NUTN is to provide "dynamic professional development opportunities in support of emerging and current technology applications to professionals involved in higher education." Carol was recognized at the NUTN annual meeting in Kennebunkport, Maine for her contributions to the field where she delivered the keynote address at the Awards luncheon. For more information about NUTN, see http://www.nutn.org/. Kudos, Carol!

3. THE ROADMAP TO REDESIGN (R2R)

Featuring progress reports and outcomes achieved by the Roadmap to Redesign.

R2R Practice Applicants Participate in Workshop

About 125 people---including teams of three or more representatives from applicant institutions, core practice members, Center staff and corporate representatives from Blackboard, SunGard Collegis, Thomson Learning and WebCT--participated in an R2R training workshop on June 2 in Baltimore, MD. Focused on orienting new institutions to the R2R program, the workshop was organized around a set of planning resources developed for R2R which include redesign models, principles of successful course redesign, strategies for cost reduction, assessment approaches and implementation issues.

Prior to attending the workshop, participants were asked to be prepared to present a five-minute description of their anticipated redesigns, including a preliminary explanation of the model they intend to use. Applicants met in small groups to discuss their plans for redesign with representatives from other institutions and with representatives of the practices. R2R course redesigns will potentially affect more than 56,000 students nationally. Because institutional teams were split up to talk with others in small groups, all team members had to be equally prepared, which fostered team planning.

Participants also completed a workshop evaluation form. The overall feedback was quite positive. Those who had begun an in-depth study of R2R web materials and the streamlined process found the workshop useful for clarifying any confusion among their team members. Those who had not spent much time reviewing the web materials found that the workshop highlighted the need to do so. Overall the teams reported that they felt much more prepared to work on their redesigns and to formulate concrete proposals after attending the workshop.

The next step in the R2R application process is for teams to submit a full redesign proposal to the Center by August 1, 2004. To learn more about the R2R program and to access the planning resources, visit The Roadmap to Redesign.

4. UPDATES FROM THE PROGRAM IN COURSE REDESIGN

Featuring progress reports and outcomes achieved by the Program in Course Redesign.

Two New Articles Focus on Urban and At-Risk Students

Carol Twigg has recently published two articles focused on particular aspects of the Program in Course Redesign. In Vol. 15, No. 2 of Metropolitan Universities Journal, an article titled "Improving the First-Year Experience: The Impact of Course Redesign" focuses on the first-year student experience, especially in urban universities. The article features case studies of five of the PCR projects: Florida Gulf Coast University, Indiana University-Purdue University Indianapolis (IUPUI), Portland State University, the University of Central Florida and the University of New Mexico. This journal is the quarterly publication of the Coalition of Urban and Metropolitan Universities, a membership organization of about 70 urban/metropolitan universities (including a few members from UK, Canada, and Australia). An abstract is available at http://muj.uc.iupui.edu/15_2.asp. Follow the links for ordering information.

An article titled "Using Asynchronous Learning in Redesign: Reaching and Retaining the At-Risk Student," was published in the February 2004 issue of the Journal of Asynchronous Learning Networks. The article features brief case studies of six of the PCR projects that had particular success in improving learning and retention among at-risk students: Florida Gulf Coast University, IUPUI, Rio Salado College, Tallahassee Community College, the University of Idaho and the University of New Mexico. It can be found at http://www.aln.org/publications/jaln/v8n1/index.asp.

External Evaluation of Pew Grant Process Completed

As part of the continuing effort to learn how well the redesign approaches used by the Program in Course Redesign (PCR) are working, the Center commissioned the National Center on Higher Education Management (NCHEMS) to evaluate the process used in the Pew Grant Program in Course Redesign. The goal was to learn how campus participants viewed both the resources developed by the Center and the process it used in supporting grant applicants, whether or not they ultimately received a grant. Respondents were split almost equally between those who received a grant and those who did not. Most respondents indicated that participating in the proposal process was a worthwhile learning experience. The most useful features cited were attending the workshops and completing the Course Readiness Criteria and the Course Planning Tool. Eighty-five percent of those institutions that did not receive a grant found the process worthwhile, and 39% reported that their institution had redesigned a course without the grant funding.

5. CUTTING ACROSS

Highlighting themes and activities that cut across redesign projects.

Virtual Emporium: Meeting Individual Student Needs

One of the five models that emerged from our analysis of the thirty projects in the Program in Course Redesign (PCR) is the emporium. In this model, students study in a large computer lab guided through the course by instructional software, which provides immediate feedback to students on problems and assessments. When students get stuck and need help, a variety of personnel are available, including full-time faculty, part-time faculty, graduate teaching assistants, and undergraduate learning assistants as appropriate. The emporia are open for long periods of time during the week (some 24*7), and all engage students in active learning and provide more individualized assistance for those who need it than a traditional class can offer.

Initially developed by Virginia Tech, this model was adopted by two additional PCR institutions—the Universities of Alabama and Idaho--and several more are considering it in the Road Map to Redesign program. All three institutions teach more than one math course in their emporium, and emporium personnel are prepared to work with students at various stages in multiple courses simultaneously.

The emporium model was designed to be face-to-face with the expectation that students would spend time in the lab. Since many emporium students are new to college and find math challenging, the face-to-face format was considered important. However, as the faculty gain more experience and more clearly recognize the range of student backgrounds and abilities, the emporium model is expanding to become more virtual, allowing more options for students within the model. For example, many students are able to complete the math problems quickly with little assistance. Consequently, several of the institutions now permit students to reduce their time in the lab or to come to the lab only to take proctored quizzes and exams.

At Virginia Tech, students in linear algebra and precalculus are only required to visit the emporium for proctored tests. In other courses offered in the emporium, there are expectations of grater face-to-face presence. At the University of Idaho, students were initially required to spend three hours each week in the lab called the Polya Math Center. Beginning in fall 2004, students will be able to demonstrate their ability to complete the work virtually and earn reduced face-to-face time requirements. The virtual emporium still allows faculty to see usage patterns and to monitor how well students are doing. If students fall behind or encounter problems, they will be required to spend more time in the lab to take advantage of available assistance.

There are a relatively small number of computers available to students in the Polya Math Center. By moving successful students to "Virtual Polya," there will be more computers available to other students during peak periods, and the math program will be able to provide more individualized assistance to those who need it. The University of Alabama also faces space constraints. While students are still required to spend some time in the lab, they may work fewer hours if they successfully complete their weekly assignment in a shorter period of time. This change in the course requirements has been extremely helpful in keeping students engaged.

To learn more about how these three institutions are modifying the emporium model to meet the individual needs of their students, contact John Rossi at Virginia Tech, Monte Boisen at the University of Idaho or Hank Lazer at the University of Alabama.

6. COMMON GROUND

Reporting on initiatives that share the Center's goals and objectives.

Florida State Models New Initiative on Program in Course Redesign

Florida State University (FSU) is developing a course redesign initiative based on the Program in Course Redesign. Offered by the Office for Distributed and Distance Learning, the FSU program will focus on large enrollment courses with the goals of increasing student learning and reducing instructional costs. The University has issued a Call to Participate and reports expressions of interest in such varied fields as accounting, economics and modern languages. A statement of intent is due in August, and full proposals are due in late September. To learn more, contact Larry Dennis or visit their web site at http://online.fsu.edu/proposal/cri/.

Online Chemistry Preparation at East Carolina University

To achieve the objectives of increasing learning in chemistry and achieving greater consistency in student learning experiences for 1,000 students annually, East Carolina University (ECU) has moved all of their chemistry lab preparation experiences online. Replacing a text of more than 330 pages, ECU's new interactive manual provides a consistent approach to the lab experiences as well as additional resources that provide support for students who need them. Students now come to lab ready to spend time actively engaged in discovery learning; teaching assistants no longer need to review the preparation needed for each lab because students have already done so online. The online manual includes a structure for writing lab reports as well as suggestions for and models of successful reports. Students can review the online resources as many times as needed so that they feel well-prepared to engage in lab work. Students have enthusiastically embraced the new method. In addition, fewer teaching assistants are now needed. For additional information contact Dorothy H. Clayton.

7. CALENDAR OF EVENTS

AUGUST

  • Full R2R proposal deadline for R2R finalists
    August 1
  • Meeting of R2R core practice members
    August 10
  • 20 new institutions selected to participate in R2R
    August 15

SEPTEMBER

  • 20 new R2R redesign projects begin

OCTOBER

  • Publication of The Learning MarketSpace

8. SUBSCRIPTIONS, SUBMISSIONS, ARCHIVES, REPOSTING

The Center for Academic Transformation serves as a source of expertise and support for those in higher education who wish to take advantage of the capabilities of information technology to transform their academic practices.

  • To subscribe to The Learning MarketSpace, click here.
  • To submit items for inclusion in this newsletter, please contact Carolyn G. Jarmon.
  • This newsletter is a merger of The Learning MarketSpace and The Pew Learning and Technology Program Newsletter.
  • Archives of The Learning MarketSpace, written by Bob Heterick and Carol Twigg and published from July 1999 – February 2003, are available here.
  • Archives of The Pew Learning and Technology Program Newsletter, published from 1999 – 2002, are available here.
  • You are welcome to re-post The Learning MarketSpace without charge. Material contained in The Learning MarketSpace may be reprinted with attribution for non-commercial purposes.

Copyright 2004, The Center for Academic Transformation