Implementation Issues

University of Illinois at Urbana-Champaign

Looking back on the course pilot itself, what worked best?

The data analysis component of the projects, with the student work being evaluated by Mallard, has been the most successful part of the redesign. The students report that is how they learn the material for the course. The idea to automate the assessment of significant student work is a good one. That idea has acceptance from the TAs as well as the students.

We are doing well at this point both on the faculty development front and on the student attitudes front. The fall 1999 offering produced some fallout from the students because of the problems discussed above. We have made numerous changes to address those initial difficulties. Our concern is that the students who take the course in fall 2000 or later semesters don't have a jaded view coming into the course based on the fall 1999 experience.

What worked least well?

We faced a number of issues related to the TAs:

  • We were aware that our TAs would need training with the new approach, to facilitate group learning and foster team building. We were surprised, however, that our TAs needed training in using Excel and specifically in using Excel for statistics. That had not been anticipated. We are now much more cognizant of the full range of TA preparation issues.
  • The TAs hadn't received prior training on how the labs worked.
  • The grading scheme on the projects was originally set on a curve. On the raw scale that was the basis for the grading, most teams were getting nearly all the points. So there was a lot of haggling about grades for an insignificant amount of raw scale credit, and students would burden the TAs with rough drafts of their projects to make sure their project got full credit.

The projects require a write-up in addition to the data analysis evaluated by Mallard. The TAs were to evaluate the write ups in the traditional way; technology would be used for file transfer but not the assessment of the write ups. Consistent with a writing-across-the-curriculum approach, drafts were encouraged (but optional) so that students could get some feedback to which they could respond before turning in the final version. We thought the grading effort would be significant but manageable. A seemingly innocent grading scheme (the TAs would rank the projects and then grade them on a curve) proved to have pernicious consequences. The students, many of whom care mainly about their grades, would lobby the TAs to give them intensive consulting on their drafts, so they could be at the top of the curve. The TAs, who care quite a bit about evaluations of their performance because of the potential impact on their future employability as economics professors, capitulated to these student demands. That undermined the efficiency in design. The course became extremely labor intensive. Ultimately we have dropped the curve and have gone to a flat grading scheme for the write-ups.

The economics department faces a shortage of TAs to put in front of the students in recitation section. Given the limited pool of TAs, all of the TAs who taught Economic Statistics II in fall 1999 were again involved in spring 2000, as were some more senior TAs since the enrollments in the spring are doubled what they are in the fall. Since some of them got burnt out from that initial experience, it would have been preferable to rotate out a few of the fall 1999 TAs, but there simply wasn't sufficient other staff to do so. These problems also were eventually overcome, but at the beginning they created a lot of stress for the TAs and many of them complained of overwork.

Early feedback from fall 2000 suggests things went much better than a year ago. The results from a student focus group indicate a high level of satisfaction and a sense that the students have learned the core concepts for the course. It is our conjecture that much of this perceived improvement is attributable to TA turnover, as the fall version was otherwise not much different from the prior spring. We need another semester of seeing the student response to be fully confident in that conjecture.

What are the biggest challenges you face in moving from the course pilot to the project's next phase?

The Web-based technology has been quite reliable, with no complaints about server slowness or stability. Computer labs and computer lab space are still an issue and will become an even bigger issue as we ramp up next semester. The campus is quite decentralized and while there are campus computer labs that do support class use, demand growth has outstripped capacity growth. Moreover, the model on campus calls for the department or college to provide a substantial component of the computer lab capacity. But there is no space for building a new undergraduate computer lab in economics at present. Wireless may offer the solution to this problem, where regular classroom space is converted to computer lab space. But we are about a year away from that being a realistic possibility. In the meantime, we have tried to get as much out of the existing campus capacity as possible, by scheduling recitation sections at non-peak times, where possible.

At present Economics Statistics II has too much course content. The students view that content as a list of topics to cover rather than as one big picture. We are definitely taking a less is more approach with the redesign. But we are sweating out what to keep and what to pitch. Since now the students have to learn Excel with a reasonable level of fluency, some room in the sequence has to be made for that. Economic Statistics I does not have enough content in the new version. The plan is to move some of the Excel training and the topic 'Decision Theory' to that course. Even with that, some additional content might be lopped off. But we have decided to do such paring in small steps rather than all at once.

There is an issue at UIUC about the Mallard software that is so key to this project. The campus has to decide how much it will put into Mallard in the future. Given the commercial success of Blackboard and WebCT, funding Mallard development seems less sensible at this juncture. But the Blackboard quiz engine is comparatively weak and even the WebCT quiz engine can't do some of the things that this project needs. That is an argument to sustain Mallard for projects such as this one. The campus is now part of the ADL co-Lab project. Partly for that reason, the campus is working on seeing whether the quiz component of Mallard can be made IMS compliant and perhaps can be used as a module with one of the other learning management systems. If that can be done it would make Mallard and the content that resides within Mallard more durable.

Back

 

Program in Course Redesign Quick Links:

Program In Course Redesign Main Page...

Lessons Learned:
Round 1...
Round II...
Round III...

Savings:
Round I...
Round II...
Round III...

Project Descriptions:
Sorted by Discipline...
Sorted by Model...
Sorted by Success...
Sorted by Grant Rounds...