Lessons Learned

Iowa State University

Pedagogical Improvement Techniques

What techniques contributed most to improving the quality of student learning?

Continuous assessment and feedback. In the experience of most college mathematics instructors, student learning is directly related to the amount of time students spend working problems. Homework is assigned in most courses, but usually the instructor is not able to grade more than a small part of it, and students do not take it seriously. In the redesigned Discrete Mathematics course, frequent homework assignments (usually three per week) took the place of lectures and formed an important part of the course and of the students' final grade. Computer grading of all exercises ensured that every assignment was counted and that students received immediate feedback.

Improved communication. Even though the redesigned course had no lecture meetings, WebCT's built-in communication tools (bulletin boards and email) made sure that students were always aware of upcoming deadlines and other special announcements as well as their current standing in the course.

Increased interaction among students. The online bulletin board let all students see responses to questions by other students. In some cases students resolved each other's questions. The team also facilitated the formation of study groups. Students arranged the meetings themselves, but the instructor helped them find each other, and WebCT tools let the study groups keep in contact.

Cost Savings Techniques

What techniques contributed most to reducing costs?

Online delivery and online testing. Traditionally, this course was taught with a total of 12 instructors and 15 teaching assistants annually. In the redesigned format, only 3 instructors and 12 TAs were required. These savings were directly attributable to online delivery and online testing. Since the instructor did not have to meet the students in the classroom and did not need to design several exams per term, each instructor could handle between 500 and 600 students. Based on their experience with other online courses, the team believes that this is about the limit for one person. The teaching assistants no longer had to grade exams any, so they could be assigned more hours to interact with the students in the computer lab or in office hours.

Implementation Issues

What implementation issues were most important?

Lack of computer labs. At the time the grant was received, the College of Liberal Arts and Sciences was planning to create a centralized computer lab. These plans did not succeed as scheduled, so the course has not been fully implemented on the planned scale. This problem has now been resolved. About one-third of the course was redesigned in fall 2003, and the full course will be redesigned in spring 2004 and beyond.

Keeping students engaged. The team discovered that online course delivery polarized student grades. The students who completed most of the assigned work typically got As and Bs, the students who did not got Ds and Fs. There were relatively few C grades. Personalized emails to the students lagging behind have had only moderate success. In fall 2003, the team implemented two further changes to address this issue. One was mandatory attendance at computer lab sessions, which counted for a small part of the grade. The other was to forward the names of failing students to the College of Business advisors with the hope that the students will pay more attention to their advisors than to their instructors.

Creation of exams. The creation of question banks for homework and exams took up considerably more time than expected. All the assignments were administered using MapleTA (formerly called EDU), a program specifically created for administering mathematical questions with write-out solutions. The syntax for creating MapleTA's algorithmic questions is a bit peculiar: error messages are often meaningless or misleading, and documentation is sparse. (In an algorithmic question, the computer generates different numbers for each student). However, the final result was well worth it. The team now has question banks for homework assignments and exams that can be reused term after term.

Technical issues. MapleTA lost a few scores in spring 2003 (3 out of several thousand), but what was more annoying was that it often told students that their scores were gone when actually they were not. Students just had to log out and back in to see them. The second problem went away with the new version installed in fall 2003, but a new one has popped up: sometimes a formula or image is not displayed in a problem. A lesser headache was the transfer of scores from MapleTA to WebCT. The team wrote some Perl scripts that cut the process down to less than 10 minutes, but it still required some action on the instructor's part. The Instructional Technology Center at ISU is working on some scripts that will fully automate the process.

Back

 

Program in Course Redesign Quick Links:

Program In Course Redesign Main Page...

Lessons Learned:
Round 1...
Round II...
Round III...

Savings:
Round I...
Round II...
Round III...

Project Descriptions:
Sorted by Discipline...
Sorted by Model...
Sorted by Success...
Sorted by Grant Rounds...