Course Development Issues
Iowa State University
To what extent have you been able to use previously developed materials in your re-design instead of creating new materials?
The textbook (Barnett/Ziegler/Byleen, Finite Mathematics, Prentice-Hall) came with WebCT content provided by the publisher. We found the content to be well prepared and suitable for study assignments with hardly any modification. It also created a nice framework around which we could build the complete course web site. The quizzes provided with the WebCT content are useful for the students for self-testing, but we did not consider them suitable as the basis for grades. There is a fairly small number of questions, all of the multiple-choice type. A list of answers would spread quickly among the students. For testing, we used a separate testing program called EDU, which allows write-out answers and algorithmic questions (that is, each student gets different numbers for the same question). We had to develop all the EDU tests from scratch. This took much more time than expected.
What kinds of activities took up the most time in your course development effort?
By far the largest amount of time was spent in preparing the EDU assignments. The course designer had experience with an earlier version of EDU (called eGrade), but the new algorithmic features of EDU required a considerable learning curve. The EDU software is still being updated, and there is little documentation. Error messages are frequently very cryptic. Sometimes it took hours to track down a particular “feature.”
Another activity that took way too much time was collecting scores. We were using some EDU assignments, some WebCT quizzes (to substitute for EDU assignments that were not ready in time), and some ALEKS scores. This required downloading the scores from three different computer systems, with different login names for the students, and combining them into one. In the future, we will do all the testing on EDU alone, which will resolve this issue.
December 2002 Update: Considerable effort went into redoing the homework assignments, and into recording weekly mini-lectures.
The homework assignments now follow the text of the book and the exercises embedded in the text, rather than the problem sections of the book. The general format of a homework problem is now “Read (a 2-3 page section in the book). This problem is similar to exercise x on page y.” This forces the students to actually read along with the text as they do their assignments. Homework assignments are broken into relatively short parts that can be completed in about an hour. There are usually three homework assignments per week. Students are encouraged to think of them as study sessions.
Making up the new EDU homework assignments took a lot of work. It was better than last term due to the experience we gained, but writing new problems in EDU is still a major undertaking. The publisher will need to come up with better tools and better documentation before their product will catch on on a larger scale.
The mini-lectures and accompanying Powerpoint slides are available to the students to view online. Each mini-lecture gives an overview of the material for one week. Preparing the mini-lectures was also quite a bit of work, mostly for preparing the Powerpoint slides.
There will only be minor changes to the course materials for spring 2003, so we anticipate less effort.
In fall 2002 we only had to move scores from EDU to WebCT, which made life a lot easier. On the other hand, the newly created study groups complicated things again. Study groups could log into EDU using the account of any one of the students and complete a homework assignment, and the score would be propagated to the other group members. This process was partially automated, but still took about 90 minutes of file transfer and editing each time. We are planning to write a more comprehensive script to speed up the process.
Have students been directly involved in the re-design process?
Both student focus groups and anonymous online surveys were used for formative evaluation of the course redesign. The results from the two types of evaluations were very similar in their conclusions and recommendations for pilot course design improvements. Changes were made in the pilot course design mid-semester based upon student input gathered during the evaluation process. Formative evaluation will be included in the final course design as well using the anonymous online survey tool in WebCT.
Two members of the redesign team planned and conducted focus groups during the eighth week of the semester. All students enrolled in the pilot sections were invited via email to attend a focus group; seven participated.
Students reported the following as positives: EDU and the option of taking tests multiple times; flexibility and convenience of learning online; one-on-one assistance from the teaching assistant; ability to work ahead in the course; and, clearly communicated expectations and deadlines
Students reported the following challenges: high levels of general frustration with ALEKS; difficulty/frustration with learning four software applications (WebCT, EDU, ALEKS, EXCEL); difficulty in keeping up with assignments; feelings that they were putting more time into “learning the routine” and “memorizing the software steps” rather than “understanding the concepts”; desire for a more structured instructional environment, such as a required weekly meeting; feelings of isolation from other students; desire for a stronger thread between the tools used in the course; and a lack of awareness of WebCT links to video-clips, movies and software help.
To better understand students’ perceptions of course delivery methods, students were asked to participate in a survey; thirty-five students responded. Among the findings:
December 2002 Update: As in the previous semester, we conducted a student focus group and asked the students to fill out an anonymous online survey. Students reported the same things listed above as positives.
There were fewer complaints than last term. We got rid of ALEKS, and the study groups helped to alleviate the feelings of isolation. The highlights of the negative comments were
We will make some minor changes to the course organization based on these comments. Specifically,
The online survey was set up a bit differently this term. There were 3 sections of questions.
Interestingly, even though only 20% of the students reported being an active participant in a study group, more than 50% thought that study groups were a good idea and should be continued in the present form (except for starting a bit later in the term, after students have had more of a chance to meet each other).
What kinds of support for your project have you been receiving from your department or the institution more broadly?
We got very good support from Instructional Technology Center (WebCT setup, posting of grades, administering surveys, recording sample lectures on video and making them available online.) The College of Business helped by conducting student focus groups, providing some case studies and recording a brief motivational lecture. The Statistics Department is doing the statistical evaluation. The Department of Mathematics will be helping by reducing the teaching load of the main course developer for fall term, since the course redesign has ended up taking much more time than expected.
We did not receive any support from the college or higher levels of the administration. Our most pressing need is another computer lab. We have requested this repeatedly, to no avail. We are unable to increase the number of students in the online sections due to lack of computer lab space.
Program in Course Redesign Quick Links: