Course Development Issues

Iowa State University

To what extent have you been able to use previously developed materials in your re-design instead of creating new materials?

The textbook (Barnett/Ziegler/Byleen, Finite Mathematics, Prentice-Hall) came with WebCT content provided by the publisher. We found the content to be well prepared and suitable for study assignments with hardly any modification. It also created a nice framework around which we could build the complete course web site. The quizzes provided with the WebCT content are useful for the students for self-testing, but we did not consider them suitable as the basis for grades. There is a fairly small number of questions, all of the multiple-choice type. A list of answers would spread quickly among the students. For testing, we used a separate testing program called EDU, which allows write-out answers and algorithmic questions (that is, each student gets different numbers for the same question). We had to develop all the EDU tests from scratch. This took much more time than expected.

What kinds of activities took up the most time in your course development effort?

By far the largest amount of time was spent in preparing the EDU assignments. The course designer had experience with an earlier version of EDU (called eGrade), but the new algorithmic features of EDU required a considerable learning curve. The EDU software is still being updated, and there is little documentation. Error messages are frequently very cryptic. Sometimes it took hours to track down a particular “feature.”

Another activity that took way too much time was collecting scores. We were using some EDU assignments, some WebCT quizzes (to substitute for EDU assignments that were not ready in time), and some ALEKS scores. This required downloading the scores from three different computer systems, with different login names for the students, and combining them into one. In the future, we will do all the testing on EDU alone, which will resolve this issue.

December 2002 Update: Considerable effort went into redoing the homework assignments, and into recording weekly mini-lectures.

The homework assignments now follow the text of the book and the exercises embedded in the text, rather than the problem sections of the book. The general format of a homework problem is now “Read (a 2-3 page section in the book). This problem is similar to exercise x on page y.” This forces the students to actually read along with the text as they do their assignments. Homework assignments are broken into relatively short parts that can be completed in about an hour. There are usually three homework assignments per week. Students are encouraged to think of them as study sessions.

Making up the new EDU homework assignments took a lot of work. It was better than last term due to the experience we gained, but writing new problems in EDU is still a major undertaking. The publisher will need to come up with better tools and better documentation before their product will catch on on a larger scale.

The mini-lectures and accompanying Powerpoint slides are available to the students to view online. Each mini-lecture gives an overview of the material for one week. Preparing the mini-lectures was also quite a bit of work, mostly for preparing the Powerpoint slides.

There will only be minor changes to the course materials for spring 2003, so we anticipate less effort.

In fall 2002 we only had to move scores from EDU to WebCT, which made life a lot easier. On the other hand, the newly created study groups complicated things again. Study groups could log into EDU using the account of any one of the students and complete a homework assignment, and the score would be propagated to the other group members. This process was partially automated, but still took about 90 minutes of file transfer and editing each time. We are planning to write a more comprehensive script to speed up the process.

Have students been directly involved in the re-design process?

Both student focus groups and anonymous online surveys were used for formative evaluation of the course redesign. The results from the two types of evaluations were very similar in their conclusions and recommendations for pilot course design improvements. Changes were made in the pilot course design mid-semester based upon student input gathered during the evaluation process. Formative evaluation will be included in the final course design as well using the anonymous online survey tool in WebCT.

  • Focus Groups

Two members of the redesign team planned and conducted focus groups during the eighth week of the semester. All students enrolled in the pilot sections were invited via email to attend a focus group; seven participated.

Students reported the following as positives: EDU and the option of taking tests multiple times; flexibility and convenience of learning online; one-on-one assistance from the teaching assistant; ability to work ahead in the course; and, clearly communicated expectations and deadlines

Students reported the following challenges: high levels of general frustration with ALEKS; difficulty/frustration with learning four software applications (WebCT, EDU, ALEKS, EXCEL); difficulty in keeping up with assignments; feelings that they were putting more time into “learning the routine” and “memorizing the software steps” rather than “understanding the concepts”; desire for a more structured instructional environment, such as a required weekly meeting; feelings of isolation from other students; desire for a stronger thread between the tools used in the course; and a lack of awareness of WebCT links to video-clips, movies and software help.

  • Online Survey

To better understand students’ perceptions of course delivery methods, students were asked to participate in a survey; thirty-five students responded. Among the findings:

  • The majority (62%) rate the overall experience either good (31%) or average (31%).
  • The majority (60%) study five or more hours per week for the course.
  • Among help resources available to students, other students outside class (66%) and classmates (57%) are indicated by the students surveyed as the two most frequently used help resources; online help pages in WebCT (40%), teaching assistant (40%), and lab monitors (40%) are the next frequently used help resources; followed by the instructor (29%).
  • Among the course materials and study resources, the majority of the students surveyed indicate that the textbook is average (51%); ALEKS is not helpful (51%); EDU is between helpful (46%) to average (43%). However, the students surveyed have split opinions on the additional study materials in WebCT: 15 students (43%) indicate that they did not use them and 14 students (40%) think they are average.
  • Among the optional course support activities for learning course materials, the majority of the students surveyed indicate that they did not use the Math help room (74%) or take advantage of teaching assistance office hours (57%); computer lab is between helpful (40%) to average (49%). However, the students surveyed are split on their views on Recitation: 13 students (37%) indicate that they don’t use recitation, 11 students (31%) think recitation is average, and 9 students (26%) think recitation is not helpful.
  • Regarding course communication tools, the majority of the students surveyed indicate that the email tool is helpful (57%); the bulletin board is between helpful (40%) to average (40%).
  • The students surveyed indicate that most of them believe maintaining about the same interaction with the instructor (57%), teaching assistant (71%), and other students (60%) would help their success. Among those who indicate more interaction would help, there are more students indicating that more interaction with instructor (43%) and peer students (34%) would be helpful than those indicating that more interaction with teaching assistant (17%) would be helpful.

December 2002 Update: As in the previous semester, we conducted a student focus group and asked the students to fill out an anonymous online survey. Students reported the same things listed above as positives.

There were fewer complaints than last term. We got rid of ALEKS, and the study groups helped to alleviate the feelings of isolation. The highlights of the negative comments were

  • Requests for a regular weekly lecture. The recorded mini- lecture and the meetings with the TA in the computer lab are appreciated, but the students would like even more classroom contact, especially early in the term. In particular, they would like to see an actual professor more often, rather than just the TA.
  • More explanation of the web site structure (which parts are important, which parts are optional).
  • More instruction on using Excel.
  • More frequent updates of the scores.
  • More convenient meeting times.

We will make some minor changes to the course organization based on these comments. Specifically,

  • The TA will still conduct the majority of the weekly meetings, but the course designer will also check in regularly to improve student contact. Having regular lectures defeats the purpose of the course redesign goal, but we will try to incorporate brief mini-lectures into the weekly lab meeting times.
  • We will make sure to cover web site structure during the weekly meetings.
  • We will make sure to include more instruction on using Excel during the weekly meetings.
  • We plan to write a better script so we can do daily rather than weekly score updates.
  • For the meeting times, we have to take what the department gives us, which happens to be later afternoon meeting hours (3-4 pm or 4-5 pm). The computer lab is occupied at other times.

The online survey was set up a bit differently this term. There were 3 sections of questions.

  • Section 1 asked students to rate many aspects of the course on a scale from 1 (best) to 5 (worst); this section corresponds to the usual course evaluation survey done in all classes. Responses generally were between 2.0 and 2.3 (B to B-), which is above average for a freshman-level class.
  • Section 2 was mostly a write-out section asking students to report problems they experienced with WebCT, EDU or the study groups, and asking for suggestions. Most of the comments were positive, or left blank. WebCT received an overall score of 1.9 (B), EDU a score of 2.3 (B-). The negative comments were essentially the same as in the focus group, and mostly dealt with EDU problems.

Interestingly, even though only 20% of the students reported being an active participant in a study group, more than 50% thought that study groups were a good idea and should be continued in the present form (except for starting a bit later in the term, after students have had more of a chance to meet each other).

What kinds of support for your project have you been receiving from your department or the institution more broadly?

We got very good support from Instructional Technology Center (WebCT setup, posting of grades, administering surveys, recording sample lectures on video and making them available online.) The College of Business helped by conducting student focus groups, providing some case studies and recording a brief motivational lecture. The Statistics Department is doing the statistical evaluation. The Department of Mathematics will be helping by reducing the teaching load of the main course developer for fall term, since the course redesign has ended up taking much more time than expected.

We did not receive any support from the college or higher levels of the administration. Our most pressing need is another computer lab. We have requested this repeatedly, to no avail. We are unable to increase the number of students in the online sections due to lack of computer lab space.

Back

 

Program in Course Redesign Quick Links:

Program In Course Redesign Main Page...

Lessons Learned:
Round 1...
Round II...
Round III...

Savings:
Round I...
Round II...
Round III...

Project Descriptions:
Sorted by Discipline...
Sorted by Model...
Sorted by Success...
Sorted by Grant Rounds...