Implementation Issues

University of Dayton

Looking back on the course pilot itself, what worked best?

Aside from isolated server access problems, we did not experience even minor failures. Students were able to access all of the course content, including multimedia materials. Lotus LearningSpace 4, our learning management system, performed admirably. Students participated in and seemed to enjoy the online collaborative assignments. One of our favorite features of this course is that it includes content developed by five separate faculty members. This course really introduces students to the department as well as to the field. There are many ways that this departmental collaboration might be expanded in the future.

December 2001 Update: Three things that worked well during the full implementation were (1) interactive online content, (2) collaborative assignments, and (3) development team collaboration. Students expressed a high level of satisfaction with the interactive (e.g., simulations, demonstrations, self-paced quizzes) and collaborative activities. Students did not seem as impressed with the streaming audio lectures and corresponding static lecture content. Students appeared to prefer reading the text of the audio lectures versus listening to them. In hindsight, there are a number of good reasons for this. First, the audio lectures were recorded by faculty members reading from a script. As such, they do not capture the natural paralinguistic features of the instructor’s speech. We are considering recording the lectures again using a more natural, engaging speaking voice. Second, numerous students reported that it was difficult to attend to the audio lectures when studying in the library or in their rooms. We did not mention to students that they might use headphones to listen to lectures and we do not know how many used this technique. However, some students reported being distracted while listening to the lectures and found it difficult listening for long periods. Finally, the availability of the scripts and corresponding slides as PDF files made it very convenient for students to simply print, highlight, and annotate the content. This may have led students to conclude that the audio lectures were superfluous.

Although a bit chaotic at times, the collaborative assignments worked well and were generally appreciated by the students. One major difference between the pilot and full implementation was that we placed more weight on the collaborative assignments. During fall 2001, the collaborative assignments were worth 20% of the students’ grade, equal to one exam. Each assignment lasted three weeks, with teams of approximately 25 students each discussing questions about a journal article and then generating a group response. This response was then graded. Students received credit for their participation at every stage of the activity.

What worked least well?

We had expected that students would access the online course materials on a somewhat regular basis. Our analysis of the server log files showed that access peaked prior to exams and that procrastination may have been a problem for students in both the traditional and online versions of the course. The consequences of procrastination may be more severe for online versus traditional students, given that they are not forcibly exposed to material during class. We will take steps to alleviate this problem in the fall by sending students a weekly newsletter that will include a recommended study schedule for that week.

Although the proportion of students having difficulty has been relatively small (about 15% at the most), faculty have spent more time resolving technical issues than expected. Even with a Help Desk in place on campus, students have been contacting faculty directly when they have trouble.

What are the biggest challenges you face in moving from the course pilot to the project's next phase?

Our general goals for the redesign project have not changed. We continue to strive toward a cost effective, redesigned course that retains all of the positive attributes of the traditional course and that leverages technology to facilitate learning outcomes in ways that were previously unfeasible. The scope of the course and course activities will also remain the same. Our positive experience with the collaborative activities in the pilot study convinced us to retain them in the next phase. We will deploy the course to all sections of Introductory Psychology during the 2001-2002 academic year.

During summer 2001, we delivered the online course to a smaller group of students as a "Summer Study at Home" course and experimented with online assessment, with students taking all four midterm examinations online from a distance. At the end of the summer, the students came to campus for a traditional, comprehensive, final exam.

As we moved to full implementation, our biggest fear concerned student attitudes and reactions. Going into the summer, we were trying to determine how we would respond to an enrollment drop in the fall. We had even prepared the department chair to expect a large enrollment hit. In fact, this enrollment problem did not materialize. We actually had to increase the number of seats in the course to accommodate all of the interested students.

We currently have approximately 630 undergraduate students registered for the course in fall 2001. Students are learning about the course via Dayton’s Virtual Orientation application, a Web-based community where students meet future roommates and interact with instructors during the summer before they arrive on campus. Incoming students are able to access a demo version of the course and informational materials.

Students come to class for an initial orientation and attend four exam sessions in person during the semester. Based on feedback from last winter's pilot, a number of changes were made for this semester: a weekly newsletter designed to guide students through the online material and provide a sense of connection to the instructor; more interactive “Test Your Knowledge” exercises; greater use of synchronous online events (e.g., chat rooms) for facilitating conversations with the instructor and other content experts; and undergraduate mentors (i.e., upper level majors) who will be supervising the group writing assignments that will begin after the first exam.

The new course includes a collaborative learning project, in which teams of 10 to 12 students meet over a three-week period using an online collaborative workspace application called Lotus QuickPlace. The quantity and quality of participation have been excellent, and the online team projects have proven to be very manageable. Students have taken good advantage of the online office hours and study sessions, using Live Sessions, a real-time collaboration feature in Lotus LearningSpace 4.

Our greatest challenge involves institutional support. Some administrators view this redesign project as a grand experiment or test case. In fact, the project has exposed a number of issues that will need to be addressed soon, regardless of the success of our redesign. Our intellectual property policy needs to be revised to cover the development of online courses. The university needs to develop and communicate to parents and students a coherent and compelling description of our e-learning initiatives that addresses common misconceptions and concerns (e.g., that the university is turning into a “distance learning” campus). Far from being an insulated and isolated project, this redesign is simply the first of many such efforts. The more that the university can do now to learn from and address the larger support and public relations issues raised by this project, the easier it will for future redesign teams.

December 2001 Update: Thankfully, our greatest fear going into the full implementation, student mutiny, failed to materialize. Although the fall 2001 students were not extremely enthusiastic about the course, they were also not extremely negative. Going into the fall, we were trying to determine how we would respond to an enrollment drop. We had even prepared the department chair to expect a large enrollment hit. In fact, this enrollment problem did not materialize.

After our experience with the full implementation, our general goals for the redesign project have not changed. We continue to strive toward a cost effective, redesigned course that retains all of the positive attributes of the traditional course and that leverages technology to facilitate learning outcomes in ways that were previously unfeasible. The scope of the course and course activities also remain the same. One of the major lessons learned from the winter 2001 pilot was that the students in the online section tended to procrastinate more than students in the traditional, face-to-face section. Our solution to this challenge, implemented during the fall 2001 semester, involved sending a weekly newsletter to all students and enhancing the syllabus to clarify the recommended study schedule. Although we are still analyzing the data from fall 2001, this may not have been enough to reduce procrastination. Here is a typical, procrastination-related comment from the student evaluation:

I was nervous about taking an on-line course, however it fit into my schedule very nicely. It was difficult to stay on task because of all the other work I had to do, and then I'd remember OH YEAH-PSYCH! I worked hard when I did it, but keeping on track was very difficult for me.

We have already made changes to the course to further increase our students’ awareness of course requirements. We have enhanced the recommended study schedule to include day-by-day recommendations regarding online and text content that should be covered. In addition, we have asked the undergraduate mentors to increase their level of contact with students and to respond to students’ general questions. Hopefully, this will increase students’ engagement and awareness.

Two other issues that have emerged in our discussions following the fall 2001 full implementation are (1) the wisdom of asking incoming, first year students to participate in a fully distributed, online course and (2) the consequences of effectively forcing all Introductory Psychology students to participate in the online course. During fall 2001, students could not opt into a traditional section of the course. Numerous students complained on the student evaluation at the end of the semester, arguing that students should be given a choice. In addition, several student advisors argued that a completely distributed, online course was inappropriate for incoming students. These are legitimate, important arguments that will need to be addressed if this application is to move past the grant period.

Back

 

Program in Course Redesign Quick Links:

Program In Course Redesign Main Page...

Lessons Learned:
Round 1...
Round II...
Round III...

Savings:
Round I...
Round II...
Round III...

Project Descriptions:
Sorted by Discipline...
Sorted by Model...
Sorted by Success...
Sorted by Grant Rounds...