View Site Map

Implementation Issues

Brigham Young University

Looking back on the course pilot itself, what worked best?

From our initial analysis, we feel that the level and quality of student and instructor interaction was quite high. We are also pleased with the learning outcomes from our portfolio analysis. We were surprised by the extent to which instructors in the “traditional” sections appropriated the assignments and methods from the online section and feel that a hybrid model is naturally emerging in the course.

  • Student/Instructor Interaction

Students were generally very satisfied with the level and quality of the student/instructor interaction. When asked whether the students would like more instructor interaction, the mean for the online course (on a 7 point scale) was 4.59, versus only 4.20 in the traditional course. Based on these neutral responses and the similarity between the traditional and online courses, it is clear that students are satisfied with their instructor interaction.

When asked whether the students would like more peer interaction, the mean in the online course was 3.85, versus 4.00 in the traditional course. However, online students made it clear that instructor conferences were a must. Students said that a purely online distance version of the course would be very difficult. They were pleased with how accessible the instructors were, allowing students to call them at home and responding to their emails promptly. Instructor feedback was highly rated by both versions of the course: 6.12 in the traditional version and 6.03 in the online version.

Despite the lack of regular face-to-face interaction, every online student who participated in the focus group felt that their instructor truly cared for them as individuals and the instructors were genuinely concerned with their progress in the course.

  • Student Writing

Student writing in the online course was found to be significantly better. In particular, we found that student (1) introductions and conclusions and (2) focus and organization improved. This makes intuitive sense given that these are the areas in writing that are most easily taught online. Overall, the online students seemed to be satisfied with their experience. Despite technical problems and a certain amount of isolation, they were generally pleased with their participation and performance in the redesigned course.

  • Savings in Instructor Time

We were also pleased with finding a 25% savings on overall instructional time. In our Fall 2002 implementation we are planning to increase the number of students in 10 online sections and keep the number of students constant in 10 online sections and study the effects of class size on timing. We are also taking measures to improve training and increase the efficiency of grading.

December 2002: Considering the scale of the implementation for fall 2002, we were pleased with how well everything went. With 30 sections all taught by inexperienced instructors, we were bracing ourselves for a possible disaster. But based on the data we have as of December 2002, some elements of the course continue to work well in a large-scale implementation.

Students continue to like the flexibility the online course provides. They also enjoyed the small class size. A few students felt that they had more one-on-one instructor interaction, and that the online course was easier than the traditional one.

Although instructors generally thought that the class should meet more frequently (two hours a week rather than one), they considered some elements of the online course to be effective. Because they were limited to one hour a week, instructors valued this one hour and felt that in-class activities were more effective. Many instructors said peer review was very valuable and used much of their class time having students work together in group activities. This use of class time reflected well the training they had received at the beginning of the semester. Instructors generally enjoyed the one-on-one interaction they had with their students in student conferences, and most instructors found the discussion board in Blackboard to be a valuable addition to the course.

What worked least well?

  • Instructor Training

We were surprised about the extent to which instructors needed training for this course. The instructors were put into the course with little training on how to use Blackboard or how to effectively use the course material. In our interviews with instructors, they mentioned that they spent most of their time administering the course rather than teaching their students. This is mostly attributable to the fact that instructors were solving technical problems, learning to use Blackboard, and learning to implement the course.

It is clear from our evaluation that instructors need training and direction on how to use the course delivery system and how to implement the courseware effectively. For example, instructors need to know how to use classroom time effectively, how to use communication technology effectively, and how to structure student learning to encourage a more complete usage of the courseware.

Based on these findings, we have developed instructor training designed to help instructors more effectively implement the course and to teach them to use communication features in the Blackboard course delivery system. We hope this training will increase time-savings and efficiency of the instructors and help increase the effectiveness of the redesigned course.

  • Technology Problems

Many of the students and most of the instructors are unhappy that they must go to two different Web sites to complete the course. For the first three weeks fall 2001 semester, students accessed the course online through BYU’s Route Y server. When the server proved to be less than reliable, the instructors decided to run the course through Blackboard.com. They must now use two sites to access the course; one site contains the course materials and lessons, the other contains communication tools and quizzes. Students feel that the course material and assessments should be on the same site. Broken links, typos, and video problems were also noted as well as the ever-pressing problem of the server being down.

When asked whether they felt that their performance in the course had been harmed by the problems, nearly all said no. Overall, the focus group students seemed to be satisfied with the functionality of the technology. There was really no significant difference in student perception of technology in the two versions of the course.

December 2002 Update: After the winter 2002 implementation, we discovered that instructors were not using the course materials we had developed as much as we had hoped. To a certain extent, this can be attributed to some design flaws in the materials themselves that were not apparent until instructors taught with these for a full semester. There were a few objects that didn’t work as designed, and students complained about having to read materials from the screen. In addition, there were some inconsistencies between the text materials and the online materials.

We also found that instructors may not have used these materials as much because they were not familiar with them. These instructors had been trained to teach the traditional version of the course and were then asked to teach the online version the next semester with only a brief orientation to the online materials. (Time and resources just weren’t available for a more extensive training.)

In preparation for the fall 2002 implementation, we built the training of new instructors around the course modules and Blackboard. In addition, in our weekly in-service meetings, we gave instructors step-by-step training on how to use the course modules. Unfortunately, despite this extra emphasis on training, we discovered at semester’s end that instructors were still not using the course materials as much as we had hoped. To be more precise, we found that instructors were picking and choosing among the online materials we had provided. A few students weren’t even aware of the course modules, and many of the rest report very limited experience with specific tools, such as the thesis machine and the annotation exercises. Only one student said she used the online course rather than the book.

We attribute some of this to a few lingering problems with the modules themselves, to inexperience, and still some unfamiliarity about what is available. But we attribute some of this to the natural desire of graduate instructors in English to assert their autonomy as developing teachers.

Graduate instructors in English tend to be very pragmatic, and they will use materials that they deem useful. Although these course modules were developed with a great deal of input from graduate instructors, the instructors who worked on the modules have since graduated, and these new instructors don’t feel the same sense of ownership in the online course materials. As we prepare for our training next year, we need to focus even more than we anticipated on leading new instructors through the materials in the course modules as well as involving them in the development of new instructional materials so that they feel a sense of ownership. But given the high rate of turnover in our graduate program (with nearly half of our instructors being new each year), familiarizing new instructors with the course will continue to be an issue in implementation.

We have come to accept that training of instructors is a much bigger contributor to the success of this project than the design itself. The course re-design presents inexperienced instructors with a new paradigm for teaching and where it is poorly understood and accepted, it will be poorly implemented. Improving the quality and consistency of our training is a primary objective. In addition, we need to be sure that the course materials are used as intended. In preparation for fall 2003, we will provide new instructors with a list of the specific activities that they should have their students do on a weekly basis.

What are the biggest challenges you face in moving from the course pilot to the project's next phase?

Our preliminary study of instructor time revealed that any significant cost savings would have to come by reducing the amount of time instructors spend evaluating essays. This is a significant part of instructor time in a writing class, much greater than any savings that could come from reducing preparation time. The amount of time instructors spend grading in the traditional class and the redesigned course varies considerably. We hope to be able to correlate the time instructors spend grading with the scores students received on portfolios to find out if greater time spent grading leads to improvement in student writing. Based on previous research conducted in this area, we predict that more time spent grading will show no improvement in student writing. For the next semester, we will approach the time spent grading as a training issue, helping our instructors to learn more efficient and effective ways of grading. We hope that this training will reduce the amount of time spent grading in both the traditional and redesigned courses and will give us a better idea of what would be an appropriate amount of time.

Our biggest challenge in moving to the next phase of our project will likely be faculty development and support. In the fall, we are experimenting with having all of our new instructors teach the redesigned course (25 sections) to see if more focused training may be able to take full advantage of the efficiencies of the redesign. We will need to teach these instructors how to manage a course very different from any course they have ever taken and how to interact with students face to face and through technology rather than just through the traditional classroom space. Some instructors will need to be trained in how to use the technology, although we considered their familiarity with technology as part of the hiring process. We will also need to find ways to teach them how to grade more efficiently and effectively.

In our next phase, we are also testing some of the physical limits of the redesign by expanding the number of sections. We are concerned about having enough office space for our instructors to meet regularly with students, and we know that the Writing Center doesn’t have adequate resources to meet four times with every student from 25 sections. Our new instructors and students have shown a willingness to try this experiment, but we have been getting some negative reactions from parents, who perceive the redesigned course as inferior to the traditional class because of its technological component.

December 2002 Update: As the course redesign becomes a more familiar part of our program, we expect that some of the problems we have had with implementation will disappear. We are in a transitional phase at the moment, and the success of the project relies quite a lot on the hard work and good will of the instructors in our program. Despite some of the problems, our instructors have worked very hard to find ways to make the project work. Our instructors haven’t in any way tried to circumvent our program, but we have found that they have understood our training in different ways (due in part to their inexperience and in part to the complexity of teaching writing). The problem of instructor turnover and the need to continually train new instructors will persist, and we need to have a model in place of continual improvement in our training of new instructors based on our assessment of implementation each fall semester to ensure a continuous improvement in the course.

Back

 

Program in Course Redesign Quick Links:

Program In Course Redesign Main Page...

Lessons Learned:
Round 1...
Round II...
Round III...

Savings:
Round I...
Round II...
Round III...

Project Descriptions:
Sorted by Discipline...
Sorted by Model...
Sorted by Success...
Sorted by Grant Rounds...