Course Development Issues
Florida Gulf Coast University
To what extent have you been able to use previously developed materials in your re-design instead of creating new materials?
In our original plan, we intended to use a great deal of previously developed material – student essays for programming the Intelligent Essay Assessor and providing students with sample essays; a test bank developed by the publisher; lectures developed by full and part time faculty. We have been fortunate that most of this material has worked extremely well – with the exception of the test bank that we received from the publisher. Unfortunately, many of the questions in the test bank were vague, unclear, or incorrect. In addition, the questions were not worded in such a way that required a high level of analysis.
What kinds of activities took up the most time in your course development effort?
Creating an adequate test bank has consumed most of our time in the redesign effort. In the course design, we included a course coordinator position to oversee the mechanics of the class; this person has spent a great deal of time working on the development of the test bank and the oversight of the course has devolved to the project coordinator. Fortunately, the pilot of the redesigned course has gone very smoothly and we limited the number of students enrolled in the course.
Three other developmental issues have also occurred which, when resolved, will have a major impact on this course and on other courses that use similar technology.
December 2002 Update: In the full implementation of the course, we have focused our development issues on making the online materials – which are extensive – as user friendly as possible. We have tracked student use of these materials in order to ascertain where they got lost or confused; further refinements are happening for the spring 2003 offering of the course. We have not worked to integrate Banner or the IEA with WebCT, primarily because we have put so much effort into the design of the course. Once we are certain about the effectiveness of all of the materials and the various technologies, we will work towards integration.
We have worked closely with the staff at Knowledge Analysis Technologies in order to increase the inter-rater reliability of the Intelligent Essay Assessor. The original reliability of the IEA in the pilot, based on the Category IEA score, was a 60%; this reliability increased to a 72% using the Continuous IEA score and shifting the assigned values. In the full implementation, 62% of the 208 essays that were read by humans in the holistic scoring session were assigned the same score on the first two reads (thus, 38% of the essays had to be read a third or fourth time in order to develop a consensus score). Where these 208 essays were used to program the IEA, 70% of the essays scored by the IEA were scored the same as in the holistic scoring session. We have not yet used the continuous IEA score and shifted the values to see if a higher inter-rater reliability could be developed. We will be further analyzing this data in the spring and programming the IEA for two new responses in the spring 2003 semester.
Have students been directly involved in the re-design process?
Both students and faculty have been involved in the development of the course. We have hired two students as preceptors, and they have provided invaluable help and insight in the creation and refinement of the course.
What kinds of support for your project have you been receiving from your department or the institution more broadly?
The faculty in the Division of Humanities and Arts has also been supportive of the new course format and has been involved in the development of the various modules in the course. The most significant contribution comes from the Office of Course and Faculty Development; the staff has been instrumental in the development and creation of the course. Indeed, the Director of Course and Faculty Development has been active as a co-project leader.
Program in Course Redesign Quick Links: