View Site Map

Course Development Issues

Florida Gulf Coast University

To what extent have you been able to use previously developed materials in your re-design instead of creating new materials?

In our original plan, we intended to use a great deal of previously developed material – student essays for programming the Intelligent Essay Assessor and providing students with sample essays; a test bank developed by the publisher; lectures developed by full and part time faculty. We have been fortunate that most of this material has worked extremely well – with the exception of the test bank that we received from the publisher. Unfortunately, many of the questions in the test bank were vague, unclear, or incorrect. In addition, the questions were not worded in such a way that required a high level of analysis.

What kinds of activities took up the most time in your course development effort?

Creating an adequate test bank has consumed most of our time in the redesign effort. In the course design, we included a course coordinator position to oversee the mechanics of the class; this person has spent a great deal of time working on the development of the test bank and the oversight of the course has devolved to the project coordinator. Fortunately, the pilot of the redesigned course has gone very smoothly and we limited the number of students enrolled in the course.

Three other developmental issues have also occurred which, when resolved, will have a major impact on this course and on other courses that use similar technology.

  • First, our student registration system (Banner) is not tied to our course platform system (WebCT), which means that students must register for the course twice – first in Banner and then in WebCT. For the 2002-2003 AY, we will create a patch for this problem, which will resolve the issue for this class.
  • Second, our essay grading software, Intelligent Essay Assessor, is not integrated with WebCT so we have had to collect student essays, format them, and then send them on to IEA. What we envision is a seamless integration between WebCT and IEA, which will automatically score their essay and post their grade in WebCT.
  • Third, we need to increase the inter-rater reliability, which is currently at an acceptable level (72%) but could be higher. To program the IEA, we had to input a digitized version of the text and 100 to 200 scored essays; we were only able to include 120 scored essays because that is what we had accumulated from previous semesters. We have developed a total of 12 prompts (6 in the visual arts and 6 in the performing arts); we use 2 prompts per semester (1 in the visual and 1 in the performing). As each semester unfolds and we use a new prompt, we will capture 100 essays, include them with the 100 essays we already have – and we will score all 200 in order to program the software. This should increase the inter-rater reliability to 80% or higher.

December 2002 Update: In the full implementation of the course, we have focused our development issues on making the online materials – which are extensive – as user friendly as possible. We have tracked student use of these materials in order to ascertain where they got lost or confused; further refinements are happening for the spring 2003 offering of the course. We have not worked to integrate Banner or the IEA with WebCT, primarily because we have put so much effort into the design of the course. Once we are certain about the effectiveness of all of the materials and the various technologies, we will work towards integration.

We have worked closely with the staff at Knowledge Analysis Technologies in order to increase the inter-rater reliability of the Intelligent Essay Assessor. The original reliability of the IEA in the pilot, based on the Category IEA score, was a 60%; this reliability increased to a 72% using the Continuous IEA score and shifting the assigned values. In the full implementation, 62% of the 208 essays that were read by humans in the holistic scoring session were assigned the same score on the first two reads (thus, 38% of the essays had to be read a third or fourth time in order to develop a consensus score). Where these 208 essays were used to program the IEA, 70% of the essays scored by the IEA were scored the same as in the holistic scoring session. We have not yet used the continuous IEA score and shifted the values to see if a higher inter-rater reliability could be developed. We will be further analyzing this data in the spring and programming the IEA for two new responses in the spring 2003 semester.

Have students been directly involved in the re-design process?

Both students and faculty have been involved in the development of the course. We have hired two students as preceptors, and they have provided invaluable help and insight in the creation and refinement of the course.

What kinds of support for your project have you been receiving from your department or the institution more broadly?

The faculty in the Division of Humanities and Arts has also been supportive of the new course format and has been involved in the development of the various modules in the course. The most significant contribution comes from the Office of Course and Faculty Development; the staff has been instrumental in the development and creation of the course. Indeed, the Director of Course and Faculty Development has been active as a co-project leader.

Back

 

Program in Course Redesign Quick Links:

Program In Course Redesign Main Page...

Lessons Learned:
Round 1...
Round II...
Round III...

Savings:
Round I...
Round II...
Round III...

Project Descriptions:
Sorted by Discipline...
Sorted by Model...
Sorted by Success...
Sorted by Grant Rounds...