Implementation Issues

Penn State University

Looking back on the course pilot itself, what worked best?

Readiness Assessment Testing

Assessment of student understanding of concepts using RATs has proven to be very effective in detecting areas in which students are not grasping the concepts, thereby enabling corrective actions to be taken in a timely manner, and in preparing students for higher level activities in the computer labs than previously. As a result, students have been helped in building skills, as the evidence of the pre- and post-test shows. The Web enables more rapid feedback to students, another crucial element in the learning process.

Student perception of the importance of RATs is evident in the results from the Innovation and Quality (IQ) data where the majority of students (55%) rated the RATs as the most important aspect of the class. Seventy-five percent of respondents believed that periodic RATs help them keep up with the readings and that they were vital for their learning and understanding of the content. As voiced in focus groups, students felt that the RATs helped by promoting recognition of holes in their understanding. In addition, students liked the opportunity to work in groups and interact with others in the class. Most students suggested keeping the Readiness Assessment Testing for the following semesters.

Although we thought that the RATs were closely linked to lab activities, students apparently did not perceive this to be the case. This experience has been valuable in terms of re-doubling our efforts to ensure that the linkage is clear and effectively implemented.

  • Individual and group collaborative activities

We were pleasantly surprised at the students' reaction to not being lectured to and instead being able to work in groups in the labs to apply what they had learned from the resource materials (we also used a portion of the printed draft of the text being written by the two authors cited earlier).

What worked least well?


Initially, we anticipated using Web-based courseware Cyberstats as the backbone of the course. However, the quality of Cyberstats turned out to be highly variable, ranging from a few excellent units to several we regarded as unsatisfactory (from both the faculty members' experience and from the students' perspective, as voiced in focus groups). Many students wanted a hard copy to refer to, and the cost in terms of time and money to print materials was troublesome to them.

We felt that only about half of the units in Cyberstats were acceptable at the beginning of the fall semester. As a result, instead of utilizing 30-32 units in Cyberstats, we now expect to use at most 5 or 6. Coincidentally, two of the authors (one at Penn State and the other at one of the other universities) are writing a text, which reflects our philosophy and goals--these two authors were also the best Cyberstats authors. Fortunately, we were able to choose their units on Cyberstats with their printed text to constitute the resources needed by the students. We have been in close contact with the developers and anticipate that these materials will be more useful later (certainly by Fall 2001). We also intend to provide greater motivation for using Cyberstats in the spring semester and to 'link' it with the print-based test material.

  • Getting students to understand the course structure and reasoning behind it, especially the linkage of activities with RATs.

Although we thought that the RATs were closely linked to lab activities, students apparently did not perceive this to be the case. The majority of focus group participants failed to see the connection between what was tested on the Readiness Assessment Tests and what was done in the lab the following day. Results from the Innovation Quality teams support this finding. Sixty-four percent of students felt that lab activities did not help reinforce the weekly readings, while another 64% of students failed to see the purpose of some lab activities. Students suggested including a mini-lecture to point out this relationship or an outline that would make this relationship more obvious. This experience has been valuable in terms of re-doubling our efforts to ensure that the linkage is clear and effectively implemented.

What are the biggest challenges you face in moving from the course pilot to the project's next phase?

Our primary concerns are technological:

  • Completion of the Web site in time for full usability

There are two primary components of the Web site: the student side and the instructor side. The former (student side) is in very good shape, but the instructor interface has several minor and major unresolved issues that need to be hammered out before we will be ready.

In the summer of 2001, the administration of the course Web site will move from the Penn State Center for Academic computing to the Department of Statistics. To facilitate the administration of the course Web site, a new version of the course Web site with improved administrative capabilities will be implemented. The revised version of the course Web site will include utilities forms which will make it possible for an instructor of graduate student with good Internet skills to administer the Web site on a day-to-day basis. Previously, only an SQL programmer could do many of the administrative duties, such as adding and dropping TAs. In addition, the new course Web site will be flexible enough to accommodate scheduling or other logistic changes that may occur throughout the semester.

The technology team also hopes to focus on providing instructor training so that they will be able to independently update online material. Since many faculty already know and use the Macromedia Dreamweaver software package to compose pages in HTML, it may become the platform of choice for updating and creating HTML pages.

  • Scaling TESTPILOT

We have had three tests of TESTPILOT with no problems, but none on the scale we will face when the restructured course is fully implemented. There is no reason for concern, but the numbers of students involved will be many times greater than what we have had in our testing--hopefully, no problems will arise.

  • Collecting grades and maintaining a grade book

We will be collecting the following grades: (a) one mid-term and the final exam; (b) individual and group RAT scores; (c) short quizzes/attendance checks using TESTPILOT; and (d) project scores. These scores will be collected in different ways, and we don't have a convenient grade book that will handle all four in a simple manner. Some manual entry will initially be required, but over eventually we expect to have an appropriate piece of software that will meet our needs.

  • Some procedures for managing the group component of RATs need to be resolved.

We have administered RATs in a room with 243 seats and 240 students, working in 60 groups of size 4. We have purchased a machine for scoring the group RATs. The process of retrieving the Scantron sheets from the students, putting them through the machine, posting the group grades and then returning them has occurred without any major problems. Students were quick to acquire the class routine, resulting in successful administration of the Readiness Assessment Tests. However, there are still several issues of concern that need to be addressed, including probable student cheating on RATs as well as the possibility that some students are passing test questions on to successive sections. As a result of these concerns, we have decided to move the administration of RATs from the large lecture class to the smaller lab sections and to do them on-line using TESTPILOT.

Finally, from the focus group data, student attitudes and reactions are of concern, but most of the frustrations students voiced in the pilot are being addressed in the spring. Students found the TAs very helpful to their learning of the content and felt that their presence was extremely valuable and beneficial. However, through a mix-up and/or misunderstanding, we failed to set up tutorial sessions to be led by TAs to give students additional assistance in learning the concepts. This is an area of great concern to us, and we will be implementing the tutorial sessions in the spring 2001 semester. A need for adding more lecture time was expressed (but at the expense of lab time, not added course time) as a way of promoting understanding and mastery of the material read. This presents a problem in that it countermands a major thrust of the redesign. Both faculty and students feel that having a one-hour question and answer period before taking the RATs would be highly beneficial.



Program in Course Redesign Quick Links:

Program In Course Redesign Main Page...

Lessons Learned:
Round 1...
Round II...
Round III...

Round I...
Round II...
Round III...

Project Descriptions:
Sorted by Discipline...
Sorted by Model...
Sorted by Success...
Sorted by Grant Rounds...