Impact on Students
Carnegie Mellon University
In the redesign, did students learn more, less or the same compared to the traditional format?
StatTutor modules targeted labs that taught statistical concepts known to be difficult for introductory students. These concepts are: correlation, boxplots, scatterplots, contingency tables, chi-squared statistic, and t-tests.
The table below compares student performance on a variety of final exam questions testing those concepts. Redesign students showed a notable improvement in the percentage correct. Note that the questions in the bottom two rows of the table – asking students to choose an appropriate statistical test when the correct answer was either chi-square or t-test – were never posed to students before the redesign because they were deemed too difficult.
A more open-ended assessment involved asking students to solve several data-analysis problems without the instructions or support typically given in the lab sessions. (That is, neither group was given access to StatTutor or paper instructions when completing the assessment.) Redesign students made fewer than one error per problem, whereas traditional students made approximately six errors per problem. Moreover, these errors were focused precisely on the difficult skills of planning/selecting appropriate analyses and evaluating the validity of statistical inferences.
Another assessment involved creating a variety of problem descriptions (in this case, data-analysis questions with a description of the accompanying data) that have several features in common—that is, some of the features shared among problems are superficial and some are conceptual. Students were asked to sort the problems into groups in whatever fashion they deemed relevant. These groupings were then evaluated for the degree to which they rely on superficial versus conceptual features. Both groups of students decreased their tendency to sort by superficial features, but the redesigned group showed a larger change in the direction of more conceptually based groups.
This course has not had a high failure rate among students. Indeed, the course's grade distribution was and continues to be centered on the letter grade B, with fewer than 5% of students failing. Thus, the project goal was not to decrease failures but rather to increase students' understanding and skill level at a given final course grade.
Other Impacts on Students
Student choice of downstream courses and majors. In measuring students' choices of downstream courses and choices of majors, the team saw a strongly increasing trend in the number of statistics majors coincident with the project redesign.
Students' interaction with TAs on more meaningful statistical issues. TAs were asked to categorize student interactions in labs according to the following questions: If a statistical software question, did it involve interpreting a statistical display or was it a conceptual question? If a conceptual question, was it a deep/difficult or an easy concept? In the redesigned lab, there were half as many questions asked of TAs, and the questions were different in nature than those in the traditional lab. In the traditional lab, 23% of the questions involved the statistical software package (versus 7% in the redesign), and 20% involved an easy concept (versus 7%); 34% involved interpreting statistical analyses (versus 43%), and 11% involved a deep concept (versus 25%).
Program in Course Redesign Quick Links: