State University of New York: SUNY Course Redesign Initiative

Buffalo State College

Course Title: The Economic System
Redesign Coordinator: Karen O'Quinn

Project Abstract
Final Report (as of 3/15/10)

Project Abstract

Buffalo State College plans to redesign its introductory economics course for non-majors, The Economic System. During the 2001-2006 period, the course enrolled a total of 4041 students, making it the ninth highest enrolled course at the college. Structured in a traditional lecture format, the course enrolls ~500 students annually and typically experiences a 15% DFW rate.

The traditional course presents an increasing strain on the Economics and Finance Department’s resources. The department has grown considerably over the past five years, doubling the number of majors and adding a successful new graduate program. Thus their ability to offer non-major courses and meet student demand is declining. The redesign will allow the department to serve more students with fewer faculty resources, while maintaining high quality instruction.

Using the Replacement Model, the economics course will be redesigned in two phases. Phase I will pilot a hybrid section that will reduce formal class meetings by 50% to one meeting each week. This class meeting will be a combination of lectures, question and answer sessions, previewing materials and highlighting key concepts. The second class meeting will be replaced with online class discussions in groups of ~15 conducted by trained undergraduate learning assistants (ULAs). In Phase II, the course will be offered in one large, 240-student section with additional online learning materials replacing 11 of the 15 face-to-face meetings. The four remaining class meetings will be scheduled over the semester, one at the beginning, two in the middle of the semester and one at the end.

When fully implemented, the redesigned course will move from a traditional lecture format to a learner-centered, active-learning mode, enhancing course quality and improving student learning outcomes. The 15-student learning teams will work collaboratively on assignments. Group discussions and assignments will be actively monitored, and students will receive individualized assistance from the instructor and the ULAs both online and during office hours. Online quizzes will also provide immediate feedback to students.

The impact of the course redesign on student learning outcomes will be assessed using common examinations. In Phase I, performance data from parallel sections, one redesigned section (~120 students) and one traditional section (~120 students) will be compared. The Phase II full implementation assessment plan, using the same common examination, will compare the one large redesigned section (~240 students) to the Phase I sections.

The redesigned course will reduce the cost of instruction by reducing the number of sections from two per semester to one, and increasing section size from ~ 120 to ~240 students. The number of full-time faculty will be decreased from two to one each semester. These actions will decrease the cost-per-student from $94 to $51, a savings of 46%. The growth of Buffalo State’s graduate program has placed pressure upon the department to add more graduate courses, and the redesign will allow the department to schedule an additional graduate section.

Final Report (as of 3/15/10)

Impact on Students

In the redesign, did students learn more, less or the same compared to the traditional format?

Improved Learning

Learning outcomes were assessed by comparing common exam items on the final exam. The team’s assessment plan called for comparing common final exam items between the traditional course taught during fall 2008 and the same common final exam items from the spring and fall 2009 redesigned sections. Due to an inadvertent record-keeping oversight, it was not possible to compare the specific 14 common items from fall 2008 with those from the two large 2009 redesigned sections. However, the percentage correct for each student was available and could be compared even though the number of items differed.

In the fall 2008 pilot of the redesign, a traditional lecture section was compared with a partially redesigned section. Only 14 common items were available for comparison. Results for these 14 common items showed that students in the partially-redesigned course (M = 67.9, SD = 19.15, n = 138) had a significantly higher percentage correct on the common items than those in the traditional course (M = 57.7, SD = 16.1, n = 158), p < .001.

However, 25 common items were available to compare the 2009 sections as the redesign progressed.  In both spring and fall 2009, only one large redesigned section was offered each semester. In spring 2009, the redesign was not implemented as fully as originally planned.

In fall 2009, the redesign was well along the way to becoming a fully hybrid course.  Results showed that the students in the fall 2009 section (M=66.28, SD = 18.36, n = 209) had significantly higher percentage scores on these 25 common final exam items than students in the spring 2009 section (M=60.64, SD = 18.53, n = 157), p < .004. The internal consistency reliability (Cronbach’s alpha) for the 25-common-item scales was good (.79 for spring 09 and .82 for fall 09).

A one-way analysis of variance comparing the percentage correct of the available common items across all four sections showed a significant difference (p < .001).  Post hoc analysis showed that the spring 2009 mean percentage was not significantly different than the traditional class from fall 2008; both were significantly lower than the fall 2008 partially-redesigned course and the fall 2009 fully redesigned course.

Improved Completion

In fall 2008, the traditional lecture section showed a success rate (grades of A, B or C) of 67%.  This was significantly lower (chi square, p < .002) than the success rate in the partially-redesigned section in the fall 2008 pilot, which was nearly 85%.  However, subsequent semesters showed a success rate in between those two extremes: a 76% success rate for the spring 2009 partial redesign, and a 79% success rate for the fall 2009 fully redesigned course. The difference between the latter two semesters was not significant.

Other Impacts on Students

The analysis of the fall 2009 data also examined the contribution to the exam grades made by two pedagogical improvements to the course:  discussion forums and online quizzes.  There were five discussion forums and eight online quizzes during the semester.  Summary scores were created for each technique by taking the mean as the dependent variable for the forums and the quizzes.  Correlations (all n = 230) were computed between the discussion and quiz mean scores and the students’ performance on the exams.  For the discussion summary score, the correlations with exams ranged between .34 and .48 (exam 4 had the highest correlation); the average correlation was .39.  For the quiz summary score, the correlations ranged from .39 to .43 (exam 4 again had the highest correlation); the average correlation was .40.  All of these correlations were significant, p < .001. The correlation between the discussion summary score and the quiz summary score was .61. These correlations indicate that students who participated more in the online quizzes and discussion forums achieved higher scores on the exams. The correlations with the final grades were higher (.75 for discussion and .66 for quizzes); however, because discussions and quizzes were counted toward the final course grade, those correlations are spuriously high. 

Student feedback in fall 2009 showed that 67.3% preferred online “active” learning to the traditional textbook method. Students were asked to rate either the usefulness or helpfulness of various components of the course.  Some of the components given the most positive ratings (percentage of “very” or “somewhat” answers) were:  e-review sheets on key terms (83.5%), prompt feedback when you took online quizzes (79.7%), the online PowerPoint lectures (92.3%), and the online readings (69.6%).  Components with the lowest percentages of positive ratings were: videos streamed through ANGEL (36.6%), the online Jeopardy game (50.6%), and the team/group activities (46.8%).

Impact on Cost Savings

Were costs reduced as planned?

The redesigned course reduced the cost of instruction by reducing the number of sections from two per semester to one, and increasing section size from ~ 120 to ~240 students. The number of full-time faculty teaching the course decreased from two to one each semester. These actions decreased the cost-per-student from $94 to $51, a savings of 46%. The growth of Buffalo State’s graduate program has placed pressure upon the department to add more graduate courses, and the redesign has allowed the economics department to schedule an additional graduate section.

Lessons Learned

Pedagogical Improvement Techniques

What techniques have contributed the most to improving the student experience?

Effective online pedagogical tools. An important component of the successful redesign was the identification, modification and incorporation of effective online pedagogical tools to help students understand economic history. The redesigned class met face-to-face only once per week; the second meeting was replaced by online activities, including videos from Discovery Education, TOONDOO (online cartoon creation), discussion forums and online quizzes. In fall 2009, the face-to-face meetings were enhanced by using TurningPoint clickers. The clickers provided two benefits: to enhance student participation and to provide feedback to the instructor about how well students comprehended the material in the face-to-face meetings. The online discussion forums (in ANGEL, the campus CMS) were used by the professor to pose questions to students, who were graded on whether or not they responded, not on the content of their responses. Five such forums were used. Online quizzes were required eight times during the semester (quizzes were not required the first and last week, nor during any of the exam weeks). The quizzes served as a means of ensuring sufficient time on task and monitoring students’ progress. Assessment data showed significant positive correlations between students’ exam scores and their summary scores for both the online quizzes and discussion forums.

Well-trained undergraduate learning assistants. A second factor crucial to the success of the redesign was the work of the undergraduate learning assistants (ULAs). The ULAs were recruited in two ways: a) by encouraging good students from the previous semester to participate and b) by working with students (teacher education candidates) whose majors are Social Studies Education and Elementary Education with a Social Studies concentration.  Economics is a required topic for both majors, and the pedagogical nature of the candidates’ ULA participation was valuable to their career preparation.  Each semester, six to eight trained ULAs monitored the online activities of the class; each ULA supervised the activities of approximately 35-40 students. The work of the ULAs included managing small-group activities; communicating, updating and releasing content on the ANGEL site for students; and, holding office hours in a computer lab to give students the one-to-one assistance that they needed. 

At the beginning of each semester, there were many changes that need to be made to the CMS (currently ANGEL) course site, including due dates for assignments, office hours, content release schedule, exam dates, etc. These updates were conducted by the Lead ULA.  As the redesign is institutionalized, a separate training for the Lead ULA will also be instituted so that these course site changes are performed before the course starts. Careful planning is needed for the transition each semester or every other semester to a new Lead ULA.

Cost Reduction Techniques

What techniques contributed most to reducing costs?

Increased section size. The redesigned course reduced the cost of instruction by reducing the number of sections from two per semester to one and increasing section size from ~ 120 to ~240 students. The number of full-time faculty teaching the course decreased from two to one each semester.

Implementation Issues

What implementation issues were most important?

ULA training. As the redesign progressed, formative assessment (student feedback) informed the team that it was necessary to formalize the role of the ULAs for the course so that their duties and responsibilities were very clear. Similarly, formative assessment has led to considerable improvement in the training of the ULAs over the project as their training has been increased considerably in duration and content.  Lasting two days, their preparation included instruction on these topics: economics content, Turning Point clickers, an overview of FERPA, classroom techniques to encourage class participation, and intensive ANGEL training. As the redesign has proceeded, the ULAs have spent an increasing percentage of their pedagogical time online.

Sustainability

Will the redesign be sustained now that the grant period is over?

The economics department has been pleased with the results of the redesign: cost savings that have allowed the reallocation of faculty resources to the graduate program, and a student learning experience that has equaled or surpassed the past approaches to the course. Therefore, the department is committed to the redesign approach.

In the short run, the three faculty members with 10+ years of experience in teaching the course will use the redesign approach. In the intermediate term, younger faculty members will be solicited to teach the course in this new format after appropriate training and mentoring. On a longer term basis, serious consideration will be given when recruiting new faculty members to finding faculty who have a serious interest in teaching a hybrid general education economics survey course, the majority of which is online, supported by ULAs.

 

 

Quick Links:

State University of New York Main Page...