Missouri Course Redesign Initiative
Missouri State University
Course Title: Introductory Psychology
Introductory Psychology is a semester-long, general education course at Missouri State University (MSU). The course falls within the self understanding/social behavioral perspective area of general education and is by far, the most popular choice for students within that area. Each academic year, at least 18 traditional face-to-face sections are offered with a total enrollment of 2,500-2,700 students. The course is lecture-based and typically taught by 65% full-time faculty and 35% adjunct instructors. While there are common general education goals across all sections, each instructor makes individual choices of content and delivery.
Despite being a popular choice among students, Introductory Psychology has traditionally experienced a high DFW rate (approximately 25%). The department wants to find ways to develop a more engaging course that will result in improved student learning outcomes and student satisfaction. Due to the large enrollment and numerous sections offered throughout the year, a significant number of adjunct instructors teach the course, which has contributed to some course drift and grade inflation. Currently, each section of 153 students is taught by one instructor, which significantly limits the type of activities that can be assigned and graded. The vast majority of the final course grade is derived from a series of multiple-choice exams. The goal is to redesign the course to be much more engaging and interactive, with an emphasis on true mastery of the course material.
The redesign will use the Replacement Model, replacing about 50 percent of traditional lecture time with online learning tools and activities. The time spent lecturing will focus on difficult concepts and specific student needs, which can be identified via online quizzes administered prior to class. This approach will likely improve student preparedness for class. Section size will increase to 300 students and a senior learning assistant (SLA) will be added to each section to facilitate student learning and development. Each 300-person section will be divided into smaller learning groups led by undergraduate learning assistants (ULAs). Each group will complete a total of four online group activities, which will focus on experiential learning of important concepts in psychology. Therefore, although students will be part of a large section, each student will have an opportunity to work collaboratively and engage in experiential, active-learning assignments with their peers and a peer ULA.
The course redesign will address all of the academic problems faced by the traditional course. The quality of the course will be significantly improved in the following ways: 1) standardization of course material based on agreed-upon student learning outcomes will prevent course drift; 2) 100% of sections will be taught by full-time faculty, providing more consistency in grading and greater engagement with full-time faculty; and, 3) the incorporation of publisher-ready, online learning materials will improve engagement with and retention of material. The team expects that these changes will significantly reduce the DFW rate and improve overall learning and satisfaction with the course.
The department has traditionally collected data on student learning through a comprehensive pre-post exam and maintains records of students’ grades and the number of drops per section. Scores on the pre-post exam from the redesigned sections will be compared with those from traditional sections. Questionnaires regarding student satisfaction and desire to take future psychology courses will be utilized in addition to traditional course evaluations. This multimodal assessment approach will provide the team with the flexibility to examine a variety of factors related to the redesign, including the impact of using different types of senior learning assistants.
The redesigned course will produce cost savings through a combination of larger class size (from 153 to 300 students) and the restructuring of personnel. There will also be a small enrollment increase of 72 students annually. The addition of an SLA to each section will allow full-time faculty members to teach double the number of students and to teach an additional course in the department. This approach will significantly reduce the number of adjunct faculty needed to teach various courses (including Introductory Psychology) in the department. The cost-per-student will decline from $73 to $60, a savings of 17.8%. The cost savings will remain in the psychology department and will be used to provide support for the redesigned course in the future, faculty wishing to take on additional course redesign projects and faculty travel to present at conferences related to the scholarship of teaching and learning.
In the redesign, did students learn more, less or the same compared to the traditional format?
The fall 2011 semester served as a baseline semester where two faculty members, who were both members of the redesign team, each taught a traditional section of Introductory Psychology. Great care was taken to ensure that the faculty members taught the course in its traditional format (i.e., did not make any pedagogical or technological changes to the course). The primary measures of learning outcomes included two pre-post comprehensive exams: 1) a 30-item comprehensive exam which was developed a number of years ago by members of the department of psychology and has traditionally been used as a measure of student learning in Introductory Psychology, and 2) a 50-item comprehensive exam created specifically for this project by the course redesign team.
Analyses confirmed that there were no significant differences between the two baseline sections with respect to pretest scores on the 30-item exam (t (276) = .92, p > .05) or the 50-item exam (t(282) = .04, p > .05), thus the sections were combined into one comparison group (n = 302).
CHANGE PERCENTAGES TO GAINS: The full implementation occurred during the fall 2012 semester and consisted of 5 sections, all taught by members of the redesign team. For the purposes of comparison, the five sections of students were combined into one “redesign group” and consisted of 1340 students. Results indicated that, on the 30-item comprehensive exam, students in the redesigned sections performed significantly better (84% improvement) compared to the traditional comparison group (54% improvement), t (317.54) = -7.50, p < .001. Similarly, students in the redesigned course demonstrated significantly more improvement from pre to post on the 50-item comprehensive exam (62% improvement) compared to the traditional sections (37% improvement), t (429.41) = -12.55, p < .001.
A significant portion of the redesigned course utilized publisher-customized digital learning technology. A correlation was calculated between the students’ online total score of assigned material and the total of five exam scores. This correlation was .68, p < .001 suggesting a strong relationship between the completion of online learning activities and exam performance.
Completion rates (grades of C or better) were 76% in both the traditional and redesign sections. There were slightly fewer D’s and F’s in the redesigned course (16.9% compared to 18.5% in the traditional course); however, the withdrawal rate increased from 6% in the traditional course to 7.4% in the redesigned course. Furthermore, while the DFW rate did not change significantly in the redesign, it does appear that the distribution of A's, B's, and C's shifted such that in the redesign, there were more A's and B's and fewer C's compared to the traditional course.
Other Impacts on Students
Student attendance is considered a measure of student engagement with the course and was assessed each class period. One baseline (traditional) section recorded attendance each week and was compared to the same faculty member’s redesigned class in fall 2012. Overall, attendance improved substantially in the redesigned section. (Fall 2011 traditional mean percent attendance = 75% versus fall 2012 redesign mean percent attendance = 83%).
Student perceptions of the course components effectiveness was assessed during both the pilot and full implementation. Students evaluated 10 dimensions of the in-class and online components of the course using a 7-point rating scale (ineffective was rated 1, somewhat effective rated a 4 and extremely effective was rated 7). Ratings below three were considered ineffective and ratings of 4 and above were considered effective.
Generally, students found both online and in-class components of the redesigned course to be effective. In-class components (video clips, class activities/demonstrations, class PowerPoint slides, clicker questions and the overall seated experience of the course) were on average perceived as more effective than the online components. The lowest rated component was the MPL media assignments and it was considered only just below somewhat effective. Perceptions of a number of course components significantly improved from spring 2012 (pilot) to fall 2012 (full implementation).
Were costs reduced as planned?
MSU’s cost savings plan was implemented as originally planned and consisted of a combination of the restructuring of course personnel and increasing section size. The course staff changed from one faculty member or adjunct instructor teaching 153 students to a teaching team of seven individuals (one full-time faculty, one graduate student or adjunct instructor serving as senior learning assistants (SLAs), and five undergraduate learning assistants (ULAs) for a section of 300 students. The anticipated reduction in the cost-per-student was 18%.
Pedagogical Improvement Techniques
What techniques contributed most to improving the quality of student learning?
MyPsychLab. A Pearson product, MyPsychLab, was used to deliver online activities, quizzing, media assignments and study material. The redesign team reviewed all potential quiz questions and chose the items which were believed to be most relevant to the learning objectives associated with each chapter. With help from the Pearson team, only these chosen quiz questions were the items that were included in the question pool. Prior to attending class, students completed a pretest, which generated an individualized study plan. After completing the study plan activities, students were tested again with an online post-test. This quizzing was low-stakes in that the students could take the post-test as many times as they wished and their highest grade was recorded. In addition, the redesign team reviewed all media materials/assignments and chose the most relevant and best exemplar for each topic. MyPsychLab also contains numerous resources and review materials that students can access. Another benefit of MyPsychLab was the ability to monitor students’ engagement with the resources. Instructors could access information including the amount of time students spent on assignments, total time spent interacting with all material available, and number of attempts and submission times of assignments.
Tailored lectures. The post-test quizzes assigned for completion prior to the seated portion of class each week were designed to assist the instructors in identifying material which students found challenging. In the 24 hours between the submission of the post-test and the seated class period, instructors revised lectures to include class activities, discussion topics, video material, and specific content, relevant to the areas of difficulty identified by the post-tests. As a result, rather than the lecture serving as a review of the material students had read in their textbook, the “lecture” became a tailored class period devoted to the difficult and/or more interesting topics for students. Students’ comments on course evaluations suggested they enjoyed the interactive demonstrations and believed this type of “lecture” was effective in helping to further illustrate various concepts and principles.
Clickers in the classroom. All students in the course were required to purchase a clicker and were instructed to bring the clicker to each seated class. The instructors incorporated questions from the MyPsychLab post-tests into the lecture and students would provide answers during the class via their clicker. The results provided immediate feedback to the instructor as to whether or not students’ understanding of the identified difficult material improved after classroom demonstrations and discussion. If needed, the instructor could then employ peer instruction or other demonstrations and discussion until students were performing at an acceptable level on quiz items using the clickers. Clickers were also used to monitor participation each class period, which counted in the overall course grade. This information was also used to follow up via email with students who were not in class, as well as to reach out to students who had not been attending class on a regular basis.
Senior learning assistants (SLAs). Graduate students and adjunct faculty were chosen to serve as SLAs. Each section of the course had one assigned SLA. The role of the SLA was to assist the instructor with managing email, grading, class preparation, and to work directly with students and the five ULAs in the course. The team has found that a combination of three graduate students and two adjunct instructors to be an acceptable balance, as graduate students tend to be more available and easier to recruit, but are significantly more expensive to employ.
Undergraduate learning assistants (ULAs). Undergraduate students who had successfully completed Introductory Psychology (having received an A or B) were contacted via email and informed of the opportunity to apply to serve as a ULA in the upcoming semester. Applications were reviewed, and an early set of applicants were invited to interview. Based upon ratings by the faculty interviewers, a cohort of ULAs was selected. The ULAs enrolled in a three-credit hour, ‘by permission’ course, The Teaching of Psychology in exchange for their work as a ULA. They were assigned to a specific section of Introductory Psychology, and within that section, were assigned a subgroup of students. For this group of students, the ULA was their first contact person for questions. ULAs also held office hours in a central location in the library where they were available for individual tutoring and assistance with course material and/or technological issues.
Experiential group activities. Students were placed in groups within their class using a randomization grouping procedure through Blackboard. Within their small groups of 15-20, students were expected to participate in four online discussions. In response to the questions posed by the instructor, they were required to make one initial response followed by two more substantive posts directed toward postings made by other students in their group. Participation in the discussion boards accounted for 12% of their overall grade. ULAs monitored the discussion and encouraged students to thoughtfully consider the incorporation of class material in responding to the questions.
Student interventions. Students who received a grade of a D or F following the first exam were required to attend a study skills session conducted by the SLAs. These sessions focused on the requirements of the course, time management, and effective study strategies. Students who had a grade of D or F at midterm were contacted via email and encouraged to schedule an individual meeting with the instructor. During this meeting, the online activity of the student was reviewed, study habits were discussed, and campus and course resources were presented. These sessions were designed to encourage a sense of accountability for their course performance and develop a plan for improvement in the remainder of the course.
Positive feedback. In addition to providing feedback to students who did not perform well, the course staff also sent positive notes of encouragement and praise to students who received A’s and B’s on selected assignments and exams. While this strategy was not formally evaluated, instructors commented on the number of students who replied to their emails with statements of appreciation.
Cost Reduction Techniques
What techniques contributed most to reducing costs?
Restructuring of course personnel. Restructuring the way the course was taught, by adding an SLA and five ULAs to each section, allowed for much larger sections to be taught while simultaneously reducing costs and improving the course staff to student ratio. In the traditional course the instructor to student ratio was 1:153. In the redesigned course the course staff to student ratio is 1:43.
Increased section size. The restructuring of personnel allowed for the section size to increase from 153 students to 300 students. The increased class size significantly reduced the number of sections to be offered in the fall and spring. As a result, all sections in fall 2012 were taught by full time faculty members in the department of psychology, where previously, approximately 35% of sections were taught by adjunct faculty. Having 100% of the sections taught by full-time faculty allowed for a true team-approach to teaching which considerably decreased the amount of course drift that was previously identified as problematic.
What implementation issues were most important?
Technology issues. The redesigned course depended heavily on a variety of technologies for course content provision, automated grading, exam administration, and seated class participation. The pilot course was instrumental in resolving many of the technological issues. Campus support from a variety of places including the Faculty Center for Teaching and Learning and Computer Services was excellent and individuals within these departments were active members of the redesign team from conceptualization to implementation. The pilot course and the full implementation certainly had technological issues on all fronts at some point or another, but our team considered it to be a success if the impact of these problems on students was minimized. As a result of campus support, support from the publishing team, and frequent team problem-solving, students rarely complained about technology problems in their course evaluations. One of the biggest technological challenges involved the fact that a new digital platform for the course did not become available until the first week of classes during the full implementation semester. Therefore, there were issues with automatic grading schemas, transferring grades to Blackboard, and videos not presenting properly during the full implementation. Working closely with the publisher throughout that semester, these problems have now been resolved and the current course has been proceeding smoothly.
Group seating. The plan was to have students to sit within their small groups to have more personal interactions with the 15-20 students who were collaborating on online group activities. Additionally, each ULA was responsible for 4 small groups and having those groups seated together allowed the ULA to take attendance and interact with his/her groups of students more effectively. However, with a class of 300, that meant that some groups of students always had to sit in the far back or corner of the auditorium which was not ideal for some students. Furthermore, when students were absent from their group, the vacant seats between the students made group discussion and interactions more difficult. Therefore, currently students sit within a larger group of 60 students so that ULAs can still take attendance in their groups but students can have more flexibility in choosing their own seats.
Small group experiential learning activities. The course included online group experiential learning activities which first consisted of wiki projects in the pilot course. Students had to complete four wiki projects in their small groups by making contributions and edits to a single group webpage. Given that the project grade depended on a combination of group and individual effort, students became frustrated and resentful of the group members who were perceived to be failing to contribute appropriately. In the end, these group projects did not serve to enhance students’ connections to their peers or facilitate experiential learning. In the full implementation, small groups participated in discussion boards rather than wiki projects. While students complained much less about the discussion boards because their grades were individually-based, it was discovered that students at this level were not prepared to engage in meaningful discussions about applied topics. After the full implementation, this aspect of the course was revised again to require all students to attend a small group study session, prior to each exam, led by the ULAs. Prior feedback from the ULAs suggested that they wanted a more active role in promoting learning and the introductory psychology students wanted to have study sessions. Early indications suggest that this new format for small group interactions has been more successful in promoting experiential learning and feelings of connectedness to other students and the ULAs. The ULAs also appear to considerably enjoy and value their role as a study group leader.
Online exam administration and proctoring. Because the course is now blended, using classroom time to administer exams is not practical. The midterm and final exam continue to be administered in the seated class, but there are 3 additional unit exams that need to be administered outside of class time. The initial plan involved administering these exams online and required students to take the exams in campus computer labs using Respondus Lockdown Browser which does not allow test-takers to open any other documents or windows while the test is being administered. A large pool of test items was chosen and the exams were timed in order to reduce the likelihood of cheating. However, these campus computer labs are not proctored and our university does not have a large proctoring center. After several faculty members witnessed cheating in the computer labs and other students in the class reported seeing some students cheating, it was decided to use the SLAs in addition to other paid personnel to proctor online exams in 2-3 different computer labs on campus. The process of scheduling these computer labs, arranging proctoring schedules, and paying extra money for proctoring has been a significant challenge. The team is currently considering the possibility of scheduling the classroom space on the “off” day for the blended course (e.g., if the seated class is on Tuesday at 11:00 AM, scheduling the classroom for Thursday at 11:00 AM as well) to allow each instructor to proctor their own exams and offer study skills sessions and technical support in the classroom on the other off-days.
Will the redesign be sustained now that the grant period is over?
The support for the redesigned course has not wavered within the department and university, and there is every indication that the course will continue as currently designed. The department is currently searching for a tenure track assistant professor, who will coordinate the introductory psychology redesigned course and the staff teaching the course each semester. The coordinator will also serve as a liaison with the publishing company and have the responsibility of recruiting, training, and often teaching the ULAs. All of the members of the original redesign team had teaching responsibilities modified to accommodate the course redesign process, and now, these faculty wish to return to teaching other courses in addition to introductory psychology. This change will require the recruitment of additional departmental faculty interested in teaching the redesigned course (this has already begun with a combination of existing faculty and new hires).
The redesigned course has evolved since its original conception and will likely continue to do so. The original team members were compensated by the university for their time and effort during the development of the redesigned course. Since the state sponsored involvement in course redesign is complete, the question of how faculty will be supported in future course redesign efforts needs to be explored. Overall, the redesign team developed a robust, engaging course, based on best-practices in teaching that have proven to significantly improve learning outcomes. Therefore, any revisions that take place in the future are likely to be relatively minor and are not expected to place any significant burden on the existing redesign team.