THE STATE OF DEVELOPMENTAL MATH: IT’S FAR WORSE THAN YOU THINK

by

Carol A. Twigg

Download a PDF version

According to the Bill & Melinda Gates Foundation, “Every year millions of young adults stride onto their local community college campus with aspirations of obtaining a college degree. But even though most of those new enrollees graduated from high school, nearly 60 percent will have to take a remedial class before earning college credit.”

How well do they do if they enroll in a developmental math class (and many who should enroll do not because their institution’s placement policy is not mandatory)? In the January 2011 issue of The Learning MarketSpace, we reported, “Among the Changing the Equation institutions, the average percentage of students who receive a grade C or better in developmental math in the spring semester is 48.2%. In the fall semester, that rate is 50.7%. Passing rates in spring terms are typically lower than in fall terms since spring includes those who failed to pass in fall, math avoiders, etc.” We also provided a lot of evidence to support the idea that our data base likely reflects the state of developmental math across the nation.

The Gates Foundation continues, “For most students, these remedial classes do not lead to a college degree or certificate. Studies have shown that three out of every four students who take remedial classes will not graduate within eight years compared to 40 percent of students not required to take remedial courses.”

So we know that we have a problem.

What’s new about the observations in this article is that the actual achievement rate in developmental math is far worse than you think it is.

In the words of one of the Changing the Equation participants: “It appears that completion rates for traditional courses are artificially increased [our emphasis] due to students passing the course without mastering all concepts necessary for success.”

Learning Goes Up But Completion Goes Down

During the pilot term of Changing the Equation, we observed scores on direct measures of student learning (common exams, common exam items, pre/post-tests) going up while completion rates (final grades of C or better) went down. We have observed this phenomenon in the past and discussed it at some length in the July 2010 issue of The Learning MarketSpace. Some readers of that article asked us if that was because more “weaker” students had dropped out of the redesign, thus “inflating” the learning results. While that could be a factor in some situations, we can show that it is not when the overall “retention” rate is the same for both formats—i.e., the same percentage of students stay enrolled in the course until the end. Here are two examples:

Community College #1

  • Common exam scores of 98 traditional students and 127 redesign students in Pre-Algebra were compared. The mean score increased from 75.1 to 84.1. At the same time, the completion rate (final grade of C or better) declined from 45.6% to 35.1%.
  • Common exam scores of 207 traditional students and 146 redesign students in Basic Algebra with Measurement were compared. The mean score increased from 71.8 to 84.2. At the same time, the completion rate (final grade of C or better) declined from 64.6% to 36.8%.
  • The percentage of students who stayed enrolled in both courses in both formats was consistent at about 86%, indicating that the increased learning was not a product of weeding out “weaker” students.                           

Community College #2

  • The percentage of correct answers to common exam items went from 60.9% in the traditional format to 85% in the redesigned format.  Yet the average pass rate (final grade of C or better) declined from 47.6% (2,769 students) to 40.7% (756 students.)
  • The percentage of students who stayed enrolled in both formats of the course was consistent at about 89%, indicating that the increased learning was not a product of weeding out “weaker” students.                           

We saw this pattern repeatedly in the pilot results.

How Was The Grade Inflation Problem Discovered?

Since we knew about this contradiction in prior course redesign programs, we asked each Changing the Equation participant, “If your learning outcomes went up or stayed the same and your completion rates went down, why do you think this happened? Was it due to differences in students? Was it due to prior grade inflation or curving? Please analyze and explain for each course where these discrepancies occurred.”

One of the responses summed it up, “I have had two main concerns in the past about the math department. One has been grade inflation and the other has been the difference in the way instructors teach and grade students. I believe that the redesign data more than proved those points.  Students passed the previous course and were not prepared for the next course in which they enrolled.”

Here’s how the problem was discovered at this particular institution.

Students who took a previous course in the developmental math sequence in a traditional setting were not prepared for the course they were taking in the redesign. Students who passed the traditional DEV 1 course demonstrated a severe deficit in the level of knowledge of the topics covered in it when they got to the DEV 2 redesigned course. This was especially true for intermediate algebra (the third course in the sequence) students.  So many students were stuck on their first module, which included graphing inequalities.  The faculty discovered that many of the students had passed the previous course with a D and did not know how to do basic graphing.

Confronted with this situation, some colleges created "review modules" to help address the deficiencies. The problem then became that many students got bogged down on the review modules. They were in the higher course and couldn’t do the “review”—i.e., the content of the lower course. The reality was that they were enrolled in the higher course, but they shouldn’t have been; they had simply been passed along without the requisite skills.

So think again what those average pass rates of 48-50% represent.

What Actually Goes On in Most Traditional Developmental Math Courses

As reported by the Changing the Equation participants, in traditional developmental math courses, there are no common standards.  Even the course content is subject to wide variation from one section to another, along with variation in grading standards.  When a student passes a traditional developmental math course, his or her success in the next course is possible but not probable.

No consistency in grading/standards

Frequently each instructor is allowed the autonomy to weight categories differently. Instructors in the traditional course have more opportunities to influence grades by

  • giving partial credit on quizzes, tests and exams
  • giving extra credit
  • giving a “mercy” grade—i.e., students receive a passing grade for their effort
  • allowing take-home quizzes and tests 
  • not checking/grading homework
  • not giving a comprehensive final exam

As one participant put it, “Looking over the grades for the traditional mid-terms and final exams, we found that some instructors would never count a problem wrong--they always gave at least partial credit.”

The point is not that one or another of these practices is wrong--it is because they are used inconsistently so that the final grade means little. An A from one instructor may be completely different from an A from a second instructor and so on.

No consistency in course content 

Even though all instructors may be given common course objectives, different instructors place more emphasis on some concepts and less on others. Many important objectives are not taught or barely covered. These objectives are important to ensure success in the next sequence of math classes, Since final exams are created individually, some concepts may be omitted entirely. Again, an A on one instructor’s exam may be completely different from an A on a second instructor’s exam and so on.

No requirement for mastery

In the traditional course format, students do not have to master each chapter/concept/module. They can survive a failing grade on a section test and continue on. They just have to make sure that the average of their exam scores is at least a 70.  This means they can fail a couple of chapters that contain important concepts and still pass the course. Frequently, students are permitted to attempt the final exam even if they have failed unit tests, have not completed homework and have had poor attendance. Some students are counted as “successfully completing” the course as long their final exam score was above 50%.

What Happens in a Changing the Equation Redesigned Course

A redesigned course is characterized by one word: consistency--consistent standards, consistent content coverage, consistent grading and consistent mastery of content. When a student passes a redesigned developmental math course, his or her preparation for the next course is guaranteed.

Consistent objective grading

Grading policies in all sections of the redesigned course are uniform. Grading in the redesign is done by the computer so the grades are more objective. No partial credit is assigned and no grade inflation occurs because students all do the same work and receive the grade they actually make with no occurrence of grade inflation or curving.

Consistent content coverage

Content overage in all sections of the redesigned course is uniform. All faculty know what the grades in each course represent. The redesign addresses the issue of adjunct instructors from other campuses who teach the traditional courses with no real supervision or review, a common occurrence in developmental math. (One adjunct was discovered to have given only a grade of 100% to all students.) Because adjuncts in the redesign operate under the exact conditions as the full-time instructors, their participation in the redesign is consistent and constructive.

Mastery of course material is required

The redesigned courses have higher standards than the traditional courses. In redesign, grades are based on mastery only.  Students in the redesign typically need 70% - 80% on each module test before moving on. Only students who complete all course modules are eligible to take the final exam and since they are required to master each module, final exam scores tend to be high. Students who complete the redesigned sections demonstrate greater understanding of the material compared with students in the traditional sections. 

Just To Reinforce the Point

One community college saw both student learning outcomes and completion rates increase in the redesign but decided to investigate the issue of grade inflation. Here’s what they found:

  • In the traditional version of Essential Mathematics, 195 students completed the course. Of those, 72 completed the course with a C or higher. Only 14 (19%) of those students passed every chapter exam with a score of 70 or higher. And only 47 (65%) passed the final exam with a minimum score of 70. This means that 58 students will progress to the next course without mastering all of the concepts necessary for success.
  • In the traditional version of Introductory Algebra, 187 students completed the course. Of those, 103 completed the course with a C or higher. Only 23 (22%) of those students passed every chapter exam with a score of 70 or higher. And only 71 (69%) passed the final exam with a minimum score of 70.  This means that 80 students will progress to the next course without mastering all of the concepts.
  • In the traditional version of Intermediate Algebra, 11 students completed the course. Of those, five completed the course with a C or higher.  No student passed every chapter exam with a score of 70 or higher. And only three of the five (60%) passed the final exam with a minimum score of 70. This means that five students will progress to a curriculum level mathematics course without mastering all concepts necessary for success.

Here are a few examples of students’ inability to demonstrate mastery on a comprehensive final exam yet still be successful and progress to the next course.

 

Final Exam Grade

Final Course Grade

Student A

21

C

Student B

30

C

Student C

46

C

As the project leader understatedly concluded, “It appears completion rates for traditional sections are artificially increased due to students passing the course without mastering all concepts necessary for success.” This also means, of course, that the success rates for this redesign project are much better than they look because of prior grade inflation in the traditional courses.

Conclusion

The ultimate impact of this rampant grade inflation is, of course, on students who are passed along into college-level math. It is not surprising that our success rates in college-level math are poor as well since so many students are coming to those courses ill prepared.

During the past 11 years, we at NCAT have learned a lot about the relationship between assessing learning outcomes and creating sustainable change. In order to create real change, you must measure what you do and understand what you are measuring. Doing so reveals problems in academic practice that are frequently glossed over in higher education. There are many (most?) developmental math reform efforts going on currently that are using final course grades as a comparative measure with little or no regard for the complexity of this issue. As we said in the opening, even if you think things are bad, they are in reality worse than you think.