Regional Accreditation and the Problems in Developmental Mathematics

This post is directed at my colleagues in community colleges and similar institutions … and the bodies that conduct our accreditation processes.  My conjecture is that the accreditation process contributes to the problems we have in developmental mathematics, and that this situation deserves corrective action on the part of the regional accreditation bodies.

The regional accreditation bodies use criteria for faculty credentials; in the case of the HLC, the specific wording is:

Faculty teaching general education courses, or other non-occupational courses, hold a master’s degree or higher in the discipline or subfield. If a faculty member holds a master’s degree or higher in a discipline or subfield other than that in which he or she is teaching, that faculty member should have completed a minimum of 18 graduate credit hours in the discipline or subfield in which they teach.
(see http://download.hlcommission.org/FacultyGuidelines_2016_OPB.pdf)

In all cases that I am aware of, remedial courses are not included in the ‘other non-occupational courses’ category.  The result is the common practice:

Anybody holding a bachelor’s degree, in any field, is qualified to teach developmental mathematics.

Within this common practice, a significant portion of faculty teaching developmental mathematics were original credentialed for high school teaching … usually in mathematics, but not always.  Teaching high school mathematics is a worthy profession, often undertaken by dedicated individuals who are either not-appreciated or blatantly disrespected.  However, the context for teaching developmental mathematics is fundamentally different from teaching high school mathematics.

Among those fundamental differences is the fact that developmental mathematics at an institution is directly connected to college-level math courses.  The developmental algebra courses are expected to prepare students for specific college-algebra or pre-calculus courses, with an expectation of content mastery and retention … those elements have a much lower priority in the high school setting.

Another critical difference between the high school and developmental math contexts is that the developmental math faculty need to interact positively with faculty teaching the college level courses.  Since so many of the developmental mathematics faculty have less qualifications, this presents a cultural and social problem:

How can faculty of college-level mathematics have professional respect for faculty of developmental math courses with ‘lower’ qualifications?

A typical developmental math course has a focus on procedural skills and passing, while the college-level math courses tend to emphasize application and theory … sometimes with a much lower emphasis on passing.  In many colleges, this difference in emphasis leads to either a de facto or official separation of developmental math from college math.

The biggest single problem we have in developmental mathematics is the emphasis on a long sequence of courses — 3 or 4 courses below college level.  The inertia for this structure is based, in large part, on the parallel to grade levels in K-12 work … arithmetic (K-6), pre-algebra (7-8), beginning algebra (9) and intermediate algebra (10 or 11).  I have found that many faculty in developmental mathematics have a difficult time letting go of this grade-level focus (courses in K-12).

The fact that the accreditation process ‘ignores’ developmental math teaching qualifications is the problem I think needs to be addressed.  Should faculty teaching developmental mathematics have the same credential requirement as college-level math faculty?  There are strong arguments for this approach.  Should faculty teaching developmental mathematics have credential requirements beyond that of K-12 math teachers?  In my view, definitely yes.

At this point in time, it is not realistic to hold developmental math faculty to the same credential requirement as college level math — we just don’t have enough people qualified at that level.  However, I think we can develop some reasonable standard which approaches that goal.  Perhaps  ‘masters in math education, or a minimum of 9 graduate credits in mathematics’ could be used as an alternative (in addition to the ‘regular’ credential for general education).  The professional organizations, primarily AMATYC, could develop such a criteria in collaboration with the accrediting bodies.

My purpose is more about pointing out the problem and need to develop a solution, rather than advocate a particular criteria.  Achieving a solution could be measured practically:

Can all mathematics faculty in a community college, regardless of normal teaching assignments, understand and contribute to all curricular discussions involving any math course at the institution?

Until we see this result, students will continue to experience a developmental math program that tends to be too long and overly connected to the K-12 ‘grade level’ structure.

 

Join Dev Math Revival on Facebook:

 

Data on Co-requisite Statistics (‘mainstreaming’)

Should students who appear to need beginning algebra be placed directly in a college statistics course?  For some people, this is no longer a question — they have concluded that the answer is an unqualified ‘yes’.  A recent research study appears to provide evidence; however, the study measured properties outside of what they intended and does not answer a basic question.

So, the study is “Should Students Assessed as Needing Remedial Mathematics Take College-Level Quantitative Courses Instead? A Randomized Controlled Trial” by Logue et al.  You can read they report at http://epa.sagepub.com/content/early/2016/05/24/0162373716649056.full.pdf

The design is reasonably good.  About 2000 students who had been placed into beginning algebra at a CUNY community college were invited to participate in the experiment.  Of those who agreed (about 900), participants were randomly assigned in to one of 3 treatments:

  1. Elementary Algebra regular    39% passed
  2. Elementary Algebra with weekly workshops   45% passed
  3. College Statistics with weekly workshops    56% passed

At these colleges, the typical pass rate for elementary algebra was 37% while statistics had a normal pass rate of 69%.

The first question about this study should be … Why is the normal pass rate in elementary algebra so appallingly low?  I suspect that the CUNY community colleges are not isolated in having such a low pass rate, but that does not change the fact that the rate is unacceptable.

The second question about the study should be … Would we expect a strong connection between completing remediation (or not) with performance in elementary statistics?   The authors of this study make the following statement:

it has been proposed that students can pass college-level statistics more easily than remedial algebra because the former is less abstract and ses everyday examples

In other words, statistics is not abstract … not mathematics at the college level.  The fact that statistics focuses on ‘real world’ data is not the problem; the fact that the study of statistics does not involve properties and relationships within a mathematical system IS a problem.  I’ve written on that previously (see “Plus Four: The Role of Statistics in Mathematics Education at http://www.devmathrevival.net/?p=976)

The study uses ‘mainstreaming’ in their descriptions of the statistics sections in their experiment; I find that an interesting and perhaps better phrase than ‘co-requisite’.  It’s unlikely that the policy makers will move to a different phrase.

The authors of this study conclude that many students who place into elementary algebra could take college-level math (represented by statistics in their study) with additional support.  The problem is that they never dealt with the connection question:  How much algebra does a student need to know in order to succeed in basic statistics?  The analysis I am aware of is “not much”; in the Statway (™) program, most of the remediation is in the domains of numeracy and proportional reasoning … very limited algebra.

This is the basic problem posed in all of the ‘research’ on co-requisite remediation:  students are placed into low-algebra courses (statistics, liberal arts math), and … when they generally succeed .. the proclamation is the ‘co-requisite remediation works!’.  That’s not what is happening at all.  Mostly what the research is ‘proving’ is that those particular college ‘math’ courses had an inappropriate prerequisite of algebra (beginning or intermediate).

Part of our responsibility is to explain to non-math experts what the relationships are between various math courses, using language and concepts that they can understand while preserving fidelity with our own work.  We need to make sure that policy makers understand that it is not an issue of us ‘not wanting to change’ … the issue is that we have a different understanding of the problem and potential solutions.  In many colleges, the math department is already ahead of where the policy makers want us to ‘go’.

I encourage you to read this study thoroughly;  Because it using a ‘control’ and ‘random assignment’ design, this study is likely to become a star for policy makers.  We need to understand the study and provide a better interpretation.

 
Join Dev Math Revival on Facebook:

Where Dreams go to Thrive … Part III (More Evidence)

The leading cause of bad policy decisions is the phrase “Research clearly shows … ” which suggests that all of us should accept one interpretation of some unnamed set of ‘research’ (most of which is not research at all).  Understanding the needs of students not prepared for college mathematics is a long-term process, involving prolonged conversations among professionals as we attempt to understand what the data and the research say about our work and our students.

My goal is to present another scientific research article on the impacts of developmental education — remedial mathematics in particular.  This article is by Bettinger & Long called “Addressing The Needs Of Under-Prepared Students In Higher Education: Does College Remediation Work?” which you can download at http://www.nber.org/papers/w11325.pdf

This research is based on a large sample of students in Ohio.  The strategy is to adjust for selection bias that is so strong in all studies on remediation — Students referred to remediation tend to have both lower specific skills (math) and more academic challenges.  The authors define a series of variables for this purpose, and eventually calculate a ‘local area treatment effect’ (LATE) which is partially based on the fact that cutoffs for remediation vary significantly among the 45 institutions of higher education in the data.  The analysis of “LATE” involved a restriction on the sample — towards the middle, where the cutoffs have more impact; this analysis excludes the weakest (roughly 10%) of the overall sample.

Key Finding #1: Equal Outcomes for those in Remediation
For outcomes such as dropping out and degree completion (bachelor’s) students who had remediation achieved similar outcomes to those who did not, once the selection bias was accounted for.

Key Finding #2: For those most impacted by remediation cutoffs, outcomes are improved
The “LATE” analysis showed that remedial students had a lower rate of dropping out and a higher rate of degree completion compared to similar students without remediation.  The authors attribute this as an accurate (perhaps even conservative) estimate of the benefits of remediation.

Here is a nice quote from their summary:

We estimate that students in remediation have better educational outcomes in comparison to students with similar backgrounds and preparation who were not required to take the courses.  [pg 19]

The research also explored the impact of remediation on student interest (as measured by type of major); you might find that discussion interesting, though it is not directly related to the question of ‘thrive’ in remedial math.  I say that because the initial major data was taken from the survey attached to the ACT exam — usually completed long before a student examines their actual choices at the college they enroll at.  The authors do find an interaction between remediation and changing type of majors (specifically, changing out of math-related majors).

This study, as the others I’ve listed lately, provide a different picture of developmental mathematics than we hear in the loud conversations by policy makers (Complete College America, for example) and proponents of ‘co-requisite remediation’.  Those external forces almost always refer to ‘research’ that is simple (few variables) and aggregated; they have not dealt with the selection bias problem at all.  If you read the pronouncements carefully, you’ll notice that the biggest evidence of our failure in remedial mathematics is the large group of students who never attempt their remedial math course(s); this ‘damning conclusion’ is presented without any evidence that the nature of the remedial math courses had any causative connection to that lack of attempt.

As professionals, it is our job to both learn about the valid research on our work (the good and not-so-good) and to inform others about what this research says.

Evidence exists which truly does indicate that remedial mathematics is where dreams go to thrive.

 Join Dev Math Revival on Facebook:

Dev Math: Where Dreams go to Thrive … Part II (Evidence)

Developmental mathematics is where dreams go to thrive; we have evidence that even the traditional courses help students succeed in college.  The narrative suggested by external political forces is often based on a simplistic view of students which is out of touch with reality.  Let’s help by spreading the word on a more complete understanding.

Students who need to take developmental math courses have a wide range of remediation needs.  Peter Bahr’s study on pathways with single or multiple domains of deficiency (http://www.devmathrevival.net/?p=2458) concluded that the basic college outcomes (such as earning a degree) show equivalent outcomes for groups of students (needed remediation versus not).

A totally different analysis by Attewell et al 2006 (see http://knowledgecenter.completionbydesign.org/sites/default/files/16%20Attewell%20JHE%20final%202006.pdf) also reaches a conclusion of equal results between groups in many ways.  Many studies of remediation are simple summaries of enrollment and grades over a short period of time.  The Attewell research was based on a longitudinal study begun on 8th graders in 1988 (thus, the acronym “NELS: 88”) done by the National Center for Educational Statistics.  Over an 12 year period, the study collected high school and college information as well as additional tests and surveys on this sample.

A key methodology in this research is ‘propensity matching’ — using other variables to predict the probability of an event and then using this probability to analyze key data.  For example, high school courses and grades, along with tests, were used to calculate the probability of needing remediation in college … where a sample of students with given probabilities did not take any remediation while another sample did.  An interesting curiosity in the results is the finding that low SES and high SES students have equal enrollment rates in remedial math when ‘propensity matched’.

Thrive: Key Result #1
Students taking remedial courses have a higher rate of earning a 2-year degree than students who do not take remedial courses with similar propensity scores for needing remedial courses.  Instead of comparing students who take remediation with the entire population, this study compared students taking remediation with similar students who did not take remediation.  The results favor remediation (34% versus 31%)

In the bachelor degree setting, the results are the other direction — which the authors analyze in a variety of ways.  One factor is the very different approach to remediation in the two sectors (4-year colleges over-avoid remediation, 2-year colleges slightly over-take remediation).   However, the time-to-degree between the two groups is very similar (4.97 years with remediation, 4.76 years without).

Thrive: Key Result #2
Students taking three or more remedial courses have just slightly reduced results.  This study shows a small decline for students needing multiple remedial courses: 23.5% earn 2-year degree, versus 27.5% of similar students without multiple courses.  The Bahr study, using a local sample, produced equivalent results in this same type of analysis.

It’s worth noting that the results for multiple remedial courses are pretty good even before we use propensity matching:  25.9% complete 2-year degree with multiple remedial courses versus 33.1% without.  This clearly shows that dreams thrive in developmental mathematics, even among students with the largest need.

Thrive: Key Result #3
Students taking 2 or more remedial math courses have results almost equivalent to other students.  The predicted probabilities for students with multiple remedial math courses is 23.8%, compared to similar students without multiple remedial math (26.7%).

Note that this study was based on data from prior to the reform movements in developmental mathematics.  Even then, the results were reasonably good and indicate that the remediation was effective at leveling the playing field.

Thrive: Key Result #4
This is the best of all:  Students who complete all of their math remediation have statistically equivalent degree completion (2-year) compared to similar students (34.0% vs 34.7%)

This result negates the common myth that taking multiple remedial math courses spells doom for students.  The data shows that this is not true, that completing math remediation does what it is meant to do — help students complete their degree.

 

I encourage you to take a look at this research; it’s likely that you will spot something important to you.  More than that, we should all begin to present a thrive narrative about developmental mathematics — because that is what the data is showing.

 
Join Dev Math Revival on Facebook:

 

WordPress Themes