Category: politics of developmental mathematics

TBR and the Co-Requisite Fraud

Since many policy makers and academic leaders are telling us that we need to do (or consider) co-requisite remediation because of the results from the Tennessee Board of Regents (TBR), the TBR should release valid results … results which are consistent with direct observations by people within their system.  #TBR #Co-Requisite

Earlier this year, one of the TBR colleges shared their internal data for the past academic year, during a visit to my college.  This particular institution is not unusual in their academic setting, which is quite diverse.  Here is a summary of their data.

Foundations (intermediate algebra)          College: 61%

Math for Liberal Arts                                College: 52%

Statistics (Intro)                                      College: 40%

The TBR lists 51.7% as the completion rate for the same time period.  [See https://www.insidehighered.com/sites/default/server_files/files/TBR%20CoRequisite%20Study%20-%20Update%20Spring%202016%20(1).pdf]

Recently, I was able to have a short conversation with a mathematics faculty member within the TBR system.  The college administrator who visited earlier this year said that their mathematics faculty “would never go back” now that they have tried co-requisite remediation, suggesting that most faculty are now supporters.  The faculty member I talked with had some very strong language about the validity of the TBR data; the phrase “cooked the books” was used.  This internal voice certainly does not sound like a strong supporter, and suggests that there is deliberate tampering with the data.

There are two direct indicators of fraud in the TBR data.

  1. Intermediate Algebra (Foundations) was used in the data, even though it does not meet any degree requirement in the system.  [It is “college level”, but does not count for an AA or AS degree.]  Foundations had the highest pass rate for the college visiting; however, TBR does not release course-by-course results.]
  2. “Passing” is a 1.0 or higher, even though the norm for general education is a 2.0 or higher.  Again, the TBR does not release actual grade distribution.  The rate of D/1.0-1.5 grades can vary but is often 10% or higher.

The data is presented as passing (implied 2.0) a college math course (implied not developmental); the TBR violates both of these conditions.  If the data was financial instead of academic, the condition is called fraud … as in a corporation which manages to report a large profit instead of the reality of a very small profit.

Perhaps the TBR did not intentionally commit this fraud.  However, given that the leaders involved are experienced academics, that does not seem likely.  The errors I am seeing are too fundamental.

Of course, it is possible that both views from internal sources are incorrect.  I do not think that is as likely as the TBR data being incorrect.

My estimate of the ACTUAL completion rate of college math courses (liberal arts math and statistics) with a 2.0/C or higher:

30% to 40%  completion of college mathematics in corequisite remediation … NOT 50% to 60% as claimed by the TBR.

Whether I am correct in claiming fraudulent data reporting from the TBR, I am sure that the TBR needs to provide much better transparency in its reporting. Developmental education is being attacked and fundamentally altered by policy makers and external influencers whose most common rationale consists of the statement “Co-requisite remediation has to be a good thing … that has been proven by the Tennessee data!”.

Some readers may suggest that my wording of this post is overly-dramatic and not in keeping with the norms of academic discourse.  I think the dramatic tone is quite warranted considering the manner in which the TBR data has been used by Complete College America and others.  I agree that this post is not within the norms of academic discourse, but I believe that the tone is totally within the norms of the new reality of higher education:

Instead of discourse, over time, building upon prior results, we have allowed external influencers to determine the agenda for higher education.

If policy makers and leaders seek to push us in the direction they prefer and then use selected data to support this direction, then those policy makers and leaders can expect us to call them out for fraud and other inappropriate behavior.

It is time for the Tennessee Board of Regents to report their data in a way that allows the rest of us to examine the questions of ‘what is working’ in ‘which course’ under ‘what conditions’.

Enough of the fraud; it’s time to show us the truth about what, which, and conditions.

 Join Dev Math Revival on Facebook:

 

Trump Method: Complete College America

Whatever your political persuasion, I hope this comparison makes sense to you.  Most politicians use selective fact usage, and it’s normal to have candidates repeat ‘information’ that fails the fact-checking process.  Mr. Trump is just a bit more extreme in his use of these strategies.  I’m actually not saying anything against “the Donald”.

However, the Trump Method is being employed by the folks at Complete College America (CCA).  The CCA is a change agent, advocating for a select set of ‘game changers’ … which are based on a conclusion about remedial education as a useful construct.  The CCA repeats the same information that does not pass the fact-checking process, much to the detriment of developmental education and community colleges in general.

It’s not that professionals in the field believe that our traditional curriculum and methods are anywhere near what they should be.  I’ve talked with hundreds of teaching faculty over the past ten years, relative to various constructs and methods to use; though we differ on eventual solutions and how to get there, we have a strong consensus that basic changes are needed in remedial mathematics.

However, the CCA brings its anvil and hammer communication … promising simple solutions to complicated problems (just like Mr. Trump).  The recent email newsletter has this headline:

Stuck at Square One
College Students Increasingly Caught in Remedial Education Trap
[http://www.apmreports.org/story/2016/08/18/remedial-education-trap?utm_campaign=APM%20Reports%2020160902%20Weekend%20Listening&utm_medium=email&utm_source=Eloqua&utm_content=Weekend%20listening%3A%20New%20education%20documentaries]

Following up on this headline leads one to a profession-bashing ‘documentary’ about how bad things are.  Did you notice the word “increasingly”?  Things getting worse clearly calls for change … if only there was evidence of things getting worse.  Not only are the facts cited in the documentary old (some from 2004), there is no discussion of any change in the results.

Like “immigrants” for Mr. Trump, remedial education is a bad thing in the view of the CCA.  Since remedial education can not be deported or locked up, the only option is to get rid of it.  The headline says that we ‘trap’ students in our remedial courses, as if we had criminal intent to limit students.  No evidence is presented that the outcomes are a ‘trap’; the word ‘trap’ is more negative than ‘limitations’ or ‘inefficient’ … never mind the lack of accuracy.

Some people have theorized that Mr. Trump appeals to less educated voters.  Who does the CCA material appeal to?  Their intended audience is not ‘us’ … it’s policy makers and state leaders.  These policy makers and state leaders are not generally ignorant nor mean-spirited.  However, the CCA has succeeded in creating an atmosphere of panic relative to remedial education.  Because of the long-term repetition of simplistic conclusions (lacking research evidence) we have this situation at state level groups and college campuses:

Remedial education is a failure, because the CCA has data [sic].
Everybody is working on basic changes, and getting rid of stand-alone remediation.
We better get with the band-wagon, or risk looking like we don’t care (‘unpatriotic’).

This is why the CCA work is so harmful to community colleges.  Instead of academia and local needs driving changes, we have a ‘one size fits all’ mania sweeping the country.  Was this the intent of the CCA?  I doubt it; I think there intent was to destroy remediation as it’s been practiced in this country.  Under the right conditions, I could even work with the CCA on this goal: if ‘destroy’ involved a reasoned examination of all alternatives within the framework present at individual community colleges, with transparent use of data on results.

Sadly, the debate … the academic process for creating long-lasting change … has been usurped by the Trump Method of the CCA.  I can only hope that our policy makers and college leaders will discover their proven change methods; at that point, all of us can work together to create changes that both serve our students and have the stability to remain in place after the CCA is long gone.

 Join Dev Math Revival on Facebook:

Alignment of Remediation with Student Programs

My college is one of the institutions in the AACC Pathways Project; we’ve got a meeting coming up, for which we were directed to read some documents … including the famous (or infamous) “Core Principles” for remediation.  [See http://www.core-principles.org/uploads/2/6/4/5/26458024/core_principles_nov4.pdf]  In that list of Core Principles, this is #4:

Students for whom the default college-level course placement is not appropriate, even with additional mandatory support, are enrolled in rigorous, streamlined remediation options that align with the knowledge and skills required for success in gateway courses in their academic or career area of interest.

What does that word “align” mean?  It seems to be a key focus of this principle … and the principle also implies that colleges are failing if they can not implement co-requisite remediation.  In early posts, I have shared data which suggests that stand-alone remediation can be effective; the issue is length-of-sequence, meaning that we can not justify a sequence of 3 or 4 developmental courses (up to and including intermediate algebra).

The general meaning of “align” simply means to put items in their proper position.  The ‘align’ in the Core Principles must mean something more than that … ‘proper position’ does not add any meaning to the statement.  [It already said ‘streamlined’ and later says ‘required or’.]  What do they really mean by ‘align’?

In the supporting narrative, the document actually talks more about co-requisite remediation than alignment.  That does not help us understand what was intended.

The policy makers and leaders I’ve heard on this issue often use this type of statement about aligning remediation:

The remediation covers skills and applications like those the student will encounter in their required math course.

In other words, what ‘align’ means is “restricted” … restricted to those mathematical concepts or procedures that the student will directly use in the required math course.  The result is that the remedial math course will consist of the same stuff included in the mandatory support course in the co-requisite model.  The authors, then, are saying that we need to do co-requisite remediation … or co-requisite remediation; the only option is concurrent versus preceding.

If the only quantitative needs a student faced were restricted to the required math course, this might be reasonable.

I again find a basic flaw in this use of co-requisite remediation in two flavors (concurrent, sequential).  We fail to serve our fundamental charge to prepare students for success in their PROGRAM … not just one math course.  As long as the student’s program requires any quantitative work in courses such as these, the ‘aligned’ remediation will fail to serve student needs:

  • Chemistry
  • Physiology
  • Economics
  • Political science
  • Psychology
  • Basic Physics

Dozens of non-math courses on each campus have strong quantitative components.  Should we avoid remedial math courses just to get students through one required math course … and cause them to face unnecessary challenges in several other courses in their program?

In some rare cases, the required math course actually covers most of the quantitative knowledge a student needs for their program.  However, in my experience, the required math course only partially provides that background … or has absolutely no connection to those needs.

Whom does remediation serve?  Policy makers … or students?

 Join Dev Math Revival on Facebook:

Where Dreams go to Thrive … Part III (More Evidence)

The leading cause of bad policy decisions is the phrase “Research clearly shows … ” which suggests that all of us should accept one interpretation of some unnamed set of ‘research’ (most of which is not research at all).  Understanding the needs of students not prepared for college mathematics is a long-term process, involving prolonged conversations among professionals as we attempt to understand what the data and the research say about our work and our students.

My goal is to present another scientific research article on the impacts of developmental education — remedial mathematics in particular.  This article is by Bettinger & Long called “Addressing The Needs Of Under-Prepared Students In Higher Education: Does College Remediation Work?” which you can download at http://www.nber.org/papers/w11325.pdf

This research is based on a large sample of students in Ohio.  The strategy is to adjust for selection bias that is so strong in all studies on remediation — Students referred to remediation tend to have both lower specific skills (math) and more academic challenges.  The authors define a series of variables for this purpose, and eventually calculate a ‘local area treatment effect’ (LATE) which is partially based on the fact that cutoffs for remediation vary significantly among the 45 institutions of higher education in the data.  The analysis of “LATE” involved a restriction on the sample — towards the middle, where the cutoffs have more impact; this analysis excludes the weakest (roughly 10%) of the overall sample.

Key Finding #1: Equal Outcomes for those in Remediation
For outcomes such as dropping out and degree completion (bachelor’s) students who had remediation achieved similar outcomes to those who did not, once the selection bias was accounted for.

Key Finding #2: For those most impacted by remediation cutoffs, outcomes are improved
The “LATE” analysis showed that remedial students had a lower rate of dropping out and a higher rate of degree completion compared to similar students without remediation.  The authors attribute this as an accurate (perhaps even conservative) estimate of the benefits of remediation.

Here is a nice quote from their summary:

We estimate that students in remediation have better educational outcomes in comparison to students with similar backgrounds and preparation who were not required to take the courses.  [pg 19]

The research also explored the impact of remediation on student interest (as measured by type of major); you might find that discussion interesting, though it is not directly related to the question of ‘thrive’ in remedial math.  I say that because the initial major data was taken from the survey attached to the ACT exam — usually completed long before a student examines their actual choices at the college they enroll at.  The authors do find an interaction between remediation and changing type of majors (specifically, changing out of math-related majors).

This study, as the others I’ve listed lately, provide a different picture of developmental mathematics than we hear in the loud conversations by policy makers (Complete College America, for example) and proponents of ‘co-requisite remediation’.  Those external forces almost always refer to ‘research’ that is simple (few variables) and aggregated; they have not dealt with the selection bias problem at all.  If you read the pronouncements carefully, you’ll notice that the biggest evidence of our failure in remedial mathematics is the large group of students who never attempt their remedial math course(s); this ‘damning conclusion’ is presented without any evidence that the nature of the remedial math courses had any causative connection to that lack of attempt.

As professionals, it is our job to both learn about the valid research on our work (the good and not-so-good) and to inform others about what this research says.

Evidence exists which truly does indicate that remedial mathematics is where dreams go to thrive.

 Join Dev Math Revival on Facebook:

WordPress Themes