Multiple Measures: How Consistent are ACT Math and Accuplacer

Like many institutions, mine allows students to place into a math course via a variety of methods.  The most common methods are the ACT Math score and the Accuplacer College Level Math (CLM) test.  I ran into a reference to a university which concluded that the ACT Math score was not a reliable predictor.

So, I’m posting a quick summary of how those two instruments agree (or not).  As part of our normal program improvement and curricular work, I have gathered information on about 800 students who were enrolled in our pre-calculus course.  Obviously, this is not a random sample of all ACT Math and all CLM scores.  However, given the selection, the two instruments should have a reasonable amount of consistency.

There were 122 students with both ACT Math and CLM scores.  Of these:

  • 74 had scores on both that produce the same course placement (61%)
  • 48 had scores such that different course placements result (39%)

The vast majority of the ‘disagreement’ involved a higher ACT Math placement than CLM placement.  A quick comparison shows that students placing based on ACT Math have a lower pass rate than those who place based on the CLM.  I’ve got some more work to do in analyzing the data before identifying a hypotheses about that pattern.

For that sample of 122 students with both scores, there is a significant correlation (about 0.32).  That correlation is somewhat limited by the sample, which tends to emphasize relatively high scores (skewed distribution).  Even with that limitation, I was concerned about the small size of the correlation … I’d expect a ‘native’ correlation (all data) of about 0.7, and a reduction to 0.5 would be reasonable given the skewed sample.  That 0.32 is pretty small for these two measures.

Most of us use “alternate measures” (this method OR that method); low consistency between methods means our error rates will increase with the ‘or’.  If the low consistency holds up in further analysis, we should either use the most reliable predictor … or true multiple measures where we use some combination of data to determine a course placement.

I began looking at our data because I could not find studies looking at the correlation and relative placement strength of our two measures.  If you are aware of a study that provides that type of information, I’d appreciate hearing about it.

 Join Dev Math Revival on Facebook:

TBR and the Co-Requisite Fraud

Since many policy makers and academic leaders are telling us that we need to do (or consider) co-requisite remediation because of the results from the Tennessee Board of Regents (TBR), the TBR should release valid results … results which are consistent with direct observations by people within their system.  #TBR #Co-Requisite

Earlier this year, one of the TBR colleges shared their internal data for the past academic year, during a visit to my college.  This particular institution is not unusual in their academic setting, which is quite diverse.  Here is a summary of their data.

Foundations (intermediate algebra)          College: 61%

Math for Liberal Arts                                College: 52%

Statistics (Intro)                                      College: 40%

The TBR lists 51.7% as the completion rate for the same time period.  [See]

Recently, I was able to have a short conversation with a mathematics faculty member within the TBR system.  The college administrator who visited earlier this year said that their mathematics faculty “would never go back” now that they have tried co-requisite remediation, suggesting that most faculty are now supporters.  The faculty member I talked with had some very strong language about the validity of the TBR data; the phrase “cooked the books” was used.  This internal voice certainly does not sound like a strong supporter, and suggests that there is deliberate tampering with the data.

There are two direct indicators of fraud in the TBR data.

  1. Intermediate Algebra (Foundations) was used in the data, even though it does not meet any degree requirement in the system.  [It is “college level”, but does not count for an AA or AS degree.]  Foundations had the highest pass rate for the college visiting; however, TBR does not release course-by-course results.]
  2. “Passing” is a 1.0 or higher, even though the norm for general education is a 2.0 or higher.  Again, the TBR does not release actual grade distribution.  The rate of D/1.0-1.5 grades can vary but is often 10% or higher.

The data is presented as passing (implied 2.0) a college math course (implied not developmental); the TBR violates both of these conditions.  If the data was financial instead of academic, the condition is called fraud … as in a corporation which manages to report a large profit instead of the reality of a very small profit.

Perhaps the TBR did not intentionally commit this fraud.  However, given that the leaders involved are experienced academics, that does not seem likely.  The errors I am seeing are too fundamental.

Of course, it is possible that both views from internal sources are incorrect.  I do not think that is as likely as the TBR data being incorrect.

My estimate of the ACTUAL completion rate of college math courses (liberal arts math and statistics) with a 2.0/C or higher:

30% to 40%  completion of college mathematics in corequisite remediation … NOT 50% to 60% as claimed by the TBR.

Whether I am correct in claiming fraudulent data reporting from the TBR, I am sure that the TBR needs to provide much better transparency in its reporting. Developmental education is being attacked and fundamentally altered by policy makers and external influencers whose most common rationale consists of the statement “Co-requisite remediation has to be a good thing … that has been proven by the Tennessee data!”.

Some readers may suggest that my wording of this post is overly-dramatic and not in keeping with the norms of academic discourse.  I think the dramatic tone is quite warranted considering the manner in which the TBR data has been used by Complete College America and others.  I agree that this post is not within the norms of academic discourse, but I believe that the tone is totally within the norms of the new reality of higher education:

Instead of discourse, over time, building upon prior results, we have allowed external influencers to determine the agenda for higher education.

If policy makers and leaders seek to push us in the direction they prefer and then use selected data to support this direction, then those policy makers and leaders can expect us to call them out for fraud and other inappropriate behavior.

It is time for the Tennessee Board of Regents to report their data in a way that allows the rest of us to examine the questions of ‘what is working’ in ‘which course’ under ‘what conditions’.

Enough of the fraud; it’s time to show us the truth about what, which, and conditions.

 Join Dev Math Revival on Facebook:


Looking for new Textbooks? Be an Author!

Are you looking for a math literacy book that is different from those available now?  Are you looking for any algebraic literacy textbook?

These books get written by people who want to teach the courses.  We understand the goals of the courses, the type of content that should be present, and how to present this material so that students can succeed.

Perhaps you are somebody who might be interested in writing either a Math Literacy or an Algebraic Literacy text … either by yourself or as part of a writing team.   If so, you can certainly approach any publishing company to start the process.

In particular, Pat McKeague of XYZ Textbooks is willing to work with potential authors of textbooks for our new courses.  He is excited about developing more textbook choices for us, while providing materials to students at a lower cost.  When XYZ publishes a textbook, they do some of the wrap-around work (such as videos).

I appreciate Pat’s support of our work and his willingness to work with authors.  If you are interested in learning more, contact Pat at

The textbooks should be a close approximation to the course goals & outcomes:

Clearly, the intent is that any textbook focus on understanding and reasoning.  The level of “context” and “small group work” can vary (though always being a part of the package); some of this could be left up to the instructor using the materials.

If you have questions, feel free to contact me!

 Join Dev Math Revival on Facebook:

Algebraic Literacy Presentation (AMATYC 2016)

The presentation includes some additional data related to HS math course taking trends over a 25 year period … which definitely impacts our college mathematics curriculum.













Here are the documents for the session:

Slides:  a-bridge-to-somewhere-amatycalgebraic-literacy-sample-lesson-rate-of-change-exponential-2016

References: references-bridge-to-somewhere-amatyc-2016

Algebraic Literacy Goals & Outcomes: algebraic-literacy-goals-and-outcomes-oct2016-cross-referenced

Sample Lessons:
Trig Basics: algebraic-literacy-sample-lesson-trig-functions-basics-2x
Rational Exponents:  algebraic-literacy-sample-lesson-rational-exponents-stem-boosting
Rates of Change: algebraic-literacy-sample-lesson-rate-of-change-exponential

 Join Dev Math Revival on Facebook:

WordPress Themes