Category: placement

Multiple Measures: How Consistent are ACT Math and Accuplacer

Like many institutions, mine allows students to place into a math course via a variety of methods.  The most common methods are the ACT Math score and the Accuplacer College Level Math (CLM) test.  I ran into a reference to a university which concluded that the ACT Math score was not a reliable predictor.

So, I’m posting a quick summary of how those two instruments agree (or not).  As part of our normal program improvement and curricular work, I have gathered information on about 800 students who were enrolled in our pre-calculus course.  Obviously, this is not a random sample of all ACT Math and all CLM scores.  However, given the selection, the two instruments should have a reasonable amount of consistency.

There were 122 students with both ACT Math and CLM scores.  Of these:

  • 74 had scores on both that produce the same course placement (61%)
  • 48 had scores such that different course placements result (39%)

The vast majority of the ‘disagreement’ involved a higher ACT Math placement than CLM placement.  A quick comparison shows that students placing based on ACT Math have a lower pass rate than those who place based on the CLM.  I’ve got some more work to do in analyzing the data before identifying a hypotheses about that pattern.

For that sample of 122 students with both scores, there is a significant correlation (about 0.32).  That correlation is somewhat limited by the sample, which tends to emphasize relatively high scores (skewed distribution).  Even with that limitation, I was concerned about the small size of the correlation … I’d expect a ‘native’ correlation (all data) of about 0.7, and a reduction to 0.5 would be reasonable given the skewed sample.  That 0.32 is pretty small for these two measures.

Most of us use “alternate measures” (this method OR that method); low consistency between methods means our error rates will increase with the ‘or’.  If the low consistency holds up in further analysis, we should either use the most reliable predictor … or true multiple measures where we use some combination of data to determine a course placement.

I began looking at our data because I could not find studies looking at the correlation and relative placement strength of our two measures.  If you are aware of a study that provides that type of information, I’d appreciate hearing about it.

 Join Dev Math Revival on Facebook:

Mathematical Literacy WITHOUT a Prerequisite

Starting this Fall (August 2016) my department will begin offering a second version of our Mathematical Literacy course.  Our original Math Lit course has a prerequisite similar to beginning algebra (it’s just a little lower).  The new course will have NO math prerequisites.

So, here is the story: Last year, we were asked to classify each math course as “remedial, secondary level”  or “remedial, elementary level” or neither.  This request originates with the financial aid office, which is charged with implementing federal regulations which use those classifications.  Our answer was that our pre-algebra course was “remedial, elementary level” because the overwhelming majority of the content corresponded to the middle of the elementary range (K-8).  We used the Common Core and the state curriculum standards for this determination, though the result would be the same with any reference standard.

Since students can not count “remedial, elementary level” for their financial aid enrollment status, our decision had a sequence of consequences.  One of those results was that our pre-algebra course was eliminated; our last students to ever take pre-algebra at my college finished the course this week.

We could not, of course, leave the situation like that — we would have no option for students who could not qualify for our original Math Literacy course (hundreds of students per year).  Originally, we proposed a zero credit replacement course.  That course was not approved.

Our original Math Literacy course is Math105.  We (quickly!) developed a second version … Math106 “Mathematical Literacy with REVIEW”.  Math106 has no math prerequisite at all.  (It’s actually got a maximum, not a minimum … students who qualify for beginning algebra can not register for Math106.)  The only prerequisites for Math106 are language skills — college level reading (approximately) and minimal writing skills.

Currently, we are designing the curriculum to be delivered in Math106.  We are starting with some ‘extra’ class time (6 hours per week instead of 4) and hope to have tutors in the classroom.  Don’t ask how the class is going because it has not started yet.  I can tell you that we are essentially implementing the MLCS course with coverage of the prerequisite skills, based on the New Life Project course goals & outcomes.

We do hope to do a presentation at our state affiliate conference (MichMATYC, at Delta College on October 15).  I would have submitted a presentation proposal for AMATYC, but all of the work on Math106 occurred well after the deadline of Feb 1.

One of the reasons I am posting this is to say: I am very proud of my math colleagues here at LCC who are showing their commitment to students with courage and creativity.  We will deliver a course starting August 25 which did not exist anywhere on February 1 of this year.

 Join Dev Math Revival on Facebook:

Placement in Math … What’s the GOAL?

So, I’ve been thinking quite a bit about college placement.  People are advocating ‘multiple measures’ (often done as ‘alternative measures’); Accuplacer is announcing “The Next Generation Accuplacer” during their summer conference (https://accuplacer.collegeboard.org/professionals/national-conference/concurrent-session-highlights) .  There are people who believe that all placement tests are ‘evil’ (including some professional friends).  The placement tests are a tool used in a process … do we agree on the goal of that process?

At the highest level of analysis, there are 3 competing goals for math placement:

  1. Which course best reflects the student’s current skills?
  2. What is the highest course in which the student has a reasonable chance of success?
  3. What skills does a student have now, or lack now?

We often make the mistake of equating these 3 goals; they are separate ‘hypotheses’, and the outcome for a given student can be quite different.

My perception is that we, in the mathematics community, have tended to fixate on the first and last goal (skills oriented placement).  To consider how different the 3 goals are, consider this student prototype:

Cassie can solve basic equations (linear, quadratic by factoring), with sound understanding of polynomials basics.  Cassie knows some function basics, including graphing; however, using slope in a flexible way is lacking.  The main weakness Cassie has is in fractions both arithmetic and algebraic — Cassie has some memorized rules that help in some cases, but produce wrong answers in general.

Goal #1 tends to find a ‘mean skill level’; since Cassie lacks some arithmetic, this mean is likely to be beginning algebra.  Goal #2 tends to find a ‘floor’ level of functioning, a minimum for success in the maximum course; depending on her program, Cassie would end up in Algebraic Literacy/Intermediate Algebra or a College Level course.  Goal #3 tends to push students down to the lowest level of gaps in skills [goal #3 is used in modular and emporium designs]; Cassie would end up in an arithmetic or pre-algebra class.

Many of us are drawn to the two skill-based goals, and that is a problem.  The first goal with it’s focus on ‘mean skill level’ tends to under-place students (as seen by outsiders); more importantly, students are always starting in a course with less challenge and more review than needed.  The third goal creates the longest sequence of learning for every student.  Both of the skill-based goals operate from the assumption that reviewing skills produces the intended results; in 40+ years in the business, I have only seen modest effect sizes for review courses in general.

The second goal has more potential to both serve our students well and provide faculty with reasonable teaching challenges.  This goal operates from a viewpoint that skills are general indicators of capabilities, and it is these capabilities that allow a student to pass math courses.  The prototype student, Cassie, is based on a student I had in beginning algebra … she is currently taking a test in my intermediate algebra class; Cassie has been very pleasant about the two courses, but I think she’s wasted a semester of math.  [My department tries to do goal #2, but our operational system produces #1 in most cases.]

Of course, there are many basic connections between the placement system goals and the design of our curriculum.  If we already had Algebraic Literacy, my student (Cassie) could have succeeded in that course in her first semester.  Our traditional curriculum (basic math, pre-algebra, beginning algebra, etc) tends to be consistent with goal #3 (the maximum number of developmental math courses).

One of the steps we often take when setting cutoffs for placement is that we ask other colleges what they use … or we see a reference online or in manuals.  It’s very rare for the goal of placement to be shared when we get this information; for years, all assumed that we shared one goal (one of the skill-based goals).

We live in a period of change in both our math courses and in the placement system.  Our developmental math courses are trending towards Math Literacy & Algebraic Literacy; the placement tests are undergoing similar shifts.  [The new Accuplacer tests will be less skill-based, more diverse, and include some reasoning.]  These changes provide us with an opportunity to re-focus our goal in placing students, and I am hoping we can develop a professional consensus on a goal of placement:

What is the highest course in which the student has a reasonable chance of success?

I am hoping that the changing tests (Accuplacer Next-Gen) will allow my institution to produce results consistent with this goal.  My draft of a Mathematical Literacy Placement Assessment is written with this goal in mind (https://www.devmathrevival.net/?p=2480).

The big question, though, is what our profession does!

 

Placement for a Modern Curriculum: A Fresh Start

The college mathematics profession has been dealing with a number of criticisms of how students are placed in their initial mathematics course.  Even before recent curricular changes, evidence suggested that the tests used for placement were likely to under-place students; a modern curriculum focusing on reasoning as well as procedure increases that placement struggle.  I’ve been working on a fresh start, and I am ready to share an initial version of a new assessment that might help solve both dimensions of the problem.

First, we need to understand the legacy of the current placement tests.  The genetic background of both Accuplacer and Compass is firmly grounded in the basic skills movements of the 1980s.  Specifically, Accuplacer grew out of instruments such as the New Jersey Basic Skills test.  The primary goal during that era was “fix all of the skills that students might lack”; the only reasoning typically included was word problems, with perhaps a bit of estimation.

Our placement needs have shifted greatly since that time.  Courses like Mathematical Literacy (and siblings Quantway [Carnegie Foundation] and Foundations of Mathematical Reasoning [Dana Center]) depend less on procedural skills like those found on placement tests; rather, the issues lie in the answers to this question:

What mathematics is this student capable of learning now?

As a coincidental benefit of the New Life Project, we have information on what these prerequisite abilities should be.  In the design of Mathematical Literacy (ML) and Algebraic Literacy (AL), our teams articulated a list of these prerequisite outcomes:

  • Mathematical Literacy prerequisite outcomes:
    1. Understand various meanings for basic operations, including relating each to
    diverse contextual situations
    2. Use arithmetic operations to solve stated problems (with and without the aid
    of technology)
    3. Order real numbers across types (decimal, fractional, and percent), including
    correct placement on a number line
    4. Use number sense and estimation to determine the reasonableness of an
    answer
    5. Apply understandings of signed-numbers (integers in particular)
  • Algebraic Literacy prerequisite outcomes:
    1. Understand proportional relationships in a variety of settings, including paired data and graphs.
    2. Apply properties of algebraic expressions, including distributing, like terms, and integer exponents.
    3. Construct equations and inequalities to represent relationships
    4. Understand how to solve linear equations by reasoning
    5. Understand how to write and use linear and exponential functions

Using these 10 outcomes, I’ve written a “Mathematical Literacy Placement Assessment” (MLPA).  The draft 0.5 of the MLPA has 30 items, with slightly more than half on the AL prerequisites.  Here is the MLPA:

Mathematical Literacy Placement Assessment 2016 version 0.5

Note two things about the MLPA:  (1) the copyright license is Creative Commons by Attribution (you can use it, as long as you state the owner [Jack Rotman] … you can even modify it; (2) the MLPA has not been validated in any way … any use should begin with preliminary validation followed by improvements to the assessment.  In case you are not familiar with the Creative Commons by Attribution license, it allows others to both use and modify the material, as long as the attribution is provided.

The intent of the MLPA is to provide a modern assessment of students who might need a Math Literacy course.  The initial 12 items are prerequisites to that course, so a score on those 12 provides a measure (with reliability and validity to be determined) of a student’s readiness for Math Literacy.  The other 18 items are intended to assess whether the student needs the Math Literacy course; a score on those 18 items would indicate whether the student is ready for an Algebraic Literacy course (or possibly intermediate algebra).

If the MLPA has content validity, we would expect a significant but small correlation with other placement results (Accuplacer, Compass) — because the MLPA is intended to measure different properties than those assessments.  The content validity would need to be established directly, possibly by use in Math Literacy courses (as a pre-test and post-test).  Variations in the two sections of the MLPA should be highly correlated to the students’ work in the Math Lit course.

My goal in developing the MLPA version 0.5 is to provide a starting point for the community of practitioners — both mathematics faculty and companies involved in testing.  Ideally, people involved in this work would collaborate so that an improved MLPA would be available to others.  The hope is that many different institutions and organizations will become involved in developing– and using — a useful modern placement test, which would benefit both colleges and students.

If you would like an MS Word document of the MLPA, please send me a direct email; you are also free to work from the “pdf” version posted above.

 

WordPress Themes