Placement in Math … What’s the GOAL?

So, I’ve been thinking quite a bit about college placement.  People are advocating ‘multiple measures’ (often done as ‘alternative measures’); Accuplacer is announcing “The Next Generation Accuplacer” during their summer conference (https://accuplacer.collegeboard.org/professionals/national-conference/concurrent-session-highlights) .  There are people who believe that all placement tests are ‘evil’ (including some professional friends).  The placement tests are a tool used in a process … do we agree on the goal of that process?

At the highest level of analysis, there are 3 competing goals for math placement:

  1. Which course best reflects the student’s current skills?
  2. What is the highest course in which the student has a reasonable chance of success?
  3. What skills does a student have now, or lack now?

We often make the mistake of equating these 3 goals; they are separate ‘hypotheses’, and the outcome for a given student can be quite different.

My perception is that we, in the mathematics community, have tended to fixate on the first and last goal (skills oriented placement).  To consider how different the 3 goals are, consider this student prototype:

Cassie can solve basic equations (linear, quadratic by factoring), with sound understanding of polynomials basics.  Cassie knows some function basics, including graphing; however, using slope in a flexible way is lacking.  The main weakness Cassie has is in fractions both arithmetic and algebraic — Cassie has some memorized rules that help in some cases, but produce wrong answers in general.

Goal #1 tends to find a ‘mean skill level’; since Cassie lacks some arithmetic, this mean is likely to be beginning algebra.  Goal #2 tends to find a ‘floor’ level of functioning, a minimum for success in the maximum course; depending on her program, Cassie would end up in Algebraic Literacy/Intermediate Algebra or a College Level course.  Goal #3 tends to push students down to the lowest level of gaps in skills [goal #3 is used in modular and emporium designs]; Cassie would end up in an arithmetic or pre-algebra class.

Many of us are drawn to the two skill-based goals, and that is a problem.  The first goal with it’s focus on ‘mean skill level’ tends to under-place students (as seen by outsiders); more importantly, students are always starting in a course with less challenge and more review than needed.  The third goal creates the longest sequence of learning for every student.  Both of the skill-based goals operate from the assumption that reviewing skills produces the intended results; in 40+ years in the business, I have only seen modest effect sizes for review courses in general.

The second goal has more potential to both serve our students well and provide faculty with reasonable teaching challenges.  This goal operates from a viewpoint that skills are general indicators of capabilities, and it is these capabilities that allow a student to pass math courses.  The prototype student, Cassie, is based on a student I had in beginning algebra … she is currently taking a test in my intermediate algebra class; Cassie has been very pleasant about the two courses, but I think she’s wasted a semester of math.  [My department tries to do goal #2, but our operational system produces #1 in most cases.]

Of course, there are many basic connections between the placement system goals and the design of our curriculum.  If we already had Algebraic Literacy, my student (Cassie) could have succeeded in that course in her first semester.  Our traditional curriculum (basic math, pre-algebra, beginning algebra, etc) tends to be consistent with goal #3 (the maximum number of developmental math courses).

One of the steps we often take when setting cutoffs for placement is that we ask other colleges what they use … or we see a reference online or in manuals.  It’s very rare for the goal of placement to be shared when we get this information; for years, all assumed that we shared one goal (one of the skill-based goals).

We live in a period of change in both our math courses and in the placement system.  Our developmental math courses are trending towards Math Literacy & Algebraic Literacy; the placement tests are undergoing similar shifts.  [The new Accuplacer tests will be less skill-based, more diverse, and include some reasoning.]  These changes provide us with an opportunity to re-focus our goal in placing students, and I am hoping we can develop a professional consensus on a goal of placement:

What is the highest course in which the student has a reasonable chance of success?

I am hoping that the changing tests (Accuplacer Next-Gen) will allow my institution to produce results consistent with this goal.  My draft of a Mathematical Literacy Placement Assessment is written with this goal in mind (https://www.devmathrevival.net/?p=2480).

The big question, though, is what our profession does!

 

Placement for a Modern Curriculum: A Fresh Start

The college mathematics profession has been dealing with a number of criticisms of how students are placed in their initial mathematics course.  Even before recent curricular changes, evidence suggested that the tests used for placement were likely to under-place students; a modern curriculum focusing on reasoning as well as procedure increases that placement struggle.  I’ve been working on a fresh start, and I am ready to share an initial version of a new assessment that might help solve both dimensions of the problem.

First, we need to understand the legacy of the current placement tests.  The genetic background of both Accuplacer and Compass is firmly grounded in the basic skills movements of the 1980s.  Specifically, Accuplacer grew out of instruments such as the New Jersey Basic Skills test.  The primary goal during that era was “fix all of the skills that students might lack”; the only reasoning typically included was word problems, with perhaps a bit of estimation.

Our placement needs have shifted greatly since that time.  Courses like Mathematical Literacy (and siblings Quantway [Carnegie Foundation] and Foundations of Mathematical Reasoning [Dana Center]) depend less on procedural skills like those found on placement tests; rather, the issues lie in the answers to this question:

What mathematics is this student capable of learning now?

As a coincidental benefit of the New Life Project, we have information on what these prerequisite abilities should be.  In the design of Mathematical Literacy (ML) and Algebraic Literacy (AL), our teams articulated a list of these prerequisite outcomes:

  • Mathematical Literacy prerequisite outcomes:
    1. Understand various meanings for basic operations, including relating each to
    diverse contextual situations
    2. Use arithmetic operations to solve stated problems (with and without the aid
    of technology)
    3. Order real numbers across types (decimal, fractional, and percent), including
    correct placement on a number line
    4. Use number sense and estimation to determine the reasonableness of an
    answer
    5. Apply understandings of signed-numbers (integers in particular)
  • Algebraic Literacy prerequisite outcomes:
    1. Understand proportional relationships in a variety of settings, including paired data and graphs.
    2. Apply properties of algebraic expressions, including distributing, like terms, and integer exponents.
    3. Construct equations and inequalities to represent relationships
    4. Understand how to solve linear equations by reasoning
    5. Understand how to write and use linear and exponential functions

Using these 10 outcomes, I’ve written a “Mathematical Literacy Placement Assessment” (MLPA).  The draft 0.5 of the MLPA has 30 items, with slightly more than half on the AL prerequisites.  Here is the MLPA:

Mathematical Literacy Placement Assessment 2016 version 0.5

Note two things about the MLPA:  (1) the copyright license is Creative Commons by Attribution (you can use it, as long as you state the owner [Jack Rotman] … you can even modify it; (2) the MLPA has not been validated in any way … any use should begin with preliminary validation followed by improvements to the assessment.  In case you are not familiar with the Creative Commons by Attribution license, it allows others to both use and modify the material, as long as the attribution is provided.

The intent of the MLPA is to provide a modern assessment of students who might need a Math Literacy course.  The initial 12 items are prerequisites to that course, so a score on those 12 provides a measure (with reliability and validity to be determined) of a student’s readiness for Math Literacy.  The other 18 items are intended to assess whether the student needs the Math Literacy course; a score on those 18 items would indicate whether the student is ready for an Algebraic Literacy course (or possibly intermediate algebra).

If the MLPA has content validity, we would expect a significant but small correlation with other placement results (Accuplacer, Compass) — because the MLPA is intended to measure different properties than those assessments.  The content validity would need to be established directly, possibly by use in Math Literacy courses (as a pre-test and post-test).  Variations in the two sections of the MLPA should be highly correlated to the students’ work in the Math Lit course.

My goal in developing the MLPA version 0.5 is to provide a starting point for the community of practitioners — both mathematics faculty and companies involved in testing.  Ideally, people involved in this work would collaborate so that an improved MLPA would be available to others.  The hope is that many different institutions and organizations will become involved in developing– and using — a useful modern placement test, which would benefit both colleges and students.

If you would like an MS Word document of the MLPA, please send me a direct email; you are also free to work from the “pdf” version posted above.

 

Placement: Does HS GPA Add EQUAL Value?

Many people are talking about ‘multiple measures’ placement, especially the option of using high school grade point average as an input variable.  In some locations (like mine) ‘multiple measures’ is translated ‘HS GPA instead of placement test’, where ‘multiple’ means ‘alternative’. True multiple measures has some appeal.  Conceptually, there is an advantage to using more than one variable as input when the variables measure different traits.  The HS GPA involves several issues, with equity being high on my list.

As mathematicians, the first thing we should say about HS GPA is that this variable is a mis-use of the raw data.  The grades in any class are barely ordinal in nature (rankings); the average used (mean) is based on ratio variables (equal intervals AND a 4 represents twice something compared to 2).  When a variable has statistical flaws such as this, any further use in analysis should be suspect.  Whatever the disadvantages of tests (ACT, SAT, or placement), at least they involve appropriate use of the measures from a statistical point of view.

High school GPA has a number of confounding variables, some of which are shared by most tests used today.  In particular, economic level (SES) and ethnicity are both factors in the HS GPA picture (as they are in college GPA).  This type of analysis is widespread and the results consistent; one such report is from the Educational Testing Services (see http://www.ets.org/Media/Research/pdf/RR-13-09.pdf ).  Using HS GPA does not level the playing field, given the high correlations normally found between the measures.  In fact, my view is that using HS GPA in addition to a test will benefit mostly majority students from comfortable homes … and will again place minority and poor students in lower levels.

As an anecdotal piece of data, I was at a conference session recently on co-requisite remediation where the placement method involved tests or HS GPA.  Through the first year of their work, the co-requisite ‘add-on’ sections were almost totally minority … even more than their traditional developmental classes had been.  [The institution used a co-requisite model where all students enroll in the college course, and those not meeting a cutoff were required to enroll in the add-on section as well.]

When people try to explain the predictive ‘power’ of HS GPA, they often use ill-defined phrases such as ‘stick-to-itness’.  I suspect that our friends teaching high school would have a different point of view, where grades in the C+ to B range reflect not skills but attitudes (primarily compliance).  How can we justify using an inappropriate statistic (grades are ordinal) which measures “who knows what”?  Whatever the HS GPA measures, it is indirectly related to preparation for college mathematics.  The connections are likely to be stronger for writing.

The arguments FOR using the HS GPA in placement are based on studies which indicate an equal or higher predictive validity, versus tests alone.  One of the better studies within mathematics is one done by ACT (see http://www.act.org/content/dam/act/unsecured/documents/5582-tech-brief-joint-use-of-act-scores-and-hs-grade-point-average.pdf).  Here is their graph:

act math and hs gpa versus college algebra success 2016

 

 

 

 

 

 

 

 

 

This graph is showing the probability of passing college algebra, with the 5 curves representing ACT Math levels (10-14, 15-19, etc).  If a student’s ACT Math is below 20, their HS GPA does not improve their probability of success — until we get between 3.5 and 4.0.  The 20 to 24 groups and above have a pattern indicating that it might help to include the GPA; since most of us use cutoffs in the 19 to 22 range, this shows some promise when using BOTH variables.

However, notice the negative indications … if the ACT math is high (over 25) and the GPA is low, the data indicates that we should place the student differently because the student has a dramatically lowered pass rate.  Perhaps THIS is the place for co-requisite remediation!  I would also point out the overall picture for HS GPA at the high end … the probability of success is varied, and depends upon the test score.

SUMMARY:
We know that both HS GPA and tests tend to reflect inequities, where the results tend to place more minorities in developmental courses.  Although predictive value increases (correlation), we are using an inappropriate statistic (HS GPA) with little connection to preparation for college mathematics.  The available research suggests some minor gains for using HS GPA:

  • for students just below the college math test cutoff with a very high HS GPA
  • those with high scores and low HS GPA
  • use of HS GPA alone results in an almost random assignment of students

Placement has never been a confident endeavor; even the best measures (tests or other) are incomplete and impacted by other variables.  Placement Tests have taken a beating in recent years, a treatment which I think was not justified.  Modernizing the placement tests is a more appropriate response … an idea which I will pursue in an upcoming post.

Join Dev Math Revival on Facebook:

<script src=”http://connect.facebook.net/en_US/all.js#xfbml=1“></script><fb:like href=”http://www.facebook.com/home.php#!/pages/DevMathRevival/201095716578610” show_faces=”true” width=”450″ font=””></fb:like>

 

Implementing Better Math Courses, Part III: Connecting All the Dots

The traditional developmental math sequence focuses on school mathematics, biased by an algebra fixation … narrowly defined to be algebraic procedures.  Although some have the perception that this sequence serves ‘STEM students’ well, professional standards and research indicates that the sequence does not serve them well.  In this post, I will focus on truly connecting all the dots — to STEM math and most college mathematics&nbsp. #NewLifeMath #AlgebraicLiteracy

The prior posts on implementing better math courses focused on the beginning algebra level.  The Level I implementation (Pathways) described a side-by-side model; the Level II implementation (Medium) provided a total replacement of beginning algebra as well as all courses prior to that.  The next level (III) involves replacing intermediate algebra with Algebraic Literacy (AMATYC New Life project).

Here is an image of this implementation model:

ImplementationMap HIGH March2016

 

 

 

 

 

 

 

 

Algebraic Literacy provides a modern course connecting students to STEM and related college mathematics.  I’ve posted information on the course and research for Algebraic Literacy at https://www.devmathrevival.net/?page_id=2312; here, I will focus on the implementation aspects.

One benefit of this ‘high’ implementation is that we can minimize remedial enrollments while providing intentional preparation.  Because Algebraic Literacy focuses on communication and reasoning, we provide an accessible course with higher expectations — more students can start in the 2nd course, and they will be better prepared for what follows.  For example, if intermediate algebra required a 77 on a placement test, algebraic literacy can succeed with a cutoff of 60 to 65; if an ACT Math 19 is required for intermediate algebra, algebraic literacy can manage with a 17.  These numbers are very generic, and are simply meant to illustrate the increased access.

The preparation is also improved in this model.  The cumulative message of the college math standards is:

Focus on learning core ideas in mathematics to a high level. (AMATYC; MAA – CRAFTY and CUPM)

Even if students flow from Algebraic Literacy to a traditional college algebra course, they will have more capabilities.

However, the curriculum at the college algebra level (and above) is in desperate need of modernization.  Those courses are almost all modifications of either a 19th century college algebra course in college algebra or slight variations of calculus from the mid-20th century.  We live in a golden age of mathematical sciences, but our students still take courses on dead (aka obsolete) mathematics.  Having the Algebraic Literacy course in place will provide both the motivation and safety needed for our departments to begin updating the STEM math courses.

This “High” implementation results in a total replacement of obsolete dev math courses and the beginning of renewal in the courses which follow.  The New Life Project dev math courses share much with the work of the Carnegie Foundation (Pathways) and the Dana Center (New Mathways).    The Carnegie work builds an option after the pathways courses (Statway or Quantway) to enable the student to take college algebra; the Dana Center work provides a different replacement model, where the STEM path (pre-calculus) begins right after a Math Literacy-type course.

Many in our profession would like to teach Algebraic Literacy instead of intermediate algebra; Algebraic Literacy is better mathematics and is consistent with modern teaching methods.  The main barrier to progress right now is ‘textbooks’, since there are no commercial materials available (Pearson; McGraw Hill; Cengage; Hawkes; etc).   The path out of this ‘chicken-egg’ dilemma is YOU … talk to the publisher representatives at every opportunity about the books you want to see.

A primary goal of this “High” implementation is a combination of improved preparation and the minimizing of the remedial math enrollment function.  I believe that we can achieve a situation where the mode of remedial math enrollments is 0 and 1, with a mean between those values.  We don’t need to eliminate remedial math courses … we need to modernize them to better serve our students.

Join Dev Math Revival on Facebook:

WordPress Themes