Placement: Does HS GPA Add EQUAL Value?

Many people are talking about ‘multiple measures’ placement, especially the option of using high school grade point average as an input variable.  In some locations (like mine) ‘multiple measures’ is translated ‘HS GPA instead of placement test’, where ‘multiple’ means ‘alternative’. True multiple measures has some appeal.  Conceptually, there is an advantage to using more than one variable as input when the variables measure different traits.  The HS GPA involves several issues, with equity being high on my list.

As mathematicians, the first thing we should say about HS GPA is that this variable is a mis-use of the raw data.  The grades in any class are barely ordinal in nature (rankings); the average used (mean) is based on ratio variables (equal intervals AND a 4 represents twice something compared to 2).  When a variable has statistical flaws such as this, any further use in analysis should be suspect.  Whatever the disadvantages of tests (ACT, SAT, or placement), at least they involve appropriate use of the measures from a statistical point of view.

High school GPA has a number of confounding variables, some of which are shared by most tests used today.  In particular, economic level (SES) and ethnicity are both factors in the HS GPA picture (as they are in college GPA).  This type of analysis is widespread and the results consistent; one such report is from the Educational Testing Services (see http://www.ets.org/Media/Research/pdf/RR-13-09.pdf ).  Using HS GPA does not level the playing field, given the high correlations normally found between the measures.  In fact, my view is that using HS GPA in addition to a test will benefit mostly majority students from comfortable homes … and will again place minority and poor students in lower levels.

As an anecdotal piece of data, I was at a conference session recently on co-requisite remediation where the placement method involved tests or HS GPA.  Through the first year of their work, the co-requisite ‘add-on’ sections were almost totally minority … even more than their traditional developmental classes had been.  [The institution used a co-requisite model where all students enroll in the college course, and those not meeting a cutoff were required to enroll in the add-on section as well.]

When people try to explain the predictive ‘power’ of HS GPA, they often use ill-defined phrases such as ‘stick-to-itness’.  I suspect that our friends teaching high school would have a different point of view, where grades in the C+ to B range reflect not skills but attitudes (primarily compliance).  How can we justify using an inappropriate statistic (grades are ordinal) which measures “who knows what”?  Whatever the HS GPA measures, it is indirectly related to preparation for college mathematics.  The connections are likely to be stronger for writing.

The arguments FOR using the HS GPA in placement are based on studies which indicate an equal or higher predictive validity, versus tests alone.  One of the better studies within mathematics is one done by ACT (see http://www.act.org/content/dam/act/unsecured/documents/5582-tech-brief-joint-use-of-act-scores-and-hs-grade-point-average.pdf).  Here is their graph:

act math and hs gpa versus college algebra success 2016

 

 

 

 

 

 

 

 

 

This graph is showing the probability of passing college algebra, with the 5 curves representing ACT Math levels (10-14, 15-19, etc).  If a student’s ACT Math is below 20, their HS GPA does not improve their probability of success — until we get between 3.5 and 4.0.  The 20 to 24 groups and above have a pattern indicating that it might help to include the GPA; since most of us use cutoffs in the 19 to 22 range, this shows some promise when using BOTH variables.

However, notice the negative indications … if the ACT math is high (over 25) and the GPA is low, the data indicates that we should place the student differently because the student has a dramatically lowered pass rate.  Perhaps THIS is the place for co-requisite remediation!  I would also point out the overall picture for HS GPA at the high end … the probability of success is varied, and depends upon the test score.

SUMMARY:
We know that both HS GPA and tests tend to reflect inequities, where the results tend to place more minorities in developmental courses.  Although predictive value increases (correlation), we are using an inappropriate statistic (HS GPA) with little connection to preparation for college mathematics.  The available research suggests some minor gains for using HS GPA:

  • for students just below the college math test cutoff with a very high HS GPA
  • those with high scores and low HS GPA
  • use of HS GPA alone results in an almost random assignment of students

Placement has never been a confident endeavor; even the best measures (tests or other) are incomplete and impacted by other variables.  Placement Tests have taken a beating in recent years, a treatment which I think was not justified.  Modernizing the placement tests is a more appropriate response … an idea which I will pursue in an upcoming post.

Join Dev Math Revival on Facebook:

<script src=”http://connect.facebook.net/en_US/all.js#xfbml=1“></script><fb:like href=”http://www.facebook.com/home.php#!/pages/DevMathRevival/201095716578610” show_faces=”true” width=”450″ font=””></fb:like>

 

Implementing Better Math Courses, Part III: Connecting All the Dots

The traditional developmental math sequence focuses on school mathematics, biased by an algebra fixation … narrowly defined to be algebraic procedures.  Although some have the perception that this sequence serves ‘STEM students’ well, professional standards and research indicates that the sequence does not serve them well.  In this post, I will focus on truly connecting all the dots — to STEM math and most college mathematics&nbsp. #NewLifeMath #AlgebraicLiteracy

The prior posts on implementing better math courses focused on the beginning algebra level.  The Level I implementation (Pathways) described a side-by-side model; the Level II implementation (Medium) provided a total replacement of beginning algebra as well as all courses prior to that.  The next level (III) involves replacing intermediate algebra with Algebraic Literacy (AMATYC New Life project).

Here is an image of this implementation model:

ImplementationMap HIGH March2016

 

 

 

 

 

 

 

 

Algebraic Literacy provides a modern course connecting students to STEM and related college mathematics.  I’ve posted information on the course and research for Algebraic Literacy at http://www.devmathrevival.net/?page_id=2312; here, I will focus on the implementation aspects.

One benefit of this ‘high’ implementation is that we can minimize remedial enrollments while providing intentional preparation.  Because Algebraic Literacy focuses on communication and reasoning, we provide an accessible course with higher expectations — more students can start in the 2nd course, and they will be better prepared for what follows.  For example, if intermediate algebra required a 77 on a placement test, algebraic literacy can succeed with a cutoff of 60 to 65; if an ACT Math 19 is required for intermediate algebra, algebraic literacy can manage with a 17.  These numbers are very generic, and are simply meant to illustrate the increased access.

The preparation is also improved in this model.  The cumulative message of the college math standards is:

Focus on learning core ideas in mathematics to a high level. (AMATYC; MAA – CRAFTY and CUPM)

Even if students flow from Algebraic Literacy to a traditional college algebra course, they will have more capabilities.

However, the curriculum at the college algebra level (and above) is in desperate need of modernization.  Those courses are almost all modifications of either a 19th century college algebra course in college algebra or slight variations of calculus from the mid-20th century.  We live in a golden age of mathematical sciences, but our students still take courses on dead (aka obsolete) mathematics.  Having the Algebraic Literacy course in place will provide both the motivation and safety needed for our departments to begin updating the STEM math courses.

This “High” implementation results in a total replacement of obsolete dev math courses and the beginning of renewal in the courses which follow.  The New Life Project dev math courses share much with the work of the Carnegie Foundation (Pathways) and the Dana Center (New Mathways).    The Carnegie work builds an option after the pathways courses (Statway or Quantway) to enable the student to take college algebra; the Dana Center work provides a different replacement model, where the STEM path (pre-calculus) begins right after a Math Literacy-type course.

Many in our profession would like to teach Algebraic Literacy instead of intermediate algebra; Algebraic Literacy is better mathematics and is consistent with modern teaching methods.  The main barrier to progress right now is ‘textbooks’, since there are no commercial materials available (Pearson; McGraw Hill; Cengage; Hawkes; etc).   The path out of this ‘chicken-egg’ dilemma is YOU … talk to the publisher representatives at every opportunity about the books you want to see.

A primary goal of this “High” implementation is a combination of improved preparation and the minimizing of the remedial math enrollment function.  I believe that we can achieve a situation where the mode of remedial math enrollments is 0 and 1, with a mean between those values.  We don’t need to eliminate remedial math courses … we need to modernize them to better serve our students.

Join Dev Math Revival on Facebook:

Why We NEED Stand-Alone Remedial Courses

Extremes are seldom a good thing.  At one extreme, we had 4 or more developmental math courses at many institutions.  In the future, we may end up with zero dev math courses — as people drink the ‘co-requisite cool-aid’.  Moderation is usually a better thing than extremes. We need to consider the diverse reasons why remedial math courses make sense.

Let’s begin with a conjecture … that it is feasible to use co-requisite remediation for students beginning any college math course.  Each of the 3 major types of introductory math courses would have the needed remediation (pre-calculus, statistics, quantitative reasoning), with each of these remediation needs being different from the others.  In some implementations, the co-requisite remediation is built on the entire content of the old dev math course; however, students typically do not need to pass the remedial component — if the college course is passed, the remedial portion is either automatically passed or does not count.

This conjecture follows a common theme in the policy world — ‘stand-alone developmental courses are a barrier to student success’.  We have some evidence that the research data does not support this conclusion — the article recently cited here, written by Peter Bahr, as well as the CUNY “ASAP” program (I’ll post about that research in the near future).  The ‘data’ used for the stand-alone statement is demographic — students who place into a dev math course (especially multiple levels below college) are far less likely to complete a college math course.

Let’s pretend that the research in favor of dev math courses is mistaken, and that the true situation is better estimated by those attacking stand-alone courses.  What are the overall consequences of ‘no more dev math courses’?

In community college programs, students are faced with quantitative issues in a variety of courses outside of mathematics.  Here is a realistic scenario:

  • In a biology course, a student needs to understand exponential functions and perhaps basic ideas of logarithms.
  • In a nursing course, a student needs to apply dimensional analysis to convert units and determine dosage.
  • In an economics class, a student needs to really understand slopes and rate of change (at least in a linear way).
  • In a chemistry class, a student needs to apply equation concepts in new ways.

If we no longer have stand-alone developmental math courses, there are basic consequences:

  1. ALL courses in client disciplines will also need to do remediation (unless they require a college-level math course).
  2. Courses in client disciplines that do require a college math course will need to have that course listed as a prerequisite — even if the math needed is at the developmental level — OR such client discipline courses will also need to do remediation.
  3. Courses in client disciplines will always need to do remediation if they require a college math course that does not happen to include all of the background needed.

We might face similar consequences within mathematics, though those seem minor to me.  The consequences are trivial within STEM programs, but that is small consolation to the majority of our students (and colleagues).  The mis-match situation (#3) occurs with stand-alone courses, but will be more widespread without them.

Getting rid of stand-alone dev math courses is extremely short-sighted.  The premise is that all of a student’s needs in developmental mathematics relate to the college math course they will take.  If a student’s program is well served by statistics, does this  mean that all courses in the program are well served by a statistics course?

Even if co-requisite remediation produces sustainable high levels of success, the methodology fails to support our student needs — ‘solving’ one problem while creating several others.  Eliminating stand-alone developmental math courses is not a solution at all … eliminating stand-alone courses puts our students at risk AND harms our colleagues in partner disciplines.  I would also predict that co-requisite remediation will disproportionately ill-serve those who most need our help — students of color and students from lower “SES” (the low-power students).

The root-problem is not stand-alone courses — the root problem is that we have a too-long sequence of antiquated dev math courses.  We have a model for solving this problem in the New Life Project, with two modern courses: Mathematical Literacy, and Algebraic Literacy.  Both courses modernize the curriculum so that it serves mathematics as well as our client disciplines, with a structure that allows most students to have one (at most) pre-college course.

The co-requisite movement states that our responsibility ends with the college math course.  Our relationships with other disciplines is based on a larger responsibility; our work on student success factors within our courses is based on a larger responsibility.  Declaring that “the results are in” and “co-requisite remediation WORKS” … amounts to defining a problem out of existence while ignoring the problem itself.

Nobody needs co-requisite remediation; nobody needs 4 or 5 developmental math courses.  Our students need an efficient modern system for meeting their quantitative needs in college, regardless of their prior level of success.

 
Join Dev Math Revival on Facebook:

 

The Case for Remediation

Today, I am at a state-wide conference on developmental education (“MDEC”), where two presenters have addressed the question “is remediation a failure?”.  As you likely know, much of the recent conversation about developmental mathematics is based on a conclusion that the existing system is a failure.  The ‘failure’ or ‘success’ conclusion depends primarily on who is asking — not on the actual data itself.

The “failure” conclusion is presented by a set of change agents (CCA, CCRC, JFF); if you don’t know those acronyms, it’s worth your time to learn them (Complete College America; Community College Research Center; Jobs For the Future).  These conclusions are almost always based on a specific standard:

Of the students placed into developmental mathematics, how many of them take and pass a college-level math course.

In other words, the ‘failure’ conclusion is based on reducing the process of developmental mathematics down to a narrow and binary variable.  One of today’s presenters pointed out that the ‘failure’ conclusion for developmental math is actually a initial-college-course issue — most initial college courses have high failure rates and reduced retention to the next level.

The ‘success’ conclusion is reached by some researchers who employ a more sophisticated analysis.  A particular example of this is Peter Bahr, who has published several studies.  One of these is “Revisiting the Efficacy of Postsecondary Remediation”, which you can see at http://muse.jhu.edu/journals/review_of_higher_education/v033/33.2.bahr.html#b17.

My findings indicate that, with just two systematic exceptions, skill-deficient students who attain college-level English and math skill experience the various academic outcomes at rates that are very similar to those of college-prepared students who attain college-level competency in English and math. Thus, the results of this study demonstrate that postsecondary remediation is highly efficacious with respect to ameliorating both moderate and severe skill deficiencies, and both single and dual skill deficiencies, for those skill-deficient students who proceed successfully through the remedial sequence.  [discussion section of article]

In other words, students who arrive at college needing developmental mathematics achieve similar academic outcomes in completion, compared to those who arrived college-ready.  There is, of course, the problem of getting students through a sequence of developmental courses … and the problems of antiquated content.  Fixing those problems would further improve the results of remediation.

One of the issues we discuss in statistics is “know the author” … who wrote the study, and what was their motivation?  The authors who conclude ‘failure’ (CCA, CCRC, JFF) are either direct change agents or designed to support change; in addition, these authors have seldom included any depth in their analysis of developmental mathematics.  Compare this to the Bahr article cited; Bahr is an academic (sociologist) looking for patterns in data relative to larger issues of theory (equity, access, etc); Bahr did extensive analysis of the curriculum in ‘developmental math’ within the study, prior to producing any conclusions.

Who are you going to believe?

Some of us live in places where our answer does not matter … for now, because other people in power roles have decided who they are going to believe.  We have to trust that the current storms of change will eventually subside and a more reasoned approach can be applied.

In mathematics, we have our own reasons for modernizing the curriculum; sometimes, we can make progress on this goal at the same time as the ‘directed reforms’.  Some of us may have to delay that work, until the current storm fades.

Our work is important; remediation has value.  Look for opportunities to make changes based on professional standards and decisions.

I’ll look for other research with sound designs to share.  If you are aware of any, let me know!

 

Join Dev Math Revival on Facebook:

 

 

WordPress Themes