Category: placement

The Placement Test Disaster ?

For an internal project at my institution, I’ve been looking at the relationships between Accuplacer test scores, ACT Math scores, and performance in both developmental and college-level courses.  Most of the results are intended for my colleagues here at LCC.  However, some patterns in those relationships are important for us to explore together.

So, the first pattern that is troubling is this:

Students who place into a pre-calculus course based on their ACT Math score have lower outcomes than those who place based on the Accuplacer “College Level Math” test … and lower than those who needed to take intermediate algebra before pre-calculus.

We use the ‘college readiness’ standard on the ACT Math test of 22 (see https://www.act.org/content/act/en/education-and-career-planning/college-and-career-readiness-standards/benchmarks.html ).  The pattern in our data for the ACT Math is similar to some references found at other institutions … though we tend not to talk about this.

Of course, the use of an admissions test (ACT or SAT) for course placement is “off label” — the admissions tests were not designed for this purpose.  We tend to use the ACT option for placement in response to political pressure from administrators (internally) and from stakeholders (externally), and sometimes under the guise of “multiple measures”.  The patterns in our data suggest that the ACT Math score is only valid for placement when used in a true multiple measures system … where two or more data sources are combined to create a placement.  However, most of us operate under ‘alternative measures’, where there are different options … and a student can select the highest outcome if they wish; alternative measures is guaranteed to maximize the error rate in placement, with a single measure placement test almost always providing better results.

The second pattern reflecting areas of concern:

The correlations are low between (A) the ACT Math and Accuplacer College Level Math test, and (B) the Accuplacer Algebra and Accuplacer Arithmetic tests.

The second combination is understandable, in itself; the content of the Algebra and Arithmetic tests have low levels of overlap.  The problem deals with our mythology around a sequence of math courses … that the prerequisite to algebra is ‘passing’ basic math.  Decades of our research on algebra success provide strong evidence that there is little connection between measures of arithmetic mastery and passing a first algebra course.  In spite of this, we continue to test students on arithmetic when there curricular needs are algebraic:  that is a disaster, and a tragedy.

The first ‘low correlation’ (ACT Math, College Level Math) is not what we would expect.  The content domains for the two tests have considerable overlap, and both tests measure aspects of ‘college readiness’.  As an interesting ‘tidbit’, we find that a higher proportion of minorities (African American in particular) place into pre-calculus based on the more reliable College Level Math test compared to majority (white, who have a higher proportion placed based on the ACT Math) — creating a bit of a role reversal (whites placed at a disadvantage).

Placement testing can add considerable value … and placement testing can create extreme problems.  For example, students with an average high school background will frequently earn a ‘college ready’ ACT Math score when they have too many gaps in their preparation for pre-calculus.  A larger problem (in terms of number of students) comes from the group of students a bit ‘below average’ … who tend to do okay on a basic algebra test but not-so-good on arithmetic, which results in thousands of students taking an arithmetic-based course when they could have succeeded in a first algebra course (or Math Literacy).

Those two problems are symptoms of a non-multiple-measures use of multiple measures, where alternative measures allow students to select the ‘maximum placement’ while other measures (with higher reliability) suggest a placement better matched for a success situation.

As a profession, we are under considerable pressure to avoid the use of placement tests.  Policy makers have been attacking remediation for several years now, and more reasonable advocates suggest using other measures.  The professional response is to insist on the best outcomes for students — which is true multiple measures; if that is not viable, a single-measure placement test is better than either a college-admission test or a global measure of high school (like HS GPA).

And, all of us should deal with this challenge:

Why would we require any college student to take a placement test on Arithmetic, when their college program does not specifically require proficiency in the content of that type of test?

At my institution, I don’t think that there are any programs (degrees or certificates) that require basic arithmetic.  We used to have several … back in 1980!  Technology in the workplace has shifted the quantitative needs, while our curriculum and placement have tended to remain fixated on an obsolete view.

 Join Dev Math Revival on Facebook:

Are Those Tests Evil? (ACT, SAT)

So, I have been doing some work on my college’s data relative to passing a math class correlated with either a placement test score or a score on the ACT Math section.  I shared some correlation data in an earlier post … this post is more about validity of measures.

One issue that has been impacting 4-year colleges & universities is the ‘test optional’ movement, where institutions make admissions decisions based on factors other than standardized tests.  This is an area of some research; one example is at http://www.act.org/content/dam/act/unsecured/documents/MS487_More-Information-More-Informed-Decisions_Web.pdf if you are interested.  Since I work at a community college, all of our admissions decisions are ‘test optional’.

Michigan uses standardized tests (ACT or SAT) as part of the required testing for students who complete high school, and the vast majority of our students do complete high school in Michigan.  Curiously, less than half of the students have standardized tests on their college record.  This both creates some interesting statistical questions and some practical problems.

For the past several years, the ACT has been that test for Michigan high school students (a switch was made to the SAT this year).  We use the ACT standard for ‘college readiness’, which is a 22 on the ACT Math section.  That standard was determined by the ACT researchers, using a criteria of “75% probability of passing college algebra” based on a very large sample of data.

A problem with this standard is that “college algebra” has several meanings in higher education.  For some people, college algebra is synonymous with a pre-calculus course; for others, college algebra is a separate course from pre-calculus.

My institution actually offers both flavors of college algebra; we have a “Pre-Calculus I” course as well as a “College Algebra” course.  The College Algebra course does not lead to standard calculus courses, but does prepare students for both applied calculus and a statistics course.  The Pre-Calculus I course is a very standard first semester course, and has a lower pass rate than College Algebra.  The prerequisite to both courses is one of (a) ACT Math 22 (b) Accuplacer College Level Math (CLM) test 55, or (c) passing our intermediate algebra course; all three of these provide the student with a “Math Level 6” indicator.  We assign a higher math level for scores significantly above the thresholds listed here.

So, here is what we see in one year’s data for the Pre-Calculus course:

  • ACT Math 22 to 25                      63% pass pre-calculus I
  • CLM  55 to 79                               81% pass pre-calculus I
  • passed Intermediate Algebra    71% pass pre-calculus I

The first two proportions are significantly different, and the first proportion is significantly different from the ‘75%’ threshold used by ACT.  One conclusion is that the ACT College Readiness standard is based more on other “college algebra” courses (not as much pre-calculus).

One of the things we find is that there is little correlation between the ACT and passing Pre-Calculus.  In other words, students with a 25 ACT Math are not any more likely to pass than those with a 22.  This is not quite as true with the CLM; the probability of passing increases (though slightly) with scores as they increase from the cutoff.

Now, a question is “why did so many students NOT provide the College with their ACT scores”?  Well, perhaps the better question … “Are those who did not provide the scores significantly different from those who did provide them?”  That is a workable question, though the data is not easy to come by.  The concern is that some types of students are more likely to provide the ACT scores (either white students or students from more affluent schools).

We’ve got reason to have doubts about using the ACT Math score as part of a placement cutoff, and reason to prefer the CLM for predictive power.

More of us need to analyze this type of data and share the results; very little research is available on validity issues of standardized tests done by practitioners.

 
Join Dev Math Revival on Facebook:

 

 

 

Multiple Measures: How Consistent are ACT Math and Accuplacer

Like many institutions, mine allows students to place into a math course via a variety of methods.  The most common methods are the ACT Math score and the Accuplacer College Level Math (CLM) test.  I ran into a reference to a university which concluded that the ACT Math score was not a reliable predictor.

So, I’m posting a quick summary of how those two instruments agree (or not).  As part of our normal program improvement and curricular work, I have gathered information on about 800 students who were enrolled in our pre-calculus course.  Obviously, this is not a random sample of all ACT Math and all CLM scores.  However, given the selection, the two instruments should have a reasonable amount of consistency.

There were 122 students with both ACT Math and CLM scores.  Of these:

  • 74 had scores on both that produce the same course placement (61%)
  • 48 had scores such that different course placements result (39%)

The vast majority of the ‘disagreement’ involved a higher ACT Math placement than CLM placement.  A quick comparison shows that students placing based on ACT Math have a lower pass rate than those who place based on the CLM.  I’ve got some more work to do in analyzing the data before identifying a hypotheses about that pattern.

For that sample of 122 students with both scores, there is a significant correlation (about 0.32).  That correlation is somewhat limited by the sample, which tends to emphasize relatively high scores (skewed distribution).  Even with that limitation, I was concerned about the small size of the correlation … I’d expect a ‘native’ correlation (all data) of about 0.7, and a reduction to 0.5 would be reasonable given the skewed sample.  That 0.32 is pretty small for these two measures.

Most of us use “alternate measures” (this method OR that method); low consistency between methods means our error rates will increase with the ‘or’.  If the low consistency holds up in further analysis, we should either use the most reliable predictor … or true multiple measures where we use some combination of data to determine a course placement.

I began looking at our data because I could not find studies looking at the correlation and relative placement strength of our two measures.  If you are aware of a study that provides that type of information, I’d appreciate hearing about it.

 Join Dev Math Revival on Facebook:

Mathematical Literacy WITHOUT a Prerequisite

Starting this Fall (August 2016) my department will begin offering a second version of our Mathematical Literacy course.  Our original Math Lit course has a prerequisite similar to beginning algebra (it’s just a little lower).  The new course will have NO math prerequisites.

So, here is the story: Last year, we were asked to classify each math course as “remedial, secondary level”  or “remedial, elementary level” or neither.  This request originates with the financial aid office, which is charged with implementing federal regulations which use those classifications.  Our answer was that our pre-algebra course was “remedial, elementary level” because the overwhelming majority of the content corresponded to the middle of the elementary range (K-8).  We used the Common Core and the state curriculum standards for this determination, though the result would be the same with any reference standard.

Since students can not count “remedial, elementary level” for their financial aid enrollment status, our decision had a sequence of consequences.  One of those results was that our pre-algebra course was eliminated; our last students to ever take pre-algebra at my college finished the course this week.

We could not, of course, leave the situation like that — we would have no option for students who could not qualify for our original Math Literacy course (hundreds of students per year).  Originally, we proposed a zero credit replacement course.  That course was not approved.

Our original Math Literacy course is Math105.  We (quickly!) developed a second version … Math106 “Mathematical Literacy with REVIEW”.  Math106 has no math prerequisite at all.  (It’s actually got a maximum, not a minimum … students who qualify for beginning algebra can not register for Math106.)  The only prerequisites for Math106 are language skills — college level reading (approximately) and minimal writing skills.

Currently, we are designing the curriculum to be delivered in Math106.  We are starting with some ‘extra’ class time (6 hours per week instead of 4) and hope to have tutors in the classroom.  Don’t ask how the class is going because it has not started yet.  I can tell you that we are essentially implementing the MLCS course with coverage of the prerequisite skills, based on the New Life Project course goals & outcomes.

We do hope to do a presentation at our state affiliate conference (MichMATYC, at Delta College on October 15).  I would have submitted a presentation proposal for AMATYC, but all of the work on Math106 occurred well after the deadline of Feb 1.

One of the reasons I am posting this is to say: I am very proud of my math colleagues here at LCC who are showing their commitment to students with courage and creativity.  We will deliver a course starting August 25 which did not exist anywhere on February 1 of this year.

 Join Dev Math Revival on Facebook:

WordPress Themes