TPSE Math … Transforming Post Secondary Ed Mathematics

One of my Michigan colleagues recently reminded me of a national project on transforming post secondary education mathematics “TPSE Math”, which you can find at http://www.tpsemath.org/

This broad-based effort seeks to engage faculty and leadership from all segments of college mathematics, with an impressive leadership team.  I encourage you to check it out.

One of the first things I explored on their site deals with equity; they have a 2016 report on equity indicators (see http://www.pellinstitute.org/downloads/publications-Indicators_of_Higher_Education_Equity_in_the_US_2016_Historical_Trend_Report.pdf)  Interesting reading!

Another part of their web site I want to look at in more detail … “MAG” (Mathematics Advisory Group), which is focused on an ‘action oriented role’.  Take a look at http://www.tpsemath.org/mag

I’m expected that we will all be involved with this TPSE work, to varying degrees.

 
Join Dev Math Revival on Facebook:

Progression in Math — A Different Perspective

Much is made these days of the “7 percent problem” (sometimes 8%) — the percent of those placing in to the lowest math course who ever pass a college math course.  This progression ‘problem’ has fueled the pushes for big changes … including co-requisite remediation and/or the elimination of developmental mathematics.  The ‘problem’ is not as simple as these policy advocates suggest, and our job is to present a more complete picture of the real problem.

A policy brief was published in 2013 by folks at USC Rossier (Fong et al); it’s available at http://www.uscrossier.org/pullias/wp-content/uploads/2013/10/Different_View_Progression_Brief.pdf.  Their key finding is represented in this chart:

Progression alternate view USC Rossier 2013

 

 

 

 

 

 

 

 

 

 

 

 

The analysis here looks at actual student progression in a sequence, as opposed to overall counts of enrollment and passes.  This particular data is from California (more on that later), the Los Angeles City Colleges specifically.  Here is their methodology, using the arithmetic population as an example:

  1. Count those who place at a level: 15,106 place into Arithmetic
  2. In that group, count those who enroll in Arithmetic:  9255 enroll in Arithmetic (61%)
  3. Of those enrolled, count those who pass Arithmetic: 5961 pass Arithmetic (64%)
  4. Of those who pass Arithmetic, count those who enroll in Pre-Algebra: 4310 enroll in Pre-Algebra (72%)
  5. Of those who pass Arithmetic and enroll in Pre-Algebra, count those who pass Pre-Algebra: 3410 (79%)
  6. Compare this to those who place into Pre-Algebra: 68% of those placing in Pre-Algebra pass that course
  7. Of those who pass Arithmetic and then pass Pre-Algebra, count those who enroll in Elementary Algebra: 2833 enroll in Elementary Algebra (83%)
  8. Of those who pass Arithmetic, then pass Pre-Algebra, and enroll in Elementary Algebra, count those who pass Elementary Algebra: 2127 pass Elementary Algebra (75%)
  9. Compare this to those who place into Elementary Algebra: 70% of those placing into Elementary Algebra pass that course
  10. Of those who pass Arithmetic, then Pre-Algebra, and then Elementary Algebra, count those who enroll in Intermediate Algebra: 1393 enroll in Intermediate Algebra (65%)
  11. Of those who pass Arithmetic, then Pre-Algebra, and then Elementary Algebra, then enroll in Intermediate Algebra, count those who pass Intermediate Algebra: 1004 pass Intermediate Algebra (72%)
  12. Compare this to those who place directly into Intermediate Algebra: 73% of those placing into Intermediate Algebra pass that course

One point of this perspective is the comparisons … in each case, the progression is approximately equal, and sometimes favors those who came from the prior math course.  This is not the popular story line!

I would point out two things in addition to this data.  First, my own work on my institution’s data is not quite as positive as this; those ‘conditional probabilities’ show a disadvantage for the progression (especially at the pre-algebra to elementary algebra transition).  Second, the retention rates (from one course to the next) are in the magnitude that I expect; in my presentations on ‘exponential attrition’ I often estimate this retention rate as being approximately equal to the course pass rate … and that is what their study found.

One of the points that the authors make is that the traditional progression data tends to assume that all students need to complete intermediate algebra (and then take a college math course).  Even prior to our pathways work, this assumption was not warranted — in community colleges, students have many programs to choose from, and some of them either require no mathematics or basic stuff (pre-algebra or elementary algebra).

The traditional analysis, then, is flawed in a basic, fatal way — it does not reflect real student choices and requirements.  For the same data that produced the chart above, this is the traditional analysis (from their policy brief):

Progression traditional view USC Rossier 2013

 

 

 

 

 

 

 

 

 

 

This is what we might call a ‘non-trivial difference in analysis’!  One methodology makes developmental mathematics look like a cemetery where student dreams go to die; the other makes it look like students will succeed as long as they don’t give up.   One says “Stop hurting students!!” while the other says “How can we make this even better?”

So, I’ve got to talk about the “California” comment earlier.  The policy brief reports that the math requirement changed for associate degrees, in California, during the period of their study: it started as elementary algebra, and was changed to intermediate algebra.  I don’t know if this is accurate — it fits some things I find online but conflicts with a few.  I do know that this requirement is not that appropriate (nor was elementary algebra) — these are variations of high school courses, and should not be used as a general education requirement in college.  We can do better than this.

This alternate view of progression does nothing to minimize the penalties of a long sequence.  A three-course-sequence has a penalty of about 60% — we lose 60% of the students at the retention points between courses.  That is an unacceptable penalty; the New Life project provides a solution with Mathematical Literacy replacing both pre-algebra and elementary algebra (with no arithmetic either) and Algebraic Literacy replacing intermediate algebra (and also allowing about half of ‘elementary algebra students’ to start a course higher).

Let’s work on that question: “How can we make this even better?”

 Join Dev Math Revival on Facebook:

Placement in Math … What’s the GOAL?

So, I’ve been thinking quite a bit about college placement.  People are advocating ‘multiple measures’ (often done as ‘alternative measures’); Accuplacer is announcing “The Next Generation Accuplacer” during their summer conference (https://accuplacer.collegeboard.org/professionals/national-conference/concurrent-session-highlights) .  There are people who believe that all placement tests are ‘evil’ (including some professional friends).  The placement tests are a tool used in a process … do we agree on the goal of that process?

At the highest level of analysis, there are 3 competing goals for math placement:

  1. Which course best reflects the student’s current skills?
  2. What is the highest course in which the student has a reasonable chance of success?
  3. What skills does a student have now, or lack now?

We often make the mistake of equating these 3 goals; they are separate ‘hypotheses’, and the outcome for a given student can be quite different.

My perception is that we, in the mathematics community, have tended to fixate on the first and last goal (skills oriented placement).  To consider how different the 3 goals are, consider this student prototype:

Cassie can solve basic equations (linear, quadratic by factoring), with sound understanding of polynomials basics.  Cassie knows some function basics, including graphing; however, using slope in a flexible way is lacking.  The main weakness Cassie has is in fractions both arithmetic and algebraic — Cassie has some memorized rules that help in some cases, but produce wrong answers in general.

Goal #1 tends to find a ‘mean skill level’; since Cassie lacks some arithmetic, this mean is likely to be beginning algebra.  Goal #2 tends to find a ‘floor’ level of functioning, a minimum for success in the maximum course; depending on her program, Cassie would end up in Algebraic Literacy/Intermediate Algebra or a College Level course.  Goal #3 tends to push students down to the lowest level of gaps in skills [goal #3 is used in modular and emporium designs]; Cassie would end up in an arithmetic or pre-algebra class.

Many of us are drawn to the two skill-based goals, and that is a problem.  The first goal with it’s focus on ‘mean skill level’ tends to under-place students (as seen by outsiders); more importantly, students are always starting in a course with less challenge and more review than needed.  The third goal creates the longest sequence of learning for every student.  Both of the skill-based goals operate from the assumption that reviewing skills produces the intended results; in 40+ years in the business, I have only seen modest effect sizes for review courses in general.

The second goal has more potential to both serve our students well and provide faculty with reasonable teaching challenges.  This goal operates from a viewpoint that skills are general indicators of capabilities, and it is these capabilities that allow a student to pass math courses.  The prototype student, Cassie, is based on a student I had in beginning algebra … she is currently taking a test in my intermediate algebra class; Cassie has been very pleasant about the two courses, but I think she’s wasted a semester of math.  [My department tries to do goal #2, but our operational system produces #1 in most cases.]

Of course, there are many basic connections between the placement system goals and the design of our curriculum.  If we already had Algebraic Literacy, my student (Cassie) could have succeeded in that course in her first semester.  Our traditional curriculum (basic math, pre-algebra, beginning algebra, etc) tends to be consistent with goal #3 (the maximum number of developmental math courses).

One of the steps we often take when setting cutoffs for placement is that we ask other colleges what they use … or we see a reference online or in manuals.  It’s very rare for the goal of placement to be shared when we get this information; for years, all assumed that we shared one goal (one of the skill-based goals).

We live in a period of change in both our math courses and in the placement system.  Our developmental math courses are trending towards Math Literacy & Algebraic Literacy; the placement tests are undergoing similar shifts.  [The new Accuplacer tests will be less skill-based, more diverse, and include some reasoning.]  These changes provide us with an opportunity to re-focus our goal in placing students, and I am hoping we can develop a professional consensus on a goal of placement:

What is the highest course in which the student has a reasonable chance of success?

I am hoping that the changing tests (Accuplacer Next-Gen) will allow my institution to produce results consistent with this goal.  My draft of a Mathematical Literacy Placement Assessment is written with this goal in mind (http://www.devmathrevival.net/?p=2480).

The big question, though, is what our profession does!

 

Placement for a Modern Curriculum: A Fresh Start

The college mathematics profession has been dealing with a number of criticisms of how students are placed in their initial mathematics course.  Even before recent curricular changes, evidence suggested that the tests used for placement were likely to under-place students; a modern curriculum focusing on reasoning as well as procedure increases that placement struggle.  I’ve been working on a fresh start, and I am ready to share an initial version of a new assessment that might help solve both dimensions of the problem.

First, we need to understand the legacy of the current placement tests.  The genetic background of both Accuplacer and Compass is firmly grounded in the basic skills movements of the 1980s.  Specifically, Accuplacer grew out of instruments such as the New Jersey Basic Skills test.  The primary goal during that era was “fix all of the skills that students might lack”; the only reasoning typically included was word problems, with perhaps a bit of estimation.

Our placement needs have shifted greatly since that time.  Courses like Mathematical Literacy (and siblings Quantway [Carnegie Foundation] and Foundations of Mathematical Reasoning [Dana Center]) depend less on procedural skills like those found on placement tests; rather, the issues lie in the answers to this question:

What mathematics is this student capable of learning now?

As a coincidental benefit of the New Life Project, we have information on what these prerequisite abilities should be.  In the design of Mathematical Literacy (ML) and Algebraic Literacy (AL), our teams articulated a list of these prerequisite outcomes:

  • Mathematical Literacy prerequisite outcomes:
    1. Understand various meanings for basic operations, including relating each to
    diverse contextual situations
    2. Use arithmetic operations to solve stated problems (with and without the aid
    of technology)
    3. Order real numbers across types (decimal, fractional, and percent), including
    correct placement on a number line
    4. Use number sense and estimation to determine the reasonableness of an
    answer
    5. Apply understandings of signed-numbers (integers in particular)
  • Algebraic Literacy prerequisite outcomes:
    1. Understand proportional relationships in a variety of settings, including paired data and graphs.
    2. Apply properties of algebraic expressions, including distributing, like terms, and integer exponents.
    3. Construct equations and inequalities to represent relationships
    4. Understand how to solve linear equations by reasoning
    5. Understand how to write and use linear and exponential functions

Using these 10 outcomes, I’ve written a “Mathematical Literacy Placement Assessment” (MLPA).  The draft 0.5 of the MLPA has 30 items, with slightly more than half on the AL prerequisites.  Here is the MLPA:

Mathematical Literacy Placement Assessment 2016 version 0.5

Note two things about the MLPA:  (1) the copyright license is Creative Commons by Attribution (you can use it, as long as you state the owner [Jack Rotman] … you can even modify it; (2) the MLPA has not been validated in any way … any use should begin with preliminary validation followed by improvements to the assessment.  In case you are not familiar with the Creative Commons by Attribution license, it allows others to both use and modify the material, as long as the attribution is provided.

The intent of the MLPA is to provide a modern assessment of students who might need a Math Literacy course.  The initial 12 items are prerequisites to that course, so a score on those 12 provides a measure (with reliability and validity to be determined) of a student’s readiness for Math Literacy.  The other 18 items are intended to assess whether the student needs the Math Literacy course; a score on those 18 items would indicate whether the student is ready for an Algebraic Literacy course (or possibly intermediate algebra).

If the MLPA has content validity, we would expect a significant but small correlation with other placement results (Accuplacer, Compass) — because the MLPA is intended to measure different properties than those assessments.  The content validity would need to be established directly, possibly by use in Math Literacy courses (as a pre-test and post-test).  Variations in the two sections of the MLPA should be highly correlated to the students’ work in the Math Lit course.

My goal in developing the MLPA version 0.5 is to provide a starting point for the community of practitioners — both mathematics faculty and companies involved in testing.  Ideally, people involved in this work would collaborate so that an improved MLPA would be available to others.  The hope is that many different institutions and organizations will become involved in developing– and using — a useful modern placement test, which would benefit both colleges and students.

If you would like an MS Word document of the MLPA, please send me a direct email; you are also free to work from the “pdf” version posted above.

 

WordPress Themes