Increasing Success in Developmental Mathematics: Do we Have Any Idea?

Back in 2007, I had a conversation with two other leaders in AMATYC (Rikki Blair and Rob Kimball) concerning what we should do to make major and quantum improvements in developmental mathematics.  We shared a belief that the existing developmental mathematics courses were a disservice to students, and would remain a disservice even if we achieved 100% pass rates in those courses.  That conversation … and a whole lot of hard work … eventually led to a convening of content experts in 2009 (Seattle), the launching of the AMATYC New Life Project, and the beginnings of Carnegie Statway™ & Quantway™ as well as the Dana Center Mathways.

Imagine my surprise, then, when a publication was released dealing with “Increasing Success in Developmental Mathematics” by the National Academies of Science (see https://www.nap.edu/catalog/25547/increasing-student-success-in-developmental-mathematics-proceedings-of-a-workshop) based on a workshop this past winter.  It is true that the planning team for this workshop did not invite me; they know that I am close to retirement, and travel has become less comfortable, so my absence is actually fine with me.  No, the surprises deal with the premise and outcomes of the workshop.

One premise of the workshop included this phrase:

” … particular attention to students who are unsuccessful in developmental mathematics …”

This phrase reflects a very biased point of view — that there is something about students which causes them to be unsuccessful in some developmental mathematics (course? courses?).  This is the biggest surprise in the report, and not a good surprise: we continue to believe and act as if there are properties of students which mean that we should change our treatment.

If you read that last sentence carefully, you may be surprised yourself at the fact that I am surprised.  After all, it’s very logical that our instructional and/or curricular treatment should vary based on some collection of student characteristics.  Yes, it is logical.  Unfortunately, the statement is anything but reasonable.  Matching instructional methods, or curricular treatments, is an age-old problem of education.  Research on the efficacy of trait-treatment matching has a long and somewhat depressing history, going back at least to the early 1970s.  (http://www.ets.org/research/policy_research_reports/publications/report/1972/hrav)

 

 

 

 

 

 

 

 

 

Trait-Treatment interactions can be managed well enough at micro-levels.  My classes, like most of ours, adjust for student differences.  The problems arise when we seek to match at higher levels of abstraction — whether it’s “learning styles” or “meta majors”.  What we know about traits quickly becomes overwhelmed by the unknown and unknowable.  A similar flaw becomes apparent when we try to impose a closed system (ALEKS for example) on an organic component of a process (a student).  Sure, if I had a series of one-on-one interviews (perhaps 10 hours worth), I could match a student with a treatment that is very likely to work for them — given what was known at the time.  That knowledge might become horribly incomplete within days, depending on the nature of changes the student is experiencing.

 

 

 

This is an optimistic ‘theoretical’ error rate:  are you willing to be one of the “30%”?

 

 

The report itself spends quite a bit of time describing and documenting ‘promising’ practices.  The most common of these seems to involve one of various forms of tracking — a student is in major A, or in meta-major α, so the math course is matched to the ‘needs’ of that major or meta-major.  I suspect that several people attending the workshop would disagree with my use of the word “tracking”; they might prefer the word “aligned with the major”.  I am surprised at the ease of which we allow this alignment/tracking to determine what is done in mathematics classes.  Actually, I should use the word “discouraged” — because in most cases the mathematics chosen for a major deals with the specific quantitative work a student needs, often focusing on a minimal standard.  Are we really this willing to surrender our discipline?  Does the phrase “mathematically educated” now mean “avoids failing a low standard”?

To the extent the report deals with specific ways to ‘increase’ success in mathematics, I would rate the report as a failure.  However, when the report deals with concepts underlying our future work, you will find valid statements.  One of my favorites is from Linda Braddy:

Braddy asserted that administrators and educators are guilty of “educational malpractice” if they do not stop offering outdated, ineffective systems
of mathematics instruction.  [pg 56]

I suspect the authors would not agree with me on what we should stop doing.  I also don’t know how to interpret the comma in “outdated, ineffective” — does something need to meet both conditions in order to be malpractice?  Should we insert an “or” in the statement where a comma shows?  How about if we drop the word “instruction”?  Seems like we should also address the outdated mathematics in our courses.

Although the workshop never claimed to be inclusive, I am also disappointed that the AMATYC New Life Project never gets mentioned.  Not only did our Project produce as many (or more) implementations as the specifics described in the report, the genetic material from the Project was used to begin two efforts which are mentioned.  The result is a report which supports that notion that AMATYC has done nothing to advance ‘success in developmental mathematics’ in the past 12 years.

 

No Comments

No comments yet.

RSS feed for comments on this post. TrackBack URI

Leave a comment

You must be logged in to post a comment.

WordPress Themes