Category: Math curriculum in general

Math Education in the Face of Climate Change

Our profession is denying climate change.  We continue to create greenhouse gasses with little regard for the planet nor for the vulnerable organisms who become collateral damage for our ignorance.  Our organizations celebrate the isolated experiment in ‘smaller carbon footprint’ while the vast majority of our companies are focused more on tradition than on science.  We are the enemy.

 

 

 

 

 

 

 

This metaphor is meant to convey the tragic condition of mathematics education in the year 2019.  Some of us have had major changes imposed on us relative to developmental mathematics, which generally leave all other college mathematics unscathed.  Even in some of those places, we still offer just as many traditional developmental math courses.  In almost all cases, the important processes in our industry remain as they were 40 and 50 years ago.

One of the attack lines against developmental mathematics has been “remedial math is where dreams go to die”, and it is true that our traditional developmental math courses did not serve our students well.   The response data to this criticism is trimodal — some of us replaced the traditional courses with fewer and more modern courses, some of us eliminated dev math with ‘corequisite’ strategies, and the rest of us continue business as usual.

If you really want to see dream death in mathematics, study our ‘pre-calculus’ content and courses.  Students enter into the prep for calculus with dreams of being a scientist or engineer or computer scientist; they almost always experience a brain-deadening mix of algebraic procedures and memorization which seems to have the goal of eliminating the ‘unfit’ before calculus I.

 

 

 

 

 

 

 

 

In mainstream college mathematics, we hold to tradition … all of this ‘stuff’ is needed for calculus; the rationale:  we have always taught this ‘stuff’ in pre-calculus.  We sometimes can justify the greenhouse gasses of pre-calculus by citing a contrived calculus problem which happens to require this contrived pre-calculus topic.  Our current books — including “OER” materials — for pre-calculus are still descendants of a gen-ed college algebra course never intended to be in a calculus path (see College Algebra … an Archeological Study).

Evidence of this bizarre mix of dream killing curriculum is our habit of having “college algebra for pre-calculus” as the prerequisite for pre-calculus.  College algebra has nothing to do with pre-calculus, just as pre-calculus has nothing to do with calculus (see College Algebra is Not Pre-Calculus, and Neither is Pre-calc and College Algebra is Still Not Pre-Calculus 🙁 ).

Pre-calculus is where STEM dreams go to die.

The most egregious contribution to ‘climate change’ in mathematics?  The fact that numerical methods and modeling are not integrated into the curriculum (at all levels).  All of our client disciplines are heavily dependent on a collection of matrix and modeling methods and technology.  Nobody ever needed all of the manual calculus methods we taught, but many of them were critical before computational mathematics.  With computational mathematics, fewer manual methods are needed — more conceptual rigor is required, along with content to support appropriate numerical methods.

 

 

 

 

 

 

This image comes from a page on this Envisioning Our Future, and is an attempt to envision a solution to our climate change problem.  Eliminating wasted energy and keeping dreams alive are essential criteria for judging the validity of such solutions, and I have no doubt that my ‘vision’ will not be our shared solution.  We need to work together to create viable solutions locally, share these solutions regionally, and eventually develop a national pattern of ‘good college mathematics’ courses.

 

 

 

 

 

 

 

 

Are you so attached to tradition that you are willing to contribute to this ‘climate change’ in higher education?  Or, do you want STEM dreams to live and thrive?  Perhaps you are willing to consider fundamental change to our curriculum just based on the criteria “teach good mathematics”. We might have pride in our individual teaching practices, but none of us can have pride in delivering bad or awful mathematics to tomorrow’s scientists.

Our traditional courses create dangerous levels of greenhouse gasses (bad mathematics) and contribute directly to climate change (the death of dreams).  We need to reduce our carbon footprint (more ‘good mathematics’) and actively improve the climate (student dreams).

What are you doing to ‘fight climate change’ in college mathematics?

 

Math Education: Changes, Progress and the Club

What works?  How do we help more of our students … all students … achieve goals supported by understanding mathematics in college?  To answer those questions, we need to correct our confusion between measurements and objects; we need to develop cohesive theories based on scientific knowledge.  We are far from that stage, and policy makers tend to be just as ill-equipped.

Co-requisite remediation in mathematics has been broadly implemented.  The process which led to widespread use of a non-solution illustrates some of my points.

 

 

 

 

 

 

 

 

 

The rush towards co-requisite models is based on two patterns in ‘data’:

  • Students who place into remedial mathematics courses tend to have significantly lower completion rates in programs
  • Students who pass a college math course in their first year tend to have significantly higher completion rates in programs

These patterns in the data are not in dispute.  The reason I call co-requisite remediation a non-solution, however, is based on the faulty reasoning which bases a solution on these ‘facts’.  I find it surprising that our administrators accept this treatment for a problem which never addresses the needs of the students — it’s all about responding to the data.  That approach, in fact, is a political method for resolving a disagreement where people on one side hold the power to make decisions.

 

 

 

 

 

 

 

 

 

Eliminating stand-alone developmental math courses will not enable more students to complete their college program, beyond the 15% to 25% who were underplaced by testing — and then only to the extent that this group failed to complete their remedial math courses.  The data on this question usually shows that the issue is not failing remedial math courses; it’s retention or attempt.

Tennessee has been the most publicized ‘innovator’, and I have been addressed with the “club” that the leader of that effort was a mathematician — with the handle of the club saying “He’s a mathematician; and he thinks this is a good solution!”  Like mathematicians are never wrong :(.  Some evidence on related work in Tennessee is showing a lack of impact (https://www.insidehighered.com/news/2018/10/29/few-achievement-gains-tennessee-remedial-education-initiative).  I would expect that evidence to show that the main benefit of corequisite remediation is an increase in the proportion of students getting credit for a college math course in either statistics or quantitative reasoning (QR)/liberal arts (LAM).

When data shows that many students can pass a college math course in statistics, QR, or LAM I don’t see any evidence of corequisite working.  Many of my trusted colleagues believe that stat, QR and LAM have very minimal mathematics required; some believe that the only real requirement is a pulse as long as there is effort.

Why do students placed in remedial mathematics have lower completion rates?  The properties of this group of students is not homogeneous.  For some of them, they have reasonably solid mathematical knowledge that is ‘rusty’ … a short treatment, or just opportunity to review, is sufficient.  A significant portion have a combination of ‘rust’ and ‘absence’; the absence usually involves core ideas of fraction, percent, and basic algebraic reasoning.  Corequisite remediation presumes to address these needs via a ‘just in time’ approach; ‘rust’ responds well to short-term treatments — absence not so much.  Another portion of remedial math students have mostly ‘absence’ of mathematical knowledge.

I’ve written previously on the question of ‘when will co-requisites work’ (Co-requisite Remediation: When it’s likely to work).  My comments here are more related to the meaning of ‘work’ — what is the problem we are addressing, and what measures show the degree to which we are solving it?

One of the primary reasons we have today’s mess in college mathematics is … ourselves.  For decades, we presented an image of mathematics where calculus was the default goal as shown in the image at the start of this post.  This image is so flawed that external attacks had an easy time imposing a political solution to an educational problem:  avoidance.  Of course, courses not preparing students for calculus are generally ‘easier’ to complete.  That is not the problem.  The problem is our lack of understanding related to the mathematical needs of students beyond what we find in guided pathways.

To clarify, the ‘our’ in that last sentence is a reference to everybody engaged with both policy and curriculum.  As an example of the lack of understanding, consider the question of the mathematics needed for biology in general and future medical doctors in particular.  We hear comments like “doctors don’t use calculus” on the job; I’m pretty confident that my doctor is not finding derivatives nor integrating a rational function when she determines appropriate treatment options.  However, mastering modern biology depends on conceptual understanding of rates of change along with decent modeling skills.  The problem is not that doctors don’t use calculus on the job; the problem is that our mathematics courses do not address their mathematical needs.

So, back to the two data points mentioned earlier.  The second of these dealt with an observation that students who complete a college mathematics in their first year are more likely to complete their program.  This type of data is often used as a second rationale in the political effort to impose co-requisite remediation … get more students into that college math course right away.  Of course, the reasoning is based on confusing correlation with causation; do groups of similar students have better completion with a college math course right away compared to delays?  Historically, students who are able to complete a college math course in the first year have one or more of these advantages:

  • Higher quality opportunities in their K-12 experience
  • Better family support
  • Lack of social risk factors
  • Presence of enabling economic factors

If these factors are involved, then we would expect a ‘math in first year’ effort to primarily mean that more students complete a college math course — not that more students complete their program.  This is, in fact, what the latest Tennessee data shows.  I have not been able to find much research on the question of whether ‘first year math’ produces the results I predict versus what the policy marketing teams suggest.  If you know of any research on this, please pass it along.

 

 

 

 

 

 

 

 

We need to own our problems.  These problems include an antiquated curriculum in college mathematics combined with instructional practices which tend to not serve the majority of students.  Clubs with data engraved on the handle are poor choices for the forces needed to make changes; understanding based on long-term scientific theories provides a sustainable basis for progress which will serve all of our students.

 

Increasing Success in Developmental Mathematics: Do we Have Any Idea?

Back in 2007, I had a conversation with two other leaders in AMATYC (Rikki Blair and Rob Kimball) concerning what we should do to make major and quantum improvements in developmental mathematics.  We shared a belief that the existing developmental mathematics courses were a disservice to students, and would remain a disservice even if we achieved 100% pass rates in those courses.  That conversation … and a whole lot of hard work … eventually led to a convening of content experts in 2009 (Seattle), the launching of the AMATYC New Life Project, and the beginnings of Carnegie Statway™ & Quantway™ as well as the Dana Center Mathways.

Imagine my surprise, then, when a publication was released dealing with “Increasing Success in Developmental Mathematics” by the National Academies of Science (see https://www.nap.edu/catalog/25547/increasing-student-success-in-developmental-mathematics-proceedings-of-a-workshop) based on a workshop this past winter.  It is true that the planning team for this workshop did not invite me; they know that I am close to retirement, and travel has become less comfortable, so my absence is actually fine with me.  No, the surprises deal with the premise and outcomes of the workshop.

One premise of the workshop included this phrase:

” … particular attention to students who are unsuccessful in developmental mathematics …”

This phrase reflects a very biased point of view — that there is something about students which causes them to be unsuccessful in some developmental mathematics (course? courses?).  This is the biggest surprise in the report, and not a good surprise: we continue to believe and act as if there are properties of students which mean that we should change our treatment.

If you read that last sentence carefully, you may be surprised yourself at the fact that I am surprised.  After all, it’s very logical that our instructional and/or curricular treatment should vary based on some collection of student characteristics.  Yes, it is logical.  Unfortunately, the statement is anything but reasonable.  Matching instructional methods, or curricular treatments, is an age-old problem of education.  Research on the efficacy of trait-treatment matching has a long and somewhat depressing history, going back at least to the early 1970s.  (http://www.ets.org/research/policy_research_reports/publications/report/1972/hrav)

 

 

 

 

 

 

 

 

 

Trait-Treatment interactions can be managed well enough at micro-levels.  My classes, like most of ours, adjust for student differences.  The problems arise when we seek to match at higher levels of abstraction — whether it’s “learning styles” or “meta majors”.  What we know about traits quickly becomes overwhelmed by the unknown and unknowable.  A similar flaw becomes apparent when we try to impose a closed system (ALEKS for example) on an organic component of a process (a student).  Sure, if I had a series of one-on-one interviews (perhaps 10 hours worth), I could match a student with a treatment that is very likely to work for them — given what was known at the time.  That knowledge might become horribly incomplete within days, depending on the nature of changes the student is experiencing.

 

 

 

This is an optimistic ‘theoretical’ error rate:  are you willing to be one of the “30%”?

 

 

The report itself spends quite a bit of time describing and documenting ‘promising’ practices.  The most common of these seems to involve one of various forms of tracking — a student is in major A, or in meta-major α, so the math course is matched to the ‘needs’ of that major or meta-major.  I suspect that several people attending the workshop would disagree with my use of the word “tracking”; they might prefer the word “aligned with the major”.  I am surprised at the ease of which we allow this alignment/tracking to determine what is done in mathematics classes.  Actually, I should use the word “discouraged” — because in most cases the mathematics chosen for a major deals with the specific quantitative work a student needs, often focusing on a minimal standard.  Are we really this willing to surrender our discipline?  Does the phrase “mathematically educated” now mean “avoids failing a low standard”?

To the extent the report deals with specific ways to ‘increase’ success in mathematics, I would rate the report as a failure.  However, when the report deals with concepts underlying our future work, you will find valid statements.  One of my favorites is from Linda Braddy:

Braddy asserted that administrators and educators are guilty of “educational malpractice” if they do not stop offering outdated, ineffective systems
of mathematics instruction.  [pg 56]

I suspect the authors would not agree with me on what we should stop doing.  I also don’t know how to interpret the comma in “outdated, ineffective” — does something need to meet both conditions in order to be malpractice?  Should we insert an “or” in the statement where a comma shows?  How about if we drop the word “instruction”?  Seems like we should also address the outdated mathematics in our courses.

Although the workshop never claimed to be inclusive, I am also disappointed that the AMATYC New Life Project never gets mentioned.  Not only did our Project produce as many (or more) implementations as the specifics described in the report, the genetic material from the Project was used to begin two efforts which are mentioned.  The result is a report which supports that notion that AMATYC has done nothing to advance ‘success in developmental mathematics’ in the past 12 years.

 

Where is the Position Paper on Co-Reqs? Math in the First Year?

Two major movements are “sweeping” across the college landscape — co-requisites in mathematics (and English), and “college math course in the first year”.  Those who have “drunk the cool aid” see both changes as progress, while an academic response continues to be more of minor interest and waiting for real data on the impact of the changes on real students.  In this context, a lack of clear communication is equivalent to a yielding of the discussion.  In my view, AMATYC has done exactly that.

I want to make sure that you know of my long and generally positive relationship with AMATYC.  I first attended a conference of the “American Mathematical Association of Two-Year Colleges” in 1987. I had been teaching for about 14 years at that point, and was being impacted by a loss of enthusiasm for the work.  That conference was a major turning point for my professional life, as well as the start of many relationships that continued for decades.

AMATYC produces position and policy statements on a variety of topics, and these generally lag behind the need — understandable, given the processes that allow for broad input from members  This lag time is normally a few years … when ‘handheld calculators’ were first an issue, the AMATYC statement on them was developed about 3 or 4 years later.  When issues came up about credentialing, the statement on qualifications came out about 4 years later.

We are now in the 5th year of the co-requisite infusion.  (Infusion suggests an external agent seeking to modify the internal functioning of a body.).  I don’t believe anybody in AMATYC is even considering a position statement on co-requisite mathematics.  Instead, the conferences are increasingly populated by sessions sharing experiences with implementations.  In the absence of official statements, the presence of multiple sessions on a practice amounts to an implicit approval of that practice.

Do we, as professionals or members of AMATYC, support co-requisite remediation without qualification?

 

 

 

 

 

 

Like good political strategy, the news cycle is being dominated by one ‘side’.  Our silence … individually and as AMATYC … relinquish the power to those seeking to disrupt our work.  [And, yes, the groups have stated that they are working to disrupt our work.  I won’t name them, as I do not want to provide any more PR for them.  It’s time for our side of the story.]

I’ve posted before on the practice of co-requisite remediation; Why Does Co-Requisite Remediation “Work”? and TBR and the Co-Requisite Fraud as well as The Selfishness of the Corequisite Model.  Of course, we should consider new models.  Of course, our use of fundamentally new structures impacting students must be based on scientific evidence and research — not ‘data’ grown specifically to support a particular practice.

The other ‘movement’ (Math in the First Year) is based on some of the worst uses of statistics I have ever seen in academia.  Of course, I’ve posted on that: Policy based on Correlation: Institutionalizing Inequity.  The flaws in this movement are so obvious that I would expect any basic statistics student to spot them within a few minutes.

 

 

 

Do you see the flaw of ‘first year’ in this image?

 

 

 

Does AMATYC have a position statement on ‘math in the first year’?  Nope.  There is a policy on placement, but not on a practice which places students in situations where they can suffer academic harm.  “Math in the first year” basically says that if you get a random student to pass a math course early like the best students do, that all students accrue the benefits of the best students (academic grit, financial stability, etc).

I always strive to be honest, and that involves me divulging that my relationship with AMATYC has faded away.  Perhaps the current leadership is actively working with the committees to develop policy statements.  However, I do know that the latest ‘standards’ (IMPACT) have very little to guide our decisions on such basic issues; the web site for IMPACT has a link, but no content.  What does all of this silence say about us?

Will we continue to be silent?

 

WordPress Themes